question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Unable to populate dataset with YAML configuration file or Dataset provider implementation

See original GitHub issue

I have the following basic test which uses JUnit5, Mockk and Testcontainers.

@TestInstance(TestInstance.Lifecycle.PER_CLASS)
@ExtendWith(DBUnitExtension::class, MockKExtension::class)
@DBRider
internal class CaseControllerDbImplTest {
    @MockK
    private lateinit var assetSecurityProvider: AssetSecurityProvider

    @MockK
    private lateinit var bdoConfigController: BdoConfigController

    @MockK
    private lateinit var hibernateController: HibernateController

    @MockK
    private lateinit var processConfigController: ProcessConfigController

    @MockK
    private lateinit var processSecurityProvider: ProcessSecurityProvider

    @MockK
    private lateinit var rolesController: RolesController

    @MockK
    private lateinit var workflowQueryController: WorkflowQueryController

    @MockK
    private lateinit var tasksController: TasksController

    private lateinit var connectionHolder: ConnectionHolder // DBRider
    private val postgres = DatabaseContainer.PostgreSQL // Testcontainers
    private lateinit var sut: CaseController
    private lateinit var session: Session // Hibernate session

    @BeforeAll
    fun initAll() {
        MockKAnnotations.init(this)
        postgres.start()
        session = Hibernate(postgres.jdbcUrl, postgres.username, postgres.password,
                WorkflowProcess::class.java).session
        session.doWork { connectionHolder = ConnectionHolder { it } } // Bind Hibernate connection
    }

    @BeforeEach
    fun init() {
        every { hibernateController.session } returns session

        sut = CaseControllerDbImpl(
                assetSecurityProvider,
                bdoConfigController,
                hibernateController,
                processConfigController,
                processSecurityProvider,
                rolesController,
                workflowQueryController,
                tasksController)
    }

    @AfterEach
    fun tearDown() {
        clearAllMocks()
        session.close()
    }

    @AfterAll
    fun tearDownAll() {
        unmockkAll()
        session.close()
        postgres.stop()
    }

    @Test
    @DataSet(provider = WorkflowDatasetProvider::class)
    fun `should find the process instance`() {
        val result = sut.getProcessInstance("ap", "invoices", "10000", "root")

        assertThat(result).isNotNull()
    }
}

I had spent a lot of time trying to fill the database table with entries in a YAML file, however while debugging the database rider code found that the dataset resource input stream is never found. Irrespective of the path provided. I am now trying programmatically by creating a dataset provider class. Now I receive the following log entries.

OpenJDK 64-Bit Server VM warning: Sharing is only supported for boot loader classes because bootstrap classpath has been appended
11:01:49.776 [main] INFO  org.testcontainers.dockerclient.DockerClientProviderStrategy - Loaded org.testcontainers.dockerclient.UnixSocketClientProviderStrategy from ~/.testcontainers.properties, will try it first
11:01:51.056 [main] INFO  org.testcontainers.dockerclient.UnixSocketClientProviderStrategy - Accessing docker with local Unix socket
11:01:51.056 [main] INFO  org.testcontainers.dockerclient.DockerClientProviderStrategy - Found Docker environment with local Unix socket (unix:///var/run/docker.sock)
11:01:51.262 [main] INFO  org.testcontainers.DockerClientFactory - Docker host IP address is localhost
11:01:51.308 [main] INFO  org.testcontainers.DockerClientFactory - Connected to docker: 
  Server Version: 19.03.8
  API Version: 1.40
  Operating System: Docker Desktop
  Total Memory: 1989 MB
11:01:51.476 [main] INFO  org.testcontainers.utility.RegistryAuthLocator - Credential helper/store (docker-credential-desktop) does not have credentials for quay.io
11:01:51.956 [main] INFO  org.testcontainers.DockerClientFactory - Ryuk started - will monitor and terminate Testcontainers containers on JVM exit
11:01:51.956 [main] INFO  org.testcontainers.DockerClientFactory - Checking the system...
11:01:51.957 [main] INFO  org.testcontainers.DockerClientFactory - ✔︎ Docker server version should be at least 1.6.0
11:01:52.069 [main] INFO  org.testcontainers.DockerClientFactory - ✔︎ Docker environment should have more than 2GB free disk space
11:01:52.093 [main] INFO  🐳 [postgres:9.6.12] - Creating container for image: postgres:9.6.12
11:01:52.136 [main] INFO  org.testcontainers.utility.RegistryAuthLocator - Credential helper/store (docker-credential-desktop) does not have credentials for index.docker.io
11:01:52.225 [main] INFO  🐳 [postgres:9.6.12] - Starting container with ID: ce27aa2674a0c65dad091048fb68cbed684e771a4cf5a7ce6d2d24e06c11cc4a
11:01:52.502 [main] INFO  🐳 [postgres:9.6.12] - Container postgres:9.6.12 is starting: ce27aa2674a0c65dad091048fb68cbed684e771a4cf5a7ce6d2d24e06c11cc4a
11:01:56.546 [main] INFO  🐳 [postgres:9.6.12] - Container postgres:9.6.12 started in PT6.813481S
11:01:56.714 [main] INFO  org.hibernate.Version - HHH000412: Hibernate Core {5.4.10.Final}
11:01:56.821 [main] INFO  org.hibernate.annotations.common.Version - HCANN000001: Hibernate Commons Annotations {5.1.0.Final}
11:01:56.974 [main] INFO  org.hibernate.c3p0.internal.C3P0ConnectionProvider - HHH010002: C3P0 using driver: null at URL: jdbc:postgresql://localhost:32812/app-server?loggerLevel=OFF
11:01:56.974 [main] INFO  org.hibernate.c3p0.internal.C3P0ConnectionProvider - HHH10001001: Connection properties: {password=****, user=postgres}
11:01:56.974 [main] INFO  org.hibernate.c3p0.internal.C3P0ConnectionProvider - HHH10001003: Autocommit mode: false
11:01:56.974 [main] WARN  org.hibernate.c3p0.internal.C3P0ConnectionProvider - HHH10001006: No JDBC Driver class was specified by property hibernate.connection.driver_class
11:01:57.008 [MLog-Init-Reporter] INFO  com.mchange.v2.log.MLog - MLog clients using slf4j logging.
11:01:57.187 [main] INFO  com.mchange.v2.c3p0.C3P0Registry - Initializing c3p0-0.9.5.3 [built 27-January-2019 00:11:37 -0800; debug? true; trace: 10]
11:01:57.229 [main] INFO  org.hibernate.c3p0.internal.C3P0ConnectionProvider - HHH10001007: JDBC isolation level: <unknown>
11:01:57.252 [main] INFO  com.mchange.v2.c3p0.impl.AbstractPoolBackedDataSource - Initializing c3p0 pool... com.mchange.v2.c3p0.PoolBackedDataSource@6311873f [ connectionPoolDataSource -> com.mchange.v2.c3p0.WrapperConnectionPoolDataSource@b8e1ca35 [ acquireIncrement -> 3, acquireRetryAttempts -> 30, acquireRetryDelay -> 1000, autoCommitOnClose -> false, automaticTestTable -> null, breakAfterAcquireFailure -> false, checkoutTimeout -> 0, connectionCustomizerClassName -> null, connectionTesterClassName -> com.mchange.v2.c3p0.impl.DefaultConnectionTester, contextClassLoaderSource -> caller, debugUnreturnedConnectionStackTraces -> false, factoryClassLocation -> null, forceIgnoreUnresolvedTransactions -> false, forceSynchronousCheckins -> false, identityToken -> z8kflta9xc0z7e181tlee|3070f3e6, idleConnectionTestPeriod -> 60, initialPoolSize -> 3, maxAdministrativeTaskTime -> 0, maxConnectionAge -> 0, maxIdleTime -> 2520, maxIdleTimeExcessConnections -> 0, maxPoolSize -> 15, maxStatements -> 5000, maxStatementsPerConnection -> 0, minPoolSize -> 3, nestedDataSource -> com.mchange.v2.c3p0.DriverManagerDataSource@df286719 [ description -> null, driverClass -> null, factoryClassLocation -> null, forceUseNamedDriverClass -> false, identityToken -> z8kflta9xc0z7e181tlee|63917fe1, jdbcUrl -> jdbc:postgresql://localhost:32812/app-server?loggerLevel=OFF, properties -> {password=******, user=******} ], preferredTestQuery -> select 1;, privilegeSpawnedThreads -> false, propertyCycle -> 0, statementCacheNumDeferredCloseThreads -> 0, testConnectionOnCheckin -> false, testConnectionOnCheckout -> true, unreturnedConnectionTimeout -> 0, usesTraditionalReflectiveProxies -> false; userOverrides: {} ], dataSourceName -> null, extensions -> {}, factoryClassLocation -> null, identityToken -> z8kflta9xc0z7e181tlee|51eb0e84, numHelperThreads -> 3 ]
11:01:57.574 [main] INFO  org.hibernate.dialect.Dialect - HHH000400: Using dialect: org.hibernate.dialect.PostgreSQL95Dialect
11:01:58.536 [main] INFO  org.hibernate.orm.connections.access - HHH10001501: Connection obtained from JdbcConnectionAccess [org.hibernate.engine.jdbc.env.internal.JdbcEnvironmentInitiator$ConnectionProviderJdbcConnectionAccess@2507a170] for (non-JTA) DDL execution was not in auto-commit mode; the Connection 'local transaction' will be committed and the Connection will be set into auto-commit mode.
11:01:58.572 [main] INFO  org.hibernate.engine.transaction.jta.platform.internal.JtaPlatformInitiator - HHH000490: Using JtaPlatform implementation: [org.hibernate.engine.transaction.jta.platform.internal.NoJtaPlatform]

11:01:59.063 [main] INFO  org.dbunit.database.DatabaseConfig - The property ending with 'schema' was not found. Please notify a dbunit developer to add the property to the class org.dbunit.database.DatabaseConfig
11:01:59.063 [main] INFO  org.dbunit.database.DatabaseConfig - The property ending with 'replacers' was not found. Please notify a dbunit developer to add the property to the class org.dbunit.database.DatabaseConfig
11:01:59.067 [main] INFO  com.github.database.rider.core.dataset.DataSetExecutorImpl - DBUnit configuration for dataset executor 'junit5':
cacheConnection: true
cacheTableNames: true
mergeDataSets: false
caseSensitiveTableNames: false
caseInsensitiveStrategy: UPPERCASE
leakHunter: false
schema: null
allowEmptyFields: false
fetchSize: 100
qualifiedTableNames: false
batchSize: 100
batchedStatements: false
caseSensitiveTableNames: false
replacers: [com.github.database.rider.core.replacers.DateTimeReplacer@5e0442dd, com.github.database.rider.core.replacers.UnixTimestampReplacer@18e76101, com.github.database.rider.core.replacers.NullReplacer@79349b61]



java.lang.RuntimeException: Could not create dataset for test 'should find the process instance'.

	at com.github.database.rider.core.RiderRunner.runBeforeTest(RiderRunner.java:47)
	at com.github.database.rider.junit5.DBUnitExtension.beforeTestExecution(DBUnitExtension.java:58)
	at org.junit.jupiter.engine.descriptor.TestMethodTestDescriptor.lambda$invokeBeforeTestExecutionCallbacks$4(TestMethodTestDescriptor.java:142)
	at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:72)
	at org.junit.jupiter.engine.descriptor.TestMethodTestDescriptor.invokeBeforeMethodsOrCallbacksUntilExceptionOccurs(TestMethodTestDescriptor.java:156)
	at org.junit.jupiter.engine.descriptor.TestMethodTestDescriptor.invokeBeforeTestExecutionCallbacks(TestMethodTestDescriptor.java:141)
	at org.junit.jupiter.engine.descriptor.TestMethodTestDescriptor.execute(TestMethodTestDescriptor.java:112)
	at org.junit.jupiter.engine.descriptor.TestMethodTestDescriptor.execute(TestMethodTestDescriptor.java:59)
	at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$5(NodeTestTask.java:105)
	at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:72)
	at org.junit.platform.engine.support.hierarchical.NodeTestTask.executeRecursively(NodeTestTask.java:95)
	at org.junit.platform.engine.support.hierarchical.NodeTestTask.execute(NodeTestTask.java:71)
	at java.base/java.util.ArrayList.forEach(ArrayList.java:1507)
	at org.junit.platform.engine.support.hierarchical.SameThreadHierarchicalTestExecutorService.invokeAll(SameThreadHierarchicalTestExecutorService.java:38)
	at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$5(NodeTestTask.java:110)
	at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:72)
	at org.junit.platform.engine.support.hierarchical.NodeTestTask.executeRecursively(NodeTestTask.java:95)
	at org.junit.platform.engine.support.hierarchical.NodeTestTask.execute(NodeTestTask.java:71)
	at java.base/java.util.ArrayList.forEach(ArrayList.java:1507)
	at org.junit.platform.engine.support.hierarchical.SameThreadHierarchicalTestExecutorService.invokeAll(SameThreadHierarchicalTestExecutorService.java:38)
	at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$5(NodeTestTask.java:110)
	at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:72)
	at org.junit.platform.engine.support.hierarchical.NodeTestTask.executeRecursively(NodeTestTask.java:95)
	at org.junit.platform.engine.support.hierarchical.NodeTestTask.execute(NodeTestTask.java:71)
	at org.junit.platform.engine.support.hierarchical.SameThreadHierarchicalTestExecutorService.submit(SameThreadHierarchicalTestExecutorService.java:32)
	at org.junit.platform.engine.support.hierarchical.HierarchicalTestExecutor.execute(HierarchicalTestExecutor.java:57)
	at org.junit.platform.engine.support.hierarchical.HierarchicalTestEngine.execute(HierarchicalTestEngine.java:51)
	at org.junit.platform.launcher.core.DefaultLauncher.execute(DefaultLauncher.java:220)
	at org.junit.platform.launcher.core.DefaultLauncher.lambda$execute$6(DefaultLauncher.java:188)
	at org.junit.platform.launcher.core.DefaultLauncher.withInterceptedStreams(DefaultLauncher.java:202)
	at org.junit.platform.launcher.core.DefaultLauncher.execute(DefaultLauncher.java:181)
	at org.junit.platform.launcher.core.DefaultLauncher.execute(DefaultLauncher.java:128)
	at com.intellij.junit5.JUnit5IdeaTestRunner.startRunnerWithArgs(JUnit5IdeaTestRunner.java:69)
	at com.intellij.rt.junit.IdeaTestRunner$Repeater.startRunnerWithArgs(IdeaTestRunner.java:33)
	at com.intellij.rt.junit.JUnitStarter.prepareStreamsAndStart(JUnitStarter.java:230)
	at com.intellij.rt.junit.JUnitStarter.main(JUnitStarter.java:58)
Caused by: com.github.database.rider.core.exception.DataBaseSeedingException: Could not initialize dataset: , dataset provider: uk.co.processflows.platform.workflow.WorkflowDatasetProvider
	at com.github.database.rider.core.dataset.DataSetExecutorImpl.createDataSet(DataSetExecutorImpl.java:161)
	at com.github.database.rider.core.RiderRunner.runBeforeTest(RiderRunner.java:45)
	... 35 more
Caused by: org.dbunit.dataset.DataSetException: Exception while searching the dependent tables.
	at org.dbunit.database.DatabaseSequenceFilter.sortTableNames(DatabaseSequenceFilter.java:105)
	at org.dbunit.database.DatabaseSequenceFilter.<init>(DatabaseSequenceFilter.java:67)
	at com.github.database.rider.core.dataset.DataSetExecutorImpl.performSequenceFiltering(DataSetExecutorImpl.java:314)
	at com.github.database.rider.core.dataset.DataSetExecutorImpl.createDataSet(DataSetExecutorImpl.java:144)
	... 36 more
Caused by: org.dbunit.util.search.SearchException: org.dbunit.dataset.NoSuchTableException: The table 'WORKFLOWPROCESSES' does not exist in schema 'null'
	at org.dbunit.database.search.AbstractMetaDataBasedSearchCallback.getNodes(AbstractMetaDataBasedSearchCallback.java:154)
	at org.dbunit.database.search.AbstractMetaDataBasedSearchCallback.getNodesFromImportedKeys(AbstractMetaDataBasedSearchCallback.java:99)
	at org.dbunit.database.search.ImportedKeysSearchCallback.getEdges(ImportedKeysSearchCallback.java:53)
	at org.dbunit.util.search.DepthFirstSearch.reverseSearch(DepthFirstSearch.java:264)
	at org.dbunit.util.search.DepthFirstSearch.search(DepthFirstSearch.java:148)
	at org.dbunit.util.search.DepthFirstSearch.search(DepthFirstSearch.java:104)
	at org.dbunit.database.search.TablesDependencyHelper.getDependentTables(TablesDependencyHelper.java:92)
	at org.dbunit.database.search.TablesDependencyHelper.getDependentTables(TablesDependencyHelper.java:73)
	at org.dbunit.database.DatabaseSequenceFilter.getDependencyInfo(DatabaseSequenceFilter.java:201)
	at org.dbunit.database.DatabaseSequenceFilter.sortTableNames(DatabaseSequenceFilter.java:101)
	... 39 more
Caused by: org.dbunit.dataset.NoSuchTableException: The table 'WORKFLOWPROCESSES' does not exist in schema 'null'
	at org.dbunit.database.search.AbstractMetaDataBasedSearchCallback.getNodes(AbstractMetaDataBasedSearchCallback.java:186)
	at org.dbunit.database.search.AbstractMetaDataBasedSearchCallback.getNodes(AbstractMetaDataBasedSearchCallback.java:149)
	... 48 more

Is there something I’m failing to do?

Issue Analytics

  • State:closed
  • Created 3 years ago
  • Comments:11 (5 by maintainers)

github_iconTop GitHub Comments

1reaction
rmpestanocommented, Apr 16, 2020

I found your issue, the problem is that DataSetProvider uses global dbunit configuration because we cannot read @DBUnit annotation in dataset provider class. Because of this the caseSensitiveTableNames was not being used by dataset provider.

To fix you can either pass a configuration to dataset builder:

 val builder = DataSetBuilder(DBUnitConfig().addDBUnitProperty("caseSensitiveTableNames",true))

or add dbunit.yml in src/test/resources and enable case sensitive table names:

properties:
  caseSensitiveTableNames: true

I also put the table name between quotes otherwise hibernate won’t respect the name case:

@Table(name="\"WorkflowProcesses\"")

And also changed the table name in your dataset provider to WorkflowProcesses

Screenshot 2020-04-15 at 16 42 34 2

Here is a working zip of your project:

rider-sample-fixed.zip

0reactions
rmpestanocommented, Apr 15, 2020

I just added a section about dataset provider configuration in docs, see here.

Read more comments on GitHub >

github_iconTop Results From Across the Web

Using YAML for data science - Paperspace Docs
This page introduces YAML syntax used in Gradient Workflows. Overview​. YAML provides a powerful and precise configuration for a data science pipeline to...
Read more >
Database application.yml for Spring boot ... - Stack Overflow
yml file. However when I make the switch, my application errors out while attempting to connect to the db. Original applications.properties file: spring ......
Read more >
Create and Configure Jobs and Pipelines Using YAML
Each YAML file can contain configuration data for exactly one job or one pipeline. You can have YAML files in multiple Git repositories,...
Read more >
Configuring CI/CD Pipelines as Code with YAML in Azure ...
In Database details under Compute + storage, click on Configure database. ... It will be saved as a new file called “azure-pipelines.yml” in ......
Read more >
Configuration providers - .NET - Microsoft Learn
Configures the JSON configuration provider to load the appsettings.json and appsettings. Environment .json files with the following options:.
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found