Operations on table isn't executed when merge datasets is used
See original GitHub issueHi everyone. I user MergeDatasets feature and if I use class level dataset with empty tables, data from method level dataset will not be inserted. Here is sorce files: Base test class
@DBRider
@DataSet(value = "datasets/empty.yml", tableOrdering = {"first_table", "second_table", "third_table"})
abstract class BaseTest {
Child test class
public class ChildTest extends BaseTest {
@Test
@DataSet(value = "datasets/dataset_with_data.yml", useSequenceFiltering = false)
@ExpectedDataSet(value = "datasets/dataset_with_data.yml", orderBy = "id")
void childTest() throws Exception {
empty.yml
first_table:
second_table:
third_table:
dataset_with_data.yml
first_table:
- name: first_record
- name: second_record
second_table:
- name: first_record
- name: second_record
I want to use empty.yml file to simplify cliening of database(I use clean_insert strategy) and don’t duplicate empty tables in every dataset. Properties @DataSet(cleanBefore, cleanAfter)
cannot be used, as they clear more tables, than we used.
Why operation isn’t executed: DataSetExecutorImpl
@Override
public void createDataSet(DataSetConfig dataSetConfig) {
...
operation.execute(getRiderDataSource().getDBUnitConnection(), resultingDataSet);
CompositeOperation
public void execute(IDatabaseConnection connection, IDataSet dataSet)
throws DatabaseUnitException, SQLException
{
logger.debug("execute(connection={}, , dataSet={}) - start", connection, dataSet);
for (int i = 0; i < _actions.length; i++)
{
DatabaseOperation action = _actions[i];
action.execute(connection, dataSet);
}
}
AbstractBatchOperation
@Override
public void execute(IDatabaseConnection connection, IDataSet dataSet)
throws DatabaseUnitException, SQLException
{
logger.debug("execute(connection={}, dataSet={}) - start", connection,
dataSet);
DatabaseConfig databaseConfig = connection.getConfig();
IStatementFactory factory = (IStatementFactory) databaseConfig
.getProperty(DatabaseConfig.PROPERTY_STATEMENT_FACTORY);
boolean allowEmptyFields = connection.getConfig()
.getFeature(DatabaseConfig.FEATURE_ALLOW_EMPTY_FIELDS);
// for each table
ITableIterator iterator = iterator(dataSet);
while (iterator.next())
{
ITable table = iterator.getTable();
String tableName = table.getTableMetaData().getTableName();
logger.trace("execute: processing table='{}'", tableName);
// Do not process empty table
if (isEmpty(table))
{
continue;
}
There is check that we shouldn’t process the empty tablse, but in our situation we have CompositeTable, but it’s _metaData is from first table( that is from empty.yml) and we skip table processing in spite the second table contains data.
The root cause here: CompositeTable
/**
* Creates a composite table that combines the specified specified tables.
* The metadata from the first table is used as metadata for the new table.
*/
public CompositeTable(ITable table1, ITable table2) {
_metaData = table1.getTableMetaData();
_tables = new ITable[] {table1, table2};
}
As a result we didn’t use methadata from second table. Can we merge methadata from both tables?
Issue Analytics
- State:
- Created 3 years ago
- Comments:15 (7 by maintainers)
Top GitHub Comments
Thanks you too) I have never shared my experience on any blogs. Maybe it’s time to start) If I create post I will share link with you)
Really great news! About the performance improvements, it is expected that useSequenceFiltering = true will slow down the tests, even more if you use a real database with lot of tables instead of a dedicated database for testing. On the other hand, useSequenceFiltering=true simplifies the configuration in most of the cases, so it’s always a compromise, if you need tunning then you need to configure a bit more by using tableOrdering.
It would be nice if you could create a detailed blog post with all the gains you got with this configuration. I can create a new section on README with external resources and link the post there.
Thank you very much for all the feedback. I’ll release
1.18.0
with these changes in the next days.