Dashboard export/import doesn't work in v1.3
See original GitHub issueGetting the following error during export > import of dashboards.
This Session's transaction has been rolled back due to a previous exception during flush.
To begin a new transaction with this Session, first issue Session.rollback().
Original exception was: Dataset [reporting].[public].[lookup] already exists
(Background on this error at: http://sqlalche.me/e/13/7s2a)
Expected results
Prior 1.3 recent update, we were able to do the following.
Environment setup and workflow:
- We have 2 environments [
dev
,prod
] - We do our dashboard development in
dev
environment - Once dashboard looks good, we deploy the changes to
prod
by doing following 3 steps 3.1 Export dashboard fromdev
3.2 Import dashboard toprod
3.3 Replace theslug
for the new dashboard so that the users doesn’t need to update their bookmarks
Actual results
Gets the following error : Dataset %s already exists
{"asctime": "2021-08-19 15:00:13,977", "threadName": "Dummy-10563", "request_id": "803a6e3e-0ea2-488a-846b-f6550e4ff4fd", "levelname": "ERROR", "name": "superset.views.core", "lineno": 695, "message": "Dataset [reporting].[public].[lookup] already exists", "exc_info": "Traceback (most recent call last):
File "/app/superset/views/core.py", line 680, in import_dashboards
{import_file.filename: import_file.read()}, database_id
File "/app/superset/dashboards/commands/importers/v0.py", line 353, in run
import_dashboards(db.session, content, self.database_id)
File "/app/superset/dashboards/commands/importers/v0.py", line 323, in import_dashboards
new_dataset_id = import_dataset(table, database_id, import_time=import_time)
File "/app/superset/datasets/commands/importers/v0.py", line 110, in import_dataset
database_id,
File "/app/superset/datasets/commands/importers/v0.py", line 204, in import_datasource
session.flush()
File "/usr/local/lib/python3.7/site-packages/sqlalchemy/orm/scoping.py", line 163, in do
return getattr(self.registry(), name)(*args, **kwargs)
File "/usr/local/lib/python3.7/site-packages/sqlalchemy/orm/session.py", line 2536, in flush
self._flush(objects)
File "/usr/local/lib/python3.7/site-packages/sqlalchemy/orm/session.py", line 2678, in _flush
transaction.rollback(_capture_exception=True)
File "/usr/local/lib/python3.7/site-packages/sqlalchemy/util/langhelpers.py", line 70, in __exit__
with_traceback=exc_tb,
File "/usr/local/lib/python3.7/site-packages/sqlalchemy/util/compat.py", line 182, in raise_
raise exception
File "/usr/local/lib/python3.7/site-packages/sqlalchemy/orm/session.py", line 2638, in _flush
flush_context.execute()
File "/usr/local/lib/python3.7/site-packages/sqlalchemy/orm/unitofwork.py", line 422, in execute
rec.execute(self)
File "/usr/local/lib/python3.7/site-packages/sqlalchemy/orm/unitofwork.py", line 589, in execute
uow,
File "/usr/local/lib/python3.7/site-packages/sqlalchemy/orm/persistence.py", line 213, in save_obj
) in _organize_states_for_save(base_mapper, states, uowtransaction):
File "/usr/local/lib/python3.7/site-packages/sqlalchemy/orm/persistence.py", line 387, in _organize_states_for_save
mapper.dispatch.before_update(mapper, connection, state)
File "/usr/local/lib/python3.7/site-packages/sqlalchemy/event/attr.py", line 322, in __call__
fn(*args, **kw)
File "/usr/local/lib/python3.7/site-packages/sqlalchemy/orm/events.py", line 719, in wrap
fn(*arg, **kw)
File "/app/superset/connectors/sqla/models.py", line 1654, in before_update
raise Exception(get_dataset_exist_error_msg(target.full_name))
Exception: Dataset [reporting].[public].[lookup] already exists"}
{"asctime": "2021-08-19 15:00:13,981", "threadName": "Dummy-10563", "request_id": "803a6e3e-0ea2-488a-846b-f6550e4ff4fd", "levelname": "ERROR", "name": "superset.views.base", "lineno": 444, "message": "This Session's transaction has been rolled back due to a previous exception during flush. To begin a new transaction with this Session, first issue Session.rollback(). Original exception was: Dataset [reporting].[public].[lookup] already exists (Background on this error at: http://sqlalche.me/e/13/7s2a)", "exc_info": "Traceback (most recent call last):
File "/usr/local/lib/python3.7/site-packages/flask/app.py", line 1950, in full_dispatch_request
rv = self.dispatch_request()
File "/usr/local/lib/python3.7/site-packages/flask/app.py", line 1936, in dispatch_request
return self.view_functions[rule.endpoint](**req.view_args)
File "/usr/local/lib/python3.7/site-packages/flask_appbuilder/security/decorators.py", line 109, in wraps
return f(self, *args, **kwargs)
File "/app/superset/utils/log.py", line 241, in wrapper
value = f(*args, **kwargs)
File "/app/superset/views/core.py", line 707, in import_dashboards
databases = db.session.query(Database).all()
File "/usr/local/lib/python3.7/site-packages/sqlalchemy/orm/query.py", line 3373, in all
return list(self)
File "/usr/local/lib/python3.7/site-packages/sqlalchemy/orm/query.py", line 3535, in __iter__
return self._execute_and_instances(context)
File "/usr/local/lib/python3.7/site-packages/sqlalchemy/orm/query.py", line 3557, in _execute_and_instances
querycontext, self._connection_from_session, close_with_result=True
File "/usr/local/lib/python3.7/site-packages/sqlalchemy/orm/query.py", line 3572, in _get_bind_args
mapper=self._bind_mapper(), clause=querycontext.statement, **kw
File "/usr/local/lib/python3.7/site-packages/sqlalchemy/orm/query.py", line 3550, in _connection_from_session
conn = self.session.connection(**kw)
File "/usr/local/lib/python3.7/site-packages/sqlalchemy/orm/session.py", line 1141, in connection
execution_options=execution_options,
File "/usr/local/lib/python3.7/site-packages/sqlalchemy/orm/session.py", line 1147, in _connection_for_bind
engine, execution_options
File "/usr/local/lib/python3.7/site-packages/sqlalchemy/orm/session.py", line 409, in _connection_for_bind
self._assert_active()
File "/usr/local/lib/python3.7/site-packages/sqlalchemy/orm/session.py", line 296, in _assert_active
code="7s2a",
sqlalchemy.exc.InvalidRequestError: This Session's transaction has been rolled back due to a previous exception during flush. To begin a new transaction with this Session, first issue Session.rollback(). Original exception was: Dataset [reporting].[public].[lookup] already exists (Background on this error at: http://sqlalche.me/e/13/7s2a)"}
{"asctime": "2021-08-19 15:00:13,984", "threadName": "Dummy-10563", "request_id": "803a6e3e-0ea2-488a-846b-f6550e4ff4fd", "levelname": "INFO", "name": "maf", "lineno": 61, "message": "response for POST https://reports-uat.storyboard.nielsen.com/superset/import_dashboards/ 500 INTERNAL SERVER ERROR", "method": "POST", "url": "https://reports-uat.storyboard.nielsen.com/superset/import_dashboards/", "status": "500 INTERNAL SERVER ERROR"}
Screenshots
How to reproduce the bug
- set “VERSIONED_EXPORT”: True feature flag
- Start superset with examples
- Export an existing dashboard
- Import it back to the same instance
- See error
Environment
(please complete the following information):
- superset version:
1.3
- python version:
3.7.9
- node.js version:
v14.15.5
- any feature flags active:
Checklist
Make sure to follow these steps before submitting your issue - thank you!
- I have checked the superset logs for python stacktraces and included it here as text if there are any.
- I have reproduced the issue with at least the latest released version of superset.
- I have checked the issue tracker for the same issue and I haven’t found one similar.
Additional context
Tracked down the error to this pull request #15909, and specifically this change
Issue Analytics
- State:
- Created 2 years ago
- Reactions:1
- Comments:10 (4 by maintainers)
Top Results From Across the Web
Import/Export dashboard not working - Kibana - Elastic Discuss
exporting a dashboard/visualization doesn't export the data. You have to reindex the data into your import-environment to be able to visualize ...
Read more >Getting a json file when I am exporting dashboard from ...
You need to enable VERSIONED_EXPORT future flag under config.py to be able to export the dashboard as a ZIP file.
Read more >How to Export and Import Flows-Node-Red
If you find that the import button doesn't change colour and a red line appears around the text box it is probably because...
Read more >Import doesn't work without manual edit of exported dashboard
That's not an export, just a HTTP GET from the api. You need to use the UI to export a dashboard. Marcus. You...
Read more >Shelly 3EM 3-phase Net Metering templates for Import, export ...
Tested and working on HA 2022.2.3 Shelly FW 20211109-131251/v1.11.7-g682a0db. Copy and paste the templates into your configuration.yaml, ...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
We have the same issue after upgrading superset from 1.2.0 to 1.3.0. We haven’t touched the flag “VERSIONED_EXPORT”, and it is still
False
which is the default.This used to work in 1.2.0 without any issue. Now we end up with this error
"Dataset <xyz> already exists!"
this is a user mistake by turning off the feature flag and moved to old mechanism: https://github.com/apache/superset/pull/16569