Bulk data exports
See original GitHub issueThe ability to dump data from a Keystone, via the Admin UI (probably the current view). At a technical level there are two broad approches:
A) Pull the data into Node and produce a file for download
This gives us full control of the format (Excel file, CSV, JSON, etc) so is nicest for the user. We can pick columns, ordering and reference hydrated (app generated/virtual) values.
However, without some significant work (generating from a worker, batching items) this approach does not scale well. The prior art (KS4) uses this approach and i’s known to have caused outages on large sites.
B) Leverage DB platform functionality
Alternatively, we may want to leverage functionality of the underlying DB directly. Specifically…
mongodump
can be given conditions to filter the documents dumped from a collection. Issues: single collection only, not hydrated, format is difficult to consume, dumps all fields.- the pgsql
copy
command can write queries out directly to a CSV. It may also be possible to dump JSON in this way. Less issues thanmongodump
;copy
can query across multiple tables/views, control columns, order, format values, etc. Also it’s super fast. Locally I can dump 80k+ records (20Mb) to a CSV in under 200 ms. - Etc… for other DB platforms.
These tool have a much higher capacity to scale but at the cost of some flexibility (esp. Mongo). This is also something we’d need to work into the adapter framework.
Issue Analytics
- State:
- Created 5 years ago
- Reactions:1
- Comments:11 (10 by maintainers)
Top GitHub Comments
Adding hooks to the Admin UI opens a lot of doors and is becoming a higher priority
for this we would need extension points for adding UI elements in
admin-
ui`