Moving the data files and raw processing to a separate repository
See original GitHub issueI wanted to ask your opinion about moving the data files including the raw data processing workflows to a separate repository.
We maintain Corpuscles.jl (https://github.com/JuliaPhysics/Corpuscles.jl) which is a Julia package for accessing the data files provided by Particle and the data update process is manually done.
If we had a separate git repository, we could easily integrate and even automate it.
In KM3NeT we are currently discussing a similar approach for neutrino fluxes and the general acceptance of the idea is pretty good.
Issue Analytics
- State:
- Created 2 years ago
- Comments:6 (6 by maintainers)
Top Results From Across the Web
How to move some files from one git repo to another (not a ...
Make a copy of repository A as the following steps make major changes to this copy which you should not push! ... Go...
Read more >Moving repositories managed by GitLab
You can move all repositories managed by GitLab to another file system or another server. Moving data within a GitLab instance.
Read more >Moving a file to a new location - GitHub Docs
You can move a file to a different directory on GitHub or by using the command line.
Read more >How to handle big repositories with Git | Atlassian Git Tutorial
Solution for big folder trees: git sparse-checkout · Clone the full repository once: 'git clone' · Activate the feature: 'git config core.sparsecheckout true' ......
Read more >Raw Repositories - Sonatype Help
The raw format can also be used for other resources than HTML files exposed by ... This section details the process of configuring...
Read more >
Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free
Top Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found

No worries Eduardo
:)I certainly don’t want to trigger more work just because of a single external package. So far we simply manually check and update the data sources and it’s not a big deal at all. I will probably go ahead and automate it based on the current structure and whenever there is time or more need for consolidation I’ll refactor.Hi @tamasgal! Apologies that I then forgot this issue from you.
I tend to agree that the data files are rather connected to the package, as we digest some and produce others, and would prefer to keep things as is. This being said, I have since ages (more or less literally) a start of work to consolidate a couple of files and simplify a bit (largely what we massage into our CSV). I will try and push forward on that, though time is often against me.