de431t.bsp file handling error
See original GitHub issue(Please excuse my English)
When I execute spicey_error.py
with de431t.bsp
ephemeris data file, the below error is occured. No error is occurred with other de43*t.bsp
. So I think there is a problem SpiceyPy handling de431t.bsp
.
The size of de431t.bsp
is about 3.4GB. The problem may be related to large file handling.
Error :
Traceback (most recent call last):
File "\myCode\jpl\spicey_error.py", line 9, in <module>
state, lightTimes = spice.spkez(ID, et, 'J2000', 'NONE', SSB_ID)
File "\home\anaconda3\lib\site-packages\spiceypy\spiceypy.py", line 102, in with_errcheck
check_for_spice_error(f)
File "\home\anaconda3\lib\site-packages\spiceypy\spiceypy.py", line 87, in check_for_spice_error
raise stypes.SpiceyError(msg)
spiceypy.utils.support_types.SpiceyError:
================================================================================
Toolkit version: N0066
SPICE(DAFBEGGTEND) --
Beginning address (352432997) greater than ending address (352432996).
spkez_c --> SPKEZ --> SPKGEO --> SPKPVN --> SPKR02 --> DAFGDA
================================================================================
spicey_error.py :
import spiceypy as spice
spice.furnsh("./de431t.txt")
SSB_ID = 0
ID = 1
tdb = '2020-07-01 00:00:00 TDB'
et = spice.str2et(tdb)
state, lightTimes = spice.spkez(ID, et, 'J2000', 'NONE', SSB_ID)
print(state, lightTimes)
de431t.txt :
\begindata
KERNELS_TO_LOAD=(
'naif0012.tls',
'de431t.bsp',
)
\begintext
Environments :
win10 pro x64 conda 4.8.3 python 3.7.6 spiceypy 3.0.2
Issue Analytics
- State:
- Created 3 years ago
- Comments:35 (23 by maintainers)
Top Results From Across the Web
Planets and their moons: JPL ephemeris files — Skyfield ...
The report The Planetary and Lunar Ephemerides DE430 and DE431, for example, states that: “The orbits of the inner planets are known to...
Read more >C source code for JPL DE ephemerides - Project Pluto
The error handling has been cleaned up; instead of bombing out inelegantly when a file is not opened or some similar error occurs,...
Read more >What is the exact format of the JPL ephemeris files?
The exact format of the SPICE Toolkit SPK files (also called BSP files) is called DAF (Double-precision Array File). It is a binary...
Read more >Invoking Cspice Library Functions Via Swift Code - NAIF
Overriding DEFAULT error handling with the RETURN option (report the error and return control to the target application without further SPICE.
Read more >Updated: Adding Third-Party Libraries to Unreal Engine
Init All; Furnsh; Handling Errors; Kernel Data Files ... The “431” and refers to DE431 which is a “Planetary and Lunar ephemerides”.
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
@pdh0710 @jdiazdelrio after communicating with the NAIF, I have learned that this is a known issue with the 64bit Windows CSPICE library, and that the library is limited to kernels no bigger than 2.1 GB. This is not a bug with how SpiceyPy or the conda-forge feedstock that builds cspice for conda users. The underlying issue is due to the 32bit long issue I noted and simply changing it to a 64bit long (long long) may be in conflict with constraints from ANSI C. As such, I won’t consider writing a patch to the c code, as there may cause unintentional errors elsewhere.
I think the only solution for you @pdh0710 would be to use the
spkmerge
to make a smaller kernel file or to use some of the smaller files provided by SSD/NAIF. Again, it primarily depends on how big of a time range you need to do your work, so if you really needed the full time span then usingspkmerge
would be my recommendation. Otherwise using a Linux or macOS system would not share the same limitation.I will leave this issue open for a day or two and then I will close it.
Yes. Here’s the output:
Right now it downloads at 350Mbps.