Fail to index table with iloc[] after dump/load table into pickle file
See original GitHub issueEDIT: Temporary Workaround
See https://github.com/astropy/astropy/issues/11332#issuecomment-781561392
Description
I am saving and loading a Table
object into a pickle file, but after loading the table cannot be indexed using the iloc
method. I found that problem is because after loading the table doesn’t the attribute Table.primary_key
Steps to Reproduce
from astropy.table import Table
import pickle
t = Table([(1, 2, 3, 4), (10, 1, 9, 9)], names=('a', 'b'), dtype=['i8', 'i8'])
t.add_index('a')
print(t.iloc[2]) #--> [3, 9]
with open("t_test.pkl" , "wb") as f:
pickle.dump(t, f, protocol=0)
t_ = pickle.load(open("t_test.pkl", "rb"))
print(t_.iloc[2])
leads to the following error:
-----------------------------
TypeErrorTraceback (most recent call last)
<ipython-input-173-7e3638d00d00> in <module>
11 t_ = pickle.load(open("t_test.pkl", "rb"))
12
---> 13 print(t_.iloc[2])
~/.pyenv/versions/adap/lib/python3.8/site-packages/astropy/table/index.py in __getitem__(self, item)
953 else:
954 key = self.table.primary_key
--> 955 index = self.indices[key]
956 rows = index.sorted_data()[item]
957 table_slice = self.table[rows]
~/.pyenv/versions/adap/lib/python3.8/site-packages/astropy/table/index.py in __getitem__(self, item)
809 raise IndexError(f"No index found for {item}")
810
--> 811 return super().__getitem__(item)
812
813
TypeError: list indices must be integers or slices, not NoneType
and here when accessing the primary_key
attribute of the original and post-pickle table:
t.primary_key, t_.primary_key
# (('a',), None)
System Details
macOS-10.15.7-x86_64-i386-64bit Python 3.8.6 (default, Jan 5 2021, 15:15:33) [Clang 12.0.0 (clang-1200.0.32.28)] Numpy 1.19.5 astropy 4.2 Scipy 1.6.0 Matplotlib 3.3.3
Issue Analytics
- State:
- Created 3 years ago
- Comments:12 (9 by maintainers)
Top Results From Across the Web
python - Pickling a list - error - Stack Overflow
This has nothing to do with pickling. I'll write new sample code that shows why it doesn't work.
Read more >Indexing and selecting data — pandas 1.5.2 documentation
iloc supports two kinds of boolean indexing. If the indexer is a boolean Series , an error will be raised. For instance, in...
Read more >Evaluation of models Alex F | Kaggle
There is 0 csv file in the current version of the dataset: In [2]: ... #importing models from joblib import dump,load import pickle...
Read more >Source code for pyNastran.op2.op2_interface.op2_scalar
Takes a dictionary of list of times in a transient case and gets the output closest to those ... _read_oef1_4], # failure indices...
Read more >Machine Learning Text Classification and Natural Language ...
spreadsheet see Table C1). In total, the dataset has 18 categories, with some of the data entries missing. These courses constituted the 'main...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
@pllim - I must have made a mistake in my test because I am reproducing the failure you reported. The oddity is that it only fails if
t
has had an index created. Based on my recollection of how pickling works for tables this makes no sense, but obviously my recollection is not serving me well. 😄Here is my minimum failing example:
Im not sure if this has something todo with it but this problem also happens when you create a new TimeSeries out of an old one like so
ts2 = ts["time","Test"] ts2.iloc[:]
this leads to the same error. I have figured out that for some reason when doing this it doesnt assign a new primary key so if you set that by hand things work but before its problematic. Hope this helps if its not helpfull to this conversation then i apologize it just seemed like a closely related problem.