Skip to content

Overflow Error When Attempting to Save Large Amounts of Data #41

Open
@0Maximus0

Description

@0Maximus0

I have been using deepdish to save dictionaries with large amounts of data. I ran into the following issue when attempting to save a particularly large file. I have tried saving the data with and without compression, if that helps. Can you help me out with it please?

File "C:/Users/xxxxxxxx/Documents/Python_Scripts/Data_Scripts/Finalized_Data_Review_Presentations/data_save_cc_test.py", line 513, in
dd.io.save('%s/Data/%s_%s_cc_data.h5'%(directory,m_list[m],list_type),cc_data,('blosc', 9))

File "C:\Users\xxxxxxxx\AppData\Local\Continuum\anaconda2\lib\site-packages\deepdish\io\hdf5io.py", line 596, in save
filters=filters, idtable=idtable)

File "C:\Users\xxxxxxxx\AppData\Local\Continuum\anaconda2\lib\site-packages\deepdish\io\hdf5io.py", line 304, in _save_level
_save_pickled(handler, group, level, name=name)

File "C:\Users\xxxxxxxx\AppData\Local\Continuum\anaconda2\lib\site-packages\deepdish\io\hdf5io.py", line 172, in _save_pickled
node.append(level)

File "C:\Users\xxxxxxxx\AppData\Local\Continuum\anaconda2\lib\site-packages\tables\vlarray.py", line 547, in append
self._append(nparr, nobjects)

File "tables/hdf5extension.pyx", line 2032, in tables.hdf5extension.VLArray._append

OverflowError: Python int too large to convert to C long

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions