+ 1
When does memory error occur?
I've been dealing with netCDF files for a while as part of my research. Once I tried to import a file of size around 52GB that contained the data of a climate parameter which I should store in a list. I was working in Jupyter using Python 2 and it couldn't import the file saying 'Memory Error'. When does it happen? What is the limit?
2 Respuestas
+ 4
The limit is related to the available memory for your Python app (not necessarly the memory amount on your device ^^)...
To handle large/hudge files you must not read it at once, but only chunk by chunk (may be line by line on a text file) ;)
[edit]
Obviously, between each read, you need to free the memoty previously used by previous reading ;P
- 1
Hello nav!
I came across the same problem as yours...
Can you please let me know how you dealt with it?