On Linux I’m using the resource module:
import resource
resource.setrlimit(resource.RLIMIT_AS, (megs * 1048576L, -1L))
More Related Contents:
- How do I read a large csv file with pandas?
- How can I explicitly free memory in Python?
- How do I determine the size of an object in Python?
- How do I profile memory usage in Python?
- What is the difference between contiguous and non-contiguous arrays?
- Does Python have a stack/heap and how is memory managed?
- Memory errors and list limits?
- Python readlines() usage and efficient practice for reading
- How unique is Python’s id()?
- Python Django Global Variables
- Get total physical memory in Python
- more efficient way to calculate distance in numpy?
- Tracking *maximum* memory usage by a Python function
- Extracting a zipfile to memory?
- Python: Reducing memory usage of dictionary
- High Memory Usage Using Python Multiprocessing
- Python: How to read huge text file into memory
- Is freeing handled differently for small/large numpy arrays?
- Measure Object Size Accurately in Python – Sys.GetSizeOf not functioning
- Deep version of sys.getsizeof [duplicate]
- Python in-memory zip library
- SQLite Performance Benchmark — why is :memory: so slow…only 1.5X as fast as disk?
- Python/Numpy MemoryError
- When to use == and when to use is?
- Python deep getsizeof list with contents?
- Are strings pooled in Python?
- Tensorflow Allocation Memory: Allocation of 38535168 exceeds 10% of system memory
- memory error in python
- Why is lxml.etree.iterparse() eating up all my memory?
- Why Python `Memory Error` with list `append()` lots of RAM left