Go 1.3 Garbage collector not releasing server memory back to system

First, note that Go, itself, doesn’t always shrink its own memory space: https://groups.google.com/forum/#!topic/Golang-Nuts/vfmd6zaRQVs The heap is freed, you can check this using runtime.ReadMemStats(), but the processes virtual address space does not shrink — ie, your program will not return memory to the operating system. On Unix based platforms we use a system call to tell … Read more

How to determine needed memory of Keras model?

I created a complete function based on the answer of Fabrício Pereira. def get_model_memory_usage(batch_size, model): import numpy as np try: from keras import backend as K except: from tensorflow.keras import backend as K shapes_mem_count = 0 internal_model_mem_count = 0 for l in model.layers: layer_type = l.__class__.__name__ if layer_type == ‘Model’: internal_model_mem_count += get_model_memory_usage(batch_size, l) single_layer_mem … Read more

How much faster is the memory usually than the disk?

I’m surprised: Figure 3 in the middle of this article, The Pathologies of Big Data, says that memory is only about 6 times faster when you’re doing sequential access (350 Mvalues/sec for memory compared with 58 Mvalues/sec for disk); but it’s about 100,000 times faster when you’re doing random access.

iOS6 MKMapView using a ton of memory, to the point of crashing the app, anyone else notice this?

After a lot of playing around and testing different ideas, some of which were mentioned here, the final solution that worked for me was as follows. Instead of creating new MKMapView’s as needed in the app, I added an mkMapView property to my AppDelegate and only created it when needed. Once it has been created, … Read more

How to set Apache Spark Executor memory

Since you are running Spark in local mode, setting spark.executor.memory won’t have any effect, as you have noticed. The reason for this is that the Worker “lives” within the driver JVM process that you start when you start spark-shell and the default memory used for that is 512M. You can increase that by setting spark.driver.memory … Read more