Difference between local allocatable and automatic arrays

For the sake of clarity, I’ll briefly mention terminology. The two arrays are both local variables and arrays of rank 1. alloc_array is an allocatable array; automatic_array is an explicit-shape automatic object. Being local variables their scope is that of the procedure. Automatic arrays and unsaved allocatable arrays come to an end when execution of … Read more

Tensorflow : Memory leak even while closing Session?

TL;DR: Closing a session does not free the tf.Graph data structure in your Python program, and if each iteration of the loop adds nodes to the graph, you’ll have a leak. Since your function feedForwardStep creates new TensorFlow operations, and you call it within the for loop, then there is a leak in your code—albeit … Read more

Why does the stack address grow towards decreasing memory addresses?

First, it’s platform dependent. In some architectures, stack is allocated from the bottom of the address space and grows upwards. Assuming an architecture like x86 that stack grown downwards from the top of address space, the idea is pretty simple: =============== Highest Address (e.g. 0xFFFF) | | | STACK | | | |————-| <- Stack … Read more