Tensorflow : Memory leak even while closing Session?

TL;DR: Closing a session does not free the tf.Graph data structure in your Python program, and if each iteration of the loop adds nodes to the graph, you’ll have a leak.

Since your function feedForwardStep creates new TensorFlow operations, and you call it within the for loop, then there is a leak in your code—albeit a subtle one.

Unless you specify otherwise (using a with tf.Graph().as_default(): block), all TensorFlow operations are added to a global default graph. This means that every call to tf.constant(), tf.matmul(), tf.Variable() etc. adds objects to a global data structure. There are two ways to avoid this:

  1. Structure your program so that you build the graph once, then use tf.placeholder() ops to feed in different values in each iteration. You mention in your question that this might not be possible.

  2. Explicitly create a new graph in each for loop. This might be necessary if the structure of the graph depends on the data available in the current iteration. You would do this as follows:

    for step in xrange(200):
        with tf.Graph().as_default(), tf.Session() as sess:
            # Remainder of loop body goes here.
    

    Note that in this version, you cannot use Tensor or Operation objects from a previous iteration. (For example, it’s not clear from your code snippet where test comes from.)

Leave a Comment