Pytorch ValueError: optimizer got an empty parameter list

Your NetActor does not directly store any nn.Parameter. Moreover, all other layers it eventually uses in forward are stored as a simple list in self.nn_layers. If you want self.actor_nn.parameters() to know that the items stored in the list self.nn_layers may contain trainable parameters, you should work with containers. Specifically, making self.nn_layers to be a nn.ModuleList … Read more

Tensorflow and Multiprocessing: Passing Sessions

You can’t use Python multiprocessing to pass a TensorFlow Session into a multiprocessing.Pool in the straightfoward way because the Session object can’t be pickled (it’s fundamentally not serializable because it may manage GPU memory and state like that). I’d suggest parallelizing the code using actors, which are essentially the parallel computing analog of “objects” and … Read more