Your NetActor
does not directly store any nn.Parameter
. Moreover, all other layers it eventually uses in forward
are stored as a simple list in self.nn_layers
.
If you want self.actor_nn.parameters()
to know that the items stored in the list self.nn_layers
may contain trainable parameters, you should work with containers.
Specifically, making self.nn_layers
to be a nn.ModuleList
instead of a simple list should solve your problem:
self.nn_layers = nn.ModuleList()