freeze layer parameters in Flux.jl alternative

38 Views Asked by At

I am trying to train a generative model for MNIST. To speed up the process, I plan to use the latent space layers of an already pretrained discriminator and incorporate them into my model. This approach should allow me to train in a lower-dimensional space. However, I have the following syntax question: I know it's possible to train only certain parameters with Flux.params, but in all my models, I'm using the withgradient(neural_network) do ... syntax. Is there any way to specify directly which layers you want to freeze when building a neural network?

0

There are 0 best solutions below