Suppose that I have the observed data $y_i\sim N(0,1/w(x_i)), $ where $x_i=i$ for $i=1,...,100$. I define the true function $w(x_i)=(x_i-50)^2$ and hope to recover this function with a Flux neural network model. Let's denote the estimated neural network $\tilde{w}(x_i)$.
Below is my current code that I'm using. However, I noticed that there is a domain error, which I believe occurs because the neural network allows $\tilde{w}(x_i)$ to be negative, which leads to an improper probability distribution (i.e., the variance must be positive). Thus, I'm wondering if you know how to impose a constraint on the Flux model such that $\tilde{w}(x_i)>0$.
My code:
using Flux
using Flux: train!
using LinearAlgebra, Random, Statistics, Distributions, SparseArrays, StatsBase # stats
using Plots # for chains
## data simulation
T = 100
w = zeros(T)
times = hcat(1:T...)
[w[t] = (times[t]-50)^2 for t in 1:T]
Sig = Diagonal(vec(1/w))
mu = zeros(T)
y = rand(MvNormal(mu, Sig), 1)
plot(y)
## specify NN
model = Dense(1 => 1)
## specify loss function as negative log-likelihood (using univariate specification to make things easier)
loss(model, x, y) = -sum([logpdf(Normal(0, 1 / model(x)[i]), y[i]) for i in 1:T])
## initial loss
loss(model, times, y)
## optimization function
opt = Descent()
## structure data
data = [(times, y)]
## train it, baby
for epoch in 1:200
train!(loss, model, data, opt)
end
## RESULTING ERROR:
ERROR: DomainError with -0.8109541:
Normal: the condition σ >= zero(σ) is not satisfied.