Replies: 4 comments
-
(I realise that I might need to create a component array from my vector, if that is the case I don't mind if that is required) |
Beta Was this translation helpful? Give feedback.
-
For the interface check https://lux.csail.mit.edu/stable/manual/interface#Layer-Interface.
|
Beta Was this translation helpful? Give feedback.
-
Thanks a lot, I think I am almost there now. Using model = Lux.Chain(
Lux.Dense(1 => 4, Lux.softplus),
Lux.Dense(4 => 4, Lux.softplus),
Lux.Dense(4 => 1, Lux.softplus)
)
rng = Random.default_rng()
ps, st = Lux.setup(rng, model) I get a good starting point. Cheking, my parameters are in a layered named tuples where e.g. ps.layer_1.weight is a 4-valued vector (I presume with all the weights of the 4 layer). Is there a direct interface to construct a correct such a layered The reason that I want to work with normal vectors is that that is what |
Beta Was this translation helpful? Give feedback.
-
Ok, I think I have figured out how to do it, thanks a lot for the help. Optimization can work over # Fetch packages
using Optimization,Lux, Zygote, Plots, Random, ComponentArrays
import OptimizationOptimisers: Adam
# Create training data and true function (which we want the neural network to approximate).
f(x) = 0.5 * (x^3)/(x^3 + 0.7^3)
xs = collect(0.0:0.1:4.0)
ys = [(0.9 + 0.2rand())*f(x) for x in xs]
# Create the model.
model = Lux.Chain(
Lux.Dense(1 => 4, Lux.softplus),
Lux.Dense(4 => 4, Lux.softplus),
Lux.Dense(4 => 1, Lux.softplus)
)
ps, st = Lux.setup(Random.default_rng(), model)
# Check input/output.
ps_ca = ComponentArray(ps)
input = reshape(xs, 1, length(xs))
target = reshape(ys, 1, length(xs))
ypred = LuxCore.stateless_apply(model, reshape(xs, 1, length(xs)), ps)
# Adds a non-neural network related field we want to optimise in parallel.
ps_ca_expanded = ComponentArray(ps_ca; new = 10.0)
# Defines loss function amd run optimization.
function loss(ps, (input, target))
ypred = LuxCore.stateless_apply(model, input, ps)
error = sum(abs2, ypred .- target)
return error + abs(ps.new)
end
optf = OptimizationFunction(loss, AutoZygote())
prob = OptimizationProblem(optf, ps_ca_expanded, (input, target))
@time res = Optimization.solve(prob, Adam(0.001); maxiters = 100000)
# Checks output.
plot(f, xs[1], xs[end])
plot!(xs, ys; seriestype = :scatter)
ypred = LuxCore.stateless_apply(model, input, res.u)
plot!(xs, ypred[1,:]; label = "Predicted", color = :red)
res.u.new # should be close to `0.0`. |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
Say that I have a Lux model
Two questions:
get the output of my neural network for these parameter values and output?
I have spent the last couple of days trying to get something like this to work, but I really have not been able to. i have been trying to make a basic example where I can train a model using optimization (where I have a neural network and some other things I want to optimize), and then build from there. Unfortunatley I cannot get this to work. Doc pages I have been reading include
https://lux.csail.mit.edu/stable/introduction/
https://lux.csail.mit.edu/stable/tutorials/beginner/1_Basics
https://docs.sciml.ai/Optimization/stable/optimization_packages/optimization/#Train-NN-with-Sophia
https://lux.csail.mit.edu/stable/tutorials/beginner/5_OptimizationIntegration
https://lux.csail.mit.edu/stable/tutorials/beginner/2_PolynomialFitting
Beta Was this translation helpful? Give feedback.
All reactions