-
Notifications
You must be signed in to change notification settings - Fork 15
Contrary loss update direction of CS-FMU and changeless loss of ME-FMU with FMUParameterRegistrator #142
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
Thanks for the issue, this is very interesting! |
Hi @ThummeTo , I doubt that the first issue not only happened when using
It seems like it easily stucks at local optimum and the result is show below The difference of calling this plot shows that the FMU output in the training process is linear, which is weird I think NOTE: the fmu I use in above results is exported by OpenModelica v1.23.0 (64-bit), which may be the main reason but not sure why |
Can you compare the gradients? |
There are some tests in FMISensitivity.jl that you can check out: FMISensitivity tests (here, we compare many gradients to ground truth gradients) |
Currently, I am trying to use
FMUParameterRegistrator
to do parameter calibration/estimation but encounter some issues.I do experiments on both ME-type & CS-type FMU:
Model:SpringPendulum1D.mo (from FMIZoo.jl)
Exporting Tool:Dymola (I dircetly use the FMU from FMIZoo.jl)
Result:good, parameter can be tuned correctly after training
Model:SpringPendulum1D.mo (from FMIZoo.jl)
Exporting Tool:OpenModelca v1.23.0 (64-bit)
Result:the loss doesn't change during training so the parameter is not tuned correctly, below is the loss function I use and part of info during training process
Actually, I found this issue comes from the wrong return value of
neuralFMU(x₀; parameters=params, p=p, showProgress=true, saveat=tSave)
inlossSum
because theposNet
is an array with same value for all time steps andvelNet
is an array with linear incremental values. For example, theposNet = [0.5, 0.5, 0.5, ..., 0.5, 0.5, 0.5]
and thevelNet = [0.0, 0.1, 0.2, 0.3, ..., 40.0]
, however, both of them should be like a string wave.By the way, this only happens when doing
FMIFlux.train!
. It is normal if I runneuralFMU(x₀; parameters=params, p=p, showProgress=true, saveat=tSave)
independently.I am not sure whether
canGetAndSetFMUstate="false"
may be the possible reason that cause weird FMU solution? Because OpenModelica seems not support ME-type FMU to support this functionality even I follow the guideline of OpenModelica to enable it.Exporting Tool:Dymola (I dircetly use the FMU from FMIZoo.jl)
Result:the loss keeps increasing so the result is totally on the contrary like below
If I negtivelize the return value of loss, the training is relatively normal but I think this is not a good way to do
Exporting Tool:OpenModelca v1.23.0 (64-bit)
Result:good, parameter can be tuned correctly after training
Conclusion
There is two issues when using
FMUParameterRegistrator
:If more information is needed, please tell me, thank you!
The text was updated successfully, but these errors were encountered: