Per-dimension constraints on RBFKernel lengthscales #2338
Replies: 1 comment 3 replies
-
The constraints can take Tensors as lower and/or upper bounds -- just specify a bound for each lengthscale individually when making the LessThan (or Interval) constraint. Rather than hard constraints, you could also consider just placing priors over the lengthscales that more softly encourage larger lengthscales as well. In terms of getting the same hypers every time -- the optimization problem is non convex, so your only real recourse here would be to optimize multiple times. That said, if you are seeing stability issues, it's probably not just optimization failing here -- when things are working well, you'll usually end up with a reasonably consistent model. What exactly are you seeing that makes you think multiple solutions is the problem? |
Beta Was this translation helpful? Give feedback.
-
I am fitting a GP into a dataset with 379 points in F=7 dimensions. I use BoTorch but the question is in the underlying GPyTorch. I am using the combo of RBFKernel, FixedNoiseGP, ExactMarginalLogLikelihood and fit_gpytorch_mll. After fitting the kernel lengthscales are like this:
What is worrysome is that the 5th lengthscale is very low:
which leads to heavy overfit on the coreesponding feature.
As recommended here I put a LowerThan constraint on my RBFKernel lenghtscale like this:
This regularizes the GP fine, but still feels like I may constrain other dimensions too much.
Q1: Is there a way to create a list of F separate constraints, so I can bound just one dimention's lengthscale?
Also I noticed that the learnt lengthscales are very unreliable, on each refit the values are significantly different.
Q2: How to achieve stable fitting of the GP in the sense that the lengthscales end up about the same on each refit?
Beta Was this translation helpful? Give feedback.
All reactions