Changing prior mean after model is trained #1219
-
I'm using a FixedNoiseGP and after training on the whole data set I'd like to perform Expected Improvement on 2 subregions using the global GP. I'm interested in finding the optima for each region, but lets suppose that one region, on average, performs very highly and the the other performs very poorly. If I compare the EI for these regions it is not a fair comparison because the mean of the Global GP drives the overall performance for both regions (under representing the high performing region and vice versa which leads to problems when compating EI values across regions). Instead, I'd like to train a GP on the whole dataset, to get a global set of hyperparameters and then simply replace the prior mean on the model to represent the individual regions. I can achieve the desired outcome by training a FixedNoiseGP and then :
Thinking about the predictive step it should be simpler than this, I should be able to replace the mean function with a constant value (as above) and then also update the vector of distances from the mean used in the prediction step, the variance is unaffected by the prior mean, so it shouldn't need to be changed. After searching, I've found prediction_strategy.mean_cache which holds the values I need to change, but it doesn't seem to be working as I cant seem to set it. I wonder if anyone might know the easiest way to do this or propose another way to do it that might, even fractionally, reduce the overheads of running the model.train(), model.eval() line. I'm running it thousands of times in my current context. |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 2 replies
-
You could technically delete it with
but it won't work since In general, I am a bit suspicious of the idea of training the |
Beta Was this translation helpful? Give feedback.
You could technically delete it with
but it won't work since
mean_cache
will be re-generated from other cached values, such asmodel.prediction_strategy.train_prior_dist
. You could delete that as well, but then you'd have to replace it, otherwise you'll get an error. You can give it a try and see if it speeds things up.In general, I am a bit suspicious of the idea of training the
covar_module
along with themean_module
then swapping out themean_module
with something else. Wouldn't it be better to set a custommean_module
from the start (e.g., fix it …