Skip to content

Optimizing over top level inputs to a custom featurization layer #2125

Answered by Balandat
sbernasek asked this question in Q&A
Discussion options

You must be logged in to vote

You almost got it right. The @t_batch_transform that is applied in ProbabilityOfImprovement here: https://github.com/pytorch/botorch/blob/main/botorch/acquisition/analytic.py#L162 and in most other acquisition functions means that the tensor that is seem by the forward methods of the transforms is at least three-dimensional / has at least one batch dimension. That means that your indexing is off. If you change your method as follows this works as you intended:

def transform(self, X: torch.Tensor) -> torch.Tensor:
    product = torch.prod(X, dim=-1, keepdim=True)
    return torch.cat([X, product], dim=-1)

Pro tip: Since BoTorch and GPyTorch make heavy use of batching and in principle wor…

Replies: 1 comment 1 reply

Comment options

You must be logged in to vote
1 reply
@sbernasek
Comment options

Answer selected by sbernasek
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants