Noticed by @myrazma
The problem is that, in hybrid optimization, we directly call BoTorch's optimize_acqf_mixed routine, bypassing our cardinality constraint machinery.
Here a minimal reproducing example:
from baybe import Campaign
from baybe.constraints import ContinuousCardinalityConstraint
from baybe.parameters import CategoricalParameter, NumericalContinuousParameter
from baybe.searchspace import SearchSpace
from baybe.targets.numerical import NumericalTarget
from baybe.utils.dataframe import add_fake_measurements
parameters = [
NumericalContinuousParameter("p1", (0, 1)),
NumericalContinuousParameter("p2", (0, 1)),
CategoricalParameter("p_cat", ["A", "B", "C"]),
]
constraints = [ContinuousCardinalityConstraint(["p1", "p2"], max_cardinality=1)]
searchspace = SearchSpace.from_product(parameters, constraints)
objective = NumericalTarget("t", "MAX").to_objective()
campaign = Campaign(searchspace, objective)
recommendations = campaign.recommend(10)
add_fake_measurements(recommendations, campaign.targets)
campaign.add_measurements(recommendations)
recommendations = campaign.recommend(10)
print(recommendations)
Output looks like:
p_cat p1 p2
index
1 B 0.046313 0.009410
1 B 0.010788 0.283344
1 B 0.030576 0.076704
1 B 0.012316 0.405818
1 B 0.857327 0.101991
1 B 0.821122 0.000000
1 B 0.870719 0.000000
1 B 0.000000 0.251797
1 B 0.028825 0.327676
1 B 0.900114 0.110927
Noticed by @myrazma
The problem is that, in hybrid optimization, we directly call BoTorch's
optimize_acqf_mixedroutine, bypassing our cardinality constraint machinery.Here a minimal reproducing example:
Output looks like: