A JavaScript library for optimizing black-box functions using Bayesian optimization with Gaussian processes
Bayesian Optimizer is a JavaScript library for optimizing black-box functions using Bayesian optimization with Gaussian processes.
- Supports multi-dimensional input spaces.
- Adjustable optimization parameters (exploration, number of candidates, etc.).
- Gaussian process regression with the Matérn kernel.
- Expected Improvement acquisition function
- 0 dependencies
- Type definitions included
Install the package using npm:
npm install bayesian-optimizer
or yarn:
yarn add bayesian-optimizer
import { BayesianOptimizer } from "bayesian-optimizer";
// Define your objective function
const objectiveFunction = async (params) => {
// Your objective function logic here
// Example: return -(params.x ** 2 + params.y ** 2);
};
// Define the search space for the objective function
const searchSpace = {
x: { min: -5, max: 5 },
y: { min: -5, max: 5 },
};
// Initialize the optimizer
const optimizer = new BayesianOptimizer({
exploration: 0.1, // Optional, default is 0.01
numCandidates: 100, // Optional, default is 100
kernelNu: 1.5, // Optional, default is 1.5
kernelLengthScale: 1.0, // Optional, default is 1.0
});
// Optimize the objective function
await optimizer.optimize(objectiveFunction, searchSpace, 100);
// Get the best parameters found
const bestParams = optimizer.getBestParams();
The main class for performing Bayesian optimization.
constructor(options?: { exploration?: number; numCandidates?: number; kernelNu?: number; kernelLengthScale?: number })
Create a new instance of the BayesianOptimizer.
options
: An optional object with the following properties:exploration
: The exploration parameter (xi) for the Expected Improvement acquisition function. Default is0.01
. Controls the exploration-exploitation trade-off.numCandidates
: The number of candidates sampled for each optimization step. Default is100
.kernelNu
: Controls the smoothness of the Squared Exponential kernel. Default is1.5
.kernelLengthScale
: Controls the length scale of the Squared Exponential kernel. Default is1.0
.
optimize(objectiveFunction: ObjectiveFunction, searchSpace: { [key: string]: ParameterRange }, numSteps:number): void
Optimize the given objective function over the specified search space for a certain number of steps.
objectiveFunction
: The function to optimize.searchSpace
: An object that defines the ranges of the parameters for the objective function.numSteps
: The number of steps to perform the optimization.
Returns the best parameters found during the optimization.
Bayesian optimization relies on Gaussian process regression, which is a powerful technique for modeling an unknown function using a set of observed data points. In this library, we use the Matérn kernel as the covariance function for the Gaussian process. The Matérn kernel is a popular choice in Bayesian optimization due to its flexibility and ability to model various degrees of smoothness in the underlying function.
The Matérn kernel has two parameters, ν (nu) and l (length scale), which can be adjusted to control the smoothness and scale of the function being modeled. By default, this library uses a ν of 2.5 and a length scale of 1. These default values can be overridden by providing the kernelNu
and kernelLengthScale
options when initializing the BayesianOptimizer.
In Bayesian optimization, the acquisition function is used to determine which points in the search space should be evaluated next. The most commonly used acquisition function is Expected Improvement (EI), which balances exploration and exploitation by calculating the expected improvement of a potential candidate point over the current best point.
The EI acquisition function is a popular strategy in Bayesian optimization that balances exploration and exploitation by selecting the next point to evaluate based on the expected improvement over the current best point. High EI values indicate a higher potential for improvement, guiding the optimizer towards promising regions of the search space.
- Aaron Fitzpatrick (Afitzy98)
Contributions are welcome! Please open an issue or submit a pull request on the GitHub repository.
There are several possible expansions that could be added to to this library. One possible expansion is the addition of other acquisition functions, such as Probability of Improvement (PI) or Upper Confidence Bound (UCB), which are also commonly used in Bayesian optimization. Another possible expansion is the inclusion of other kernel functions, which can affect the smoothness and scale of the function being modeled. Adding more kernel functions would increase the flexibility of the library and allow users to model a wider range of functions.
MIT License