This method uses an amount of memory that is quadratic in the number of variables to be optimized. It is generally very effective but if your problem has a very large number of variables then it isn't appropriate. Instead, you should try the lbfgs_search_strategy.
This method uses an amount of memory that is linear in the number of variables to be optimized. So it is capable of handling problems with a very large number of variables. However, it is generally not as good as the L-BFGS algorithm (see the lbfgs_search_strategy class).
Note that BOBYQA only works on functions of two or more variables. So if you need to perform derivative-free optimization on a function of a single variable then you should use the find_max_single_variable function.
The following paper, published in 2009 by Powell, describes the detailed working of the BOBYQA algorithm.
The BOBYQA algorithm for bound constrained optimization without derivatives by M.J.D. Powell
Note that BOBYQA only works on functions of two or more variables. So if you need to perform derivative-free optimization on a function of a single variable then you should use the find_min_single_variable function.
This method uses an amount of memory that is linear in the number of variables to be optimized. This makes it an excellent method to use when an optimization problem has a large number of variables.
Note also that this is actually a helper function for creating newton_search_strategy_obj objects.
Minimize: f(w) == 0.5*dot(w,w) + C*R(w) Where R(w) is a user-supplied convex function and C > 0
Optimized Cutting Plane Algorithm for Large-Scale Risk Minimization by Vojtech Franc, Soren Sonnenburg; 10(Oct):2157--2192, 2009.
Bundle Methods for Regularized Risk Minimization by Choon Hui Teo, S.V.N. Vishwanthan, Alex J. Smola, Quoc V. Le; 11(Jan):311-365, 2010.
Minimize: f(alpha) == 0.5*trans(alpha)*Q*alpha subject to the following constraints: sum(alpha) == nu*y.size() 0 <= min(alpha) && max(alpha) <= 1 trans(y)*alpha == 0 Where all elements of y must be equal to +1 or -1 and f is convex. This means that Q should be symmetric and positive-semidefinite.
Minimize: f(alpha) == 0.5*trans(alpha)*Q*alpha + trans(p)*alpha subject to the following constraints: for all i such that y(i) == +1: 0 <= alpha(i) <= Cp for all i such that y(i) == -1: 0 <= alpha(i) <= Cn trans(y)*alpha == B Where all elements of y must be equal to +1 or -1 and f is convex. This means that Q should be symmetric and positive-semidefinite.
Minimize: f(alpha) == 0.5*trans(alpha)*Q*alpha - trans(alpha)*b subject to the following constraints: sum(alpha) == C min(alpha) >= 0 Where f is convex. This means that Q should be symmetric and positive-semidefinite.
Minimize: f(p) == 0.5*trans(p)*B*p + trans(g)*p subject to the following constraint: length(p) <= radius