Patterns in static

Apophenia

Data Fields
apop_mle_settings Struct Reference

Data Fields

double * starting_pt
 
char * method
 
double step_size
 
double tolerance
 
double delta
 
int max_iterations
 
int verbose
 
double dim_cycle_tolerance
 
int n_tries
 
int iters_fixed_T
 
double k
 
double t_initial
 
double mu_t
 
double t_min
 
gsl_rng * rng
 
apop_data ** path
 

Detailed Description

The settings for maximum likelihood estimation (including simulated annealing).

Field Documentation

double apop_mle_settings::dim_cycle_tolerance

If zero (the default), the usual procedure. If $>0$, cycle across dimensions: fix all but the first dimension at the starting point, optimize only the first dim. Then fix the all but the second dim, and optimize the second dim. Continue through all dims, until the log likelihood at the outset of one cycle through the dimensions is within this amount of the previous cycle's log likelihood. There will be at least two cycles.

int apop_mle_settings::max_iterations

Ignored by simulated annealing. Other methods halt if they do this many iterations without finding an optimum.

char* apop_mle_settings::method

The method to be used for the optimization. All strings are case-insensitive.

String Name

Notes

"NM simplex" Nelder-Mead simplex

Does not use gradients at all. Can sometimes get stuck.

"FR cg" Conjugate gradient (Fletcher-Reeves) (default)

CG methods use derivatives. The converge to the optimum of a quadratic function in one step; performance degrades as the objective digresses from quadratic.

"BFGS cg" Broyden-Fletcher-Goldfarb-Shanno conjugate gradient

"PR cg" Polak-Ribiere conjugate gradient

"Annealing" simulated annealing

Slow but works for objectives of arbitrary complexity, including stochastic objectives.

"Newton"Newton's method

Search by finding a root of the derivative. Expects that gradient is reasonably well-behaved.

"Newton hybrid"Newton's method/gradient descent hybrid

Find a root of the derivative via the Hybrid method If Newton proposes stepping outside of a certain interval, use an alternate method. See the GSL manual for discussion.

"Newton hybrid no scale"Newton's method/gradient descent hybrid with spherical scaleAs above, but use a simplified trust region.
apop_data** apop_mle_settings::path

If not NULL, record each vector tried by the optimizer as one row of this apop_data set. Each row of the matrix element holds the vector tried; the corresponding element in the vector is the evaluated value at that vector (after out-of-constraints penalties have been subtracted). A new apop_data set is allocated at the pointer you send in. This data set has no names; add them as desired. Sample use:

apop_data *mypath;
Apop_model_add_group(mymodel, apop_mle, .path=&mypath);
apop_model *out = apop_estimate(mydata, mymodel);
apop_data_print(mypath, .output_name="search");
double* apop_mle_settings::starting_pt

An array of doubles (i.e., double*) suggesting a starting point. If NULL, use an all-ones vector. Note that if v is a gsl_vector, then v->data is of the right form (provided v is not a slice of a matrix).

double apop_mle_settings::step_size

the initial step size.

double apop_mle_settings::tolerance

the precision the minimizer uses. Only vaguely related to the precision of the actual variables.

int apop_mle_settings::verbose

Give status updates as we go. This is orthogonal to the apop_opts.verbose setting.

Autogenerated by doxygen on Wed Oct 15 2014 (Debian 0.999b+ds3-2).