Command Reference : Command Reference
Estimation of an elastic net model, including options for Lasso and ridge regression.
equation_name.enet(options) y x1 [x2 x3 ...] [@vw(...)]
List the dependent variable first, followed by a list of the independent variables. Use a “C” if you wish to include a constant or intercept term. Variable weights should be incorporated with the “@vw” tag, e.g., “@vw(x1, 0.5)”.
Specification Options
penalty=arg (default=“el”)
Type of threshold estimation: “el” (elastic net), “ridge” (ridge), “lasso” (Lasso).
alpha=arg (default=“.5”)
Value of the mixing parameter. Must be a value between zero and one.
Value of the penalty parameter. Can be a single number, list of space-delimited numbers, a workfile series object, or left blank for an EViews-supplied list (default). Values must be zero or greater.
General Options
xtrans=arg (default=“none”)
Transformation of the regressor variables: “none” (none), “L1” (L1), “L2” (L2), “stdsmpl” (sample standard deviation), “stdpop” (population standard deviation), “minmax” (min-max).
lambdaratio=arg (default=0.0001)
Ratio of minimum to maximum lambda for EViews-supplied list.
nlambdas=arg (default=100)
Number of lambas for EViews-supplied list.
Maximum number of iterations.
Set convergence criterion. The criterion is based upon the maximum of the percentage changes in the scaled estimates. The criterion will be set to the nearest value between 1e-24 and 0.2.
Weight series or expression.
wtype=arg (default=“istdev”)
Weight specification type: inverse standard deviation (“istdev”), inverse variance (“ivar”), standard deviation (“stdev”), variance (“var”).
Weight scaling: EViews default (“eviews”), average (“avg”), none (“none”).
The default setting depends upon the weight type: “eviews” if “wtype=istdev”, “avg” for all others.
showopts / ‑showopts
[Do / do not] display the starting coefficient values and estimation options in the rotation output.
Force the dialog to appear from within a program.
Print basic estimation results.
Cross Validation Options
cvmethod=arg (default=“kfold_cv”)
Cross-validation method: “kfold” (k-fold), “shuffle” (shuffle), “leavepout” (leave p out), “leave1out” (leave one out), “rolling” (rolling window), “expanding” (expanding window).
cvmeasure=arg (default=“mse”)
Error measurement from cross-validation: “mse” (mean-squared error), “mae” (mean absolute error), “r2” (r-squared).
Proportion of data or number of data points in training set for shuffle method.
Proportion of data or number of data points in test set for shuffle method.
Number of shuffle method repetitions.
Number of folds for k-fold method.
Number of data points left out for leave p out method.
Gap between the training set and future test set for rolling and expanding window methods.
Number of initial data points held out of rolling and expanding window methods.
Window size for rolling window method.
Random Number Options
seed=positive_integer from 0 to 2,147,483,647
Seed the random number generator.
If not specified, EViews will seed random number generator with a single integer draw from the default global random number generator.
rnd=arg (default=“kn” or method previously set using rndseed).
Type of random number generator: improved Knuth generator (“kn”), improved Mersenne Twister (“mt”), Knuth’s (1997) lagged Fibonacci generator used in EViews 4 (“kn4”) L’Ecuyer’s (1999) combined multiple recursive generator (“le”), Matsumoto and Nishimura’s (1998) Mersenne Twister used in EViews 4 (“mt4”).
test.enet(penalty=lasso, alpha=1, lambda=.04065, conv=1e-8, maxit=5000) ystdz v1 v2 v3 c
estimates a lasso model with a single lambda value, no regressor transformation, and an optimization convergence limit of 1e-8 with a maximum of 5000 iterations.
test.enet(alpha=0, xtrans=stdpop, cvmeasure=mae, nfolds=10) ystdz v1 v2 v3 c
estimates an elastic net model with alpha = 0 (a ridge regression model solved numerically), sample standardization of the regressors, mean absolute error for the cross-validation error selection, and k-fold cross validation with ten folds.
See “Elastic Net and Lasso” for a discussion of elastic net, ridge regression, and Lasso models.