Type: | Package |
Title: | Maximum Penalized Likelihood Estimation with Extended Lasso Penalty |
Version: | 0.3 |
Date: | 2022-05-13 |
Author: | B N Mandal <mandal.stat@gmail.com> and Jun Ma <jun.ma@mq.edu.au> |
Maintainer: | B N Mandal <mandal.stat@gmail.com> |
Depends: | R (≥ 3.1.1) |
Description: | Estimates coefficients of extended LASSO penalized linear regression and generalized linear models. Currently lasso and elastic net penalized linear regression and generalized linear models are considered. This package currently utilizes an accurate approximation of L1 penalty and then a modified Jacobi algorithm to estimate the coefficients. There is provision for plotting of the solutions and predictions of coefficients at given values of lambda. This package also contains functions for cross validation to select a suitable lambda value given the data. Also provides a function for estimation in fused lasso penalized linear regression. For more details, see Mandal, B. N.(2014). Computational methods for L1 penalized GLM model fitting, unpublished report submitted to Macquarie University, NSW, Australia. |
License: | GPL-2 | GPL-3 [expanded from: GPL (≥ 2)] |
NeedsCompilation: | no |
Packaged: | 2022-05-13 07:02:03 UTC; b |
Repository: | CRAN |
Date/Publication: | 2022-05-13 08:50:08 UTC |
Error bars
Description
The function places error bars on crossvalidation plots
Usage
bars(x, up, low, width = 0.03, ...)
Arguments
x |
a vector |
up |
a vector |
low |
a vector |
width |
a vector |
Details
This function is internal and used by cross validation routines.
Author(s)
B N Mandal and Jun Ma
Examples
x=rnorm(5)
up=rep(1,5)
low=rep(1,5)
plot(x)
bars(x,up,low)
Extract coefficients from a fitted extlasso object
Description
The function returns the coefficients from a fitted extlasso object
Usage
## S3 method for class 'extlasso'
coef(object,...)
Arguments
object |
A ‘extlasso’ object obtained using ‘extlasso’ function. |
... |
Not used |
Value
Estimated coefficients for different lambdas starting from maximum value of lambda to minimum value of lambda
Author(s)
B N Mandal and Jun Ma
References
Mandal, B.N. and Jun Ma, (2014). A Jacobi-Armijo Algorithm for LASSO and its Extensions.
Examples
x=matrix(rnorm(100*30),100,30)
y=sample(c(0,1),100,replace=TRUE)
g1=extlasso(x,y,family="binomial")
coef(g1)
x=matrix(rnorm(100*30),100,30)
y=rnorm(100)
g1=extlasso(x,y,family="normal")
coef(g1)
k-fold cross validation for penalized generalized linear models for binomial family
Description
The function does k-fold cross validation for selecting best value of regularization parameter.
Usage
cv.binomial(x,y,k=5,nlambda=50,tau=1,plot=TRUE,errorbars=TRUE)
Arguments
x |
x is matrix of order n x p where n is number of observations and p is number of predictor variables. Rows should represent observations and columns should represent predictor variables. |
y |
y is a vector of response variable of order n x 1. |
k |
Number of folds for cross validation. Default is k=5. |
nlambda |
Number of lambda values to be used for cross validation. Default is nlambda=50. |
tau |
Elastic net parameter, |
plot |
if TRUE, produces a plot of cross validated prediction mean squared errors against lambda. Default is TRUE. |
errorbars |
If TRUE, error bars are drawn in the plot. Default is TRUE. |
Value
Produces a plot and returns a list with following components:
lambda |
Value of lambda for which average cross validation error is minimum |
pmse |
A vector of average cross validation errors for various lambda values |
lambdas |
A vector of lambda values used in cross validation |
se |
A vector containing standard errors of cross validation errors |
Note
This function need not be called by user. The function is internally called by cv.extlasso function.
Author(s)
B N Mandal and Jun Ma
References
Mandal, B.N. and Jun Ma, (2014). A Jacobi-Armijo Algorithm for LASSO and its Extensions.
Examples
x=matrix(rnorm(100*30),100,30)
y=sample(c(0,1),100,replace=TRUE)
cv.binomial(x,y,k=5)
k-fold cross validation for penalized generalized linear models for normal/binomial/poisson family
Description
The function does k-fold cross validation for selecting best value of regularization parameter.
Usage
cv.extlasso(x,y,family=c("binomial","normal","poisson"),k=5,
nlambda=50,tau=1,plot=TRUE, errorbars=TRUE)
Arguments
x |
x is matrix of order n x p where n is number of observations and p is number of predictor variables. Rows should represent observations and columns should represent predictor variables. |
y |
y is a vector of response variable of order n x 1. |
family |
family is either "normal" or "binomial" or "poisson". |
k |
Number of folds for cross validation. Default is k=5. |
nlambda |
Number of lambda values to be used for cross validation. Default is nlambda=50. |
tau |
Elastic net parameter, |
plot |
if TRUE, produces a plot of cross validated prediction mean squared errors/ deviances against lambda. Default is TRUE. |
errorbars |
If TRUE, error bars are drawn in the plot. Default is TRUE. |
Value
Produces a plot and returns a list with following components:
lambda |
Value of lambda for which average cross validation error is minimum |
pmse |
A vector of average cross validation errors for various lambda values |
lambdas |
A vector of lambda values used in cross validation |
se |
A vector containing standard errors of cross validation errors |
Note
This function uses prediction means squared errors for normal family and deviance for binomial and poisson family.
Author(s)
B N Mandal and Jun Ma
References
Mandal, B.N. and Jun Ma, (2014). A Jacobi-Armijo Algorithm for LASSO and its Extensions.
Examples
#normal family
x=matrix(rnorm(100*30),100,30)
y=rnorm(100)
cv.extlasso(x,y,family="normal",k=5)
#binomial family
x=matrix(rnorm(100*30),100,30)
y=sample(c(0,1),100,replace=TRUE)
cv.extlasso(x,y,family="binomial",k=5)
#poisson family
x=matrix(rnorm(100*30),100,30)
y=sample(c(1:5),100,replace=TRUE)
cv.extlasso(x,y,family="poisson",k=5)
k-fold cross validation for penalized generalized linear models for normal family
Description
The function does k-fold cross validation for selecting best value of regularization parameter.
Usage
cv.normal(x,y,k=5,nlambda=50,tau=1,plot=TRUE,errorbars=TRUE)
Arguments
x |
x is matrix of order n x p where n is number of observations and p is number of predictor variables. Rows should represent observations and columns should represent predictor variables. |
y |
y is a vector of response variable of order n x 1. |
k |
Number of folds for cross validation. Default is k=5. |
nlambda |
Number of lambda values to be used for cross validation. Default is nlambda=50. |
tau |
Elastic net parameter, |
plot |
if TRUE, produces a plot of cross validated prediction mean squared errors against lambda. Default is TRUE. |
errorbars |
If TRUE, error bars are drawn in the plot. Default is TRUE. |
Value
Produces a plot and returns a list with following components:
lambda |
Value of lambda for which average cross validation error is minimum |
pmse |
A vector of average cross validation errors for various lambda values |
lambdas |
A vector of lambda values used in cross validation |
se |
A vector containing standard errors of cross validation errors |
Note
This function need not be called by user. The function is internally called by cv.extlasso function.
Author(s)
B N Mandal and Jun Ma
References
Mandal, B.N. and Jun Ma, (2014). A Jacobi-Armijo Algorithm for LASSO and its Extensions.
Examples
x=matrix(rnorm(100*30),100,30)
y=rnorm(100)
cv.normal(x,y,k=10)
k-fold cross validation for penalized generalized linear models for poisson family
Description
The function does k-fold cross validation for selecting best value of regularization parameter.
Usage
cv.poisson(x,y,k=5,nlambda=50,tau=1,plot=TRUE,errorbars=TRUE)
Arguments
x |
x is matrix of order n x p where n is number of observations and p is number of predictor variables. Rows should represent observations and columns should represent predictor variables. |
y |
y is a vector of response variable of order n x 1. |
k |
Number of folds for cross validation. Default is k=5. |
nlambda |
Number of lambda values to be used for cross validation. Default is nlambda=50. |
tau |
Elastic net parameter, |
plot |
if TRUE, produces a plot of cross validated prediction mean squared errors against lambda. Default is TRUE. |
errorbars |
If TRUE, error bars are drawn in the plot. Default is TRUE. |
Value
Produces a plot and returns a list with following components:
lambda |
Value of lambda for which average cross validation error is minimum |
pmse |
A vector of average cross validation errors for various lambda values |
lambdas |
A vector of lambda values used in cross validation |
se |
A vector containing standard errors of cross validation errors |
Note
This function need not be called by user. The function is internally called by cv.extlasso function.
Author(s)
B N Mandal and Jun Ma
References
Mandal, B.N. and Jun Ma, (2014). A Jacobi-Armijo Algorithm for LASSO and its Extensions.
Examples
x=matrix(rnorm(100*30),100,30)
y=sample(c(1:5),100,replace=TRUE)
cv.poisson(x,y,k=5)
Entire regularization path of penalized generalized linear model for normal/binomial/poisson family using modified Jacobi Algorithm
Description
The function computes coefficients of a penalized generalized linear model for normal/binomial/poisson family using modified Jacobi Algorithm for a sequence of lambda values. Currently lasso and elastic net penalty are supported.
Usage
extlasso(x,y,family=c("normal","binomial","poisson"),intercept=TRUE,
normalize=TRUE,tau=1,alpha=1e-12,eps=1e-6,tol=1e-6,maxiter=1e5, nstep=100,min.lambda=1e-4)
Arguments
x |
x is matrix of order n x p where n is number of observations and p is number of predictor variables. Rows should represent observations and columns should represent predictor variables. |
y |
y is a vector of response variable of order n x 1. y should follow either normal/binomial/poisson distribution. |
family |
family should be one of these: "normal","binomial","poisson" |
intercept |
If TRUE, model includes intercept, else the model does not have intercept. |
normalize |
If TRUE, columns of x matrix are normalized with mean 0 and norm 1 prior to fitting the model. The coefficients at end are returned on the original scale. Default is normalize = TRUE. |
tau |
Elastic net parameter, |
alpha |
The quantity in approximating |
eps |
A value which is used to set a coefficient to zero if coefficients value is within - eps to + eps. Default is eps = 1e-6. |
tol |
Tolerance criteria for convergence of solutions. Default is tol = 1e-6. |
maxiter |
Maximum number of iterations permissible for solving optimization problem for a particular lambda. Default is 10000. Rarely you need to change this to higher value. |
nstep |
Number of steps from maximum value of lambda to minimum value of lambda. Default is nstep = 100. |
min.lambda |
Minimum value of lambda. Default is min.lambda=1e-4. |
Value
An object of class ‘extlasso’ with following components:
beta0 |
A vector of order nstep of intercept estimates. Each value denote an estimate for a particular lambda. Corresponding lambda values are available in ‘lambdas’ element of the ‘extlasso’ object. |
coef |
A matrix of order nstep x p of slope estimates. Each row denotes solution for a particular lambda. Corresponding lambda values are available in ‘lambdas’ element of the ‘extlasso’ object. Here p is number of predictor variables. |
lambdas |
Sequence of lambda values for which coefficients are obtained |
L1norm |
L1norm of the coefficients |
norm.frac |
Fractions of norm computed as L1 norm at current lambda divided by maximum L1 norm |
lambda.iter |
Number of iterations used for different lambdas |
of.value |
Objective function values |
normx |
Norm of x variables |
Author(s)
B N Mandal and Jun Ma
References
Mandal, B.N. and Jun Ma, (2014). A Jacobi-Armijo Algorithm for LASSO and its Extensions.
Examples
#LASSO
x=matrix(rnorm(100*30),100,30)
y=rnorm(100)
g1=extlasso(x,y,family="normal")
plot(g1)
plot(g1,xvar="lambda")
#Elastic net
g2=extlasso(x,y,family="normal",tau=0.6)
plot(g2)
plot(g2,xvar="lambda")
#Ridge regression
g3=extlasso(x,y,family="normal",tau=0)
plot(g3)
plot(g3,xvar="lambda")
#L1 penalized GLM for binomial family
x=matrix(rnorm(100*30),100,30)
y=sample(c(0,1),100,replace=TRUE)
g1=extlasso(x,y,family="binomial")
plot(g1)
plot(g1,xvar="lambda")
#Elastic net with GLM with binomial family
g2=extlasso(x,y,family="binomial",tau=0.8)
plot(g2)
plot(g2,xvar="lambda")
Coefficients of penalized generalized linear models for a given lambda for binomial family
Description
The function computes regression coefficients for a penalized generalized linear models for a given lambda value for response variable following binomial distribution.
Usage
extlasso.binom.lambda(n,p,p1,sumy,beta0.old,beta1.old,x,y,
dxkx0,dxkx1,tau,lambda1,alpha,tol,maxiter,eps,xbeta.old,mu1)
Arguments
n |
Number of observations |
p |
Number of predictors |
p1 |
Number of active predictors |
sumy |
Sum of y values |
beta0.old |
Initial value of intercept |
beta1.old |
A vector of initial values of slope coefficients |
x |
A n by p1 matrix of predictors |
y |
A vector of n observations |
dxkx0 |
In case of a model with intercept, first diagonal of X'X |
dxkx1 |
Diagonals of X'X |
tau |
Elastic net paramter. Default is 1 |
lambda1 |
The value of lambda |
alpha |
Approximation to be used for absolute value. Default is 10^-6 |
tol |
Tolerance criterion. Default is 10^-6 |
maxiter |
Maximum number of iterations. Default is 10000 |
eps |
value for which beta is set to zero if -eps<beta<eps. Default is 10^-6 |
xbeta.old |
A n by 1 vector of xbeta values |
mu1 |
The value of mu at beta.old |
Details
This function is internal and used by extlasso.binomial function. User need not call this function.
Value
A list with following components
beta0.new |
Intercept estimate |
beta1.new |
Slope coefficient estimates |
conv |
"yes" means converged and "no" means did not converge |
iter |
Number of iterations to estimate the coefficients |
ofv.new |
Objective function value at solution |
xbeta.new |
xbeta values at solution |
mu1 |
Value of mu at solution |
Author(s)
B N Mandal and Jun Ma
Entire regularization path of penalized generalized linear model for binomial family using modified Jacobi Algorithm
Description
The function computes coefficients of a penalized generalized linear model for binomial family using modified Jacobi Algorithm for a sequence of lambda values. Currently lasso and elastic net penalty are supported.
Usage
extlasso.binomial(x,y,intercept=TRUE,normalize=TRUE,tau=1,
alpha=1e-12,eps=1e-6,tol=1e-6,maxiter=1e5, nstep=100,min.lambda=1e-4)
Arguments
x |
x is matrix of order n x p where n is number of observations and p is number of predictor variables. Rows should represent observations and columns should represent predictor variables. |
y |
y is a vector of response variable of order n x 1. y should follow binomial distribution and y should be a vector of 0 and 1. |
intercept |
If TRUE, model includes intercept, else the model does not have intercept. |
normalize |
If TRUE, columns of x matrix are norma lized with mean 0 and norm 1 prior to fitting the model. The coefficients at end are returned on the original scale. Default is normalize = TRUE. |
tau |
Elastic net parameter, |
alpha |
The quantity in approximating |
eps |
A value which is used to set a coefficient to zero if coefficients value is within - eps to + eps. Default is eps = 1e-6. |
tol |
Tolerance criteria for convergence of solutions. Default is tol = 1e-6. |
maxiter |
Maximum number of iterations permissible for solving optimization problem for a particular lambda. Default is 10000. Rarely you need to change this to higher value. |
nstep |
Number of steps from maximum value of lambda to minimum value of lambda. Default is nstep = 100. |
min.lambda |
Minimum value of lambda. Default is min.lambda=1e-4. |
Value
An object of class ‘extlasso’ for which plot, predict and coef method exists. The object has following components:
beta0 |
A vector of order nstep of intercept estimates. Each value denote an estimate for a particular lambda. Corresponding lambda values are available in ‘lambdas’ element of the ‘extlasso’ object. |
coef |
A matrix of order nstep x p of slope estimates. Each row denotes solution for a particular lambda. Corresponding lambda values are available in ‘lambdas’ element of the ‘extlasso’ object. |
lambdas |
Sequence of lambda values for which coefficients are obtained |
L1norm |
L1norm of the coefficients |
norm.frac |
Fractions of norm computed as L1 norm at current lambda divided by maximum L1 norm |
lambda.iter |
Number of iterations used for different lambdas |
of.value |
Objective function values |
normx |
Norm of x variables |
Author(s)
B N Mandal and Jun Ma
References
Mandal, B.N. and Jun Ma, (2014). A Jacobi-Armijo Algorithm for LASSO and its Extensions.
Examples
x=matrix(rnorm(100*30),100,30)
y=sample(c(0,1),100,replace=TRUE)
g1=extlasso.binomial(x,y)
plot(g1)
plot(g1,xvar="lambda")
g1$of.value
Coefficients of penalized generalized linear models for a given lambda for normal family
Description
The function computes regression coefficients for a penalized generalized linear models for a given lambda value for response variable following normal distribution.
Usage
extlasso.norm.lambda(n,p,p1,x,y,xpx,dxpx,xpy,beta.old,
tau,alpha,lambda1,tol,maxiter,eps,xbeta.old)
Arguments
n |
Number of observations |
p |
Number of predictors. |
p1 |
Number of active predictors |
x |
A n by p1 matrix of predictors. |
y |
A vector of n observations. |
xpx |
Matrix X'X |
dxpx |
Diagonals of X'X |
xpy |
Vector X'y |
beta.old |
A vector of initial values of beta. |
tau |
Elastic net paramter. Default is 1 |
alpha |
Approximation to be used for absolute value. Default is 10^-6. |
lambda1 |
The value of lambda |
tol |
Tolerance criterion. Default is 10^-6 |
maxiter |
Maximum number of iterations. Default is 10000. |
eps |
value for which beta is set to zero if -eps<beta<eps. Default is 10^-6 |
xbeta.old |
A n by 1 vector of xbeta values. |
Details
This function is internal and used by extlasso.normal function. User need not call this function.
Value
A list with following components
beta.new |
Coefficient estimates |
conv |
"yes" means converged and "no" means did not converge |
iter |
Number of iterations to estimate the coefficients |
ofv.new |
Objective function value at solution |
xbeta.new |
xbeta values at solution |
Author(s)
B N Mandal and Jun Ma
Entire regularization path of penalized generalized linear model for normal family using modified Jacobi Algorithm
Description
The function computes coefficients of a penalized generalized linear model for normal family using modified Jacobi Algorithm for a sequence of lambda values. Currently lasso and elastic net penalty are supported.
Usage
extlasso.normal(x,y,intercept=TRUE,normalize=TRUE,tau=1,alpha=1e-12,
eps=1e-6,tol=1e-6,maxiter=1e5,nstep=100,min.lambda=1e-4)
Arguments
x |
x is matrix of order n x p where n is number of observations and p is number of predictor variables. Rows should represent observations and columns should represent predictor variables. |
y |
y is a vector of response variable of order n x 1. y should follow normal distribution. |
intercept |
If TRUE, model includes intercept, else the model does not have intercept. |
normalize |
If TRUE, columns of x matrix are normalized with mean 0 and norm 1 prior to fitting the model. The coefficients at end are returned on the original scale. Default is normalize = TRUE. |
tau |
Elastic net parameter, |
alpha |
The quantity in approximating |
eps |
A value which is used to set a coefficient to zero if coefficients value is within - eps to + eps. Default is eps = 1e-6. |
tol |
Tolerance criteria for convergence of solutions. Default is tol = 1e-6. |
maxiter |
Maximum number of iterations permissible for solving optimization problem for a particular lambda. Default is 10000. Rarely you need to change this to higher value. |
nstep |
Number of steps from maximum value of lambda to minimum value of lambda. Default is nstep = 100. |
min.lambda |
Minimum value of lambda. Default is min.lambda=1e-4. |
Value
An object of class ‘extlasso’ for which plot, predict and coef method exists. The object has following components:
intercept |
Value of intercept: TRUE or FALSE as used in input |
coef |
A matrix of order nstep x p if intercept is FALSE or nstep x (p+1) if intercept is TRUE, first column being estimates of intercept. Each row denotes solution for a particular lambda. Corresponding lambda values are available in ‘lambdas’ element of the ‘extlasso’ object. Here p is number of predictor variables. |
lambdas |
Sequence of lambda values for which coefficients are obtained |
L1norm |
L1norm of the coefficients |
norm.frac |
Fractions of norm computed as L1 norm at current lambda divided by maximum L1 norm |
lambda.iter |
Number of iterations used for different lambdas |
of.value |
Objective function values |
normx |
Norm of x variables |
Author(s)
B N Mandal and Jun Ma
References
Mandal, B.N. and Jun Ma, (2014). A Jacobi-Armijo Algorithm for LASSO and its Extensions.
Examples
x=matrix(rnorm(100*30),100,30)
y=rnorm(100)
g1=extlasso.normal(x,y)
plot(g1)
plot(g1,xvar="lambda")
g1$of.value
Coefficients of penalized generalized linear models for a given lambda for Poisson family
Description
The function computes regression coefficients for a penalized generalized linear models for a given lambda value for response variable following Poisson distribution.
Usage
extlasso.pois.lambda(n,p,p1,sumy,beta0.old,beta1.old,x,y,dxkx0,dxkx1,
tau,lambda1,alpha,tol,maxiter,eps,xbeta.old,mu1)
Arguments
n |
Number of observations |
p |
Number of predictors |
p1 |
Number of active predictors |
sumy |
Sum of y values |
beta0.old |
Initial value of intercept |
beta1.old |
A vector of initial values of slope coefficients |
x |
A n by p1 matrix of predictors |
y |
A vector of n observations |
dxkx0 |
In case of a model with intercept, first diagonal of X'X |
dxkx1 |
Diagonals of X'X |
tau |
Elastic net paramter. Default is 1 |
lambda1 |
The value of lambda |
alpha |
Approximation to be used for absolute value. Default is 10^-6 |
tol |
Tolerance criterion. Default is 10^-6 |
maxiter |
Maximum number of iterations. Default is 10000 |
eps |
value for which beta is set to zero if -eps<beta<eps. Default is 10^-6 |
xbeta.old |
A n by 1 vector of xbeta values |
mu1 |
The value of mu at beta.old |
Details
This function is internal and used by extlasso.poisson function. User need not call this function.
Value
A list with following components
beta0.new |
Intercept estimate |
beta1.new |
Slope coefficient estimates |
conv |
"yes" means converged and "no" means did not converge |
iter |
Number of iterations to estimate the coefficients |
ofv.new |
Objective function value at solution |
xbeta.new |
xbeta values at solution |
mu1 |
Value of mu at solution |
Author(s)
B N Mandal and Jun Ma
Entire regularization path of penalized generalized linear model for poisson family using modified Jacobi Algorithm
Description
The function computes coefficients of a penalized generalized linear model for poisson family using modified Jacobi Algorithm for a sequence of lambda values. Currently lasso and elastic net penalty are supported.
Usage
extlasso.poisson(x,y,intercept=TRUE,normalize=TRUE,tau=1,alpha=1e-12,
eps=1e-6,tol=1e-6,maxiter=1e5,nstep=100,min.lambda=1e-4)
Arguments
x |
x is matrix of order n x p where n is number of observations and p is number of predictor variables. Rows should represent observations and columns should represent predictor variables. |
y |
y is a vector of response variable of order n x 1. y should follow poisson distribution and y should be a vector of counts. |
intercept |
If TRUE, model includes intercept, else the model does not have intercept. |
normalize |
If TRUE, columns of x matrix are normalized with mean 0 and norm 1 prior to fitting the model. The coefficients at end are returned on the original scale. Default is normalize = TRUE. |
tau |
Elastic net parameter, |
alpha |
The quantity in approximating |
eps |
A value which is used to set a coefficient to zero if coefficients value is within - eps to + eps. Default is eps = 1e-6. |
tol |
Tolerance criteria for convergence of solutions. Default is tol = 1e-6. |
maxiter |
Maximum number of iterations permissible for solving optimization problem for a particular lambda. Default is 10000. Rarely you need to change this to higher value. |
nstep |
Number of steps from maximum value of lambda to minimum value of lambda. Default is nstep = 100. |
min.lambda |
Minimum value of lambda. Default is min.lambda=1e-4. |
Value
An object of class ‘extlasso’ for which plot, predict and coef method exists. The object has following components:
beta0 |
A vector of order nstep of intercept estimates. Each value denote an estimate for a particular lambda. Corresponding lambda values are available in ‘lambdas’ element of the ‘extlasso’ object. |
coef |
A matrix of order nstep x p of slope coefficients. Each row denote solution for a particular lambda. Corresponding lambda values are available in ‘lambdas’ element of the ‘extlasso’ object. |
lambdas |
Sequence of lambda values for which coefficients are obtained |
L1norm |
L1norm of the coefficients |
norm.frac |
Fractions of norm computed as L1 norm at current lambda divided by maximum L1 norm |
lambda.iter |
Number of iterations used for different lambdas |
of.value |
Objective function values |
normx |
Norm of x variables |
Author(s)
B N Mandal and Jun Ma
References
Mandal, B.N. and Jun Ma, (2014). A Jacobi-Armijo Algorithm for LASSO and its Extensions.
Examples
x=matrix(rnorm(100*30),100,30)
y=sample(c(1:5),100,replace=TRUE)
g1=extlasso.poisson(x,y)
plot(g1)
plot(g1,xvar="lambda")
g1$of.value
Coefficients of fused lasso penalized regression for a given pair of lambda1 and lambda2 values
Description
The function computes regression coefficients for a fused lasso penalized regression model for a given pair of lambda1 and lambda2 values.
Usage
fl.lambda(n,p,x,y,xpx,dxpx,xpy,beta.old,ofv.old,alpha,
lambda1,lambda2,tol,maxiter,eps,xbeta.old)
Arguments
n |
Number of observations |
p |
Number of predictors. |
x |
A n by l matrix of predictors. Here n is number of observations, l is number of active variables. |
y |
a vector of n observations. |
xpx |
The X'X matrix |
dxpx |
A vector of order l of diagonal elements of x'x |
xpy |
A vector of order l containing x'y |
beta.old |
A vector initial values of beta. Optional |
ofv.old |
Objective function value at beta.old |
alpha |
Approximation to be used for absolute value. Default is 10^-6. |
lambda1 |
The value of lambda1 |
lambda2 |
The value of lambda2 |
tol |
Tolerance criterion. Default is 10^-7 |
maxiter |
Maximum number of iterations. Default is 100000. |
eps |
Value for which beta is set to zero if -eps<beta<eps. Default is 10^-6 |
xbeta.old |
A n by 1 vector of xbeta values. Optional |
Details
This function is internal and used by fusedlasso function. User need not call this function.
Value
A list with following components
beta.new |
Coefficient estimates |
conv |
"yes" means converged and "no" means did not converge |
iter |
Number of iterations to estimate the coefficients |
ofv.new |
Objective function value at solution |
Author(s)
B N Mandal and Jun Ma
Particular fold of a data after k fold partition
Description
The function returns a particular fold after k-fold partitioning by kfold function.
Usage
fold(data1,k,i)
Arguments
data1 |
A matrix. |
k |
Number of folds |
i |
The fold to be returned |
Details
This function is internal and used by cross validation routines.
Value
A matrix with desired fold specified.
Author(s)
B N Mandal and Jun Ma
Examples
data=matrix(rnorm(10*4),10,4)
kfold(data,3)
fold(data,3,2)
Fused lasso penalized linear regression
Description
The function computes coefficients of a fused lasso penalized linear regression model using modified Jacobi gradient descent Algorithm for a pair of lambda1 and lambda2 values.
Usage
fusedlasso(x,y,lambda1,lambda2,intercept=TRUE,normalize=TRUE,
alpha=1e-6,eps=1e-6,tol=1e-8,maxiter=1e5)
Arguments
x |
x is a matrix of order n x p where n is number of observations and p is number of predictor variables. Rows should represent observations and columns should represent predictor variables. |
y |
y is a vector of response variable of order n x 1. |
lambda1 |
The value of lambda1 |
lambda2 |
The value of lambda2 |
intercept |
If TRUE, model includes intercept, else the model does not have intercept. |
normalize |
If TRUE, columns of x matrix are normalized with mean 0 and norm 1 prior to fitting the model. The coefficients at end are returned on the original scale. Default is normalize = TRUE. |
alpha |
The quantity in approximating |
eps |
A value which is used to set a coefficient to zero if coefficients value is within - eps to + eps. Default is eps = 1e-6. |
tol |
Tolerance criteria for convergence of solutions. Default is tol = 1e-6. |
maxiter |
Maximum number of iterations permissible for solving optimization problem for a particular lambda. Default is 10000. Rarely you need to change this to higher value. |
Value
An object of class ‘extlasso’ with following components:
intercept |
Value of intercept: TRUE or FALSE as used in input |
coef |
A vector of order (p+1) if intercept is TRUE, first element being estimates of intercept or a vector of order p if intercept is FALSE. Here p is number of predictor variables. |
lambda1 |
The value of lambda1 |
lambda2 |
The value of lambda2 |
L1norm |
L1norm of the coefficients |
lambda.iter |
Number of iterations |
of.value |
Objective function value |
Author(s)
B N Mandal and Jun Ma
References
Mandal, B.N. and Jun Ma, (2014). A Jacobi-Armijo Algorithm for LASSO and its Extensions.
Examples
n=50
p=100
rho=0
beta=rep(0,p)
beta[1:20]=1
beta[11:15]=2
beta[25]=3
beta[41:45]=1
x=matrix(rnorm(n*p),n,p)
y=x%*%beta+rnorm(n,0,0.5)
f1<-fusedlasso(x,y,lambda1=0.1,lambda2=1)
plot(beta,col="blue",type="b",pch=1,ylim=range(beta,f1$coef))
lines(f1$coef,type="b",lty=1,col="black")
legend("topright",pch=1,lty=1,merge=TRUE,text.col=c("blue","black"),legend=c("True","Fitted"))
k-fold partition of data at random
Description
The function partitions a data set into k folds of equal sizes at random.
Usage
kfold(data1,k)
Arguments
data1 |
A matrix. |
k |
Number of folds |
Details
This function is internal and used by cross validation routines.
Value
A matrix with fold identification in first column.
Author(s)
B N Mandal and Jun Ma
Examples
data=matrix(rnorm(10*4),10,4)
kfold(data,3)
Deviances for hold out data in cross validation
Description
The function computes deviances for hold out data of ith fold
Usage
msefun.binomial(lambda1,f1,xi,yi)
Arguments
lambda1 |
value of lambda |
f1 |
A fitted ‘extlasso’ object |
xi |
Hold out data of predictor variables |
yi |
Hold out data of response variables |
Details
This function is internal and used by cross validation routines.
Value
A value of deviance
Author(s)
B N Mandal and Jun Ma
Prediction means squared errors for hold out data in cross validation
Description
The function computes Prediction means squared errors for hold out data of ith fold
Usage
msefun.normal(lambda1,f1,xi,yi)
Arguments
lambda1 |
value of lambda |
f1 |
A fitted ‘extlasso’ object |
xi |
Hold out data of predictor variables |
yi |
Hold out data of response variables |
Details
This function is internal and used by cross validation routines.
Value
A value of prediction mean squared error
Author(s)
B N Mandal and Jun Ma
Deviances for hold out data in cross validation
Description
The function computes deviances for hold out data of ith fold
Usage
msefun.poisson(lambda1,f1,xi,yi)
Arguments
lambda1 |
value of lambda |
f1 |
A fitted ‘extlasso’ object |
xi |
Hold out data of predictor variables |
yi |
Hold out data of response variables |
Details
This function is internal and used by cross validation routines.
Value
A value of deviance
Author(s)
B N Mandal and Jun Ma
Plot of regularization path
Description
Produces a plot of entire regularization path from a 'extlasso' object obtained using ‘extlasso’ function.
Usage
## S3 method for class 'extlasso'
plot(x,xvar=c("lambda","L1norm","fraction of norm"),...)
Arguments
x |
A ‘extlasso’ object obtained using ‘extlasso’ function. |
xvar |
What should be on x-axis? xvar="lambda" produces a plot of regularization path with respect to lambda, xvar="L1norm" produces a plot of regularization path with respect to L1 norm of coefficients and xvar="fraction of norm" produces a plot of regularization path with respect to fraction of norm of coefficients. Default is xvar="L1norm". |
... |
Optional graphical parameters to matplot() function |
Value
A plot of regularization path is produced.
Author(s)
B N Mandal and Jun Ma
References
Mandal, B.N. and Jun Ma, (2014). A Jacobi-Armijo Algorithm for LASSO and its Extensions.
Examples
x=matrix(rnorm(100*30),100,30)
y=rnorm(100)
g1=extlasso(x,y,family="normal")
plot(g1)
plot(g1,xvar="lambda")
x=matrix(rnorm(100*30),100,30)
y=sample(c(0,1),100,replace=TRUE)
g1=extlasso(x,y,family="binomial")
plot(g1)
plot(g1,xvar="lambda")
Prediction of coefficients of a penalized linear regression or generalized linear models
Description
The function computes estimated coefficients value at a given lambda or L1 norm or fraction of norm using a ‘extlasso’ object obtained using ‘extlasso’ function.
Usage
## S3 method for class 'extlasso'
predict(object,mode=c("fraction","norm","lambda"),at=0,...)
Arguments
object |
A ‘extlasso’ object obtained using ‘extlasso’ function. |
mode |
If mode="lambda", prediction is made for a given lambda, if mode="norm", prediction is made for a given L1 norm and if mode="fraction", prediction is made for a fraction of norm value. Default is mode="lambda" |
at |
A value at which prediction is to be made. Default is at = 0. |
... |
Not used. Other arguments to predict. |
Value
A vector of estimated coefficients of length p or p+1 at the given value of lambda or L1 norm or fraction of norm, depending on intercept=TRUE or FALSE in ‘extlasso’ object. Here p is number of predictor variables.
Author(s)
B N Mandal and Jun Ma
References
Mandal, B.N. and Jun Ma, (2014). A Jacobi-Armijo Algorithm for LASSO and its Extensions.
Examples
x=matrix(rnorm(100*30),100,30)
y=sample(c(0,1),100,replace=TRUE)
g1=extlasso(x,y,family="binomial")
predict(g1,mode="lambda",at=0.1)
predict(g1,mode="L1norm",at=1)
predict(g1,mode="fraction",at=0.5)
x=matrix(rnorm(100*30),100,30)
y=rnorm(100)
g1=extlasso(x,y,family="normal")
predict(g1,mode="lambda",at=0.09)
predict(g1,mode="L1norm",at=0.6)
predict(g1,mode="fraction",at=0.8)