MIT CogNet, The Brain Sciences ConnectionFrom the MIT Press, Link to Online Catalog
SPARC Communities
Subscriber : Stanford University Libraries » LOG IN

space

Powered By Google 
Advanced Search

 

Lasso is Equivalent to Adaptive Quadratic Penalization

 Yves Grandvalet and Stephane Canu
  
 

Abstract:
Adaptive ridge is a special form of ridge regression, balancing the quadratic penalization on each parameter of the model. We show the equivalence between adaptive ridge and lasso (least absolute shrinkage and selection operator). This equivalence states that both procedures produce the same estimate. Least absolute shrinkage can thus be viewed as a particular quadratic penalization. From this observation, we derive a fixed-point algorithm to compute the lasso solution. We finally present a series of possible applications of this type of algorithm in regression problems: kernel regression, additive modeling and neural net training.

 
 


© 2010 The MIT Press
MIT Logo