| |
Abstract:
Adaptive ridge is a special form of ridge regression,
balancing the quadratic penalization on each parameter of the
model. We show the equivalence between adaptive ridge and lasso
(least absolute shrinkage and selection operator). This equivalence
states that both procedures produce the same estimate. Least
absolute shrinkage can thus be viewed as a particular quadratic
penalization. From this observation, we derive a fixed-point
algorithm to compute the lasso solution. We finally present a
series of possible applications of this type of algorithm in
regression problems: kernel regression, additive modeling and
neural net training.
|