quality and high quality medicine. To meet the
requirement, we need to select the effective features
from Traditional Chinese medicine fingerprint.
Because the fingerprint characteristics is suitable
for the methods we said before.so we try to apply
LASSO to identify the quality of Traditional
Chinese medicine fingerprint.
Figure 1: Traditional Chinese Medicine Fingerprint.
We treat each fingerprint as a observe (
i
), the
value of
j
th peak is value of
j
th feature (
ij
), and
i
y
is symbol of medicine type identifier .
So Traditional Chinese Medicine Fingerprint Model:
2
1
ˆ
arg min
N
ijij
ij
yx
j
j
ubject to t
The model has the same form as LASSO, and the
solution of is also availiable for this model.
4 CONCLUSIONS
LASSO is a wonderful method for variable selection.
However, nothing is perfect. To solve the weakness
of LASSO, many techniques based on LASSO have
been proposed. The fused LASSO is a most widely
used method now, because it can meet the demands
of many actual problems. But there is still no
efficient computation method of Fused LASSO to
solve complicate problems. Adaptive LASSO is a
creative procedure which penalizes each coefficient
with different weights. “Oracles Properties” is a
good feature of Adaptive LASSO. Relaxed LASSO
was raised to overcome the correlation of variables
which has negative influence on predict accuracy of
regression model. The solution of Group LASSO is
sparse on the level of groups of variables. SCAD
hold properties of sparse, continuous and unbiased.
Elastic Net which holds advantages of both Ridge
Regression and LASSO is another popular
procedure.
REFERENCES
Breiman, L., 1995. Better Subset Regression Using the
Nonnegative Garrote. Technometrics, 37, 373-384.
E. Frank and Jerome H. Friedman, 1993. A Statistical
View of Some Chemometrics Regression Tools.
Technometrics, 35(2), 109-135.
E. Hoerl, Robert W. Kennard., 1970. Ridge Regression:
Applications to Nonorthogonal Problems.
Technometrics, 12(1), 69-82.
Tibshirani, R., 1996. Regression shrinkage and selection
via the Lasso. J. Roy. Stat. Soc. B, 58, 267–288.
Wenjiang J. Fu, 1998. Penalized Regressions: The Bridge
versus the Lasso. Journal of Computational and
Graphical Statistics, 7(3), 397-416.
Efron, B., Hastie, T., Johnstone, I., & Tibshirani, R., 2004.
Least angle regression. Ann. Stat., 32, 407–499.
Peng Zhao, Bin Yu., 2007. Stagewise Lasso. The Journal
of Machine Learning Research, 8.
Tibshirani, R., Saunders, M., Rosset, S., Zhu, Ji.,
Knight,Keith., 2005. Sparsity and smoothness via the
fused lasso. Journal of the Royal Statistical Society
Series B, 67, 91-108.
Zou, H., 2006. The Adaptive Lasso and its Oracle
Properties. Journal of the American Statistical
Association, 101, 1418-1429.
Meinshausen, N., 2007. Relaxed Lasso Computational
Statistics and Data Analysis, 52,374-393.
Yuan, M., Lin, Y., 2006. Model selection and estimation
in regression with grouped variables. Journal of the
Royal Statistical Society: Series B (Statistical
Methodology), 68(1), 49-67.
Zou, H., Hastie, T., 2005. Regularization and variable
selection via the elastic net. Journal of the Royal
Statistical Society Series B, 67, 301-320.
Peng Xiaoling, 2005.Variable Selection Methods and
Their Applications in Quantitative Structure
一
Property Relationship (QSPR).
Yuan, M., Lin, Y., 2006. Model selection and estimation
in regression with Grouped variables.Journal of the
Royal Statistical Society Series B, 68, 49-67.
Fu, W. J., 1998. Penalized regressions: the bridge VS the
lasso. Journal of Computational and Graphical
Statistics, 7,397-416.
ICEIS 2011 - 13th International Conference on Enterprise Information Systems
134