Open Access
Wuhan Univ. J. Nat. Sci.
Volume 27, Number 2, April 2022
Page(s) 128 - 134
Published online 20 May 2022
  1. Bickel P J, Levina E. Covariance regularization by thresholding [J]. The Annals of Statistics, 2008, 36(6): 2577-2604. [MathSciNet] [Google Scholar]
  2. Cai T T, Zhang C H, Zhou H H, et al. Optimal rates of convergence for covariance matrix estimation [J]. The Annals of Statistics, 2010, 38(4): 2118-2144. [MathSciNet] [Google Scholar]
  3. Rothman A J, Levina E, Zhu J. Generalized thresholding of large covariance matrices [J]. Journal of the American Statistical Association, 2009, 104(485): 177-186. [CrossRef] [MathSciNet] [Google Scholar]
  4. Xue L Z, Ma S Q, Zou H. Positive-definite 1-penalized estimation of large covariance matrices [J]. Journal of the American Statistical Association, 2012, 107(500): 1480-1491. [CrossRef] [MathSciNet] [Google Scholar]
  5. Friedman J, Hastie T, Tibshirani R. Sparse inverse covariance estimation with the graphical lasso [J]. Biostatistics, 2007, 9(3): 432-441. [Google Scholar]
  6. Rothman A J . Positive definite estimators of large covariance matrices [J]. Biometrika, 2012, 99(3): 733-740. [CrossRef] [MathSciNet] [Google Scholar]
  7. Kang X N, Deng X W. On variable ordination of cholesky-based estimation for a sparse covariance matrix [J]. The Canadian Journal of Statistics, 2021, 49(2): 283-310. [CrossRef] [MathSciNet] [Google Scholar]
  8. Choi Y G, Lim J, Roy A J P. Fixed support positive-definite modification of covariance matrix estimators via linear shrinkage [J]. Journal of Multivariate Analysis, 2019, 171: 234-249. [CrossRef] [MathSciNet] [Google Scholar]
  9. Zhang B, Zhou J, Li J. Improved covariance matrix estimators by multi-penalty regularization [C]// 2019 22th International Conference on Information Fusion (FUSION) . Washington D C: IEEE, 2019: 1-7. [Google Scholar]
  10. Friedman J, Hastie T, Hofling H, et al. Pathwise coordinate optimization [J]. The Annals of Applied Statistics, 2007, 1(2): 302-332. [CrossRef] [MathSciNet] [Google Scholar]
  11. Wu T T, Lange K. Coordinate descent algorithms for lasso penalized regression [J]. The Annals of Applied Statistics, 2008, 2(1): 224-244. [MathSciNet] [Google Scholar]
  12. Hu Y, Chi E C, Allen G I. ADMM algorithmic regularization paths for sparse statistical machine learning [C]// Splitting Methods in Communication, Imaging, Science, and Engineering. Berlin: Springer-Verlag, 2016: 433-459. [Google Scholar]
  13. Wahlberg B, Boyd S, Annergren M, et al. An ADMM algorithm for a class of total variation regularized estimation problems [J]. IFAC Proceedings Volumes, 2012, 45(16): 83-88. [CrossRef] [Google Scholar]
  14. Mohan K, Chung M, Han S, et al. Structured learning of gaussian graphical models [J]. Advances in Neural Information Processing Systems, 2012, 2012:629-637. [PubMed] [Google Scholar]
  15. Mohan K, London P, Fazel M, et al. Node-based learning of multiple Gaussian graphical models [J]. Journal of Machine Learning Research, 2014, 15(1): 445-488. [MathSciNet] [PubMed] [Google Scholar]
  16. Chi E C, Lange K. Splitting methods for convex clustering [J]. Journal of Computational and Graphical Statistics, 2015, 24(4): 994-1013. [CrossRef] [MathSciNet] [PubMed] [Google Scholar]
  17. Boyd S, Parikh N, Chu E, et al. Distributed optimization and statistical learning via the alternating direction method of multipliers [J]. Foundations & Trends in Machine Learning, 2010, 3(1): 1-122. [Google Scholar]
  18. He B S, Yang H, Wang S L. Alternating direction method with self adaptive penalty parameters for monotone variational inequalities [J]. Journal of Optimization Theory & Applications, 2000, 106(2): 337-356. [CrossRef] [MathSciNet] [Google Scholar]

Current usage metrics show cumulative count of Article Views (full-text article views including HTML views, PDF and ePub downloads, according to the available data) and Abstracts Views on Vision4Press platform.

Data correspond to usage on the plateform after 2015. The current usage metrics is available 48-96 hours after online publication and is updated daily on week days.

Initial download of the metrics may take a while.