Showing 23 of total 23 results (show query)
kisungyou
ADMM:Algorithms using Alternating Direction Method of Multipliers
Provides algorithms to solve popular optimization problems in statistics such as regression or denoising based on Alternating Direction Method of Multipliers (ADMM). See Boyd et al (2010) <doi:10.1561/2200000016> for complete introduction to the method.
Maintained by Kisung You. Last updated 4 years ago.
77.5 match 6 stars 6.08 score 15 scripts 9 dependentsmgallow
ADMMsigma:Penalized Precision Matrix Estimation via ADMM
Estimates a penalized precision matrix via the alternating direction method of multipliers (ADMM) algorithm. It currently supports a general elastic-net penalty that allows for both ridge and lasso-type penalties as special cases. This package is an alternative to the 'glasso' package. See Boyd et al (2010) <doi:10.1561/2200000016> for details regarding the estimation method.
Maintained by Matt Galloway. Last updated 7 years ago.
admmcovariance-matrixglassolassoprecision-matrixridgeopenblascpp
24.0 match 4 stars 4.86 score 12 scriptsosqp
osqp:Quadratic Programming Solver using the 'OSQP' Library
Provides bindings to the 'OSQP' solver. The 'OSQP' solver is a numerical optimization package or solving convex quadratic programs written in 'C' and based on the alternating direction method of multipliers. See <doi:10.48550/arXiv.1711.08013> for details.
Maintained by Balasubramanian Narasimhan. Last updated 9 months ago.
admmconvex-optimizationlassomachine-learningoperator-splittingquadratic-programmingcpp
11.0 match 12 stars 8.21 score 47 scripts 65 dependentsegpivo
SpatPCA:Regularized Principal Component Analysis for Spatial Data
Provide regularized principal component analysis incorporating smoothness, sparseness and orthogonality of eigen-functions by using the alternating direction method of multipliers algorithm (Wang and Huang, 2017, <DOI:10.1080/10618600.2016.1157483>). The method can be applied to either regularly or irregularly spaced data, including 1D, 2D, and 3D.
Maintained by Wen-Ting Wang. Last updated 7 months ago.
admmcovariance-estimationeigenfunctionslassomatrix-factorizationpcarcpparmadillorcppparallelregularizationspatialspatial-data-analysissplinesopenblascppopenmp
11.0 match 20 stars 5.53 score 17 scriptscran
cqrReg:Quantile, Composite Quantile Regression and Regularized Versions
Estimate quantile regression(QR) and composite quantile regression (cqr) and with adaptive lasso penalty using interior point (IP), majorize and minimize(MM), coordinate descent (CD), and alternating direction method of multipliers algorithms(ADMM).
Maintained by Jueyu Gao. Last updated 3 years ago.
21.9 match 5 stars 2.48 score 20 scripts 1 dependentsegpivo
SpatMCA:Regularized Spatial Maximum Covariance Analysis
Provide regularized maximum covariance analysis incorporating smoothness, sparseness and orthogonality of couple patterns by using the alternating direction method of multipliers algorithm. The method can be applied to either regularly or irregularly spaced data, including 1D, 2D, and 3D (Wang and Huang, 2018 <doi:10.1002/env.2481>).
Maintained by Wen-Ting Wang. Last updated 7 months ago.
admmccacross-covariancelassomatrix-factorizationrcpparmadillorcppparallelsplinesopenblascppopenmp
11.0 match 5 stars 3.40 score 4 scriptsegpivo
QuantRegGLasso:Adaptively Weighted Group Lasso for Semiparametric Quantile Regression Models
Implements an adaptively weighted group Lasso procedure for simultaneous variable selection and structure identification in varying coefficient quantile regression models and additive quantile regression models with ultra-high dimensional covariates. The methodology, grounded in a strong sparsity condition, establishes selection consistency under certain weight conditions. To address the challenge of tuning parameter selection in practice, a BIC-type criterion named high-dimensional information criterion (HDIC) is proposed. The Lasso procedure, guided by HDIC-determined tuning parameters, maintains selection consistency. Theoretical findings are strongly supported by simulation studies. (Toshio Honda, Ching-Kang Ing, Wei-Ying Wu, 2019, <DOI:10.3150/18-BEJ1091>).
Maintained by Wen-Ting Wang. Last updated 4 months ago.
admmgroup-lassohigh-dimensionalquantile-regressionrcpprcpparmadilloopenblascpp
11.0 match 2 stars 3.30 score 2 scriptswleoncio
MADMMplasso:Multi Variate Multi Response ADMM with Interaction Effects
This system allows one to model a multi-variate, multi-response problem with interaction effects. It combines the usual squared error loss for the multi-response problem with some penalty terms to encourage responses that correlate to form groups and also allow for modeling main and interaction effects that exit within the covariates. The optimization method employed is the Alternating Direction Method of Multipliers (ADMM). The implementation is based on the methodology presented on Quachie Asenso, T., & Zucknick, M. (2023) <doi:10.48550/arXiv.2303.11155>.
Maintained by Waldir Leoncio. Last updated 2 months ago.
8.5 match 1.70 scoredakep
pense:Penalized Elastic Net S/MM-Estimator of Regression
Robust penalized (adaptive) elastic net S and M estimators for linear regression. The methods are proposed in Cohen Freue, G. V., Kepplinger, D., Salibián-Barrera, M., and Smucler, E. (2019) <https://projecteuclid.org/euclid.aoas/1574910036>. The package implements the extensions and algorithms described in Kepplinger, D. (2020) <doi:10.14288/1.0392915>.
Maintained by David Kepplinger. Last updated 8 months ago.
linear-regressionpenseregressionrobust-regresssionrobust-statisticsopenblascppopenmp
1.8 match 4 stars 6.06 score 48 scriptscran
TraceAssist:Nonparametric Trace Regression via Sign Series Representation
Efficient method for fitting nonparametric matrix trace regression model. The detailed description can be found in C. Lee, L. Li, H. Zhang, and M. Wang (2021). Nonparametric Trace Regression via Sign Series Representation. <arXiv:2105.01783>. The method employs the aggregation of structured sign series for trace regression (ASSIST) algorithm.
Maintained by Chanwoo Lee. Last updated 4 years ago.
5.2 match 1.70 scorepbombina
admmDensestSubmatrix:Alternating Direction Method of Multipliers to Solve Dense Dubmatrix Problem
Solves the problem of identifying the densest submatrix in a given or sampled binary matrix, Bombina et al. (2019) <arXiv:1904.03272>.
Maintained by Polina Bombina. Last updated 5 years ago.
3.0 match 2.70 scorebips-hb
CVN:Covariate-Varying Networks
Inferring high-dimensional Gaussian graphical networks that change with multiple discrete covariates. Louis Dijkstra, Arne Godt, Ronja Foraita (2024) <arXiv:2407.19978>.
Maintained by Ronja Foraita. Last updated 1 months ago.
graphical-modelshigh-dimensional-statisticsnetwork-analysiscpp
1.7 match 3.70 score 7 scriptsdppalomar
spectralGraphTopology:Learning Graphs from Data via Spectral Constraints
In the era of big data and hyperconnectivity, learning high-dimensional structures such as graphs from data has become a prominent task in machine learning and has found applications in many fields such as finance, health care, and networks. 'spectralGraphTopology' is an open source, documented, and well-tested R package for learning graphs from data. It provides implementations of state of the art algorithms such as Combinatorial Graph Laplacian Learning (CGL), Spectral Graph Learning (SGL), Graph Estimation based on Majorization-Minimization (GLE-MM), and Graph Estimation based on Alternating Direction Method of Multipliers (GLE-ADMM). In addition, graph learning has been widely employed for clustering, where specific algorithms are available in the literature. To this end, we provide an implementation of the Constrained Laplacian Rank (CLR) algorithm.
Maintained by Ze Vinicius. Last updated 2 years ago.
0.5 match 2 stars 5.91 score 135 scripts 1 dependentscran
HSDiC:Homogeneity and Sparsity Detection Incorporating Prior Constraint Information
We explore sparsity and homogeneity of regression coefficients incorporating prior constraint information. A general pairwise fusion approach is proposed to deal with the sparsity and homogeneity detection when combining prior convex constraints. We develop an modified alternating direction method of multipliers algorithm (ADMM) to obtain the estimators.
Maintained by Yaguang Li. Last updated 6 years ago.
2.2 match 1 stars 1.00 score 1 scriptsgumeo
accSDA:Accelerated Sparse Discriminant Analysis
Implementation of sparse linear discriminant analysis, which is a supervised classification method for multiple classes. Various novel optimization approaches to this problem are implemented including alternating direction method of multipliers ('ADMM'), proximal gradient (PG) and accelerated proximal gradient ('APG') (See Atkins 'et al'. <arXiv:1705.07194>). Functions for performing cross validation are also supplied along with basic prediction and plotting functions. Sparse zero variance discriminant analysis ('SZVD') is also included in the package (See Ames and Hong, <arXiv:1401.5492>). See the 'github' wiki for a more extended description.
Maintained by Gudmundur Einarsson. Last updated 1 years ago.
0.5 match 5 stars 3.40 score 10 scriptschongwu-biostat
prclust:Penalized Regression-Based Clustering Method
Clustering is unsupervised and exploratory in nature. Yet, it can be performed through penalized regression with grouping pursuit. In this package, we provide two algorithms for fitting the penalized regression-based clustering (PRclust) with non-convex grouping penalties, such as group truncated lasso, MCP and SCAD. One algorithm is based on quadratic penalty and difference convex method. Another algorithm is based on difference convex and ADMM, called DC-ADD, which is more efficient. Generalized cross validation and stability based method were provided to select the tuning parameters. Rand index, adjusted Rand index and Jaccard index were provided to estimate the agreement between estimated cluster memberships and the truth.
Maintained by Chong Wu. Last updated 8 years ago.
0.5 match 2.70 score 6 scriptsmaurobernardi
fdaSP:Sparse Functional Data Analysis Methods
Provides algorithms to fit linear regression models under several popular penalization techniques and functional linear regression models based on Majorizing-Minimizing (MM) and Alternating Direction Method of Multipliers (ADMM) techniques. See Boyd et al (2010) <doi:10.1561/2200000016> for complete introduction to the method.
Maintained by Mauro Bernardi. Last updated 1 years ago.
0.5 match 1.00 scoreaaamini
sbmSDP:Semidefinite Programming for Fitting Block Models of Equal Block Sizes
An ADMM implementation of SDP-1, a semidefinite programming relaxation of the maximum likelihood estimator for fitting a block model. SDP-1 has a tendency to produce equal-sized blocks and is ideal for producing a form of network histogram approximating a nonparametric graphon model. Alternatively, it can be used for community detection. (This is experimental code, proceed with caution.)
Maintained by Arash A. Amini. Last updated 10 years ago.
0.5 match 1 stars 1.00 score 1 scriptsxylam
DWDLargeR:Fast Algorithms for Large Scale Generalized Distance Weighted Discrimination
Solving large scale distance weighted discrimination. The main algorithm is a symmetric Gauss-Seidel based alternating direction method of multipliers (ADMM) method. See Lam, X.Y., Marron, J.S., Sun, D.F., and Toh, K.C. (2018) <doi:10.48550/arXiv.1604.05473> for more details.
Maintained by Xin-Yee Lam. Last updated 7 months ago.
0.5 match 1.00 score 2 scriptscran
ddpca:Diagonally Dominant Principal Component Analysis
Efficient procedures for fitting the DD-PCA (Ke et al., 2019, <arXiv:1906.00051>) by decomposing a large covariance matrix into a low-rank matrix plus a diagonally dominant matrix. The implementation of DD-PCA includes the convex approach using the Alternating Direction Method of Multipliers (ADMM) and the non-convex approach using the iterative projection algorithm. Applications of DD-PCA to large covariance matrix estimation and global multiple testing are also included in this package.
Maintained by Fan Yang. Last updated 6 years ago.
0.5 match 1.00 score