Showing 33 of total 33 results (show query)
rcppcore
RcppArmadillo:'Rcpp' Integration for the 'Armadillo' Templated Linear Algebra Library
'Armadillo' is a templated C++ linear algebra library (by Conrad Sanderson) that aims towards a good balance between speed and ease of use. Integer, floating point and complex numbers are supported, as well as a subset of trigonometric and statistics functions. Various matrix decompositions are provided through optional integration with LAPACK and ATLAS libraries. The 'RcppArmadillo' package includes the header files from the templated 'Armadillo' library. Thus users do not need to install 'Armadillo' itself in order to use 'RcppArmadillo'. From release 7.800.0 on, 'Armadillo' is licensed under Apache License 2; previous releases were under licensed as MPL 2.0 from version 3.800.0 onwards and LGPL-3 prior to that; 'RcppArmadillo' (the 'Rcpp' bindings/bridge to Armadillo) is licensed under the GNU GPL version 2 or later, as is the rest of 'Rcpp'.
Maintained by Dirk Eddelbuettel. Last updated 1 days ago.
armadilloc-plus-plusrcpprcpparmadilloopenblascppopenmp
200 stars 18.85 score 1.9k scripts 3.4k dependentscovaruber
sommer:Solving Mixed Model Equations in R
Structural multivariate-univariate linear mixed model solver for estimation of multiple random effects with unknown variance-covariance structures (e.g., heterogeneous and unstructured) and known covariance among levels of random effects (e.g., pedigree and genomic relationship matrices) (Covarrubias-Pazaran, 2016 <doi:10.1371/journal.pone.0156744>; Maier et al., 2015 <doi:10.1016/j.ajhg.2014.12.006>; Jensen et al., 1997). REML estimates can be obtained using the Direct-Inversion Newton-Raphson and Direct-Inversion Average Information algorithms for the problems r x r (r being the number of records) or using the Henderson-based average information algorithm for the problem c x c (c being the number of coefficients to estimate). Spatial models can also be fitted using the two-dimensional spline functionality available.
Maintained by Giovanny Covarrubias-Pazaran. Last updated 3 days ago.
average-informationmixed-modelsrcpparmadilloopenblascppopenmp
44 stars 12.63 score 300 scripts 10 dependentsmlampros
ClusterR:Gaussian Mixture Models, K-Means, Mini-Batch-Kmeans, K-Medoids and Affinity Propagation Clustering
Gaussian mixture models, k-means, mini-batch-kmeans, k-medoids and affinity propagation clustering with the option to plot, validate, predict (new data) and estimate the optimal number of clusters. The package takes advantage of 'RcppArmadillo' to speed up the computationally intensive parts of the functions. For more information, see (i) "Clustering in an Object-Oriented Environment" by Anja Struyf, Mia Hubert, Peter Rousseeuw (1997), Journal of Statistical Software, <doi:10.18637/jss.v001.i04>; (ii) "Web-scale k-means clustering" by D. Sculley (2010), ACM Digital Library, <doi:10.1145/1772690.1772862>; (iii) "Armadillo: a template-based C++ library for linear algebra" by Sanderson et al (2016), The Journal of Open Source Software, <doi:10.21105/joss.00026>; (iv) "Clustering by Passing Messages Between Data Points" by Brendan J. Frey and Delbert Dueck, Science 16 Feb 2007: Vol. 315, Issue 5814, pp. 972-976, <doi:10.1126/science.1136800>.
Maintained by Lampros Mouselimis. Last updated 9 months ago.
affinity-propagationcpp11gmmkmeanskmedoids-clusteringmini-batch-kmeansrcpparmadilloopenblascppopenmp
84 stars 11.08 score 640 scripts 24 dependentsnorskregnesentral
shapr:Prediction Explanation with Dependence-Aware Shapley Values
Complex machine learning models are often hard to interpret. However, in many situations it is crucial to understand and explain why a model made a specific prediction. Shapley values is the only method for such prediction explanation framework with a solid theoretical foundation. Previously known methods for estimating the Shapley values do, however, assume feature independence. This package implements methods which accounts for any feature dependence, and thereby produces more accurate estimates of the true Shapley values. An accompanying 'Python' wrapper ('shaprpy') is available through the GitHub repository.
Maintained by Martin Jullum. Last updated 2 days ago.
explainable-aiexplainable-mlrcpprcpparmadilloshapleyopenblascppopenmp
154 stars 10.59 score 175 scripts 1 dependentsbioc
BASiCS:Bayesian Analysis of Single-Cell Sequencing data
Single-cell mRNA sequencing can uncover novel cell-to-cell heterogeneity in gene expression levels in seemingly homogeneous populations of cells. However, these experiments are prone to high levels of technical noise, creating new challenges for identifying genes that show genuine heterogeneous expression within the population of cells under study. BASiCS (Bayesian Analysis of Single-Cell Sequencing data) is an integrated Bayesian hierarchical model to perform statistical analyses of single-cell RNA sequencing datasets in the context of supervised experiments (where the groups of cells of interest are known a priori, e.g. experimental conditions or cell types). BASiCS performs built-in data normalisation (global scaling) and technical noise quantification (based on spike-in genes). BASiCS provides an intuitive detection criterion for highly (or lowly) variable genes within a single group of cells. Additionally, BASiCS can compare gene expression patterns between two or more pre-specified groups of cells. Unlike traditional differential expression tools, BASiCS quantifies changes in expression that lie beyond comparisons of means, also allowing the study of changes in cell-to-cell heterogeneity. The latter can be quantified via a biological over-dispersion parameter that measures the excess of variability that is observed with respect to Poisson sampling noise, after normalisation and technical noise removal. Due to the strong mean/over-dispersion confounding that is typically observed for scRNA-seq datasets, BASiCS also tests for changes in residual over-dispersion, defined by residual values with respect to a global mean/over-dispersion trend.
Maintained by Catalina Vallejos. Last updated 5 months ago.
immunooncologynormalizationsequencingrnaseqsoftwaregeneexpressiontranscriptomicssinglecelldifferentialexpressionbayesiancellbiologybioconductor-packagegene-expressionrcpprcpparmadilloscrna-seqsingle-cellopenblascppopenmp
83 stars 10.14 score 368 scripts 1 dependentshypertidy
fasterize:Fast Polygon to Raster Conversion
Provides a drop-in replacement for rasterize() from the 'raster' package that takes polygon vector or data frame objects, and is much faster. There is support for the main options provided by the rasterize() function, including setting the field used and background value, and options for aggregating multi-layer rasters. Uses the scan line algorithm attributed to Wylie et al. (1967) <doi:10.1145/1465611.1465619>. Note that repository originally was hosted at 'Github' 'ecohealthalliance/fasterize' but was migrated to 'hypertidy/fasterize' in March 2025, and can be found indexed on 'R universe' <https://cran.r-universe.dev/fasterize>.
Maintained by Michael Sumner. Last updated 21 days ago.
rasterrcpprcpparmadillosfspatialcpp
182 stars 10.05 score 14 dependentsmlampros
OpenImageR:An Image Processing Toolkit
Incorporates functions for image preprocessing, filtering and image recognition. The package takes advantage of 'RcppArmadillo' to speed up computationally intensive functions. The histogram of oriented gradients descriptor is a modification of the 'findHOGFeatures' function of the 'SimpleCV' computer vision platform, the average_hash(), dhash() and phash() functions are based on the 'ImageHash' python library. The Gabor Feature Extraction functions are based on 'Matlab' code of the paper, "CloudID: Trustworthy cloud-based and cross-enterprise biometric identification" by M. Haghighat, S. Zonouz, M. Abdel-Mottaleb, Expert Systems with Applications, vol. 42, no. 21, pp. 7905-7916, 2015, <doi:10.1016/j.eswa.2015.06.025>. The 'SLIC' and 'SLICO' superpixel algorithms were explained in detail in (i) "SLIC Superpixels Compared to State-of-the-art Superpixel Methods", Radhakrishna Achanta, Appu Shaji, Kevin Smith, Aurelien Lucchi, Pascal Fua, and Sabine Suesstrunk, IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 34, num. 11, p. 2274-2282, May 2012, <doi:10.1109/TPAMI.2012.120> and (ii) "SLIC Superpixels", Radhakrishna Achanta, Appu Shaji, Kevin Smith, Aurelien Lucchi, Pascal Fua, and Sabine Suesstrunk, EPFL Technical Report no. 149300, June 2010.
Maintained by Lampros Mouselimis. Last updated 2 years ago.
filteringgabor-feature-extractiongabor-filtershog-featuresimageimage-hashingprocessingrcpparmadillorecognitionslicslicosuperpixelsopenblascppopenmp
60 stars 9.86 score 358 scripts 8 dependentsmlampros
KernelKnn:Kernel k Nearest Neighbors
Extends the simple k-nearest neighbors algorithm by incorporating numerous kernel functions and a variety of distance metrics. The package takes advantage of 'RcppArmadillo' to speed up the calculation of distances between observations.
Maintained by Lampros Mouselimis. Last updated 2 years ago.
cpp11distance-metrickernel-methodsknnrcpparmadilloopenblascppopenmp
17 stars 9.16 score 54 scripts 13 dependentssmac-group
simts:Time Series Analysis Tools
A system contains easy-to-use tools as a support for time series analysis courses. In particular, it incorporates a technique called Generalized Method of Wavelet Moments (GMWM) as well as its robust implementation for fast and robust parameter estimation of time series models which is described, for example, in Guerrier et al. (2013) <doi: 10.1080/01621459.2013.799920>. More details can also be found in the paper linked to via the URL below.
Maintained by Stéphane Guerrier. Last updated 2 years ago.
rcpprcpparmadillosimulationtime-seriestimeseriestimeseries-dataopenblascpp
15 stars 7.68 score 59 scripts 4 dependentscoatless-rpkg
RcppEnsmallen:Header-Only C++ Mathematical Optimization Library for 'Armadillo'
'Ensmallen' is a templated C++ mathematical optimization library (by the 'MLPACK' team) that provides a simple set of abstractions for writing an objective function to optimize. Provided within are various standard and cutting-edge optimizers that include full-batch gradient descent techniques, small-batch techniques, gradient-free optimizers, and constrained optimization. The 'RcppEnsmallen' package includes the header files from the 'Ensmallen' library and pairs the appropriate header files from 'armadillo' through the 'RcppArmadillo' package. Therefore, users do not need to install 'Ensmallen' nor 'Armadillo' to use 'RcppEnsmallen'. Note that 'Ensmallen' is licensed under 3-Clause BSD, 'Armadillo' starting from 7.800.0 is licensed under Apache License 2, 'RcppArmadillo' (the 'Rcpp' bindings/bridge to 'Armadillo') is licensed under the GNU GPL version 2 or later. Thus, 'RcppEnsmallen' is also licensed under similar terms. Note that 'Ensmallen' requires a compiler that supports 'C++14' and 'Armadillo' 10.8.2 or later.
Maintained by James Joseph Balamuta. Last updated 4 months ago.
armadillocpp11ensmallenoptimizationrcpprcpparmadilloopenblascppopenmp
31 stars 7.67 score 1 scripts 14 dependentsmlampros
textTinyR:Text Processing for Small or Big Data Files
It offers functions for splitting, parsing, tokenizing and creating a vocabulary for big text data files. Moreover, it includes functions for building a document-term matrix and extracting information from those (term-associations, most frequent terms). It also embodies functions for calculating token statistics (collocations, look-up tables, string dissimilarities) and functions to work with sparse matrices. Lastly, it includes functions for Word Vector Representations (i.e. 'GloVe', 'fasttext') and incorporates functions for the calculation of (pairwise) text document dissimilarities. The source code is based on 'C++11' and exported in R through the 'Rcpp', 'RcppArmadillo' and 'BH' packages.
Maintained by Lampros Mouselimis. Last updated 1 years ago.
bhboostcpp11processingrcpprcpparmadillotextopenblascppopenmp
39 stars 7.64 score 244 scripts 1 dependentsmlampros
elmNNRcpp:The Extreme Learning Machine Algorithm
Training and predict functions for Single Hidden-layer Feedforward Neural Networks (SLFN) using the Extreme Learning Machine (ELM) algorithm. The ELM algorithm differs from the traditional gradient-based algorithms for very short training times (it doesn't need any iterative tuning, this makes learning time very fast) and there is no need to set any other parameters like learning rate, momentum, epochs, etc. This is a reimplementation of the 'elmNN' package using 'RcppArmadillo' after the 'elmNN' package was archived. For more information, see "Extreme learning machine: Theory and applications" by Guang-Bin Huang, Qin-Yu Zhu, Chee-Kheong Siew (2006), Elsevier B.V, <doi:10.1016/j.neucom.2005.12.126>.
Maintained by Lampros Mouselimis. Last updated 2 years ago.
armadilloelmextreme-learning-machinercpparmadilloopenblascppopenmp
14 stars 7.06 score 39 scripts 7 dependentsjmgirard
circumplex:Analysis and Visualization of Circular Data
Circumplex models, which organize constructs in a circle around two underlying dimensions, are popular for studying interpersonal functioning, mood/affect, and vocational preferences/environments. This package provides tools for analyzing and visualizing circular data, including scoring functions for relevant instruments and a generalization of the bootstrapped structural summary method from Zimmermann & Wright (2017) <doi:10.1177/1073191115621795> and functions for creating publication-ready tables and figures from the results.
Maintained by Jeffrey Girard. Last updated 5 months ago.
circularcircumplexdata-analysisggplot2interpersonalpsychologyrcpparmadillotidyverseopenblascppopenmp
11 stars 6.54 score 52 scriptspolkas
miceFast:Fast Imputations Using 'Rcpp' and 'Armadillo'
Fast imputations under the object-oriented programming paradigm. Moreover there are offered a few functions built to work with popular R packages such as 'data.table' or 'dplyr'. The biggest improvement in time performance could be achieve for a calculation where a grouping variable have to be used. A single evaluation of a quantitative model for the multiple imputations is another major enhancement. A new major improvement is one of the fastest predictive mean matching in the R world because of presorting and binary search.
Maintained by Maciej Nasinski. Last updated 2 months ago.
cppfastfast-imputationsgroupingimputationimputationsmatrixmromultiple-imputationrcpprcpparmadillovifweightingopenblascppopenmp
20 stars 5.94 score 29 scriptsmlampros
VMDecomp:Variational Mode Decomposition
'RcppArmadillo' implementation for the Matlab code of the 'Variational Mode Decomposition' and 'Two-Dimensional Variational Mode Decomposition'. For more information, see (i) 'Variational Mode Decomposition' by K. Dragomiretskiy and D. Zosso in IEEE Transactions on Signal Processing, vol. 62, no. 3, pp. 531-544, Feb.1, 2014, <doi:10.1109/TSP.2013.2288675>; (ii) 'Two-Dimensional Variational Mode Decomposition' by Dragomiretskiy, K., Zosso, D. (2015), In: Tai, XC., Bae, E., Chan, T.F., Lysaker, M. (eds) Energy Minimization Methods in Computer Vision and Pattern Recognition. EMMCVPR 2015. Lecture Notes in Computer Science, vol 8932. Springer, <doi:10.1007/978-3-319-14612-6_15>.
Maintained by Lampros Mouselimis. Last updated 2 years ago.
rcpparmadillovariational-mode-decompositionopenblascppopenmp
8 stars 5.78 score 1 scripts 5 dependentstmsalab
hmcdm:Hidden Markov Cognitive Diagnosis Models for Learning
Fitting hidden Markov models of learning under the cognitive diagnosis framework. The estimation of the hidden Markov diagnostic classification model, the first order hidden Markov model, the reduced-reparameterized unified learning model, and the joint learning model for responses and response times.
Maintained by Sunbeom Kwon. Last updated 2 years ago.
cognitive-diagnostic-modelspsychometricsrcpprcpparmadilloopenblascppopenmp
7 stars 5.70 score 12 scriptsegpivo
SpatPCA:Regularized Principal Component Analysis for Spatial Data
Provide regularized principal component analysis incorporating smoothness, sparseness and orthogonality of eigen-functions by using the alternating direction method of multipliers algorithm (Wang and Huang, 2017, <DOI:10.1080/10618600.2016.1157483>). The method can be applied to either regularly or irregularly spaced data, including 1D, 2D, and 3D.
Maintained by Wen-Ting Wang. Last updated 7 months ago.
admmcovariance-estimationeigenfunctionslassomatrix-factorizationpcarcpparmadillorcppparallelregularizationspatialspatial-data-analysissplinesopenblascppopenmp
20 stars 5.53 score 17 scriptsuscbiostats
aphylo:Statistical Inference and Prediction of Annotations in Phylogenetic Trees
Implements a parsimonious evolutionary model to analyze and predict gene-functional annotations in phylogenetic trees as described in Vega Yon et al. (2021) <doi:10.1371/journal.pcbi.1007948>. Focusing on computational efficiency, 'aphylo' makes it possible to estimate pooled phylogenetic models, including thousands (hundreds) of annotations (trees) in the same run. The package also provides the tools for visualization of annotated phylogenies, calculation of posterior probabilities (prediction) and goodness-of-fit assessment featured in Vega Yon et al. (2021).
Maintained by George Vega Yon. Last updated 1 years ago.
annotationsinferencephylogeneticsrcpparmadillocpp
6 stars 5.49 score 104 scriptscoatless-rpkg
rgen:Random Sampling Distribution C++ Routines for Armadillo
Provides popular sampling distributions C++ routines based in armadillo through a header file approach.
Maintained by James Joseph Balamuta. Last updated 1 years ago.
armadillorandom-distributionsrcpprcpparmadillo
4 stars 5.38 score 1 scripts 4 dependentstmsalab
cIRT:Choice Item Response Theory
Jointly model the accuracy of cognitive responses and item choices within a Bayesian hierarchical framework as described by Culpepper and Balamuta (2015) <doi:10.1007/s11336-015-9484-7>. In addition, the package contains the datasets used within the analysis of the paper.
Maintained by James Joseph Balamuta. Last updated 3 years ago.
armadillobayesianchoicecognitive-diagnostic-modelsgibbs-samplingitem-response-theoryrcpparmadilloopenblascppopenmp
4 stars 5.14 score 23 scriptstmsalab
simcdm:Simulate Cognitive Diagnostic Model ('CDM') Data
Provides efficient R and 'C++' routines to simulate cognitive diagnostic model data for Deterministic Input, Noisy "And" Gate ('DINA') and reduced Reparameterized Unified Model ('rRUM') from Culpepper and Hudson (2017) <doi: 10.1177/0146621617707511>, Culpepper (2015) <doi:10.3102/1076998615595403>, and de la Torre (2009) <doi:10.3102/1076998607309474>.
Maintained by James Joseph Balamuta. Last updated 1 years ago.
cognitive-diagnostic-modelspsychometricsrcpprcpparmadillosimulationopenblascpp
4.95 score 15 scripts 2 dependentsmlampros
fastGLCM:'GLCM' Texture Features
Two 'Gray Level Co-occurrence Matrix' ('GLCM') implementations are included: The first is a fast 'GLCM' feature texture computation based on 'Python' 'Numpy' arrays ('Github' Repository, <https://github.com/tzm030329/GLCM>). The second is a fast 'GLCM' 'RcppArmadillo' implementation which is parallelized (using 'OpenMP') with the option to return all 'GLCM' features at once. For more information, see "Artifact-Free Thin Cloud Removal Using Gans" by Toizumi Takahiro, Zini Simone, Sagi Kazutoshi, Kaneko Eiji, Tsukada Masato, Schettini Raimondo (2019), IEEE International Conference on Image Processing (ICIP), pp. 3596-3600, <doi:10.1109/ICIP.2019.8803652>.
Maintained by Lampros Mouselimis. Last updated 2 years ago.
glcmrcpparmadilloopenblascppopenmp
5 stars 4.40 score 2 scriptsa91quaini
intrinsicFRP:An R Package for Factor Model Asset Pricing
Functions for evaluating and testing asset pricing models, including estimation and testing of factor risk premia, selection of "strong" risk factors (factors having nonzero population correlation with test asset returns), heteroskedasticity and autocorrelation robust covariance matrix estimation and testing for model misspecification and identification. The functions for estimating and testing factor risk premia implement the Fama-MachBeth (1973) <doi:10.1086/260061> two-pass approach, the misspecification-robust approaches of Kan-Robotti-Shanken (2013) <doi:10.1111/jofi.12035>, and the approaches based on tradable factor risk premia of Quaini-Trojani-Yuan (2023) <doi:10.2139/ssrn.4574683>. The functions for selecting the "strong" risk factors are based on the Oracle estimator of Quaini-Trojani-Yuan (2023) <doi:10.2139/ssrn.4574683> and the factor screening procedure of Gospodinov-Kan-Robotti (2014) <doi:10.2139/ssrn.2579821>. The functions for evaluating model misspecification implement the HJ model misspecification distance of Kan-Robotti (2008) <doi:10.1016/j.jempfin.2008.03.003>, which is a modification of the prominent Hansen-Jagannathan (1997) <doi:10.1111/j.1540-6261.1997.tb04813.x> distance. The functions for testing model identification specialize the Kleibergen-Paap (2006) <doi:10.1016/j.jeconom.2005.02.011> and the Chen-Fang (2019) <doi:10.1111/j.1540-6261.1997.tb04813.x> rank test to the regression coefficient matrix of test asset returns on risk factors. Finally, the function for heteroskedasticity and autocorrelation robust covariance estimation implements the Newey-West (1994) <doi:10.2307/2297912> covariance estimator.
Maintained by Alberto Quaini. Last updated 8 months ago.
factor-modelsfactor-selectionfinanceidentification-testsmisspecificationrcpparmadillorisk-premiumopenblascppopenmp
7 stars 4.39 score 1 scriptskonrad1991
paropt:Parameter Optimizing of ODE-Systems
Enable optimization of parameters of ordinary differential equations. Therefore, using 'SUNDIALS' to solve the ODE-System (see Hindmarsh, Alan C., Peter N. Brown, Keith E. Grant, Steven L. Lee, Radu Serban, Dan E. Shumaker, and Carol S. Woodward. (2005) <doi:10.1145/1089014.1089020>). Furthermore, for optimization the particle swarm algorithm is used (see: Akman, Devin, Olcay Akman, and Elsa Schaefer. (2018) <doi:10.1155/2018/9160793> and Sengupta, Saptarshi, Sanchita Basak, and Richard Peters. (2018) <doi:10.3390/make1010010>).
Maintained by Krämer Konrad. Last updated 9 months ago.
optimizationparoptparticle-swarm-optimizationrcpprcpparmadillocpp
3 stars 4.26 score 12 scriptstmsalab
dina:Bayesian Estimation of DINA Model
Estimate the Deterministic Input, Noisy "And" Gate (DINA) cognitive diagnostic model parameters using the Gibbs sampler described by Culpepper (2015) <doi:10.3102/1076998615595403>.
Maintained by James Joseph Balamuta. Last updated 5 years ago.
armadillobayesiangibbs-samplerirtitem-response-theorypsychometricsrcpprcpparmadilloopenblascpp
14 stars 3.85 score 3 scriptsegpivo
SpatMCA:Regularized Spatial Maximum Covariance Analysis
Provide regularized maximum covariance analysis incorporating smoothness, sparseness and orthogonality of couple patterns by using the alternating direction method of multipliers algorithm. The method can be applied to either regularly or irregularly spaced data, including 1D, 2D, and 3D (Wang and Huang, 2018 <doi:10.1002/env.2481>).
Maintained by Wen-Ting Wang. Last updated 7 months ago.
admmccacross-covariancelassomatrix-factorizationrcpparmadillorcppparallelsplinesopenblascppopenmp
5 stars 3.40 score 4 scriptstmsalab
slcm:Sparse Latent Class Model for Cognitive Diagnosis
Perform a Bayesian estimation of the exploratory Sparse Latent Class Model for Binary Data described by Chen, Y., Culpepper, S. A., and Liang, F. (2020) <doi:10.1007/s11336-019-09693-2>.
Maintained by James Joseph Balamuta. Last updated 2 months ago.
latent-class-modelpsychometricsrcpparmadillosparseopenblascppopenmp
2 stars 3.30 score 1 scriptsegpivo
QuantRegGLasso:Adaptively Weighted Group Lasso for Semiparametric Quantile Regression Models
Implements an adaptively weighted group Lasso procedure for simultaneous variable selection and structure identification in varying coefficient quantile regression models and additive quantile regression models with ultra-high dimensional covariates. The methodology, grounded in a strong sparsity condition, establishes selection consistency under certain weight conditions. To address the challenge of tuning parameter selection in practice, a BIC-type criterion named high-dimensional information criterion (HDIC) is proposed. The Lasso procedure, guided by HDIC-determined tuning parameters, maintains selection consistency. Theoretical findings are strongly supported by simulation studies. (Toshio Honda, Ching-Kang Ing, Wei-Ying Wu, 2019, <DOI:10.3150/18-BEJ1091>).
Maintained by Wen-Ting Wang. Last updated 5 months ago.
admmgroup-lassohigh-dimensionalquantile-regressionrcpprcpparmadilloopenblascpp
2 stars 3.30 score 2 scriptstmsalab
iccbeta:Multilevel Model Intraclass Correlation for Slope Heterogeneity
A function and vignettes for computing an intraclass correlation described in Aguinis & Culpepper (2015) <doi:10.1177/1094428114563618>. This package quantifies the share of variance in a dependent variable that is attributed to group heterogeneity in slopes.
Maintained by Steven Andrew Culpepper. Last updated 5 years ago.
armadillocorrelationintraclass-correlationrcpprcpparmadilloopenblascpp
2 stars 3.00 score 5 scriptstmsalab
edina:Bayesian Estimation of an Exploratory Deterministic Input, Noisy and Gate Model
Perform a Bayesian estimation of the exploratory deterministic input, noisy and gate (EDINA) cognitive diagnostic model described by Chen et al. (2018) <doi:10.1007/s11336-017-9579-4>.
Maintained by James Joseph Balamuta. Last updated 5 years ago.
cognitive-diagnostic-modelscppdinaecdmitem-response-theorypsychometricsrcpparmadilloopenblascpp
2 stars 3.00 score 1 scriptstmsalab
fourPNO:Bayesian 4 Parameter Item Response Model
Estimate Barton & Lord's (1981) <doi:10.1002/j.2333-8504.1981.tb01255.x> four parameter IRT model with lower and upper asymptotes using Bayesian formulation described by Culpepper (2016) <doi:10.1007/s11336-015-9477-6>.
Maintained by Steven Andrew Culpepper. Last updated 5 years ago.
armadillocognitive-diagnostic-modelsgibbs-sampleritem-response-theoryrcpprcpparmadilloopenblascppopenmp
1 stars 2.70 score 5 scriptstmsalab
rrum:Bayesian Estimation of the Reduced Reparameterized Unified Model with Gibbs Sampling
Implementation of Gibbs sampling algorithm for Bayesian Estimation of the Reduced Reparameterized Unified Model ('rrum'), described by Culpepper and Hudson (2017) <doi: 10.1177/0146621617707511>.
Maintained by James Joseph Balamuta. Last updated 1 years ago.
armadillocdmcognitive-diagnostic-modelsgibbs-sampling-algorithmpsychometricsrcpparmadillorrumopenblascppopenmp
2.70 score 3 scriptstmsalab
errum:Exploratory Reduced Reparameterized Unified Model Estimation
Perform a Bayesian estimation of the exploratory reduced reparameterized unified model (ErRUM) described by Culpepper and Chen (2018) <doi:10.3102/1076998618791306>.
Maintained by James Joseph Balamuta. Last updated 5 years ago.
cognitive-diagnostic-modelingcppecdmitem-response-theorypsychometricsrcpparmadillorrumopenblascppopenmp
2.70 score