Showing 15 of total 15 results (show query)
airoldilab
sgd:Stochastic Gradient Descent for Scalable Estimation
A fast and flexible set of tools for large scale estimation. It features many stochastic gradient methods, built-in models, visualization tools, automated hyperparameter tuning, model checking, interval estimation, and convergence diagnostics.
Maintained by Junhyung Lyle Kim. Last updated 1 years ago.
big-datadata-analysisgradient-descentstatisticsopenblascpp
62.1 match 62 stars 7.25 score 71 scriptseagerai
fastai:Interface to 'fastai'
The 'fastai' <https://docs.fast.ai/index.html> library simplifies training fast and accurate neural networks using modern best practices. It is based on research in to deep learning best practices undertaken at 'fast.ai', including 'out of the box' support for vision, text, tabular, audio, time series, and collaborative filtering models.
Maintained by Turgut Abdullayev. Last updated 11 months ago.
audiocollaborative-filteringdarknetdarknet-image-classificationfastaimedicalobject-detectiontabulartextvision
8.8 match 118 stars 9.40 score 76 scriptsmlverse
torch:Tensors and Neural Networks with 'GPU' Acceleration
Provides functionality to define and train neural networks similar to 'PyTorch' by Paszke et al (2019) <doi:10.48550/arXiv.1912.01703> but written entirely in R using the 'libtorch' library. Also supports low-level tensor operations and 'GPU' acceleration.
Maintained by Daniel Falbel. Last updated 9 days ago.
4.3 match 520 stars 16.52 score 1.4k scripts 38 dependentssgdinference-lab
SGDinference:Inference with Stochastic Gradient Descent
Estimation and inference methods for large-scale mean and quantile regression models via stochastic (sub-)gradient descent (S-subGD) algorithms. The inference procedure handles cross-sectional data sequentially: (i) updating the parameter estimate with each incoming "new observation", (ii) aggregating it as a Polyak-Ruppert average, and (iii) computing an asymptotically pivotal statistic for inference through random scaling. The methodology used in the SGDinference package is described in detail in the following papers: (i) Lee, S., Liao, Y., Seo, M.H. and Shin, Y. (2022) <doi:10.1609/aaai.v36i7.20701> "Fast and robust online inference with stochastic gradient descent via random scaling". (ii) Lee, S., Liao, Y., Seo, M.H. and Shin, Y. (2023) <arXiv:2209.14502> "Fast Inference for Quantile Regression with Tens of Millions of Observations".
Maintained by Youngki Shin. Last updated 1 years ago.
inferencesgdstochastic-gradient-descentsubgradientopenblascpp
15.7 match 1 stars 3.70 score 4 scriptscristiancastiglione
sgdGMF:Estimation of Generalized Matrix Factorization Models via Stochastic Gradient Descent
Efficient framework to estimate high-dimensional generalized matrix factorization models using penalized maximum likelihood under a dispersion exponential family specification. Either deterministic and stochastic methods are implemented for the numerical maximization. In particular, the package implements the stochastic gradient descent algorithm with a block-wise mini-batch strategy to speed up the computations and an efficient adaptive learning rate schedule to stabilize the convergence. All the theoretical details can be found in Castiglione, Segers, Clement, Risso (2024, <https://arxiv.org/abs/2412.20509>). Other methods considered for the optimization are the alternated iterative re-weighted least squares and the quasi-Newton method with diagonal approximation of the Fisher information matrix discussed in Kidzinski, Hui, Warton, Hastie (2022, <http://jmlr.org/papers/v23/20-1104.html>).
Maintained by Cristian Castiglione. Last updated 14 days ago.
7.2 match 10 stars 7.75 score 108 scriptstheoreticalecology
sjSDM:Scalable Joint Species Distribution Modeling
A scalable and fast method for estimating joint Species Distribution Models (jSDMs) for big community data, including eDNA data. The package estimates a full (i.e. non-latent) jSDM with different response distributions (including the traditional multivariate probit model). The package allows to perform variation partitioning (VP) / ANOVA on the fitted models to separate the contribution of environmental, spatial, and biotic associations. In addition, the total R-squared can be further partitioned per species and site to reveal the internal metacommunity structure, see Leibold et al., <doi:10.1111/oik.08618>. The internal structure can then be regressed against environmental and spatial distinctiveness, richness, and traits to analyze metacommunity assembly processes. The package includes support for accounting for spatial autocorrelation and the option to fit responses using deep neural networks instead of a standard linear predictor. As described in Pichler & Hartig (2021) <doi:10.1111/2041-210X.13687>, scalability is achieved by using a Monte Carlo approximation of the joint likelihood implemented via 'PyTorch' and 'reticulate', which can be run on CPUs or GPUs.
Maintained by Maximilian Pichler. Last updated 27 days ago.
deep-learninggpu-accelerationmachine-learningspecies-distribution-modellingspecies-interactions
6.6 match 69 stars 7.64 score 70 scriptstechtonique
bcn:Boosted Configuration Networks
Boosted Configuration (neural) Networks for supervised learning.
Maintained by T. Moudiki. Last updated 6 months ago.
machine-learningneural-networksstatistical-learningcpp
5.6 match 5 stars 4.00 score 4 scriptsegenn
rtemis:Machine Learning and Visualization
Advanced Machine Learning and Visualization. Unsupervised Learning (Clustering, Decomposition), Supervised Learning (Classification, Regression), Cross-Decomposition, Bagging, Boosting, Meta-models. Static and interactive graphics.
Maintained by E.D. Gennatas. Last updated 1 months ago.
data-sciencedata-visualizationmachine-learningmachine-learning-libraryvisualization
1.8 match 145 stars 7.09 score 50 scripts 2 dependentsjwdietrich21
SPINA:Structure Parameter Inference Approach
Calculates constant structure parameters of endocrine homeostatic systems from equilibrium hormone concentrations. Methods and equations have been described in Dietrich et al. (2012) <doi:10.1155/2012/351864> and Dietrich et al. (2016) <doi:10.3389/fendo.2016.00057>.
Maintained by Johannes W. Dietrich. Last updated 7 years ago.
4.5 match 1.52 score 11 scriptscran
BayesFluxR:Implementation of Bayesian Neural Networks
Implementation of 'BayesFlux.jl' for R; It extends the famous 'Flux.jl' machine learning library to Bayesian Neural Networks. The goal is not to have the fastest production ready library, but rather to allow more people to be able to use and research on Bayesian Neural Networks.
Maintained by Enrico Wegner. Last updated 1 years ago.
1.8 match 1.70 scorebflammers
ANN2:Artificial Neural Networks for Anomaly Detection
Training of neural networks for classification and regression tasks using mini-batch gradient descent. Special features include a function for training autoencoders, which can be used to detect anomalies, and some related plotting functions. Multiple activation functions are supported, including tanh, relu, step and ramp. For the use of the step and ramp activation functions in detecting anomalies using autoencoders, see Hawkins et al. (2002) <doi:10.1007/3-540-46145-0_17>. Furthermore, several loss functions are supported, including robust ones such as Huber and pseudo-Huber loss, as well as L1 and L2 regularization. The possible options for optimization algorithms are RMSprop, Adam and SGD with momentum. The package contains a vectorized C++ implementation that facilitates fast training through mini-batch learning.
Maintained by Bart Lammers. Last updated 4 years ago.
anomaly-detectionartificial-neural-networksautoencodersneural-networksrobust-statisticsopenblascppopenmp
0.5 match 13 stars 5.59 score 60 scriptsatarkhan
bigSurvSGD:Big Survival Analysis Using Stochastic Gradient Descent
Fits Cox Model via stochastic gradient descent (SGD). This implementation avoids computational instability of the standard Cox Model when dealing large datasets. Furthermore, it scales up with large datasets that do not fit the memory. It also handles large sparse datasets using Proximal stochastic gradient descent algorithm.
Maintained by Aliasghar Tarkhan. Last updated 5 years ago.
0.5 match 7 stars 2.85 score 1 scriptscaptainyc
higrad:Statistical Inference for Online Learning and Stochastic Approximation via HiGrad
Implements the Hierarchical Incremental GRAdient Descent (HiGrad) algorithm, a first-order algorithm for finding the minimizer of a function in online learning just like stochastic gradient descent (SGD). In addition, this method attaches a confidence interval to assess the uncertainty of its predictions. See Su and Zhu (2018) <arXiv:1802.04876> for details.
Maintained by Yuancheng Zhu. Last updated 7 years ago.
0.5 match 1.70 score 7 scripts