Showing 3 of total 3 results (show query)
cmmid
paramix:Aggregate and Disaggregate Continuous Parameters for Compartmental Models
A convenient framework for aggregating and disaggregating continuously varying parameters (for example, case fatality ratio, with age) for proper parametrization of lower-resolution compartmental models (for example, with broad age categories) and subsequent upscaling of model outputs to high resolution (for example, as needed when calculating age-sensitive measures like years-life-lost).
Maintained by Carl Pearson. Last updated 4 months ago.
2 stars 5.26 score 3 scriptsforestry-labs
distillML:Model Distillation and Interpretability Methods for Machine Learning Models
Provides several methods for model distillation and interpretability for general black box machine learning models and treatment effect estimation methods. For details on the algorithms implemented, see <https://forestry-labs.github.io/distillML/index.html> Brian Cho, Theo F. Saarinen, Jasjeet S. Sekhon, Simon Walter.
Maintained by Theo Saarinen. Last updated 2 years ago.
bartdistillation-modelexplainable-machine-learningexplainable-mlinterpretabilityinterpretable-machine-learningmachine-learningmodelrandom-forestxgboost
7 stars 3.92 score 12 scriptscran
distillery:Method Functions for Confidence Intervals and to Distill Information from an Object
Some very simple method functions for confidence interval calculation, bootstrap resampling aimed at atmospheric science applications, and to distill pertinent information from a potentially complex object; primarily used in common with packages extRemes and SpatialVx. To reference this package and for a tutorial on the bootstrap functions, please see Gilleland (2020) <doi: 10.1175/JTECH-D-20-0069.1> and Gilleland (2020) <doi: 10.1175/JTECH-D-20-0070.1>.
Maintained by Eric Gilleland. Last updated 4 months ago.
2.78 score 7 dependents