Showing 5 of total 5 results (show query)
christopherdhare
basicspace:Recovering a Basic Space from Issue Scales
Provides functions to estimate latent dimensions of choice and judgment using Aldrich-McKelvey and Blackbox scaling methods, as described in Poole et al. (2016, <doi:10.18637/jss.v069.i07>). These techniques allow researchers (particularly those analyzing political attitudes, public opinion, and legislative behavior) to recover spatial estimates of political actors' ideal points and stimuli from issue scale data, accounting for perceptual bias, multidimensional spaces, and missing data. The package uses singular value decomposition and alternating least squares (ALS) procedures to scale self-placement and perceptual data into a common latent space for the analysis of ideological or evaluative dimensions. Functionality also include tools for assessing model fit, handling complex survey data structures, and reproducing simulated datasets for methodological validation.
Maintained by Christopher Hare. Last updated 3 months ago.
38.1 match 1 stars 2.89 score 39 scriptsf-rousset
blackbox:Black Box Optimization and Exploration of Parameter Space
Performs prediction of a response function from simulated response values, allowing black-box optimization of functions estimated with some error. Includes a simple user interface for such applications, as well as more specialized functions designed to be called by the Migraine software (Rousset and Leblois, 2012 <doi:10.1093/molbev/MSR262>; Leblois et al., 2014 <doi:10.1093/molbev/msu212>; and see URL). The latter functions are used for prediction of likelihood surfaces and implied likelihood ratio confidence intervals, and for exploration of predictor space of the surface. Prediction of the response is based on ordinary Kriging (with residual error) of the input. Estimation of smoothing parameters is performed by generalized cross-validation.
Maintained by François Rousset. Last updated 1 years ago.
61.5 match 1.79 score 8 scripts 1 dependentsbgreenwell
ebm:Explainable Boosting Machines
An interface to the 'Python' 'InterpretML' framework for fitting explainable boosting machines (EBMs); see Nori et al. (2019) <doi:10.48550/arXiv.1909.09223> for. EBMs are a modern type of generalized additive model that use tree-based, cyclic gradient boosting with automatic interaction detection. They are often as accurate as state-of-the-art blackbox models while remaining completely interpretable.
Maintained by Brandon M. Greenwell. Last updated 12 days ago.
aiblackboxexplainable-aiexplainable-machine-learningexplainable-mlglassboximlinterpretabilityinterpretability-and-explainabilityinterpretableinterpretable-aiinterpretable-machine-learninginterpretable-mlinterpretable-modelsmachine-learningxai
11.5 match 1 stars 4.60 scorerbgramacy
laGP:Local Approximate Gaussian Process Regression
Performs approximate GP regression for large computer experiments and spatial datasets. The approximation is based on finding small local designs for prediction (independently) at particular inputs. OpenMP and SNOW parallelization are supported for prediction over a vast out-of-sample testing set; GPU acceleration is also supported for an important subroutine. OpenMP and GPU features may require special compilation. An interface to lower-level (full) GP inference and prediction is provided. Wrapper routines for blackbox optimization under mixed equality and inequality constraints via an augmented Lagrangian scheme, and for large scale computer model calibration, are also provided. For details and tutorial, see Gramacy (2016 <doi:10.18637/jss.v072.i01>.
Maintained by Robert B. Gramacy. Last updated 2 years ago.
2.3 match 8 stars 5.47 score 166 scripts 2 dependentscran
ciu:Contextual Importance and Utility
Implementation of the Contextual Importance and Utility (CIU) concepts for Explainable AI (XAI). A recent description of CIU can be found in e.g. Främling (2020) <arXiv:2009.13996>.
Maintained by Kary Främling. Last updated 2 years ago.
1.9 match 1.00 score