Illuminating the Black Box: Variable Importance for Interpretable Machine Learning

  • July 8, 2021 from 12:00-1:00pm
  • Please RSVP by emailing Lisa Maturo at Lisa.M.Maturo@christianacare.org.
  • Please include your full name, email address, and institution/organization. 
  • We will provide instructions on obtaining CME credit for attendance.

Machine learning approaches offer alternative to traditional linear regression-based approaches for modeling disease outcomes and often provide greater flexibility for addressing complex relationships between predictors and outcomes.  However, their use in clinical practice has been limited in part due to lack of interpretability of the relationship between predictors and the outcome of interest.  There are graphical approaches and measure of relative variable importance that have been developed to improve interpretability of machine learning models.  This talk will provide an overview of common supervised machine learning methods (e.g. ensemble methods, support vector machines and neural networks) and discuss currently implemented and available importance measures and graphical approaches that provide semi-quantitative measures of associations between variables and outcome.

Meet the Speaker

Bethany J. Wolf, PhD

Dr. Wolf is an Associate Professor of Biostatistics in the Department of Public Health Sciences at the Medical University of South Carolina and is a member of the DE-CTR BERD Core.   She has statistical expertise in prediction modeling and her statistical research focuses on developing novel machine learning approaches for prediction modeling, model optimization approaches, and quantifying variable importance and variable selection approaches.

This activity has been approved for AMA PRA Category 1 Credit

Stay Connected

Join the ACCEL newsletter and keep up to date with the latest news, events and funding opportunities!