Skip to main content

Investigation of robustness of state of the art methods for anxiety detection in real-world conditions

Description

I am new to ACCESS. I have a little bit of past experience running code on NCSA's Blue Waters. As a self-taught programmer, it would be interesting to learn from an experienced mentor. 

Here's an overview of my project:

Anxiety detection is topic that is actively studied but struggles to generalize and perform outside of controlled lab environments. I propose to critically analyze state of the art detection methods to quantitatively quantify failure modes of existing applied machine learning models and introduce methods to robustify real-world challenges. The aim is to start the study by performing sensitivity analysis of existing best-performing models, then testing existing hypothesis of real-world failure of these models. We predict that this will lead us to understand more deeply why models fail and use explainability to design better in-lab experimental protocols and machine learning models that can perform better in real-world scenarios. Findings will dictate future directions that may include improving personalized health detection, careful design of experimental protocols that empower transfer learning to expand on existing reach of anxiety detection models, use explainability techniques to inform better sensing methods and hardware, and other interesting future directions.

HPC Resources Needed (if known)
GPU hours on NCSA Delta
Project Deliverables
Completed training model on available data with detailed performance analysis across different conditions (noise types and levels) to inform next steps in model building
Researcher(s)
Institution
University of Illinois at Urbana-Champaign
Status
Complete
Mentor(s)
Student(s)