logo
ResearchBunny Logo
Using Mobile Data and Deep Models to Assess Auditory Verbal Hallucinations

Psychology

Using Mobile Data and Deep Models to Assess Auditory Verbal Hallucinations

S. Mirjafari, A. T. Campbell, et al.

This innovative research by Shayan Mirjafari, Andrew T Campbell, Subigya Nepal, and Weichen Wang investigates the exciting intersection of mobile data and deep learning to assess auditory verbal hallucinations. Through ecological momentary assessments and advanced neural networks, the study showcases the promising potential of mobile technology for real-time AVH evaluation.

00:00
00:00
Playback language: English
Abstract
This paper explores the use of mobile data and deep learning models to assess auditory verbal hallucinations (AVH). Researchers collected ecological momentary assessments (EMA) four times daily for a month from 435 participants who experience hearing voices. Participants also recorded audio diaries and passively provided mobile sensing data. The study used VGGish and BERT for feature extraction from audio and text data respectively and a novel approach using VGGish to transform mobile sensing data. A hybrid neural network model, employing transfer learning and data fusion techniques, achieved a 54% top-1 and 72% top-2 F1 score in predicting AVH valence. The research demonstrates the potential of mobile technology for real-time AVH assessment.
Publisher
Not specified in provided text
Published On
Apr 01, 2023
Authors
Shayan Mirjafari, Andrew T Campbell, Subigya Nepal, Weichen Wang
Tags
auditory verbal hallucinations
mobile data
deep learning
transfer learning
neural networks
real-time assessment
ecological momentary assessment
Listen, Learn & Level Up
Over 10,000 hours of research content in 25+ fields, available in 12+ languages.
No more digging through PDFs, just hit play and absorb the world's latest research in your language, on your time.
listen to research audio papers with researchbunny