logo
ResearchBunny Logo
Abstract
Variational Bayes (VB) methods have emerged as a fast and computationally-efficient alternative to Markov chain Monte Carlo (MCMC) methods for scalable Bayesian estimation of mixed multinomial logit (MMNL) models. This paper addresses two gaps: extant VB methods are limited to individual-specific taste parameters, and the finite-sample properties of VB estimators are unknown. The study extends VB methods to include fixed and random utility parameters and conducts a simulation-based evaluation benchmarking the extended VB methods against MCMC and MSLE. Results suggest that VB variants (except those using a modified Jensen's inequality lower bound) perform as well as MCMC and MSLE in prediction and parameter recovery, with VB-NCVMP-Δ up to 16 times faster than MCMC and MSLE.
Publisher
Transportation Research Part B
Published On
Dec 12, 2019
Authors
Prateek Bansal, Rico Krueger, Michel Bierlaire, Ricardo A. Daziano, Taha H. Rashidi
Tags
Variational Bayes
Bayesian estimation
mixed multinomial logit models
prediction
parameter recovery
Markov chain Monte Carlo
finite-sample properties
Listen, Learn & Level Up
Over 10,000 hours of research content in 25+ fields, available in 12+ languages.
No more digging through PDFs, just hit play and absorb the world's latest research in your language, on your time.
listen to research audio papers with researchbunny