Tuesday, April 27, 2021

Variational Bayes for Network Estimation

Check out the fine paper, "Fast and Accurate Variational Inference for Large Bayesian VARs with Stochastic Volatility," by Chan and Yu (2020), which builds on earlier work by Koop and Korobilis (2018) and Gefang, Koop, and Poon (2019) and much earlier work. By using a global approximating density, the variational approach abandons the naive hope of achieving perfection in the limit of an MCMC sequence, in exchange for massive speed gains even in very high dimensions (without sacrificing much accuracy when done well). 

Superficially Chan-Yu looks like just another "Bayesian VARs with stochastic volatility" paper (not that there's anything wrong with that!).  But here's a key thought chain: 

(1) Networks are productively characterized and understood by interpreting them as VARS and then using standard VAR estimation, decomposition, and visualization technology (see Diebold-Yilmaz et al., here and here and the references therein).  

(2) But many interesting networks are very high-dimensional, which presented a problem for taking network-VAR analysis to the next level, where, for example, one might want a 5000-dimensional network VAR.

(3) But ultra-high-dimensional situations are now much less problematic, thanks to variational Bayes.  Moving in that direction, Chan and Lu do variational Bayes for the DDLY bank network (approximately 100 dimensions), and moreover they incorporate stochastic volatility.

No comments:

Post a Comment

Note: Only a member of this blog may post a comment.