Monday, June 18, 2018

10th ECB Workshop on Forecasting Techniques, Frankfurt

Starts now, program hereLooks like a great lineup. Most of the papers are posted, and the organizers also plan to post presentation slides following the conference. Presumably in future weeks I'll blog on some of the presentations.

Monday, June 11, 2018

Deep Neural Nets for Volatility Dynamics

There doesn't seem to be much need for nonparametric nonlinear modeling in empirical macro and finance. Not that lots of smart people haven't tried. The two key nonlinearities (volatility dynamics and regime switching) just seem to be remarkably well handled by tightly-parametric customized models (GARCH/SV and Markov-switching, respectively). 

But the popular volatility models are effectively linear (ARMA) in squares. Maybe that's too rigidly constrained. Volatility dynamics seem like something that could be nonlinear in ways much richer than just ARMA in squares. 

Here's an attempt using deep neural nets. I'm not convinced by the paper -- much more thorough analysis and results are required than the 22 numbers reported in the "GARCH" and "stocvol" columns of its Table 1 -- but I'm intrigued.

It's quite striking that neural nets, which have been absolutely transformative in other areas of predictive modeling, have thus far contributed so little in economic / financial contexts. Maybe the "deep" versions will change that, at least for volatility modeling. Or maybe not. 

Thursday, June 7, 2018

Machines Learning Finance

FRB Atlanta recently hosted a meeting on "Machines Learning Finance". Kind of an ominous, threatening (Orwellian?) title, but there were lots of (non-threatening...) pieces. I found the surveys by Ryan Adams and John Cunningham particularly entertaining. A clear theme on display throughout the meeting was that "supervised learning" -- the main strand of machine learning -- is just function estimation, and in particular, conditional mean estimation. That is, regression. It may involve high dimensions, non-linearities, binary variables, etc., but at the end of the day it's still just regression. If you're a regular No Hesitations reader, the "insight" that supervised learning = regression will hardly be novel to you, but still it's good to see it disseminating widely.