Tuesday, January 7, 2020

Ice-Free Arctic Summers are Coming VERY Soon

A very happy New Year to all!

Here's a new D&R to start it off:

"Probability Assessments of an Ice-Free Arctic: 
Comparing Statistical and Climate Model Projections"
by
Francis X. Diebold and Glenn D. Rudebusch
arXiv:1912.10774 [stat.APecon.EM].

The downward trend in Arctic sea ice is a key factor determining the pace and intensity of future global climate change; moreover, declines in sea ice can have a wide range of additional environmental and economic consequences. Based on several decades of satellite data, in a new paper Glenn Rudebusch and I provide statistical forecasts of Arctic sea ice extent during the rest of this century (Diebold and Rudebusch, "Probability Assessments of an Ice-Free Arctic: Comparing Statistical and Climate Model Projections", arXiv:1912.10774 [stat.APecon.EM]). Our results indicate that sea ice is diminishing at an increasing rate, in sharp contrast to average projections from the CMIP5 global climate models, which foresee a gradual slowing of sea ice loss even in high carbon emissions scenarios. Our long-range statistical projections also deliver probability assessments of the timing of an ice-free Arctic. This analysis indicates almost a 60 percent chance of a seasonally ice-free Arctic Ocean in the 2030s -- much earlier than the average projection from global climate models.

Friday, December 27, 2019

Holiday Haze


File:Happy Holidays (5318408861).jpg




Your blogger will be back in the New Year. 

Meanwhile, Happy Holidays to all!

Wednesday, December 4, 2019

Penn Econometrics Colloquium this Saturday

The annual Greater New York Metropolitan Area Econometrics Colloquium will be hosted at Penn, this Saturday 12/7/2019. The program is now available and appears below. 


The 14th Greater New York Metropolitan Area Econometrics Colloquium
Conference Venue
Forum 250, 2nd Floor,
133 South 36th Street, Philadelphia, PA, 19104
The Ronald O. Perelman Center for Political Science and Economics (PCPSE)
University of Pennsylvania
Organizing Committee
Karun Adusumilli, Xu Cheng, Frank Diebold, Wayne Gao, Frank Schorfheide
Sponsors
Department of Economics, University of Pennsylvania
Penn Institute for Economic Research
Warren Center for Network and Data Sciences
Program
Each presentation is 20 minutes plus 5 minutes discussion
8:30-9:00Breakfast and Registration
9:00-10:15Session 1. Chair: Wayne Gao
“Adaptation Bounds for Confidence Bands under Self-Similarity” by Timothy Armstrong
“Nonparametric Identification under Independent Component Restrictions” by Ivana Komunjer and Dennis Kristensen
“Local Projection Inference is Simpler and More Robust Than You Think” by José Luis Montiel Olea and Mikkel Plagborg-Møller
10:15-10:45Break
10:45-12:00Session 2. Chair: Karun Adusumilli
“Identification through Sparsity in Factor Models” by Simon Freyaldenhoven
“Predictive Properties of Forecast Combination, Ensemble Methods, and Bayesian Predictive Synthesis” by Kosaku Takanashi and Kenichiro McAlinn
“Learning Latent Factors from Diversified Projections and its Applications to Over-Estimated and Weak Factors” by Jianqing Fan and Yuan Liao
12:00-1:30Lunch
1:30-2:45Session 3. Chair: Xu Cheng
“Bootstrap with Cluster-dependence in Two or More Dimensions” by Konrad Menzel
“Robust Inference about Conditional Tail Features: A Panel Data Approach” by Yuya Sasaki and Yulong Wang
“On Binscatter” by Matias Cattaneo, Richard Crump, Max Farrell, and Yingjie Feng
2:45-3:15Break
3:15-4:30Session 4. Chair: Frank Schorfheide
“Estimation in Auction Models with Shape Restrictions” by Joris Pinkse and Karl Schurter
“Empirical Framework for Cournot Oligopoly with Private Information” by Gaurab Aryal and Federico Zincenko
“Identification of Structural and Counterfactual Parametersin a Large Class of Structural Econometric Models” by Lixiong Li
4:30-4:45Break
4:45-6:00Session 5. Chair: Frank Diebold
“A Short T Interactive Panel Data Model with Fixed Effects” by Jinyong Hahn and Nese Yildiz
“Salvaging Falsified Instrumental Variable Models” by Matthew Masten and Alexandre Poirier
“Bootstrap-Based Inference for Cube Root Asymptotics” by Matias Cattaneo, Michael Jansson, and Kenichi Nagasawa

Climate Change Economics and Finance

The latest IMF Finance & Development focuses on the economics of climate change. I feared it would be a forgettable collection of green-washed puff pieces, but fortunately I was wrong. There's actually a lot of insightful material, and of course a lot to discuss and debate, and it's an easy and enjoyable read. Thanks to Raham Kanani at the IMF for alerting me.

FEATURED ARTICLES:
  • The Adaptive Age: No institution or individual can stand on the sidelines in the fight against climate change // by Kristalina Georgieva
  • The Greatest Balancing Act: When it comes to sustaining the vital symbiosis between the economic and the natural world, we all can do more // by David Attenborough and Christine Lagarde
  • Carbon Calculus: For deep greenhouse gas emission reductions, a long-term perspective on costs is essential // by Kenneth Gillingham
  • Fifty Shades of Green: The world needs a new, sustainable financial system to stop runaway climate change // by Mark Carney
  • Putting a Price on Pollution: Carbon pricing strategies could hold the key to meeting the world’s climate stabilization goals // by Ian Parry
  • Investing in Resilience: Disaster-prone countries are strengthening their ability to withstand climate events // by Bob Simison
  • Climate Change and Financial Risk: Central banks and financial regulators are starting to factor in climate change // by Pierpaolo Grippa, Jochen Schmittmann, and Felix Suntheim         
  • Reaping What We Sow: Smart changes to how we farm and eat can have a huge impact on our planet // by Nicoletta Batini                
  • A Greener Future for Finance: The successes and challenges of green bonds offer lessons for sustainable finance // by Afsaneh Beschloss and Mina Mashayekhi

Friday, November 22, 2019

Google's Quantum Computer

It's been a long time since I blogged on quantum computing.  Recently things have really heated up, with Google's new machine. Check out today's interesting interpretive Penn interview here. It's of special interest to us Penn folk, since modern computing started with Penn's ENIAC, and now a massive second revolution is in the wings. (If you want some of the deep science, see here.)




Saturday, November 16, 2019

George Tauchen 70th

Lots of good econometrics at Duke in honor of George Tauchen's 70th! Great conference. Check here for the first pages of some of George's greatest hits, as well as some fantastic "word clouds" summarizing the evolution of his research in the 80's, 90's, 00's, 10's, and all-time. Conference program and papers here. My slides for the Diebold-Rudebusch diurnal temperature range (DTR) paper are below. The link to George's work is studying the daily the daily range as a volatility measure, as for example in Gallant-Hsu-Tauchen, "Using Daily Range Data to Calibrate Volatility Diffusions." In George's case it's an asset return volatility context; in our DTR case it's a climate volatility context.





















Monday, November 11, 2019

FRBSF on Climate Change

Just back from the fantastic Federal Reserve Bank of San Francisco Conference on the Economics of Climate Change.  Program and links to papers here.  Every paper was a highlight.  Check out, for example, Basal-Kiku-Ochoa here on the more "structural empirical" side , or Pesaran et al. here on the more "reduced-form empirical" side (although they have theory too). Really good stuff. The slides for my discussion of Pesaran et al. follow.








Friday, November 8, 2019

Panel GLS w Arbitrary Cov Matrices

Check out the fine new paper, "Feasible Generalized Least Squares for Panel Data with
Cross-sectional and Serial Correlations," by Jushan Bai, Sung Hoon Choi, and Yuan Liao.

Really nice feasible GLS (yes, GLS!) panel regression allowing for general disturbance heteroskedasticity, serial correlation, and/or spatial correlation.

It's an interesting move back to efficient GLS estimation instead of punting on efficiency and just White-washing s.e.'s with a HAC estimator, which I always found rather unsettling. (See here and here.)  The cool twist is that Bai et al. allow for very general disturbance covariance structures, just as with HAC, without giving away efficiency since they do GLS. The best of both worlds!

So nice to have one of the paper's authors, Yuan Liao, visiting Penn Economics this year. What a windfall for us.


Sunday, October 27, 2019

Machine Learning for Financial Crises


Below are the slides from my discussion of Helene Rey et al., "Answering the Queen: Machine Learning and Financial Crises", which I gave a few days ago at a fine NBER IFM meeting (program and clickable papers here). I also discussed it in June at the BIS annual research meeting in Zurich. The key development since the earlier mid-summer draft is that they actually implemented a real-time financial crisis prediction analysis for France using vintage data, as opposed to quasi-real-time using final-revised data. Moving to real time of course somewhat degrades the quasi-real-time results, but they largely hold up. Very impressive. Therefore I now offer suggestions for improving evaluation credibility in the remaining cases where vintage datasets are not yet available. On the other hand, I also note how subtle but important look-ahead biases can creep in even when vintage data are available and used. I conclude that the only fully-convincing evaluation involves implementing their approach moving forward, recording the results, and building up a true track record.









Online Learning vs. TVP Forecast Combination

[This post is based on the first slide (below) of a discussion of Helene Rey et al., which I gave a few days ago at a fine NBER IFM meeting (program and clickable papers here). The paper is fascinating and impressive, and I'll blog on it separately next time. But the slide below is more of a side rant on general issues, and I skipped it in the discussion of Rey et al. to be sure to have time to address their particular issues.]

Quite a while ago I blogged here on the ex ante expected loss minimization that underlies traditional econometric/statistical forecast combination, vs. the ex post regret minimization that underlies "online learning" and related "machine learning" methods. Nothing has changed. That is, as regards ex post regret minimization, I'm still intrigued, but I'm still not persuaded.

And there's another thing that bothers me. As implemented, ML-style online learning and traditional econometric-style forecast combination with time-varying parameters (TVPs) are almost identical: just projection (regression) of realizations on forecasts, reading off the combining weights as the regression coefficients.  OF COURSE we can generalize to allow for time-varying combining weights, non-linear combinations, regularization in high dimensions, etc., and hundreds of econometrics papers have addressed and explored those issues. Yet the ML types seem to think they invented everything, and too many economists are buying it. Rey et al., for example, don't so much as mention the econometric forecast combination literature, which by now occupies large chapters of  leading textbooks, like Elliott and Timmermann at the bottom of the slide below.