Friday, December 27, 2019

Holiday Haze


File:Happy Holidays (5318408861).jpg




Your blogger will be back in the New Year. 

Meanwhile, Happy Holidays to all!

Wednesday, December 4, 2019

Penn Econometrics Colloquium this Saturday

The annual Greater New York Metropolitan Area Econometrics Colloquium will be hosted at Penn, this Saturday 12/7/2019. The program is now available and appears below. 


The 14th Greater New York Metropolitan Area Econometrics Colloquium
Conference Venue
Forum 250, 2nd Floor,
133 South 36th Street, Philadelphia, PA, 19104
The Ronald O. Perelman Center for Political Science and Economics (PCPSE)
University of Pennsylvania
Organizing Committee
Karun Adusumilli, Xu Cheng, Frank Diebold, Wayne Gao, Frank Schorfheide
Sponsors
Department of Economics, University of Pennsylvania
Penn Institute for Economic Research
Warren Center for Network and Data Sciences
Program
Each presentation is 20 minutes plus 5 minutes discussion
8:30-9:00Breakfast and Registration
9:00-10:15Session 1. Chair: Wayne Gao
“Adaptation Bounds for Confidence Bands under Self-Similarity” by Timothy Armstrong
“Nonparametric Identification under Independent Component Restrictions” by Ivana Komunjer and Dennis Kristensen
“Local Projection Inference is Simpler and More Robust Than You Think” by José Luis Montiel Olea and Mikkel Plagborg-Møller
10:15-10:45Break
10:45-12:00Session 2. Chair: Karun Adusumilli
“Identification through Sparsity in Factor Models” by Simon Freyaldenhoven
“Predictive Properties of Forecast Combination, Ensemble Methods, and Bayesian Predictive Synthesis” by Kosaku Takanashi and Kenichiro McAlinn
“Learning Latent Factors from Diversified Projections and its Applications to Over-Estimated and Weak Factors” by Jianqing Fan and Yuan Liao
12:00-1:30Lunch
1:30-2:45Session 3. Chair: Xu Cheng
“Bootstrap with Cluster-dependence in Two or More Dimensions” by Konrad Menzel
“Robust Inference about Conditional Tail Features: A Panel Data Approach” by Yuya Sasaki and Yulong Wang
“On Binscatter” by Matias Cattaneo, Richard Crump, Max Farrell, and Yingjie Feng
2:45-3:15Break
3:15-4:30Session 4. Chair: Frank Schorfheide
“Estimation in Auction Models with Shape Restrictions” by Joris Pinkse and Karl Schurter
“Empirical Framework for Cournot Oligopoly with Private Information” by Gaurab Aryal and Federico Zincenko
“Identification of Structural and Counterfactual Parametersin a Large Class of Structural Econometric Models” by Lixiong Li
4:30-4:45Break
4:45-6:00Session 5. Chair: Frank Diebold
“A Short T Interactive Panel Data Model with Fixed Effects” by Jinyong Hahn and Nese Yildiz
“Salvaging Falsified Instrumental Variable Models” by Matthew Masten and Alexandre Poirier
“Bootstrap-Based Inference for Cube Root Asymptotics” by Matias Cattaneo, Michael Jansson, and Kenichi Nagasawa

Climate Change Economics and Finance

The latest IMF Finance & Development focuses on the economics of climate change. I feared it would be a forgettable collection of green-washed puff pieces, but fortunately I was wrong. There's actually a lot of insightful material, and of course a lot to discuss and debate, and it's an easy and enjoyable read. Thanks to Raham Kanani at the IMF for alerting me.

FEATURED ARTICLES:
  • The Adaptive Age: No institution or individual can stand on the sidelines in the fight against climate change // by Kristalina Georgieva
  • The Greatest Balancing Act: When it comes to sustaining the vital symbiosis between the economic and the natural world, we all can do more // by David Attenborough and Christine Lagarde
  • Carbon Calculus: For deep greenhouse gas emission reductions, a long-term perspective on costs is essential // by Kenneth Gillingham
  • Fifty Shades of Green: The world needs a new, sustainable financial system to stop runaway climate change // by Mark Carney
  • Putting a Price on Pollution: Carbon pricing strategies could hold the key to meeting the world’s climate stabilization goals // by Ian Parry
  • Investing in Resilience: Disaster-prone countries are strengthening their ability to withstand climate events // by Bob Simison
  • Climate Change and Financial Risk: Central banks and financial regulators are starting to factor in climate change // by Pierpaolo Grippa, Jochen Schmittmann, and Felix Suntheim         
  • Reaping What We Sow: Smart changes to how we farm and eat can have a huge impact on our planet // by Nicoletta Batini                
  • A Greener Future for Finance: The successes and challenges of green bonds offer lessons for sustainable finance // by Afsaneh Beschloss and Mina Mashayekhi

Friday, November 22, 2019

Google's Quantum Computer

It's been a long time since I blogged on quantum computing.  Recently things have really heated up, with Google's new machine. Check out today's interesting interpretive Penn interview here. It's of special interest to us Penn folk, since modern computing started with Penn's ENIAC, and now a massive second revolution is in the wings. (If you want some of the deep science, see here.)




Saturday, November 16, 2019

George Tauchen 70th

Lots of good econometrics at Duke in honor of George Tauchen's 70th! Great conference. Check here for the first pages of some of George's greatest hits, as well as some fantastic "word clouds" summarizing the evolution of his research in the 80's, 90's, 00's, 10's, and all-time. Conference program and papers here. My slides for the Diebold-Rudebusch diurnal temperature range (DTR) paper are below. The link to George's work is studying the daily the daily range as a volatility measure, as for example in Gallant-Hsu-Tauchen, "Using Daily Range Data to Calibrate Volatility Diffusions." In George's case it's an asset return volatility context; in our DTR case it's a climate volatility context.





















Monday, November 11, 2019

FRBSF on Climate Change

Just back from the fantastic Federal Reserve Bank of San Francisco Conference on the Economics of Climate Change.  Program and links to papers here.  Every paper was a highlight.  Check out, for example, Basal-Kiku-Ochoa here on the more "structural empirical" side , or Pesaran et al. here on the more "reduced-form empirical" side (although they have theory too). Really good stuff. The slides for my discussion of Pesaran et al. follow.








Friday, November 8, 2019

Panel GLS w Arbitrary Cov Matrices

Check out the fine new paper, "Feasible Generalized Least Squares for Panel Data with
Cross-sectional and Serial Correlations," by Jushan Bai, Sung Hoon Choi, and Yuan Liao.

Really nice feasible GLS (yes, GLS!) panel regression allowing for general disturbance heteroskedasticity, serial correlation, and/or spatial correlation.

It's an interesting move back to efficient GLS estimation instead of punting on efficiency and just White-washing s.e.'s with a HAC estimator, which I always found rather unsettling. (See here and here.)  The cool twist is that Bai et al. allow for very general disturbance covariance structures, just as with HAC, without giving away efficiency since they do GLS. The best of both worlds!

So nice to have one of the paper's authors, Yuan Liao, visiting Penn Economics this year. What a windfall for us.


Sunday, October 27, 2019

Machine Learning for Financial Crises


Below are the slides from my discussion of Helene Rey et al., "Answering the Queen: Machine Learning and Financial Crises", which I gave a few days ago at a fine NBER IFM meeting (program and clickable papers here). I also discussed it in June at the BIS annual research meeting in Zurich. The key development since the earlier mid-summer draft is that they actually implemented a real-time financial crisis prediction analysis for France using vintage data, as opposed to quasi-real-time using final-revised data. Moving to real time of course somewhat degrades the quasi-real-time results, but they largely hold up. Very impressive. Therefore I now offer suggestions for improving evaluation credibility in the remaining cases where vintage datasets are not yet available. On the other hand, I also note how subtle but important look-ahead biases can creep in even when vintage data are available and used. I conclude that the only fully-convincing evaluation involves implementing their approach moving forward, recording the results, and building up a true track record.









Online Learning vs. TVP Forecast Combination

[This post is based on the first slide (below) of a discussion of Helene Rey et al., which I gave a few days ago at a fine NBER IFM meeting (program and clickable papers here). The paper is fascinating and impressive, and I'll blog on it separately next time. But the slide below is more of a side rant on general issues, and I skipped it in the discussion of Rey et al. to be sure to have time to address their particular issues.]

Quite a while ago I blogged here on the ex ante expected loss minimization that underlies traditional econometric/statistical forecast combination, vs. the ex post regret minimization that underlies "online learning" and related "machine learning" methods. Nothing has changed. That is, as regards ex post regret minimization, I'm still intrigued, but I'm still not persuaded.

And there's another thing that bothers me. As implemented, ML-style online learning and traditional econometric-style forecast combination with time-varying parameters (TVPs) are almost identical: just projection (regression) of realizations on forecasts, reading off the combining weights as the regression coefficients.  OF COURSE we can generalize to allow for time-varying combining weights, non-linear combinations, regularization in high dimensions, etc., and hundreds of econometrics papers have addressed and explored those issues. Yet the ML types seem to think they invented everything, and too many economists are buying it. Rey et al., for example, don't so much as mention the econometric forecast combination literature, which by now occupies large chapters of  leading textbooks, like Elliott and Timmermann at the bottom of the slide below.


Thursday, October 24, 2019

Volatility and Risk Institute

NYU's Volatility Institute is expanding into the Volatility and Risk Institute (VRI). The four key initiatives are Climate Risk (run by Johannes Stroebel), Cyber Risk (run by Randal Milch), Financial Risk (run by Viral Acharya), and Geopolitical Risk (run by Thomas Philippon). Details here. This is a big deal. Great to see climate given such obvious and appropriate prominence. And notice how interconnected are climate, financial, and geopolitical risks.
The following is adapted from an email from Rob Engle and Dick Berner:

The Volatility Institute and its V-lab have, for the past decade, assessed risk through the lens of financial volatility, providing real-time measurement and forecasts of volatility and correlations for a wide spectrum of financial assets, and SRISK, a powerful measure of the resilience of the global financial system. Adopting an interdisciplinary approach, the VRI will build on that foundation to better assess newly emerging nonfinancial and financial risks facing today’s business leaders and policymakers, including climate-related, cyber/operational and geopolitical risks, as well as the interplay among them. 

The VRI will be co-directed by two NYU Stern faculty: Nobel Laureate Robert Engle, Michael Armellino Professor of Management and Financial Services and creator of the V-lab; and Richard Berner, Professor of Management Practice and former Director of the Office of Financial Research, established by the Dodd–Frank Wall Street Reform and Consumer Protection Act to help promote financial stability by delivering high-quality financial data, standards and analysis to policymakers and the public. 

The VRI will serve as the designated hub to facilitate, support and promote risk-related research, and external and internal engagement among scholars, practitioners and policymakers. To realize its interdisciplinary potential, the VRI will engage the expertise of faculty across New York University, including at the Courant Institute of Mathematical Sciences, Law School, Tandon School of Engineering, Wagner School of Public Policy and Wilf Family Department of Politics in the Faculty of Arts & Science. 

Saturday, October 19, 2019

Missing data in Factor Models

Serena Ng's site is back. Her new paper with Jushan Bai, "Matrix Completion, Counterfactuals, and Factor Analysis of Missing Data", which I blogged about earlier, is now up on arXiv, here.  Closely related, Marcus Pelger just sent me his new paper with Ruoxuan Xiong, "Large Dimensional Latent Factor Modeling with Missing Observations and Applications to Causal Inference", which I look forward to reading. One is dated Oct 15 and one is dated Oct 16. Science is rushing forward!

Saturday, October 12, 2019

Interval Prediction

Last time I blogged on Serena's amazing presentation from Per's Chicago meeting,
https://fxdiebold.blogspot.com/2019/10/large-dimensional-factor-analysis-with.html

But I was equally blown away by Rina's amazing "Predictive inference with the jackknife+".
Rina Foygel Barber∗ , Emmanuel J. Canes† , Aaditya Ramdas‡ , Ryan J. Tibshirani‡§
https://arxiv.org/pdf/1905.02928.pdf.

Correctly calibrated prediction intervals despite arbitrary model misspecification!

Of course I'm left with lots of questions.  They have nice correct-coverage theorems. What about length?  I would like theorems (not just simulations) as regards shortest length intervals with guaranteed correct coverage. Their results seem to require iid or similar exchangability environments. What about heteroskedastic environments where prediction error variance depends on covariates? What about time series environments?

Then, quite amazingly, "Distributional conformal prediction" by Victor Chernozukov et al., arrived in my mailbox.
https://arxiv.org/pdf/1909.07889.pdf
It is similarly motivated and may address some of my questions.

Anyway, great developments for interval prediction!

Monday, October 7, 2019

Carbon Offsets



At the end of a recently-received request that I submit my receipts from a conference trip last week:
... Let me know if you'd like us to purchase carbon offsets for your miles, and deduct the amount (or any amount) from your reimbursement. We’ll do it on your behalf. Thank You! For example: NYC-Chicago round trip = $5.72. Oxford-Chicago round trip = $35.77. Nantes - Chicago round trip = $37.08 Bergen - Chicago round trip = $35.44
A first for me!  Ironically, I spoke on some of my new climate econometrics work with Glenn Rudebusch. (It was not a climate conference per se, and mine was the only climate paper.)

Sunday, October 6, 2019

Large Dimensional Factor Analysis with Missing Data

Back from the very strong Stevanovich meeting.  Program and abstracts here.  One among many  highlights was:

Large  Dimensional  Factor Analysis  with  Missing Data
Presented by Serena Ng, (Columbia, Dept. of Economics)

Abstract:
This paper introduces two factor-based imputation procedures that will fill
missing values with consistent estimates of the common component. The
first method is applicable when the missing data are bunched. The second
method is appropriate when the data are missing in a staggered or
disorganized manner. Under the strong factor assumption, it is shown that
the low rank component can be consistently estimated but there will be at
least four convergence rates, and for some entries, re-estimation can
accelerate convergence. We provide a complete characterization of the
sampling error without requiring regularization or imposing the missing at
random assumption as in the machine learning literature. The
methodology can be used in a wide range of applications, including
estimation of covariances and counterfactuals.

This paper just blew me away.  Re-arrange the X columns to get all the "complete cases across people" (tall block) in the leftmost columns, and re-arrange the X rows to get all the "complete cases across variables" (wide block) in the topmost rows.  The intersection is the "balanced" block in the upper left.  Then iterate on the tall and wide blocks to impute the missing data in the bottom right "missing data" block. The key figure that illustrates the procedure provided a real "eureka moment" for me.  Plus they have a full asymptotic theory as opposed to just worst-case bounds.

Kudos!

I'm not sure whether the paper is circulating yet, and Serena's web site vanished recently (not her fault -- evidently Google made a massive error), but you'll soon be able to get the paper one way or another.






Sunday, September 29, 2019

Krusell on Economics of Climate

In general I'm not a fan of podcasts -- it takes annoyingly longer to listen than to read -- but if you're interested in climate economics you must hear this Per Krusell gem. It's from 2017 but fresh as ever.  He also gave a mini-course at Penn back in 2017.  So sad I could not go. 

I also like Marty Weitzman's Climate Shock.  Similarly informed and serious.  Quirky and endearing writing style.  Makes a strong case for a low discount rate.


Machine Learning and Big Data

Nice looking meeting coming up this week, "Big Data and Machine Learning in Econometrics, Finance, and Statistics" at U Chicago's Stevanovich Center for Financial Mathematics.  Preliminary program here.


Sunday, September 22, 2019

Temperature Volatility

Temperature level is of course heavily studied, and trending upward alarmingly quickly. In a new paper, Glenn Rudebusch and I study temperature volatility, which has been much less heavily studied. We show that temperature volatility is pervasively trending downward, and that its "twin peaks" seasonal pattern is also evolving, both of which have implications for agriculture and much else. Our analysis is based on the daily temperature range, in precise parallel with the time-honored use of the daily log price range as a volatility (quadratic variation) estimator in financial markets.

Tuesday, September 17, 2019

Econometrics Meeting at Penn

Please consider submitting to this year's Greater New York Area (GNYA) econometrics meeting, hosted this year by Penn (third time, I think).  Always a fine conference.  GNYA is defined VERY broadly. Program and participant list from last year's meeting at Princeton here.  Call for papers for this year's Penn meeting immediately below.

Dear friends,

We are pleased to announce that the University of Pennsylvania will host the Greater New York Metropolitan Area Econometrics Colloquium on Saturday, December 7, 2018.  If you would like to present your work at the colloquium, please send your paper or extended abstract by Sunday, October 27, 2018.  We plan to have the program selected by Friday, November 8th, 2018.

Further information about the colloquium will be posted at


Please feel free to forward this call for papers to your colleagues. As usual, we will not include presentations by graduate students in this short event. Each presentation should last about 30 minutes. We plan to include 8-12 presentations in the program.

Continental breakfast, lunch, and dinner will be provided. We cannot cover travel or accommodations. There are three hotels on campus: The Study at University City (newest, 0.3 mile), Sheraton Philadelphia University City (1 block), Hilton Inn at Penn (1 block). About two miles from campus, there are also many options in center city Philadelphia.

Please send submissions to gnyec19@gmail.com.
Best,
Karun Adusumilli
Xu Cheng
Frank Diebold
Wayne Gao
Frank Schorfheide

Monday, September 9, 2019

Environmental and Energy Policy and the Economy

The first volume from last May's NBER meeting is forthcoming; see https://www.nber.org/books/kotc-1  Marvelously, the meetings and volumes will be ongoing annually.  See below for the 2020 CFP.  Note that submissions from non-NBER researchers are welcome.


NBER Call for Papers / Proposals
2nd Annual NBER Environmental and Energy Policy and the Economy Conference

 
Dear Researchers,
 
We are seeking papers or proposals for the second annual NBER conference/publication on Environmental and Energy Policy and the Economy. We will accept six papers for presentation at the National Press Club in Washington, D.C. on May 21, 2020. The audience will include the professional staffs of government agencies, research institutions, and NGOs focused on energy and environmental policy. The contributed papers will then be published in an annual volume by the University of Chicago Press.
 
To view last year?s agenda and papers for the forthcoming volume, please click 
HERE and HERE.
 
Papers should be relevant to current policy debates and accessible to a professional audience, yet following standard NBER protocol, they should avoid making policy recommendations. While standalone projects are specifically encouraged, we also welcome spinoff projects where authors intend to later submit a more extensive or technical version to a journal, or may have already done so. While no paper should be a duplicate of another paper, alternate versions that put results into a more general, policy relevant context and summarize them in more accessible language are encouraged. This is a great opportunity to communicate research to the policy community.
 
Submissions should be either complete papers or 2-3 page abstracts outlining the intended contribution. Submissions are due by October 14, 2019, and can be uploaded at
 
http://www.nber.org/confsubmit/backend/cfp?id=EEPEs20
 
Submissions from researchers who are not affiliated with the NBER, and from researchers who are from groups that have been historically under-represented in the economics profession, are welcome. The authors of each paper will share an $8,000 honorarium.
 
Decisions about accepted papers will be made by mid-November. Complete drafts of papers will be due in early April 2019.
 
We look forward to hearing from you.
 
Matthew Kotchen
James Stock
Catherine Wolfram