Wednesday, April 14, 2021

To my loyal email subscribers

Hi Email Subscribers,

Thanks so much for your No Hesitations support, and email subscriptions, over the years.  The message below just arrived.  Not sure what I'll be able to do -- we will see.  In any event your email subscription may be temporarily or even permanently interrupted or cancelled.  You can of course always view the blog directly at https://fxdiebold.blogspot.com, or follow my Twitter posts (@FrancisDiebold) which link to the blog posts. I hope you will!

Cheers,

Frank


Hi Francis,
FeedBurner has been a part of Google for almost 14 years, and we're making several upcoming changes to support the product's next chapter. Here’s what you can expect to change and what you can do now to ensure you’re prepared.
Starting in July, we are transitioning FeedBurner onto a more stable, modern infrastructure. This will keep the product up and running for all users, but it also means that we will be turning down most non-core feed management features, including email subscriptions, at that time.
For those who use FeedBurner email subscriptions, we recommend downloading your email subscribers so that you can migrate to a new email subscription service.
For many users, no action is required. All existing feeds will continue to serve uninterrupted, and you can continue to create new accounts and burn new feeds. Core feed management functionality will continue to be supported, such as the ability to change the URL, source feed, title, and podcast metadata of your feed, along with basic analytics.
Learn More

Tuesday, April 13, 2021

Practical Guide to Climate Econometrics

This tutorial site is very nice for  climate data sources, computing environments for climate data manipulation, special climate data issues, etc., even if it's shallow on actual modeling.  Good for students. Hats off to the lead  authors, two of whom are indeed Ph.D. Students: Azhar HussainJames RisingKevin Schwarzwald, and Ana Trisovic.  Thanks to Glenn Rudebusch for alerting me to the site.  

Wednesday, April 7, 2021

Economic Forecasting Mini-Course

 


DEADLINE: Friday 7 May 2021
Is this email not displaying correctly?
View it in your browser.
Euro Area Business Cycle Network Training School

 

 Recent Developments in Forecasting

 

By
 
Graham Elliott (UC San Diego)
 
Allan Timmermann (UC San Diego)
 

Hosted online with Banca d’Italia, Italy

 1-8 June 2021

 

Deadline: 6pm (UK time), Friday 7 May 2021

General Description

We are pleased to announce details of the latest EABCN Training School; a three-day course entitled “Recent Developments in Forecasting”. Professors Graham Elliott and Allan Timmermann will teach the course. It is primarily aimed at participants in the Euro Area Business Cycle Network but applications will also be considered from doctoral students, post-doctoral researchers and economists working in central banks and government institutions outside of the network, as well as commercial organisations (fees applicable for non-network organisations).
 
Course Outline

The course introduces participants to a variety of advanced topics and recent developments in economic forecasting. The first part of the course examines the forecasting problem in general, showing that point forecasting is parameter estimation with a conditional model of the outcome and density forecasting is estimation of a conditional density. We clarify what we mean by optimal forecasting and relate classical and Bayesian approaches.
 
Understanding these issues provides a foundation for all forecasting problems. Binary forecasting or classification is most closely related to decision making. The simplicity of the loss function allows many strong results. Parametric, Semiparametric and nonparametric methods will be discussed and properties of the approaches examined.
 
Often the difference between good and bad forecasting approaches hinges on how they deal with changes to the underlying data generating process. The course therefore next addresses the consequences of model instability on forecasting performance and discusses strategies for dealing with such instability, using empirical illustrations from macroeconomics and finance. We also discuss how one can use multivariate (panel) information to better deal with model instability in a forecasting context.
 
A major issue in modern forecasting is the large number of potential predictors. Much work has been undertaken in econometrics, statistics and computer science in recent years. We provide a framework for thinking about methods as well as explain how some of the popular machine learning methods work and their properties. With this in place, we cover a variety of variable selection and model aggregation methods.

The final part of the course covers how to choose among competing forecasts and formally compare forecasting performance across two or possibly large numbers of forecasts. Ignoring the search across multiple models for a good forecasting model can introduce data mining biases, and we discuss ways to handle this problem.
 
The course draws on material from the following book (referred to as ET):
G. Elliott and A. Timmermann, 2016, Economic Forecasting. Princeton University Press.

Part I: Foundations and the Binary Forecasting Problem

  1. The Forecasting Problem
    1. Economic loss functions and ‘optimal’ forecasting  (ET chapter 2-3)
    2. Classical and Bayesian Forecasts (ET chapter 4-5)
  2. The Binary Forecasting Problem (ET chapter 12)
    1. Loss functions
    2. Point and Density Forecasting
    3. Methods for Classification/Binary Forecasting
Part II: Predictive Modelling Methods and Model Instability
  1. Forecasting under model instability
    1. Detection of breaks in time-series forecasting models (Rossi, 2013, Elliott and Mueller, 2006)
    2. Choice of estimation window in the presence of instability (Pesaran and Timmermann, 2007)
    3. Ad-hoc Strategies vs. Parametric Models of the Change Process (ET chapter 19, Pettenuzzo and Timmermann, 2011, 2017)
    4. Exploring Panel Data for Detecting and Forecasting under Breaks (Smith and Timmermann, 2017)
  2. Forecasting with Many Regressors
    1. Sparse vs. Dense Models: PCA, PLS, LASSO and variants
    2. Machine Learning Methods: Trees and neural nets (Gu, Kelly, and Xia, 2020, Coulombe et al, 2020)
  3. Model Selection and Forecast Combination Methods
    1. Model Selection Methods (ET, chapter 6)
    2. Model Aggregation approaches (Elliott, Gargano, and Timmerman, 2013).
Part III: Evaluating and Comparing Forecasting Performance
  1. Comparing forecasting performance: Horse races and p-hacking
    1. Comparisons of forecast performance.  (ET chapter 17, Clark and McCracken 2001, Diebold and Mariano 1995, Giacomini and White, 2006)
    2. Evaluating and comparing many forecasting models (White, 2000, Sullivan, Timmermann and White, 1999, Romano and Wolf, 2005, Hansen, Lunde, and Nason, 2011)
    3. Data mining and p-hacking (Harvey, Liu, and Zhu, 2016)
    4. Comparing forecasting performance in a single cross-section (Qu, Timmermann and Zhu, 2020)

Administrative information:

The course will take place online, in the evenings for Europe, from 5pm CEST:

  • June 1st lecture (3 hours)
  • June 3rd lecture (3 hours)
  • June 4th practice (1.5 hours)
  • June 7th lecture (3 hours)
  • June 8th practice (3 hours).

Candidates who have a CEPR profile should apply by submitting their CV online at portal.cepr.org/eabcn-training-school-recent-developments-forecasting by 6pm (UK time), 7 May, 2021. If you do not currently have a CEPR profile, please create a new one here and then click on the registration link.
 
PhD students should also send a statement that specifies the ways participating in the school will be useful for their current research (max 300 words).
 
Participants from non-academic institutions where the employer is not a member of the EABCN network are charged a course fee of €1000.
 
About the Instructors:
 
Graham Elliott is a professor of economics. He works in the field of econometrics, developing statistical methods for economic and other applications. He is a fellow of the Center for Applied Macroeconomic Analysis (CAMA), author of the reference/text "Economic Forecasting" jointly with Allan Timmermann, former co-editor of the International Journal of Forecasting (IJF) and co-editor of Volumes 1 and 2 of the Handbook of Forecasting.
 
Allan Timmermann holds a Atkinson/Epstein Chair in Management Leadership at the Rady School of Management and is also a professor in economics at UC San Diego's department of economics since 1994. He obtained his PhD from University of Cambridge after initial economics training at the University of Copenhagen. Timmermann is a very productive scholar in finance and applied econometrics. He serves as an associate editor on leading journals in finance, economics and forecasting including Journal of Business and Economic Statistics, Journal of Economic Dynamics and Control, Journal of Financial Econometrics, and Journal of Forecasting. He has published in journals such as Journal of American Statistical Association, Review of Economic Studies, Journal of Finance, and Journal of Econometrics.

For more information on EABCN, visit the website.

Monday, April 5, 2021

Local Projections vs. VARs

Interesting paper, showing an interesting LP (higher variance) vs. VAR (higher bias) tradeoff.

The authors conclude that "Unless researchers are overwhelmingly concerned with bias, shrinkage via Bayesian VARs or penalized LPs is attractive."  

A key point in terms of whether researchers are "overwhelmingly concerned with bias" is that it's not so much about researchers' preferences (innate feelings about bias) as it is about the data -- dynamic environments with large moving-average roots force concern with bias, because that's where low-ordered VARs are poor approximations, injecting large amounts of bias.

So: Much depends on how important large moving-average roots are in macroeconomic dynamics.  In principle they can be be very important (so un-penalized LP may be attractive).  In practice, well, usually it seems not so much (in which case penalized LP may be attractive).

Local Projections vs. VARs: Lessons From Thousands of DGPs

By:Dake LiMikkel Plagborg-M{\o}llerChristian K. Wolf
Abstract:We conduct a simulation study of Local Projection (LP) and Vector Autoregression (VAR) estimators of structural impulse responses across thousands of data generating processes (DGPs), designed to mimic the properties of the universe of U.S. macroeconomic data. Our analysis considers various structural identification schemes and several variants of LP and VAR estimators, and we pay particular attention to the role of the researcher's loss function. A clear bias-variance trade-off emerges: Because our DGPs are not exactly finite-order VAR models, LPs have lower bias than VAR estimators; however, the variance of LPs is substantially higher than that of VARs at intermediate or long horizons. Unless researchers are overwhelmingly concerned with bias, shrinkage via Bayesian VARs or penalized LPs is attractive.
Date:2021–04
URL:http://d.repec.org/n?u=RePEc:arx:papers:2104.00655&r=ets

Sunday, March 28, 2021

COVID Modeling Update: Bayesian Analysis

I often shy away from papers by colleagues/coauthors, trying to maintain some semblance of objectivity.  But this one is too cool to let go, "Bayesian Estimation of Epidemiological Models: Methods, Causality, and Policy Trade-Offs," by Arias, Fernandez-Villaverde, Rubio-Ramırez, and Shin.  Non-linear non-Gaussian state space with time-varying parameters are central. See https://www.sas.upenn.edu/~jesusfv/Bayesian_Epidemiological.pdf.  

Sunday, March 21, 2021

The Latest in Probability Forecast Evaluation

VERY nice and useful paper in the Proceedings of the National Academy by Tilmann Gneiting et al.: "Stable reliability diagrams for probabilistic classifiers", https://doi.org/10.1073/pnas.2016191118.  Supplement: https://www.pnas.org/content/pnas/suppl/2021/02/17/2016191118.DCSupplemental/pnas.2016191118.sapp.pdf.

(Significantly-revised version of "Evaluating probabilistic classifiers: Reliability diagrams and score decompositions revisited“, https://arxiv.org/pdf/2008.03033.pdf.)


Thursday, March 18, 2021

Machine Learning Panel Data

This looks very cool.  Great presenter and great discussant.  March 22.

 SoFiE Seminar

with Eric Ghysels and Max Farrell

Presenter:


Eric Ghysels (UNC Chapel Hill)

Paper:

“Machine Learning Panel Data Regressions with an Application to Nowcasting Price Earnings Ratios"

Discussant:

Max Farrell (University of Chicago)

Date:

March 22, 2021

Time:

11am New York / 8am San Diego / 3pm London / 4pm Paris / 11pm Beijing

Zoom Link:

https://nyu.zoom.us/j/91944854496

Recording:

A link to a video recording will be available here soon after the event.

Sunday, March 14, 2021

The Finance Crowd Analysis Project

I have long been interested in crowdsourcing, from a forecast combination perspective. Fincap, described below, is related but different.  I look forward to seeing and pondering the fincap results.

The following material is adapted from the Fincap project site.  For details, including a really slick 2-minute video intro, see https://fincap.academy/index.html#schedule.

#fincap is the first crowd-sourced empirical paper in Economics/Finance.

More than 100 research teams (RTs) from around the world will test the same set of hypotheses on the same data. They will work independently and write a short academic paper based on their findings.

These reports will be evaluated by more than 30 distinguished academics whom we refer to as peer evaluators (PEs). Their feedback will be passed on to the RTs so that they can revise their papers. 

The project coordinators will study the #fincap results to learn about the scientific process. They have committed ex-ante to a meta-science analysis which was frozen before any instructions and data were given to the RTs and PEs.

Sunday, March 7, 2021

Network Cluster-Robust Inference

Interesting progression of HAC / cluster-robust inference, from serial correlation, to spatial correlation (e.g., Müller and Watson, 2021, here), and now network-induced correlation. 

Very cool new network result from Michael Leung at USC: Asymptotic independence (in the number of linking steps), the key to robust/clustered inference in network environments, holds iff network clusters have conductance (the ratio of edge boundary size to volume) approaching 0. (Yes, necessary and sufficient!)