Friday, November 16, 2018

Nearest-Neighbor Prediction

The beautiful idea has been around for ages. Find the N closest H-histories to the current H-history (you choose/tune N and H), for each H-history see what followed, take an average, and use that as your forecast. Of course there are many variations and extensions. Interesting new work by Dendramis, Kapetanios, and Marcellino is in exactly that tradition, except that Dendramis et al.  don't show much awareness of the tradition, or attempt to stand on its shoulders, which I find odd. I find myself hungry for tighter connections, for example to my favorite old nearest-neighbor prediction piece, Sid Yakowitz's well-known "Nearest-Neighbor Methods for Time Series Analysis,” Journal of Time Series Analysis, 1987.

Thursday, November 15, 2018

JFEC Special Issue for Peter Christoffersen

No, I have not gone into seclusion. Well actually I have, but not intentionally and certainly not for lack of interest in the blog. Just the usual crazy time of year, only worse this year for some reason. Anyway I'll be back very soon, with lots to say! But here's something important and timely, so it can't wait:

Journal of Financial Econometrics

Call for Papers

Special Issue in Honor of Peter Christoffersen

The Journal of Financial Econometrics is organizing a special issue in memory of Professor Peter
Christoffersen, our friend and colleague, who passed away in June 2018. Peter held the TMX Chair in Capital Markets and a Bank of Canada Fellowship and was a widely respected member of the Rotman School at the University of Toronto since 2010. Prior to 2010, Peter was a valued member of the Desautels Faculty of Management at McGill University. In addition to his transformative work in econometrics and volatility models, financial risk and financial innovation had been the focus of Peter’s work in recent years.

We invite paper submissions on topics related to Peter’s contributions to Finance and Econometrics. We are particularly interested in papers related to the following topics:

1)   The use of option-implied information for forecasting; Rare disasters and portfolio
management; Factor structures in derivatives and futures markets.

2)   Volatility, correlation, extreme events, systemic risk and Value-at-Risk modeling for
financial market risk management.

3)   The econometrics of digital assets; Big data and Machine Learning.

To submit a paper, authors should login to the Journal of Financial Econometrics online submission system and follow the submission instructions as per journal policy.  The due date for submissions is June 30, 2019.  It is important to specify in the cover letter that the paper is submitted to the special issue in honor of Peter Christoffersen, otherwise your paper will not be assigned to the guest editors.

Guest Editors

•    Francis X. Diebold, University of Pennsylvania

•    René Garcia, Université de Montréal and Toulouse School of Economics

•    Kris Jacobs, University of Houston

Monday, October 29, 2018

Becker Friedman Expectations Conference

I just returned from a great BFI Conference at U Chicago, Developing and Using Business Expectations Data, organized by Nick Bloom and Steve Davis.

Wonderfully, density as opposed to point survey forecasts were featured throughout. There was the latest on central bank surveys (e.g., Binder et al.), but most informative (to me) was the emphasis on surveys that I'm less familiar with, typically soliciting density expectations from hundreds or thousands of C-suite types at major firms. Examples include Germany's important IFO survey (e.g., Bachman et al.), the U.S. Census Management and Organizational Practices Survey (e.g., Bloom et al.)., and fascinating work in progress at FRB Atlanta. 

The Census survey is especially interesting due to its innovative structuring of histogram bins. There are no fixed bins. Instead users give 5 bins of their own choice, and five corresponding probabilities (which add to 1). This solves the problem in fixed-bin surveys of  (lazy? behaviorally-biased?) respondents routinely and repeatedly assigning 0 probability to subsequently-realized events.

Sunday, October 28, 2018

Expansions Don't Die of Old Age

As the expansion ages, there's progressively more discussion of whether its advanced age makes it more likely to end. The answer is no. More formally, postwar U.S. expansion hazards are basically flat, in contrast to contraction hazards, which are sharply increasing. Of course the present expansion will eventually end, and it may even end soon, but its age it unrelated to its probability of ending.

All of this is very clear in Diebold, Rudebusch and Sichel (1992). See Figure 6.2 on p. 271. (Sorry for the poor photocopy quality.) The flat expansion hazard result has held up well (e.g., Rudebusch (2016)), and moreover it would only be strengthened by the current long expansion.

[I blogged on flat expansion hazards before, but the message bears repeating as the expansion continues to age.]

Thursday, October 4, 2018

In Memoriam Herman Stekler

I am sad to report that Herman Stekler passed away last month. I didn't know until now. He was a very early and important and colorful -- indeed unique -- personage in the forecasting community, making especially noteworthy contributions to forecast evaluation.  
https://forecasters.org/herman-stekler_oracle-oct-2018/

Tuesday, October 2, 2018

Tyranny of the Top 5 Econ Journals

Check out:

PUBLISHING AND PROMOTION IN ECONOMICS: THE TYRANNY OF THE TOP FIVE 
by
James J. Heckman and Sidharth Moktan 
NBER Working Paper 25093
http://www.nber.org/papers/w25093

Heckman et al. examine a range of data from a variety of perspectives, analyze them thoroughly, and pull no punches in describing their striking results.

It's a great paper. There's a lot I could add, maybe in a future post, but my blood pressure is already high enough for today. So I'll just leave you with a few choice quotes from the paper ["T5" means "top-5 economics journals" ]:

"The results ... support the hypothesis that the T5 influence operates through channels that are independent of article quality."

"Reliance on the T5 to screen talent incentivizes careerism over creativity."

"Economists at highly ranked departments with established reputations are increasingly not publishing in T5 or field journals and more often post papers online in influential working paper series, which are highly cited, but not counted as T5s."

"Many non-T5 articles are better cited than many articles in T5 journals. ...  Indeed, many of the most important papers published in the past 50 years have been too innovative to survive the T5 gauntlet."

"The [list of] most cited non-T5 papers reads like an honor roll of economic analysis."

"The T5 ignores publication of books. Becker’s Human Capital
(1964) has more than 4 times the number of citations of any paper listed on RePEc. The exclusion of books from citation warps incentives against broad and integrated research and towards writing bite-sized fragments of ideas."

Saturday, September 29, 2018

RCT's vs. RDD's

Art Owen and Hal Varian have an eye-opening new paper, "Optimizing the Tie-Breaker Regression Discontinuity Design".

Randomized controlled trials (RCT's) are clearly the gold standard in terms of statistical efficiency for teasing out causal effects. Assume that you really can do an RCT. Why then would you ever want to do anything else?

Answer: There may be important considerations beyond statistical efficiency. Take the famous "scholarship example". (You want to know whether receipt of an academic scholarship causes enhanced academic performance among strong scholarship test performers.) In an RCT approach you're going to give lots of academic scholarships to lots of randomly-selected people, many of whom are not strong performers. That's wasteful. In a regression discontinuity design (RDD) approach ("give scholarships only to strong performers who score above X in the scholarship exam, and compare the performances of students who scored just above and below X"), you don't give any scholarships to weak performers. So it's not wasteful -- but the resulting inference is statistically inefficient. 

"Tie breakers" implement a middle ground: Definitely don't give scholarships to bottom performers, definitely do give scholarships to top performers, and randomize for a middle group. So you gain some efficiency relative to pure RDD (but you're a little wasteful), and you're less wasteful than a pure RCT (but you lose some efficiency).

Hence there's an trade-off, and your location on it depends on the size of the your middle group. Owen and Varian characterize the trade-off and show how to optimize the size of the middle group. Really nice, clean, and useful.

[Sorry but I'm running way behind. I saw Hal present this work a few months ago at a fine ECB meeting on predictive modeling.]

Sunday, September 23, 2018

NBER WP's Hit 25,000



A few weeks ago the NBER released WP25000, What a great NBER service -- there have been 7.6 million downloads of NBER WP's in the last year alone. 


This milestone is 
of both current and historical interest. The history is especially interesting. As Jim Poterba notes in a recent communication:
This morning's "New this Week" email included the release of the 25000th NBER working paper, a study of the intergenerational transmission of human capital by David Card, Ciprian Domnisoru, and Lowell Taylor.  The NBER working paper series was launched in 1973, at the inspiration of Robert Michael, who sought a way for NBER-affiliated researchers to share their findings and obtain feedback prior to publication.  The first working paper was "Education, Information, and Efficiency" by Finis Welch.  The design for the working papers -- which many will recall appeared with yellow covers in the pre-digital age -- was created by H. Irving Forman, the NBER's long-serving chart-maker and graphic artist.
Initially there were only a few dozen working papers per year, but as the number of NBER-affiliated researchers grew, particularly after Martin Feldstein became NBER president in 1977, the NBER working paper series also expanded.  In recent years, there have been about 1150 papers per year.  Over the 45 year history of the working paper series, the Economic Fluctuations and Growth Program has accounted for nearly twenty percent (4916) of the papers, closely followed by Labor Studies (4891) and Public Economics (4877).

Wednesday, September 19, 2018

Wonderful Network Connectedness Piece

Very cool NYT graphics summarizing U.S. Facebook network connectedness.  Check it out:
https://www.nytimes.com/interactive/2018/09/19/upshot/facebook-county-friendships.html?action=click&module=In%20Other%20News&pgtype=Homepage&action=click&module=News&pgtype=Homepage


They get the same result that Kamil Yilmaz and I have gotten for years in our analyses of economic and financial network connectedness:  There is a strong "gravity effect" -- that is, even in the electronic age, physical proximity is the key ingredient to network relationships. See for example:

Maybe not as surprising for facebook friends as for financial institutions (say).  But still... 

Sunday, September 16, 2018

Banque de France’s Open Data Room

See below for announcement of a useful new product from the Bank of France and its Representative Office in New York. 
Banque de France has been for many years at the forefront of disseminating statistical data to academics and other interested parties. Through Banque de France’s dedicated public portal http://webstat.banque-france.fr/en/, we offer a large set of free downloadable series (about 40 000 mainly aggregated series).

Banque de France has expanded further the service provided and launched, in Paris, in November 2016 an “Open Data Room”, providing researchers with a free access to granular data.
We are glad to announce that the “Open Data Room” service is now also available to US researchers through Banque de France Representative Office in New York City.