Robert F. Engle
Nobel Prize for Economics, 2003 NAS
Michael Armellino Professor of Finance, New York University
Sunday, November 15, 2015
9:00am - 10:15am
"Dynamic Conditional Beta and Global Financial Instability"
Abstract: The risk that a financial institution will be undercapitalized in a financial crisis is modeled using a time varying stock market beta and balance sheet data. The econometric theory of the dynamic conditional beta is explained and applied. The results are updated on VLAB to give a current view of global financial stability.
Robert F. Engle, American economist, corecipient of the Nobel Prize for Economics in 2003 for his development of methods for analyzing time series data with time-varying volatility. He shared the award with Clive W.J. Granger. Engle received an M.S. (1966) and Ph.D. (1969) from Cornell University. He taught at the Massachusetts Institute of Technology (1969–75) before joining the University of California at San Diego (UCSD), where he became a professor of economics in 1977 and chair of the department of economics from 1990 to 1994. In 1999 he began teaching at the Stern School of Business at New York University, where he was Michael Armellino Professor of Finance. He retired from UCSD as professor emeritus and research professor in 2003. Engle also held associate editorships on several academic journals, notably the Journal of Applied Econometrics, of which he was coeditor from 1985 to 1989. Engle conducted much of his prizewinning work in the 1970s and ’80s, when he developed improved mathematical techniques for the evaluation and more-accurate forecasting of risk, which enabled researchers to test if and how volatility in one period was related to volatility in another period. This work had particular relevance in financial market analysis, in which the investment returns of an asset were assessed against its risks and in which stock prices and returns could exhibit extreme volatility. While periods of strong turbulence caused large fluctuations in prices in stock markets, these were often followed by relative calm and slight fluctuations. Inherent in Engle’s autoregressive conditional heteroskedasticity (known as ARCH) model was the concept that, while most volatility is embedded in random error, its variance depends on previously realized random errors, with large errors being followed by large errors and small by small. This contrasted with earlier models wherein the random error was assumed to be constant over time. Engle’s methods and the ARCH model led to a proliferation of tools for analyzing stocks and enabled economists to make more accurate forecasts.
Michael I. Jordan
NAS NAE AAAS
Pehong Chen Distinguished Professor, University of California, Berkeley
Sunday, November 15, 2015
2:00pm - 3:15pm
"On Computational Thinking, Inferential Thinking and 'Big Data'"
Abstract: The rapid growth in the size and scope of datasets in science and technology has created a need for novel foundational perspectives on data analysis that blend the inferential and computational sciences. That classical perspectives from these fields are not adequate to address emerging problems in "Big Data" is apparent from their sharply divergent nature at an elementary level---in computer science, the growth of the number of data points is a source of "complexity" that must be tamed via algorithms or hardware, whereas in statistics, the growth of the number of data points is a source of "simplicity" in that inferences are generally stronger and asymptotic results can be invoked. On a formal level, the gap is made evident by the lack of a role for computational concepts such as "runtime" in core statistical theory and the lack of a role for statistical concepts such as "risk" in core computational theory. I discuss recent progress at the computation/statistics interface, including fundamental tradeoffs between inferential quality, communication, runtime and privacy constraints, and mechanisms for implementing these tradeoffs, such as algorithmic weakening, subsampling and concurrency control.
Michael I. Jordan is the Pehong Chen Distinguished Professor in the Department of Electrical Engineering and Computer Science and the Department of Statistics at the University of California, Berkeley. His research interests bridge the computational, statistical, cognitive and biological sciences, and have focused in recent years on Bayesian nonparametric analysis, probabilistic graphical models, spectral methods, kernel machines and applications to problems in distributed computing systems, natural language processing, signal processing and statistical genetics. Prof. Jordan is a member of the National Academy of Sciences, a member of the National Academy of Engineering and a member of the American Academy of Arts and Sciences. He is a Fellow of the American Association for the Advancement of Science. He has been named a Neyman Lecturer and a Medallion Lecturer by the Institute of Mathematical Statistics, and has received the ACM/AAAI Allen Newell Award. He is a Fellow of the AAAI, ACM, ASA, CSS, IMS, IEEE and SIAM.
Data Scientist, Facebook, Inc.
Tuesday, November 17, 2015
9:00am - 10:15am
"Information in Social Networks"
Abstract: This talk will describe several studies characterizing how information propagates in social networks, ranging from how the information that reaches us depends on the social ties we have, to the diversity of that information, to the likelihood that the information seen will be reshared. Furthermore, as many people reshare the same information, large cascades can form. The early spread of the cascade can be used to predict long-term features of the cascade such as size and shape. Finally, the information itself may change in the course of propagation, revealing the evolutionary structure of memes.