Discover millions of ebooks, audiobooks, and so much more with a free trial

Only $11.99/month after trial. Cancel anytime.

Applied Time Series Analysis: A Practical Guide to Modeling and Forecasting
Applied Time Series Analysis: A Practical Guide to Modeling and Forecasting
Applied Time Series Analysis: A Practical Guide to Modeling and Forecasting
Ebook687 pages7 hours

Applied Time Series Analysis: A Practical Guide to Modeling and Forecasting

Rating: 5 out of 5 stars

5/5

()

Read preview

About this ebook

Written for those who need an introduction, Applied Time Series Analysis reviews applications of the popular econometric analysis technique across disciplines. Carefully balancing accessibility with rigor, it spans economics, finance, economic history, climatology, meteorology, and public health. Terence Mills provides a practical, step-by-step approach that emphasizes core theories and results without becoming bogged down by excessive technical details. Including univariate and multivariate techniques, Applied Time Series Analysis provides data sets and program files that support a broad range of multidisciplinary applications, distinguishing this book from others.

  • Focuses on practical application of time series analysis, using step-by-step techniques and without excessive technical detail
  • Supported by copious disciplinary examples, helping readers quickly adapt time series analysis to their area of study
  • Covers both univariate and multivariate techniques in one volume
  • Provides expert tips on, and helps mitigate common pitfalls of, powerful statistical software including EVIEWS and R
  • Written in jargon-free and clear English from a master educator with 30 years+ experience explaining time series to novices
  • Accompanied by a microsite with disciplinary data sets and files explaining how to build the calculations used in examples
LanguageEnglish
Release dateJan 22, 2019
ISBN9780128131183
Applied Time Series Analysis: A Practical Guide to Modeling and Forecasting
Author

Terence C. Mills

Terence Mills is Professor of Applied Statistics and Econometrics at Loughborough University and has well over 200 publications, beginning in 1977 with a paper in the European Economic Review. He has since published in most of the international economic, economic history, econometrics, finance and statistics journals and in a range of other journals, including Journal of Climate, Climatic Change, Journal of Cosmology, International Journal of Body Composition Research, Physica A, Energy and Buildings, and Journal of Public Health. He has also written or edited almost 20 books, including a range of introductory statistics and econometric texts, handbooks on econometrics, and histories of time series analysis.

Related to Applied Time Series Analysis

Related ebooks

Economics For You

View More

Related articles

Reviews for Applied Time Series Analysis

Rating: 5 out of 5 stars
5/5

2 ratings0 reviews

What did you think?

Tap to rate

Review must be at least 10 words

    Book preview

    Applied Time Series Analysis - Terence C. Mills

    Applied Time Series Analysis

    A Practical Guide to Modeling and Forecasting

    Terence C. Mills

    Loughborough University, Loughborough, United Kingdom

    Table of Contents

    Cover image

    Title page

    Copyright

    Introduction

    Endnotes

    Chapter 1. Time Series and Their Features

    Abstract

    Autocorrelation and Periodic Movements

    Seasonality

    Stationarity and Nonstationarity

    Trends

    Volatility

    Common Features

    Time Series Having Natural Constraints

    Endnotes

    Chapter 2. Transforming Time Series

    Abstract

    Distributional Transformations

    Stationarity Inducing Transformations

    Decomposing a Time Series and Smoothing Transformations

    Endnotes

    Chapter 3. ARMA Models for Stationary Time Series

    Abstract

    Stochastic Processes and Stationarity

    Wold’s Decomposition and Autocorrelation

    First-Order Autoregressive Processes

    First-Order Moving Average Processes

    General AR and MA Processes

    Autoregressive-Moving Average Models

    ARMA Model Building and Estimation

    Endnotes

    Chapter 4. ARIMA Models for Nonstationary Time Series

    Abstract

    Nonstationarity

    ARIMA Processes

    ARIMA Modeling

    Endnotes

    Chapter 5. Unit Roots, Difference and Trend Stationarity, and Fractional Differencing

    Abstract

    Determining the Order of Integration of a Time Series

    Testing for a Unit Root

    Trend Versus Difference Stationarity

    Testing for More Than One Unit Root

    Other Approaches to Testing for a Unit Root

    Estimating Trends Robustly

    Fractional Differencing and Long Memory

    Testing for Fractional Differencing

    Estimating the Fractional Differencing Parameter

    Endnotes

    Chapter 6. Breaking and Nonlinear Trends

    Abstract

    Breaking Trend Models

    Breaking Trends and Unit Root Tests

    Unit Roots Tests When the Break Date Is Unknown

    Robust Tests for a Breaking Trend

    Confidence Intervals for the Break Date and Multiple Breaks

    Nonlinear Trends

    Endnotes

    Chapter 7. An Introduction to Forecasting With Univariate Models

    Abstract

    Forecasting With Autoregressive-Integrated-Moving Average (ARIMA) Models

    Forecasting a Trend Stationary Process

    Endnotes

    Chapter 8. Unobserved Component Models, Signal Extraction, and Filters

    Abstract

    Unobserved Component Models

    Signal Extraction

    Filters

    Endnotes

    Chapter 9. Seasonality and Exponential Smoothing

    Abstract

    Seasonal Patterns in Time Series

    Modeling Deterministic Seasonality

    Modeling Stochastic Seasonality

    Mixed Seasonal Models

    Seasonal Adjustment

    Exponential Smoothing

    Endnotes

    Chapter 10. Volatility and Generalized Autoregressive Conditional Heteroskedastic Processes

    Abstract

    Volatility

    Autoregressive Conditional Heteroskedastic Processes

    Testing for the Presence of ARCH Errors

    Forecasting From an ARMA-GARCH Model

    Endnotes

    Chapter 11. Nonlinear Stochastic Processes

    Abstract

    Martingales, Random Walks, and Nonlinearity

    Nonlinear Stochastic Models

    Bilinear Models

    Threshold and Smooth Transition Autoregressions

    Markov-Switching Models

    Neural Networks

    Nonlinear Dynamics and Chaos

    Testing for Nonlinearity

    Forecasting With Nonlinear Models

    Endnotes

    Chapter 12. Transfer Functions and Autoregressive Distributed Lag Modeling

    Abstract

    Transfer Function-Noise Models

    Autoregressive Distributed Lag Models

    Endnotes

    Chapter 13. Vector Autoregressions and Granger Causality

    Abstract

    Multivariate Dynamic Regression Models

    Vector Autoregressions

    Granger Causality

    Determining the Lag Order of a Vector Autoregression

    Variance Decompositions and Innovation Accounting

    Structural Vector Autoregressions

    Endnotes

    Chapter 14. Error Correction, Spurious Regressions, and Cointegration

    Abstract

    The Error Correction Form of an Autoregressive Distributed Lag Model

    Spurious Regressions

    Error Correction and Cointegration

    Testing for Cointegration

    Estimating Cointegrating Regressions

    Endnotes

    Chapter 15. Vector Autoregressions With Integrated Variables, Vector Error Correction Models, and Common Trends

    Abstract

    Vector Autoregressions With Integrated Variables

    Vector autoregressions With Cointegrated Variables

    Estimation of Vector Error Correction Models and Tests of Cointegrating Rank

    Identification of Vector Error Correction Models

    Structural Vector Error Correction Models

    Causality Testing in Vector Error Correction Models

    Impulse Response Asymptotics in Nonstationary VARs

    Vector Error Correction Model-X Models

    Common Trends and Cycles

    Endnotes

    Chapter 16. Compositional and Count Time Series

    Abstract

    Constrained Time Series

    Modeling Compositional Data

    Forecasting Compositional Time Series

    Time Series Models for Counts: The IN-AR(1) Benchmark Model

    Other Integer-Valued ARMA Processes

    Estimation of Integer-Valued ARMA Models

    Testing for Serial Dependence in Count Time Series

    Forecasting Counts

    Intermittent and Nonnegative Time Series

    Endnotes

    Chapter 17. State Space Models

    Abstract

    Formulating State Space Models

    The Kalman Filter

    ML Estimation and the Prediction Error Decomposition

    Prediction and Smoothing

    Multivariate State Space Models

    Endnotes

    Chapter 18. Some Concluding Remarks

    Abstract

    Endnotes

    References

    Index

    Copyright

    Academic Press is an imprint of Elsevier

    125 London Wall, London EC2Y 5AS, United Kingdom

    525 B Street, Suite 1650, San Diego, CA 92101, United States

    50 Hampshire Street, 5th Floor, Cambridge, MA 02139, United States

    The Boulevard, Langford Lane, Kidlington, Oxford OX5 1GB, United Kingdom

    Copyright © 2019 Elsevier Inc. All rights reserved.

    No part of this publication may be reproduced or transmitted in any form or by any means, electronic or mechanical, including photocopying, recording, or any information storage and retrieval system, without permission in writing from the publisher. Details on how to seek permission, further information about the Publisher’s permissions policies and our arrangements with organizations such as the Copyright Clearance Center and the Copyright Licensing Agency, can be found at our website: www.elsevier.com/permissions.

    This book and the individual contributions contained in it are protected under copyright by the Publisher (other than as may be noted herein).

    Notices

    Knowledge and best practice in this field are constantly changing. As new research and experience broaden our understanding, changes in research methods, professional practices, or medical treatment may become necessary.

    Practitioners and researchers must always rely on their own experience and knowledge in evaluating and using any information, methods, compounds, or experiments described herein. In using such information or methods they should be mindful of their own safety and the safety of others, including parties for whom they have a professional responsibility.

    To the fullest extent of the law, neither the Publisher nor the authors, contributors, or editors, assume any liability for any injury and/or damage to persons or property as a matter of products liability, negligence or otherwise, or from any use or operation of any methods, products, instructions, or ideas contained in the material herein.

    British Library Cataloguing-in-Publication Data

    A catalogue record for this book is available from the British Library

    Library of Congress Cataloging-in-Publication Data

    A catalog record for this book is available from the Library of Congress

    ISBN: 978-0-12-813117-6

    For Information on all Academic Press publications visit our website at https://www.elsevier.com/books-and-journals

    Publisher: Candice Janco

    Acquisition Editor: J. Scott Bentley

    Editorial Project Manager: Susan Ikeda

    Production Project Manager: Vijayaraj Purushothaman

    Cover Designer: Christian J. Bilbow

    Typeset by MPS Limited, Chennai, India

    Introduction

    0.1 Data taking the form of time series, where observations appear sequentially, usually with a fixed time interval between their appearance (every day, week, month, etc.,), are ubiquitous. Many such series are followed avidly: for example, the Dow Jones Industrial stock market index opened 2017 with a value of 20,504, closing the year on 24,719; a rise of 20.6%, one of the largest annual percentage increases on record. By January 26, 2018, the index had reached an intraday high of 26,617 before declining quickly to close on February 8 at 23,860; a value approximately equal to that of the index at the end of November 2017 and representing a fall of 10.4% from its peak. Five days later, it closed on February 13 at 24,640; 3.3% above this local minimum. By the end of May 2018, the index stood at 24,415; little changed over the ensuing 3 months.

    This volatility, which was the subject of great media and, of course, stock market attention, was surpassed by the behavior of the price of the crypto-currency bitcoin during a similar period. Bitcoin was priced at $995 at the start of 2017 and $13,850 at the end of the year; an astonishing almost 1300% increase. Yet, during December 17, just a fortnight before, it had reached an even higher price of $19,871; an almost 1900% increase from the start of the year. The decline from this high point continued into the new year, the price falling to $5968 on February 6 (a 70% decline from the peak price less than 2 months prior), before rebounding again to close at $8545 on February 13. Since then the price has increased to $11,504 on March 4 before falling back to $6635 on April 6. At the end of May 2018, the price stood at $7393.

    0.2 While financial time series observed at high frequency often display such wildly fluctuating behavior, there are many other time series, often from the physical world, which display interesting movements over longer periods. Fig. I.1 shows the decadal averages of global temperatures from the 1850s onward. The behavior of temperature time series, whether global or regional, have become the subject of great interest and, in some quarters, great concern, over the past few decades. Fig. I.1 shows why. Global temperatures were relatively constant from the 1850s to the 1910s before increasing over the next three decades. There was then a second hiatus between the 1940s and 1970s before temperatures began to increase rapidly from the 1980s onward. Such behavior, in which trends are interspersed with periods of relative stability, is a feature of many time series.

    Figure I.1 Average decadal global temperatures, 1850–2010.

    0.3 How to model time series statistically is the concern of this book, in which, as its title suggests, I emphasize the practical, applied, aspects of statistical time series modeling. While a formal theoretical framework is, as we shall see, an unavoidable necessity, I do not overburden the reader with technical niceties that have little practical impact—my aim is always to provide methods that may be used to analyze and understand time series that occur in the real world that researchers face.¹ Examples are, therefore, drawn from a variety of fields that I am familiar with and have indeed researched in.

    0.4 Chapter 1, Time Series and Their Features, initiates our analysis by considering some of the features that may be exhibited by an individual time series or a group of time series, and this leads naturally to Chapter 2, Transforming Time Series, where a variety of transformations are introduced that enable observed time series to become more amenable to statistical analysis. Chapter 3, ARMA Models for Stationary Time Series, introduces the basic formal concepts of stochastic processes and stationarity that underpin all statistical time series models. It then develops the basic class of univariate time series models, the autoregressive-moving average (ARMA) process, which is the core model used throughout the book. Not every time series is stationary, however: indeed, many require transforming to stationarity. Integrated processes, which may be made stationary by differencing, are an important class of nonstationary processes and Chapter 4, ARIMA Models for Nonstationary Time Series, thus extends ARMA models to the AR-integrated-MA (or ARIMA) class of processes.²

    An important aspect of applied modeling is that of determining whether a time series is stationary or nonstationary and, if found to be the latter, of determining what form of nonstationarity it takes. Chapter 5, Unit Roots, Difference and Trend Stationarity, and Fractional Differencing, considers this problem within the context of testing for unit roots, discriminating between difference and trend stationarity, and investigating whether fractional differencing is required. The analysis is extended in Chapter 6, Breaking and Nonlinear Trends, to consider models having breaking and nonlinear trends.

    An important use of time series models is in forecasting future, and hence unobserved, values of a series. Chapter 7, An Introduction to Forecasting With Univariate Models, thus introduces the theory of univariate time series forecasting based on the range of models introduced so far. Chapter 8, Unobserved Component Models, Signal Extraction, and Filters and Chapter 9, Seasonality and Exponential Smoothing, both focus on other types of univariate models, the former on unobserved component models, the latter on exponential smoothing techniques, which are particularly useful when seasonality, which is also discussed in this chapter, is a feature of the time series.

    Chapter 10, Volatility and Generalized Autoregressive Conditional Heteroskedastic Processes, considers volatility, characterized by a time-varying variance, and consequently introduces the generalized autoregressive conditional heteroskedastic (GARCH) process to model such volatility. A time-varying variance may be regarded as a form of nonlinearity, but there are many other types of nonlinear processes and these are reviewed in Chapter 11, Nonlinear Stochastic Processes.

    The remainder of the book deals with multivariate models, beginning in Chapter 12, Transfer Functions and Autoregressive Distributed Lag Modeling, with an introduction to transfer functions and autoregressive-distributed lag (ARDL) models, in which an endogenous time series is influenced by one or more exogenous series. The endogenous/exogenous distinction may be relaxed to allow a group of time series to all be considered endogenous. This leads to the vector autoregressive (VAR) model and the related concept of (Granger) causality which is the topic of Chapter 13, Vector Autoregressions and Granger Causality.

    These two chapters are restricted to analyzing groups of stationary time series, but when integrated series are allowed, some important new concepts need to be introduced. The inclusion of integrated series in an ARDL model requires consideration of the related concepts of error correction and cointegration, as outlined in Chapter 14, Error Correction, Spurious Regressions, and Cointegration. Chapter 15, VARs With Integrated Variables, Vector Error Correction Models, and Common Trends, extends these concepts to a VAR setting, which leads to the vector error correction model (VECM) and the concept of common trends.

    Chapter 16, Compositional and Count Time Series, focuses on both the modeling of time series that come as counts, that is, small integer values for which the usual assumption of a continuous measurement scale is untenable, and the implications of modeling a set of time series that form a composition. These are series that are shares of a whole and must, therefore, satisfy the twin constraints of taking only values between zero and unity and of summing to unity for every observation, constraints which must, for example, be satisfied by forecasts of future shares. Chapter 17, State Space Models, introduces a general setup known as the state space form, which enables many of the models introduced in the book to be cast in a single framework, which may be analyzed using a powerful technique known as the Kalman filter. Chapter 18, Some Concluding Remarks, provides some concluding remarks about the nature of applied time series modeling.

    0.5 Each chapter contains applied examples, some of which are developed over several chapters. All computations use the Econometric Views, Version 10 (EViews 10) software package and full details on how the computations were performed are provided in the computing exercises that accompany each chapter. EViews 10 was used as it is an excellent and popular package developed specifically for the time series techniques discussed in the book. However, I am aware that many researchers outside the economics and finance area use the statistical programming language R, and EViews 10 contains links between EViews routines and R routines, which those researchers may wish to consult.

    0.6 It is assumed that the reader has a background in introductory statistics (at the level of Mills, 2014, say) and some basic knowledge of matrix algebra (Mills, 2013a, Chapter 2, provides a convenient presentation of the material required).

    0.7 A brief word on notation. As can be seen, chapter sections are denoted x.y, where x is the chapter and y is the section (this chapter is numbered 0). This enables the latter to be cross-referenced as §x.y. Matrices and vectors are always written in bold font, upper case for matrices, lower case for vectors wherever possible. The latter are regarded as column vectors unless otherwise stated: thus, A denotes a matrix while a is a vector.

    Endnotes


    ¹There are several highly technical treatments of time series analysis, most notably Brockwell and Davis (1991) and Hamilton (1994), which the interested reader may wish to consult.

    ²Acronyms abound in time series analysis and have even prompted a journal article on them (Granger, 1982), although in the almost four decades since its publication many, many more have been suggested!

    Chapter 1

    Time Series and Their Features

    Abstract

    Time series, although ubiquitous, require analysing with special statistical concepts and techniques, for without them erroneous inferences and conclusions may all to easily be drawn. Autocorrelation is the key feature of an individual time series or a group of time series. Related to this measure of internal correlation are further features, such as stationarity, nonstationarity, often involving the presence of trends, cycles, and periodic movements, seasonality, and volatility. Time series may also exhibit natural constraints, such as nonnegativity, integer values, and series that are defined as proportions, or compositions, of a whole. All these key features are introduced and discussed using examples of observed time series taken from a variety of fields.

    Keywords

    Autocorrelation; stationarity; non-stationarity; trends; volatility; common features

    Chapter Outline

    Autocorrelation and Periodic Movements 2

    Seasonality 4

    Stationarity and Nonstationarity 4

    Trends 6

    Volatility 8

    Common Features 9

    Time Series Having Natural Constraints 10

    Endnotes 12

    1.1 As stated in the Introduction, time series are indeed ubiquitous, appearing in almost every research field where data are analyzed. However, their formal study requires special statistical concepts and techniques without which erroneous inferences and conclusions may all too readily be drawn, a problem that statisticians have found necessary to confront since at least Udny Yule’s Presidential Address to the Royal Statistical Society in 1925, provocatively titled Why do we sometimes get nonsense-correlations between time series? A study in sampling and the nature of time series.¹

    1.2 In general, a time series on some variable x , where the subscript t being the first observation available on x will often be referred to as the observation period. The observations are typically measured at equally spaced intervals, say every minute, hour, or day, etc., so the order in which observations arrive is paramount. This is unlike, say, data on a cross section of a population taken at a given point in time, where the ordering of the data is usually irrelevant unless some form of spatial dependence exists between observations.²

    1.3, where h is referred to as the forecast horizon.

    Fig. 1.1 shows monthly observations of an index of the North Atlantic Oscillation (NAO) between 1950 and 2017. The NAO is a weather phenomenon in the North Atlantic Ocean and measures fluctuations in the difference of atmospheric pressure at sea level between two stable air pressure areas, the Subpolar low and the Subtropical (Azores) high. Strong positive phases of the NAO tend to be associated with above-normal temperatures in eastern United States and across northern Europe and with below-normal temperatures in Greenland and across southern Europe and the Middle East. These positive phases are also associated with above-normal precipitation over northern Europe and Scandinavia and with below-normal precipitation over southern and central Europe. Opposite patterns of temperature and precipitation anomalies are typically observed during strong negative phases of the NAO (see Hurrell et al., 2003).

    Figure 1.1 NAO index: monthly, January 1950–December 2017. NAO, North Atlantic Oscillation. Data from Climate Prediction Center, NOAA Center for Weather and Climate Prediction.

    Clearly, being able to identify recurring patterns in the NAO would be very useful for medium- to long-range weather forecasting, but, as Fig. 1.1 illustrates, no readily discernible patterns seem to exist.

    Autocorrelation and Periodic Movements

    1.4 Such a conclusion may, however, be premature for there might well be internal correlations within the index that could be useful for identifying interesting periodic movements and for forecasting future values of the index. These are typically referred to as the autocorrelations . The lag-k (sample) autocorrelation is defined as

    (1.1)

    where

    (1.2)

    and

    (1.3)

    , respectively. The set of sample autocorrelations for various values of k is known as the sample autocorrelation function (SACF) and plays a key role in time series analysis. An examination of the SACF of the NAO index is provided in Example 3.1.

    1.5 A second physical time series that has a much more pronounced periodic movement is the annual sunspot number from 1700 to 2017 as shown in Fig. 1.2. As has been well-documented, sunspots display a periodic cycle (the elapsed time from one minimum (maximum) to the next) of approximately 11 years; see, for example, Hathaway (2010). The SACF can be used to calculate an estimate of the length of this cycle, as is done in Example 3.3.

    Figure 1.2 Annual sunspot numbers: 1700–2017. Data from WDC-SILSO, Royal Observatory of Belgium, Brussels.

    1.6 Fig. 1.3 shows the temperature of a hospital ward taken every hour for several months during 2011 and 2012 (see Iddon et al., 2015, for more details and description of the data). Here there is a long cyclical movement—an annual swing through the seasons—superimposed upon which are short-term diurnal movements as well as a considerable amount of random fluctuation (known as noise), typically the consequence of windows being left open on the ward for short periods of time and more persistent movements which are related to external temperatures and solar irradiation (sunshine).

    Figure 1.3 Hourly temperatures in °C of a ward in Bradford Royal Infirmary during 2011 and 2012. Data from Iddon, C.R., Mills, T.C., Giridharan, R., Lomas, K.J., 2015. The influence of ward design on resilience to heat waves: an exploration using distributed lag models. Energy Build., 86, 573–588.

    Seasonality

    1.7 When a time series is observed at monthly or quarterly intervals an annual seasonal pattern is often an important feature. Fig. 1.4 shows quarterly United Kingdom beer sales between 1997 and 2017 that have a pronounced seasonal pattern that has evolved over the years: first quarter sales are always the lowest, fourth quarter sales are usually, but not always, the highest, while the relative size of second and third quarter sales seems to fluctuate over the observation period.

    Figure 1.4 United Kingdom beer sales (thousands of barrels): quarterly, 1997–2017. Data from the British Beer & Pub Association.

    Stationarity and Nonstationarity

    1.8 An important feature of Figs. 1.1–1.3 is the absence of any sustained increase or decline in the level of each series over the observation period; in other words, they fluctuate around a constant mean level which is clearly to be expected from the physical nature of each series. A constant mean level is one, but not the only, condition for a series to be stationary. In contrast, beer sales in Fig. 1.4 show a decline throughout the observation period, most noticeably from 2004 onwards. Clearly beer sales have not fluctuated around a constant mean level; rather, the mean level has fallen in recent years which has been a well-documented concern of UK brewers.

    1.9 If the mean level cannot be regarded as constant then a series is said to be nonstationary. Nonstationarity, however, can appear in many guises. Fig. 1.5 plots daily observations on the US dollar–UK sterling ($–£) exchange rate from 1975 to 2017, a period in which this exchange rate fluctuated widely. The rate fell from around $2.50 at the start of the 1980s to almost parity by the middle of that decade, before rebounding to about $2 at the beginning of the 1990s. In the past 25 years it has undergone three further steep declines, known as depreciations, corresponding to the United Kingdom’s exit in October 1992 from the European Exchange Rate Mechanism, the onset of the global financial crisis during 2008, and the Brexit referendum of June 2016, interspersed by a major appreciation from around $1.4 to over $2 between 2003 and 2008.

    Figure 1.5 $–£ Exchange rate: daily observations, January 1975–December 2017. Data from Bank of England.

    Clearly, the assumption of a constant mean level for the exchange rate would not be appropriate either statistically or, indeed, economically, for this would imply an equilibrium level for the exchange rate which would, consequently, provide a one-way bet for dealers whenever the rate got too far from this level, either above or below. The exchange rate thus exhibits a form of nonstationarity that can be termed random walk or unit root nonstationarity, terms that will be defined and discussed in detail in Chapter 4, ARIMA Models for Nonstationary Time Series, and Chapter 5, Unit Roots, Difference and Trend Stationarity, and Fractional Differencing.

    Trends

    1.10 Just as clearly, the exchange rate does not exhibit an overall trend throughout the observation period, this being informally thought of as a generally monotonic upward or downward movement, which would here imply either a perpetual appreciation or depreciation of the currency; a movement that could not happen in practice as it would, again, offer a one-way bet to dealers in what is perhaps the most efficient of financial markets.

    1.11 Trends, however, are to be found in many time series.³ Fig. 1.6 shows per capita wine and spirits consumption for the United Kingdom from 1950 to 2015. Both show positive trends; that for wine being stronger than that for spirits with, to a first approximation, both trends having reasonably constant slopes. The two series, thus, appear to exhibit linear trends.

    Figure 1.6 Annual consumption of wines and spirits in the United Kingdom: liters of alcohol per capita, 1950–2015. Data from megafile_of_global_wine _data_1835_to_2016 _031117.xlxs. Available from: <www.adelaide.edu.au/wine-econ/databases>.

    1.12 Many trends, though, do not have constant slopes and hence are nonlinear. Monthly observations from 1948 to 2017 on the UK retail price index are shown in Fig. 1.7, and the index clearly has a nonconstant slope. Since the slope measures the change in the index, it is related to, but is not identical with, inflation and the shifting slopes thus reveal the evolution of price increases in the United Kingdom during the postwar period (how measures of inflation might in practice be constructed from a price index is discussed in Chapter 2: Transforming Time Series). Note the sequence of falling prices in the latter half of 2008 at the height of the global financial crisis which produced a brief period of deflation; a rare event in developed economies in the postwar era.

    Figure 1.7 UK RPI: monthly, January 1948–December 2017. RPI, Retail price index. Data from U.K. Office of National Statistics.

    1.13 Ascertaining whether a time series contains a trend may be quite difficult. Consider Fig. 1.8 which shows monthly global temperatures from 1850 to 2017. While there is certainly a general upward movement during the observation period, it is by no means monotonic, with several sustained periods of relative constancy and even decline, thus seeming to rule out a linear trend for the entire observation period. For the anthropogenic global warming hypothesis to hold, this series must exhibit random walk nonstationarity with a positive drift. Whether the drift is significantly positive will be considered in Example 4.3.

    Figure 1.8 Global temperature: monthly, January 1850–December 2017. Data from Met Office Hadley Centre for Climate Science and Services.

    Volatility

    1.14 A second condition of stationarity is that of constant variance. Fig. 1.9 shows the daily percentage change in the $–£ exchange rate plotted in Fig. 1.5. Although the plot of the percentage changes is dominated by the 8% point decline in sterling on June 24, 2016, after the announcement of the Brexit referendum result, the entire sequence of changes is characterized by periods of relative calm interspersed by bursts of volatility, so that the variance of the series changes continuously. Thus,

    Enjoying the preview?
    Page 1 of 1