Modelling Stock Prices with Exponential Weighted Moving Average (EWMA)

Show more

Received 2 January 2016; accepted 23 February 2016; published 26 February 2016

1. Introduction

It is well established that to estimate the volatility of a stock price empirically, the stock price is usually observed at fixed intervals of time. One particular objective of EWMA is to track changes in the volatility. For small λ values, recent observations affect the estimate promptly. For λ values closer to one, the estimate changes slowly based on recent changes in the returns of the underlying variable.

The aim of this research paper is to study the daily returns of FTSE 100 Stock Prices of top 100 companies listed on London Stock Exchange from 30th June 2009 to 1st December 2014 and thereby forecast the daily returns from 1st December 2014 to 5^{th} February 2015 with the Exponential Weighted Moving Average (EWMA) Model. The data for this model will be directly obtained from the UK FTSE 100 Index. The paper will make use of Monte Carlo Simulation by writing the codes on the Excel VBA and thereby predict the future stock prices of FTSE 100 Stock Prices.

The paper is structured into four sections. Section 1 will introduce the paper and gives the benefits of modelling in detail. Section 2 will review the literature briefly while Section 3 will be dealing with the data and its analysis. The last section will give the conclusion of the paper.

Benefits of Modelling

A model is a representation of real world events. A model can be defined as “a simplified description of reality that is at least potentially useful in decision-making” [1] . It is the imitation of real world events which gives us the opportunity to study events in order to make accurate decisions about the future before the actual events take place. It gives us the ability to study a long term investments in a compressed time frame to decide whether the investments are profitable or not. With the support of models, those systems with long time frames can be studied in compressed time. A stochastic model recognises the random nature of some input parameters like interest rates, exchange rates of countries’ currencies, inflation rate, etc.; whereas, standard mathematical and logical models are not capable of allowing such randomness. More so, there is ability to study and compare different future possible alternatives or actions so as to choose the least expensive method that best suits the requirements of a user and thereby avoid potential costs associated with trialling in real life. Another benefit is the setting up experimental conditions allows us to gain control on the model so as to reduce the variance of the results output without upsetting the mean values. Finally, models creates a way of testing sensitivity of parameters used a model to measure the effect on the output. This opportunity is not available to the real world system.

2. Literature Review

Modelling the Future Stock Price of Stock Market

Recently, modelling the time-varying nature of the volatility of emerging stock markets has attracted the focus of researchers. [2] centred their research on the stock market volatility of 10 largest emerging markets in Asia and Latin America. They found that shifts in volatility of emerging markets is regulated by country-specific political, social, and economic events. In addition, many authors have recently made their research on the volatility of Central and Eastern European stock markets. [3] -[5] discovered, inter alia, that significant autocorrelation, high volatility persistence, significant asymmetry, lack of relationship between stock market volatility and expected return and non-normality of the return distribution are basic characteristics of stock market volatility in transition countries.

Aside from all these, [6] equally forecasted stock market volatility of fourteen stock markets. With the use of eleven models and use of symmetric and asymmetric loss functions to evaluate the performance of these models, they found that, according to symmetric loss functions, it is the exponential smoothing model that provides the best forecast. However, when asymmetric loss functions are applied ARCH-type models provide the best forecast.

In actual sense, stock market prediction is a process of determining the future value of a stock or other financial instrument traded on stockexchange market. Any successful prediction of a stock’s future price usually result in high profit because of the common problem associated with forecasting the stock prices which is “uncertainty”. According to theefficient-market hypothesis, stock price movements which is quite unpredictable are controlled by the random walk hypothesis, implying that the best forecasting on tomorrow’s price is today’s price value.Some researchers identified a large number of statistical models and financial variables that are useful to predict the future price of stock market.

3. The Exponentially Weighted Moving Average (EWMA) Model

The Exponentially Weighted Moving Average (EWMA) model was derived by JP Morgan in 1989 for their Risk Metrics framework [7] from a Gaussian distribution. The EWMA method of calculating volatility laid more emphasis on more recent returns. The reason behind is that recent price movement is the best predictor of future movement. This was an improvement to the simple volatility method. The EWMA gives volatility forecast for day in form of the following equation:

where,

the parameter of the model, so called decay factor.

According to [8] , there are a number of different methods for the calibration of the parameters of the EWMA model. An extensive overview of these approaches is given in [7] . [8] further emphasized the main ideas of these approach. He further stated that parameter n refers to the number of historical observations used to produce the estimate and a different estimate of this parameter does not significantly influence the accuracy of the forecast.

3.1. The Autoregressive Conditional Heteroskedasticity (ARCH), Generalized ARCH (or GARCH) Models

The Autoregressive Conditional Heteroskedasticity (ARCH) model was first introduced by Engle in 1982 [9] . According to [10] , Rob Engle’s seminal Nobel Prize winning 1982 Econometrica article on the Auto Regressive Conditional Heteroskedastic (ARCH) class of models spurred a virtual “arms race” into the development of new and better procedures for modelling and forecasting time-varying financial market volatility.

[11] gave a definition of GARCH (p, q) model where p is the order of the GARCH terms and q is the order of the ARCH terms ε^{2}

Many researchers have worked on GARCH extensions which had led to development of EGARCH, IGARCH, etc. In essence, these models are the most popularly known for forecasting the financial volatility and returns. In 1993, [12] introduced a new partially nonparametric model, Nonlinear Asymmetric GARCH (1, 1) NAGARCH

These models have been extended many times but out of them, we decided to use EWMA because one of the main objectives of EWMA is to estimate the next?day (or period) volatility of a time series and closely track the volatility as it changes. That is, the volatility of a market variable on day n, as estimated at the end of day.

3.2. Analysis of This Model and Its Output (EWMA)

It has been observed that one of the major advantages of EWMA is that it gives more weight to the recent returns while calculating the returns. In this paper, we will look at how volatility is calculated using EWMA.

Firstly, we would to calculate the log returns of the price series.

To determine the stock prices, we first calculate the daily lognormal returns, using the formula ln(P_{i}/P_{i} − 1), where P represents each day’s closing stock price. We need to use the natural log because we want the returns to be continuously compounded. We will now have daily returns for the entire price series.

Step 2: The variance rate is the square of volatility.

The next step is the take the square of long returns. This is actually the calculation of simple variance or volatility represented by the following formula:

Here, u represents the returns, and m represents the number of days.

Step 3: Assign weights.

In order to assign weights in a way that recent returns have higher weight and older returns have lesser weight, there is need a factor called Lambda (λ), which is a smoothing constant or the persistent parameter. The weights are assigned as (1 − λ)λ^{0}. Lambda must be less than 1. The Risk Metrics database (produced by JP Morgan and made public available) uses the EWMA with 0.94 for updating daily volatility. The first weight will be (1 − 0.94) = 6%, the second weight will be 6% × 0.94 = 5.64% and so on. In EWMA all the weights sum to 1, however they are on reducing basis with a constant ratio of λ.

Step 4: We now multiply the Returns-squared with the weights and take the addition of R^{2}*w, and thereby gives the final EWMA variance. The volatility will be the square root of variance.

Table 1 below shows the simulated data.

Table 1. The simulated data.

Figure 1. Forecasted daily returns.

4. Conclusions

EWMA is basically a special form of an ARCH() model, with such characteristics which include the fact that the ARCH order is equal to the sample data size and the weights are exponentially declining at rate λ throughout time. The main reason for using EWMA is that it is particularly useful to estimate the next-day (or period) volatility of a time series and closely track the volatility as it changes.

In this research paper, we have examined the daily returns of FTSE 100 Stock Prices of top 100 companies listed on London Stock Exchange from the thirtieth day of June 2009 to the first day of December 2014 and equally forecasted the daily returns from the first day of December 2014 to the fifth day of February 2015 with the Exponential Weighted Moving Average (EWMA) Model. We found that there is a very high possibility that the stock prices will start to fall as from 5^{th} February 2015 downwards (Figure 1).

References

[1] Geweke, J. (2005) Contemporary Bayesian Econometrics and Statistics. John Wiley & Sons, Hoboken.

http://dx.doi.org/10.1002/0471744735

[2] Aggarwal, R., Inclan, C. and Leal, R. (1999) Volatility in Emerging Stock Markets. Journal of Financial and Quantitative Analysis, 34, 33-55.

http://dx.doi.org/10.2307/2676245

[3] Kasch-Haroutounian, M. and Price, S. (2001) Volatility in Transition Market of Central Europe. Applied Financial Economics, 11, 93-105.

http://dx.doi.org/10.1080/09603100150210309

[4] Glimore, C.G. and McManus, G.M. (2001) Random-Walk and Efficiency of Central European Equity Markets, Presentation at the 2001 European Financial Management Association, Annual Conference, June 2001 in Lugano, Switzerland.

[5] Poshakwale, S. and Murinde, V. (2001) Modelling the Volatility in East European Emerging Stock Markets: Evidence on Hungary and Poland. Applied Financial Economics, 11, 445-456.

http://dx.doi.org/10.1080/096031001300314009

[6] Balaban, E., Bayar, A. and Faff, R. (2003) Forecasting Stock Market Volatility: Evidence from 14 Countries. 10th Global Finance Conference 2003, Frankfurt/Main, 15-17 June 2003.

[7] Morgan, J.P. (1996) Risk Metrics—Technical Document. J.P. Morgan/Reuters, New York.

[8] Sergiy, L. (2009) Volatility Modelling in Financial Markets.

http://www.math.vu.nl/~sbhulai/theses/stageverslag-ladokhin.pdf

[9] Engle, R. (1982) Autoregressive Conditional Heteroscedasticity with Estimates of the Variance of United Kingdom Inflation. Econometrica, 987-1007.

http://dx.doi.org/10.2307/1912773

[10] Tim, B. (2007) Glossary to ARCH (GARC). Duke University and NBER.

http://faculty.chicagobooth.edu/jeffrey.russell/teaching/Finecon/readings/glossary.pdf

[11] Bollerslev, T. (1986) Generalized Autoregressive Conditional Heteroskedasticity. Journal of Econometrics, 31, 307-327. http://dx.doi.org/10.1016/0304-4076(86)90063-1

[12] Engle, R. and Ng, V. (1991) Measuring and Testing the Impact of News on Volatility. Journal of Finance, 48, 1749-1778.

http://dx.doi.org/10.1111/j.1540-6261.1993.tb05127.x