Moving Average Methods
Edward L. Boone
Department of Statistical Sciences and Operations Research
Virginia Commonwealth University

November 11, 2013

Edward L. Boone
Simple Moving Average

We are considering time series data xt where t = 1, 2, ..., T .
The order of the observations matter.
A simple moving average attempts to find a local mean.
This can be done simply by taking the average of the
points around the time of interest.
For example if we are interested in a window of width k we
simply take xt , xt−1 , xt+1 ,...,xt+k ,xt−k and compute their
average.

Edward L. Boone
Example
Consider the following example:
x1
1.2

x2
1.3

x3
1.1

x4
1.2

x5
1.4

x6
1.7

x7
1.6

x8
1.8

x9
1.5

x10
1.6

If we want the moving average at time t = 3 with window 2.
¯
x3,2 =

x1 + x2 + x3 + x4 + x5
1.2 + 1.3 + 1.1 + 1.2 + 1.4
=
= 1.24
5
5

If we want the moving average at time t = 7 with window 2.
¯
x7,2 =

x5 + x6 + x7 + x8 + x9
1.4 + 1.7 + 1.6 + 1.8 + 1.5
=
= 1.6
5
5

Notice that the “local” means are not similar.
Edward L. Boone
Trailing Moving Average
The problem with a standard moving average is that for the
mean at time t we need to know t + 1, t + 2,...,t + k , which is in
the future.
In many useful cases we don’t know the future.
We want to just use past values.
This leads to the idea of the trailing moving average.
Only take the average of xt−k , xt−k +1 ,...x1 , xt .
¯
xt,k =

Edward L. Boone

1
k

t

xt
i=t−k
Example
Again consider the following example.
x1
1.2

x2
1.3

x3
1.1

x4
1.2

x5
1.4

x6
1.7

x7
1.6

x8
1.8

x9
1.5

Trailing moving average with window k = 2
¯
x3,2 =

1.3 + 1.1 + 1.2
x2 + x3 + x4
=
= 1.2
3
3
.
.
.
x5 + x6 + x7
1.4 + 1.7 + 1.6
¯
=
=
= 1.56
3
3

¯
x4,2 =

¯
x7,2
Edward L. Boone

x1 + x2 + x3
1.2 + 1.3 + 1.1
=
= 1.2
3
3

x10
1.6
Simple vs. Trailing Moving Average

There are some issues that we will have to confront with all
time series methods.
How to handle the starting values?
Outliers?
Gaps?
Prediction?
Some of these are easier to deal with than others.

Edward L. Boone
Simple vs. Trailing Moving Average

23

Consider the example to the
right.

Edward L. Boone

22
x

20
19
18
17

Notice that the red line is
“smoother” than the blue
line.

21

Red is centered moving
average.
Blue is trailing moving
average.

True
Center
Trail

0

20

40

60
t

80

100
Issues with Moving Averages

Problems with simple moving average techniques.
In the previous methods, all observations in window get the
same weight.
We may wish to downweight observations as they get
farther in the past and always use all observations.
Gaps?
Prediction?
Some of these are easier to deal with than others.

Edward L. Boone
Exponentially Weighted Moving Average
A “simple” way to address the weighting problem is using a
weighted moving average.
There are several versions of these.
Each attempts to model the components of a time series
dataset.
If we just want to model the level then the Exponentially
Weighted Moving Average may be reasonable.
S1 = x1
St = αxt + (1 − α)St−1
These downweight the previous observations but still use
all observations.
Edward L. Boone
Example
Again consider the following example using α = 0.3.
x1
1.2
S1
S2
S3
S4
S5
S6
S7
S8
S9
S10
Edward L. Boone

=
=
=
=
=
=
=
=
=
=

x2
1.3

x3
1.1

x4
1.2

x1 = 1.2
αx2 + (1 − α)S1
αx3 + (1 − α)S2
αx4 + (1 − α)S3
αx5 + (1 − α)S4
αx6 + (1 − α)S5
αx7 + (1 − α)S6
αx8 + (1 − α)S7
αx9 + (1 − α)S8
??

x5
1.4

x6
1.7

x7
1.6

x8
1.8

x9
1.5

x10
1.6

= 0.3(1.3) + 0.7(1.2) = 1.23
= 0.3(1.1) + 0.7(1.23) = 1.191
= 0.3(1.2) + 0.7(1.191) = 1.1937
= 0.3(1.4) + 0.7(1.1937) = 1.2559
= 0.3(1.7) + 0.7(1.2559) = 1.3891
= 0.3(1.6) + 0.7(1.3891) = 1.4523
= 0.3(1.8) + 0.7(1.4523) = 1.5566
= 0.3(1.5) + 0.7(1.5566) = 1.5396
Example

1.1

1.2

1.3

1.4

x

1.5

1.6

1.7

1.8

A picture of what the calculations give.

2

4

6
time

Edward L. Boone

8

10
Exponentially Weighted Moving Average

What we have looked at so far is only concerned in estimating a
level (mean).
Only models the level.
We want to model the trend.
We want to model the Seasonality as well.
In order to do this we will need to build the model using
these basic components.

Edward L. Boone
Double Exponential Smoothing
Now we can add in a trend term bt .
Starting values:
S1 = x1
b1 = x2 − x1
Process smoothing:
St
bt

Edward L. Boone

= αxt + (1 − α)(St−1 + bt−1 )
= β(St − St−1 ) + (1 − β)bt−1
Example
Again consider the following example using α = 0.3 and
β = 0.2.
x1
1.2

x2
1.3

x3
1.1

x4
1.2

x5
1.4

x6
1.7

x7
1.6

x8
1.8

x9
1.5

S1 = x1 = 1.2
b1 = x2 − x1 = 1.3 − 1.2 = 0.1
S2 = αx2 + (1 − α)(S1 + b1 )
= 0.3(1.3) + 0.7(1.2 + 0.1) = 1.3
b2 = β(S1 − S2 ) + (1 − β)(b1 )
= 0.2(1.3 − 1.2) + 0.8(0.1) = 0.1
.
.
.
S10 = 1.7343
b10 = 0.0576
Edward L. Boone

x10
1.6
Example
Again consider the following example using α = 0.3 and
β = 0.2.
x1
1.2

x2
1.3

x3
1.1

x4
1.2

x5
1.4

x6
1.7

x7
1.6

x8
1.8

Predict x11 and x12 .
S10 = 1.7343
b10 = 0.0576
x11 = S10 + b10
= 1.7343 + 0.0567
= 1.7919
x12 = S10 + 2b10
= 1.7343 + 2(0.0567)
= 1.8496
Edward L. Boone

x9
1.5

x10
1.6
Triple Exponential Smoothing

To have a level, trend and season can get a bit complicated.
We need to know what period the seasonality manifests.
Quarterly data the seasonal “lag" L may be 4.
Monthly data the seasonal “lag" L may be 12.
Weekly data the seasonal “lag" L may be 52.
Daily data the seasonal “lag" L may be 365.
Think how complicated hourly data would be.
For simplicity we will consider Quarterly data with L = 4.

Edward L. Boone
Triple Exponential Smoothing
This is also known as the Holt-Winters method.
Process smoothing:
xt
+ (1 − α)(St−1 + bt−1 )
St = α
Ct−L
bt = β(St − St−1 ) + (1 − β)bt−1
xt
Ct = γ + (1 − γ)Ct−L
st
Starting values:
These are more difficult to get.
Some use the first few cycles and get means.
Some use regression to get S0 and b0 , then use those to
get the initial C’s.
We will let R do this for us so we don’t have to worry about
it.
Edward L. Boone
Example

4000
2000

3000

Sales

5000

6000

Consider the following data: SeasonalSales.csv

0

10

20

30
Time

Edward L. Boone

40
HoltWinters Function in R

Using the HoltWinters function in R we can estimate:
smoothing parameters
fitted values
the fitted values also contain the level, trend and seasonal
components

Edward L. Boone
Example

4000
2000

3000

Sales

5000

6000

Consider the following data: SeasonalSales.csv

2

4

6

8
Time

Edward L. Boone

10

12
Example

4000
2000

3000

Sales

5000

6000

A closer look:

8.0

8.5

9.0
Time

Edward L. Boone

9.5

10.0
Example

Sales

2000

3000

4000

5000

6000

7000

Predict into the future.

2

4

6

8
Time

Edward L. Boone

10

12

14
Example

Sales

2000

3000

4000

5000

6000

7000

A closer look.

12.0

12.5

13.0

13.5
Time

Edward L. Boone

14.0

14.5

15.0
Example

2350

2400

2450

Sales

2500

2550

2600

An even closer look.

12.0

12.5

13.0

13.5
Time

Edward L. Boone

14.0

14.5

15.0
Conclusion
Moving average and “smoothing" methods are ad hoc methods
for analyzing time series data.
MA methods require user input k .
Smoothing methods downweight past observations.
These methods can directly model the level, trend and
season components.
Since they are ad hoc they can produce odd results at
times.
While these methods are useful we need to be careful because
they have no clear theory to back them up.

Edward L. Boone

More Related Content

PPTX
Time Series
PPTX
Numerical analysis ppt
PDF
Monte carlo simulation
PDF
Time Series - 1
PPTX
D.M time series analysis
PPT
Mba 532 2011_part_3_time_series_analysis
PPT
Linear regression
PPTX
time series analysis
Time Series
Numerical analysis ppt
Monte carlo simulation
Time Series - 1
D.M time series analysis
Mba 532 2011_part_3_time_series_analysis
Linear regression
time series analysis

What's hot (20)

PPTX
Time series
PPT
4 stochastic processes
PPTX
Time series
ODP
Scatter diagrams and correlation
PPTX
Continuous Random Variables
PPSX
Simple linear regression
PPTX
Binomial distribution
PPTX
Time series Analysis
PPTX
Probability Density Function (PDF)
PPT
aem : Fourier series of Even and Odd Function
PPTX
Discrete random variable.
PPT
Forecasting exponential smoothing
PPT
Introduction to Business Statistics
PPTX
Knowledge representation in AI
PPTX
Euler and improved euler method
PPT
introduction to Numerical Analysis
PPTX
Time series
PPT
Time Series Analysis - Modeling and Forecasting
Time series
4 stochastic processes
Time series
Scatter diagrams and correlation
Continuous Random Variables
Simple linear regression
Binomial distribution
Time series Analysis
Probability Density Function (PDF)
aem : Fourier series of Even and Odd Function
Discrete random variable.
Forecasting exponential smoothing
Introduction to Business Statistics
Knowledge representation in AI
Euler and improved euler method
introduction to Numerical Analysis
Time series
Time Series Analysis - Modeling and Forecasting
Ad

Viewers also liked (20)

PPTX
Moving average method maths ppt
PPT
Chapter 16
PPTX
Above and Beyond - The Moving Average
PPT
Presentation 3
PPT
Deductor Implementation Results
PDF
Simple moving avg
PPTX
Altavox case study
PPT
Forecasting
DOC
Chapter 7
PPTX
How to use VWAP as an important technical indicator
PDF
Forecasting Techniques - Data Science SG
PPT
Survey Research Methods
PPT
Lecture2 forecasting f06_604
PPT
PDF
Moving average and Rate of change (ROC)
PPTX
Statr session 23 and 24
PPTX
Moving average adx derivatives
PPTX
Simple linear regression analysis
DOC
Time series analysis
PPTX
JDemetra+Nowcasting: Macroeconomic Monitoring and Visualizing News
Moving average method maths ppt
Chapter 16
Above and Beyond - The Moving Average
Presentation 3
Deductor Implementation Results
Simple moving avg
Altavox case study
Forecasting
Chapter 7
How to use VWAP as an important technical indicator
Forecasting Techniques - Data Science SG
Survey Research Methods
Lecture2 forecasting f06_604
Moving average and Rate of change (ROC)
Statr session 23 and 24
Moving average adx derivatives
Simple linear regression analysis
Time series analysis
JDemetra+Nowcasting: Macroeconomic Monitoring and Visualizing News
Ad

Similar to Moving Average (20)

PDF
time series.ppt [Autosaved].pdf
PPTX
Time series forecasting
PPTX
Statr session 25 and 26
PDF
Anaplan Stat Forecasting Methods.pdf
PPTX
Time Series Decomposition
PDF
Review of Time series (ECON403)
DOCX
Time Series FORECASTING
PPTX
Moving avg & method of least square
PDF
Chapter 18 Part I
PPTX
MovingAverage (2).pptx
PDF
Time series
PPTX
1.3 Moving Averages.pptx Introduction and Problems
PPT
1634 time series and trend analysis
PPTX
Week 4 forecasting - time series - smoothing and decomposition - m.awaluddin.t
PPT
Forecasting-1.ppt.Brod Forecasting Methods
PPTX
Time Series (chapter 6).pptx PowerPoint Presentation
PDF
Holtwinters terakhir lengkap
PDF
1 s2.0-0272696386900197-main
PDF
1 s2.0-0272696386900197-main
PPTX
forecasting strategy for Management Science
time series.ppt [Autosaved].pdf
Time series forecasting
Statr session 25 and 26
Anaplan Stat Forecasting Methods.pdf
Time Series Decomposition
Review of Time series (ECON403)
Time Series FORECASTING
Moving avg & method of least square
Chapter 18 Part I
MovingAverage (2).pptx
Time series
1.3 Moving Averages.pptx Introduction and Problems
1634 time series and trend analysis
Week 4 forecasting - time series - smoothing and decomposition - m.awaluddin.t
Forecasting-1.ppt.Brod Forecasting Methods
Time Series (chapter 6).pptx PowerPoint Presentation
Holtwinters terakhir lengkap
1 s2.0-0272696386900197-main
1 s2.0-0272696386900197-main
forecasting strategy for Management Science

Recently uploaded (20)

PDF
Getting Started with Data Integration: FME Form 101
DOCX
search engine optimization ppt fir known well about this
PPTX
Benefits of Physical activity for teenagers.pptx
PDF
A Late Bloomer's Guide to GenAI: Ethics, Bias, and Effective Prompting - Boha...
PDF
STKI Israel Market Study 2025 version august
PDF
Taming the Chaos: How to Turn Unstructured Data into Decisions
PPTX
The various Industrial Revolutions .pptx
PDF
Enhancing emotion recognition model for a student engagement use case through...
PPT
What is a Computer? Input Devices /output devices
PDF
Assigned Numbers - 2025 - Bluetooth® Document
PPTX
observCloud-Native Containerability and monitoring.pptx
PPTX
Web Crawler for Trend Tracking Gen Z Insights.pptx
PDF
Architecture types and enterprise applications.pdf
PDF
Unlock new opportunities with location data.pdf
PDF
TrustArc Webinar - Click, Consent, Trust: Winning the Privacy Game
PPTX
Tartificialntelligence_presentation.pptx
PDF
August Patch Tuesday
PDF
A comparative study of natural language inference in Swahili using monolingua...
PDF
DP Operators-handbook-extract for the Mautical Institute
PDF
Zenith AI: Advanced Artificial Intelligence
Getting Started with Data Integration: FME Form 101
search engine optimization ppt fir known well about this
Benefits of Physical activity for teenagers.pptx
A Late Bloomer's Guide to GenAI: Ethics, Bias, and Effective Prompting - Boha...
STKI Israel Market Study 2025 version august
Taming the Chaos: How to Turn Unstructured Data into Decisions
The various Industrial Revolutions .pptx
Enhancing emotion recognition model for a student engagement use case through...
What is a Computer? Input Devices /output devices
Assigned Numbers - 2025 - Bluetooth® Document
observCloud-Native Containerability and monitoring.pptx
Web Crawler for Trend Tracking Gen Z Insights.pptx
Architecture types and enterprise applications.pdf
Unlock new opportunities with location data.pdf
TrustArc Webinar - Click, Consent, Trust: Winning the Privacy Game
Tartificialntelligence_presentation.pptx
August Patch Tuesday
A comparative study of natural language inference in Swahili using monolingua...
DP Operators-handbook-extract for the Mautical Institute
Zenith AI: Advanced Artificial Intelligence

Moving Average

  • 1. Moving Average Methods Edward L. Boone Department of Statistical Sciences and Operations Research Virginia Commonwealth University November 11, 2013 Edward L. Boone
  • 2. Simple Moving Average We are considering time series data xt where t = 1, 2, ..., T . The order of the observations matter. A simple moving average attempts to find a local mean. This can be done simply by taking the average of the points around the time of interest. For example if we are interested in a window of width k we simply take xt , xt−1 , xt+1 ,...,xt+k ,xt−k and compute their average. Edward L. Boone
  • 3. Example Consider the following example: x1 1.2 x2 1.3 x3 1.1 x4 1.2 x5 1.4 x6 1.7 x7 1.6 x8 1.8 x9 1.5 x10 1.6 If we want the moving average at time t = 3 with window 2. ¯ x3,2 = x1 + x2 + x3 + x4 + x5 1.2 + 1.3 + 1.1 + 1.2 + 1.4 = = 1.24 5 5 If we want the moving average at time t = 7 with window 2. ¯ x7,2 = x5 + x6 + x7 + x8 + x9 1.4 + 1.7 + 1.6 + 1.8 + 1.5 = = 1.6 5 5 Notice that the “local” means are not similar. Edward L. Boone
  • 4. Trailing Moving Average The problem with a standard moving average is that for the mean at time t we need to know t + 1, t + 2,...,t + k , which is in the future. In many useful cases we don’t know the future. We want to just use past values. This leads to the idea of the trailing moving average. Only take the average of xt−k , xt−k +1 ,...x1 , xt . ¯ xt,k = Edward L. Boone 1 k t xt i=t−k
  • 5. Example Again consider the following example. x1 1.2 x2 1.3 x3 1.1 x4 1.2 x5 1.4 x6 1.7 x7 1.6 x8 1.8 x9 1.5 Trailing moving average with window k = 2 ¯ x3,2 = 1.3 + 1.1 + 1.2 x2 + x3 + x4 = = 1.2 3 3 . . . x5 + x6 + x7 1.4 + 1.7 + 1.6 ¯ = = = 1.56 3 3 ¯ x4,2 = ¯ x7,2 Edward L. Boone x1 + x2 + x3 1.2 + 1.3 + 1.1 = = 1.2 3 3 x10 1.6
  • 6. Simple vs. Trailing Moving Average There are some issues that we will have to confront with all time series methods. How to handle the starting values? Outliers? Gaps? Prediction? Some of these are easier to deal with than others. Edward L. Boone
  • 7. Simple vs. Trailing Moving Average 23 Consider the example to the right. Edward L. Boone 22 x 20 19 18 17 Notice that the red line is “smoother” than the blue line. 21 Red is centered moving average. Blue is trailing moving average. True Center Trail 0 20 40 60 t 80 100
  • 8. Issues with Moving Averages Problems with simple moving average techniques. In the previous methods, all observations in window get the same weight. We may wish to downweight observations as they get farther in the past and always use all observations. Gaps? Prediction? Some of these are easier to deal with than others. Edward L. Boone
  • 9. Exponentially Weighted Moving Average A “simple” way to address the weighting problem is using a weighted moving average. There are several versions of these. Each attempts to model the components of a time series dataset. If we just want to model the level then the Exponentially Weighted Moving Average may be reasonable. S1 = x1 St = αxt + (1 − α)St−1 These downweight the previous observations but still use all observations. Edward L. Boone
  • 10. Example Again consider the following example using α = 0.3. x1 1.2 S1 S2 S3 S4 S5 S6 S7 S8 S9 S10 Edward L. Boone = = = = = = = = = = x2 1.3 x3 1.1 x4 1.2 x1 = 1.2 αx2 + (1 − α)S1 αx3 + (1 − α)S2 αx4 + (1 − α)S3 αx5 + (1 − α)S4 αx6 + (1 − α)S5 αx7 + (1 − α)S6 αx8 + (1 − α)S7 αx9 + (1 − α)S8 ?? x5 1.4 x6 1.7 x7 1.6 x8 1.8 x9 1.5 x10 1.6 = 0.3(1.3) + 0.7(1.2) = 1.23 = 0.3(1.1) + 0.7(1.23) = 1.191 = 0.3(1.2) + 0.7(1.191) = 1.1937 = 0.3(1.4) + 0.7(1.1937) = 1.2559 = 0.3(1.7) + 0.7(1.2559) = 1.3891 = 0.3(1.6) + 0.7(1.3891) = 1.4523 = 0.3(1.8) + 0.7(1.4523) = 1.5566 = 0.3(1.5) + 0.7(1.5566) = 1.5396
  • 11. Example 1.1 1.2 1.3 1.4 x 1.5 1.6 1.7 1.8 A picture of what the calculations give. 2 4 6 time Edward L. Boone 8 10
  • 12. Exponentially Weighted Moving Average What we have looked at so far is only concerned in estimating a level (mean). Only models the level. We want to model the trend. We want to model the Seasonality as well. In order to do this we will need to build the model using these basic components. Edward L. Boone
  • 13. Double Exponential Smoothing Now we can add in a trend term bt . Starting values: S1 = x1 b1 = x2 − x1 Process smoothing: St bt Edward L. Boone = αxt + (1 − α)(St−1 + bt−1 ) = β(St − St−1 ) + (1 − β)bt−1
  • 14. Example Again consider the following example using α = 0.3 and β = 0.2. x1 1.2 x2 1.3 x3 1.1 x4 1.2 x5 1.4 x6 1.7 x7 1.6 x8 1.8 x9 1.5 S1 = x1 = 1.2 b1 = x2 − x1 = 1.3 − 1.2 = 0.1 S2 = αx2 + (1 − α)(S1 + b1 ) = 0.3(1.3) + 0.7(1.2 + 0.1) = 1.3 b2 = β(S1 − S2 ) + (1 − β)(b1 ) = 0.2(1.3 − 1.2) + 0.8(0.1) = 0.1 . . . S10 = 1.7343 b10 = 0.0576 Edward L. Boone x10 1.6
  • 15. Example Again consider the following example using α = 0.3 and β = 0.2. x1 1.2 x2 1.3 x3 1.1 x4 1.2 x5 1.4 x6 1.7 x7 1.6 x8 1.8 Predict x11 and x12 . S10 = 1.7343 b10 = 0.0576 x11 = S10 + b10 = 1.7343 + 0.0567 = 1.7919 x12 = S10 + 2b10 = 1.7343 + 2(0.0567) = 1.8496 Edward L. Boone x9 1.5 x10 1.6
  • 16. Triple Exponential Smoothing To have a level, trend and season can get a bit complicated. We need to know what period the seasonality manifests. Quarterly data the seasonal “lag" L may be 4. Monthly data the seasonal “lag" L may be 12. Weekly data the seasonal “lag" L may be 52. Daily data the seasonal “lag" L may be 365. Think how complicated hourly data would be. For simplicity we will consider Quarterly data with L = 4. Edward L. Boone
  • 17. Triple Exponential Smoothing This is also known as the Holt-Winters method. Process smoothing: xt + (1 − α)(St−1 + bt−1 ) St = α Ct−L bt = β(St − St−1 ) + (1 − β)bt−1 xt Ct = γ + (1 − γ)Ct−L st Starting values: These are more difficult to get. Some use the first few cycles and get means. Some use regression to get S0 and b0 , then use those to get the initial C’s. We will let R do this for us so we don’t have to worry about it. Edward L. Boone
  • 18. Example 4000 2000 3000 Sales 5000 6000 Consider the following data: SeasonalSales.csv 0 10 20 30 Time Edward L. Boone 40
  • 19. HoltWinters Function in R Using the HoltWinters function in R we can estimate: smoothing parameters fitted values the fitted values also contain the level, trend and seasonal components Edward L. Boone
  • 20. Example 4000 2000 3000 Sales 5000 6000 Consider the following data: SeasonalSales.csv 2 4 6 8 Time Edward L. Boone 10 12
  • 22. Example Sales 2000 3000 4000 5000 6000 7000 Predict into the future. 2 4 6 8 Time Edward L. Boone 10 12 14
  • 24. Example 2350 2400 2450 Sales 2500 2550 2600 An even closer look. 12.0 12.5 13.0 13.5 Time Edward L. Boone 14.0 14.5 15.0
  • 25. Conclusion Moving average and “smoothing" methods are ad hoc methods for analyzing time series data. MA methods require user input k . Smoothing methods downweight past observations. These methods can directly model the level, trend and season components. Since they are ad hoc they can produce odd results at times. While these methods are useful we need to be careful because they have no clear theory to back them up. Edward L. Boone