Demand Forecasting Lecture Notes PDF
Document Details
Uploaded by UndauntedSaxhorn
Horus University
2014
Dr. Aliaa Ali Farag
Tags
Summary
This document is a chapter from a textbook on business administration, specifically focusing on demand forecasting techniques and includes practical exercises. It covers concepts such as weighted moving average, exponential smoothing, and associative forecasting.
Full Transcript
Forecasting 2 Dr. Aliaa Ali Farag Lecturer at Faculty of Business Administration – Horus University © 2014 Pearson Education,...
Forecasting 2 Dr. Aliaa Ali Farag Lecturer at Faculty of Business Administration – Horus University © 2014 Pearson Education, © 2014 Inc. Pearson Education, Inc. 4-1 Chapter 2 - Lecture 4 Weighted Moving Average It is similar to a moving average, except that it assigns more weight to the most recent values in a time series. © 2014 Pearson Education, Inc. 4-2 Case 1: IF weights are numbers It is similar to a moving average, except that it assigns more weight to the most recent values in a time series. © 2014 Pearson Education, Inc. 4-3 Weighted Moving Average MONTH ACTUAL SHED SALES 3-MONTH WEIGHTED MOVING AVERAGE January 10 February 12 March 13 April 16 [(3 x 13) + (2 x 12) + (10)]/6 = 12 1/6 May 19 June WEIGHTS 23 APPLIED PERIOD July 26 3 Last month 2 Two months ago August 30 1 Three months ago September 28 Sum of the weights October 18 6 November Forecast for16this month = December 3 x Sales last mo. + 2 x Sales 2 mos. ago + 1 x Sales 3 mos. ago 14 Sum of the weights © 2014 Pearson Education, Inc. 4-4 Weighted Moving Average MONTH ACTUAL SHED SALES 3-MONTH WEIGHTED MOVING AVERAGE January 10 February 12 March 13 April 16 [(3 x 13) + (2 x 12) + (10)]/6 = 12 1/6 May 19 [(3 x 16) + (2 x 13) + (12)]/6 = 14 1/3 June 23 [(3 x 19) + (2 x 16) + (13)]/6 = 17 July 26 [(3 x 23) + (2 x 19) + (16)]/6 = 20 1/2 August 30 [(3 x 26) + (2 x 23) + (19)]/6 = 23 5/6 September 28 [(3 x 30) + (2 x 26) + (23)]/6 = 27 1/2 October 18 [(3 x 28) + (2 x 30) + (26)]/6 = 28 1/3 November 16 [(3 x 18) + (2 x 28) + (30)]/6 = 23 1/3 December 14 [(3 x 16) + (2 x 18) + (28)]/6 = 18 2/3 © 2014 Pearson Education, Inc. 4-5 Exercise ▪ John’s House of Pancakes uses a weighted moving average method to forecast pancake sales. It assigns a weight of 5 to the previous month’s demand, 3 to demand two months ago, and 1 to demand three months ago. If sales amounted to 500 pancakes in April,1000 pancakes in May, 2200 pancakes in June, and 3000 pancakes in July, what should be the forecast for August? a. 2400 b. 2511 c. 2067 d. 3767 e. 1622 © 2014 Pearson Education, Inc. 4-6 If Values are percentages Period Demand Compute the weighted moving average for period 8 1 12 using weights of 0.5 , 0.3, and 0.2 2 15 3 11 4 9 5 10 6 8 7 14 8 12 3-period weighted moving average forecast for Period 8= [(0.5 14) + (0.3 8) + (0.2 10)] = 11.4 Copyright © 2013 Pearson Education, Inc. publishing as Prentice Hall 9-7 Problem AssumingW1 = 0.50, W2 = 0.30, and W3 = 0.20. Use the weighted moving average method to forecast arrivals for month 5. Month Customer arrival 1 800 2 740 3 810 4 790 F5 = W1D4 + W2D3 + W3D2 = 0.50(790) + 0.30(810) + 0.20(740) Forecast for month 5 is 786 customer arrivals Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall. 13 – 8 Exponential Smoothing ▪ Exponential smoothing is a time series forecasting technique often used to predict future data points based on past observations. It's particularly useful in fields like inventory management, sales forecasting, and economic data analysis. The method "smooths" out fluctuations in data to create a more stable forecast, helping to predict trends and patterns. © 2014 Pearson Education, Inc. 4-9 Exponential Smoothing Exponential Smoothing In exponential smoothing, the smoothing constant α\alphaα (ranging between 0 and 1) determines how sensitive the forecast is to recent changes in the actual data. Here’s a breakdown of how it works: 1.Higher α\alphaα values (close to 1): A high α\alphaα means that the forecast places more weight on recent observations, making it more "responsive" or sensitive to recent changes in the actual data. When α\alphaα is high, the forecast will adjust quickly if there’s a sudden change in demand. This is useful in environments where demand fluctuates rapidly because it allows the forecast to "follow" recent trends closely. However, it also makes the forecast more volatile, as it may overreact to small changes. 2.Lower α\alphaα values (close to 0): A low α\alphaα means that the forecast relies more on the historical trend and less on the most recent data, making it less responsive to short-term fluctuations. Lower α\alphaα values produce a "smoother" forecast that is more stable and less reactive to sudden changes in demand. This approach is useful when demand is relatively stable or when short-term fluctuations are insignificant, as it minimizes the influence of random noise. In summary: Higher α\alphaα creates a more responsive forecast, adapting to recent trends but potentially overreacting to random noise. Lower α\alphaα creates a smoother, more stable forecast, which is beneficial for stable demand patterns but may lag behind if a genuine trend shift occurs. The choice of α\alphaα depends on the nature of the data and the importance of responsiveness vs. stability in forecasting. Exponential Smoothing Under exponential smoothing, each new forecast is based on the previous forecast plus a percentage difference between that forecast and the actual demand. ⚫ Requires only three items of data ◆ The last period’s forecast ◆ The demand for this period ◆ A smoothing parameter, alpha (α), where 0 ≤ α ≤ 1.0 Ft+1 = Ft + (Dt - Ft ) or = αDt + (1 – α)Ft Ft+1 = new forecast Ft = Current or previous period’s forecast α = smoothing (or weighting) constant (0 ≤ α≤ 1) Dt or At = previous period’s actual demand © 2014 Pearson Education, Inc. 4 - 13 Example ▪ In January, a car dealer predicted February demand for 142 units. Actual February demand was 153 units. Using a smoothing constant of α=0.20, we can forecast march demand as follows: © 2014 Pearson Education, Inc. 4 - 14 Notes: Exponential Smoothing ▪ Exponential smoothing is simple and requires minimal data ▪ When α=0, Ft+1= Ft ▪ When α=1, Ft+1= Dt © 2014 Pearson Education, Inc. 4 - 15 Exercise ▪ Given The following: ▪ Actual demand in June = 160 units ▪ June forecast = 180 units, July forecast = 172 units ▪ The value of smoothing constant is…….? 172 = 180 + (−) -8 = -20 = © 2014 Pearson Education, Inc. 4 - 16 Exercise ▪ Which of the following smoothing constants would make an exponential smoothing forecast equivalent to a naive forecast? A) 0 B).01 C).1 D).5 E) 1.0 © 2014 Pearson Education, Inc. 4 - 17 Exercise ▪ Simple exponential smoothing is being used to forecast demand. The previous forecast of 66 turned out to be four units less than actual demand. The next forecast is 66.6, implying a smoothing constant, alpha, equal to: A).01 66.6 = 66 + (−) B).10 0.6 = () C).15 = D).20 E).60 © 2014 Pearson Education, Inc. 4 - 18 Associative Forecasting Associative forecasting is a forecasting method that uses associative models to predict future values based on the relationships between variables. Unlike time series forecasting, which relies purely on historical patterns, associative forecasting seeks to identify factors that influence the outcome and use these variables to make predictions. Used when changes in one or more independent variables can be used to predict the changes in the dependent variable Most common technique is linear regression analysis © 2014 Pearson Education, Inc. 4 - 19 Linear Regression Linear regression is a statistical technique used to model and analyze the relationship between two or more variables by fitting a linear equation to observed data. In its simplest form, linear regression estimates the relationship between a dependent variable (also called the outcome or target variable) and one independent variable (predictor). When more than one predictor is used, it’s known as multiple linear regression. Associative (Causal) Methods Linear Regression Causal methods are used when historical data are available and the relationship between the factor to be forecasted and other external or internal factors can be identified. Linear regression: A causal method in which one variable (the dependent variable) is related to one or more independent variables by a linear equation. Dependent variable: The variable that one wants to forecast. Independent variables: Variables that are assumed to affect the dependent variable and thereby “cause” the results observed in the past. © 2007 Pearson Education Independent & Dependent Variable Independent Variable: The variable that is manipulated either by the researcher or by nature or circumstance ▪ Also called “stimulus” “input” or “predictor” variables. Dependent Variable: A variable that is observed or measured, and that is influenced or changed by the independent variable. ▪ Also called “response” or “output” or “criterion” or “predicted” variables Both are analogous to a cause-effect relationship © 2007 Pearson Education Linear Regression Factors Associated with Our Sales Advertising Pricing Competitors Sales Economy Timing Independent Dependent Variables Variable 4-19 ©2004by PrenticeHall, Inc., UpperSaddle River, N.J. 07458 Linear Regression ⚫ A dependent variable is related to one or more independent variables by a linear equation ⚫ The independent variables are assumed to “cause” the results observed in the past ⚫ Simple linear regression model is a straight line Y = a + bX where Y = dependent variable X = independent variable a = Y-intercept of the line (When X = 0) b = slope of the line Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall. 13 – 20 Least Squares Method ▪ If your data shows a linear relationship between the X and Y variables, you will want to find the line that best fits this linear relationship. ▪ That line is called a Regression Line and has the equation (ŷ= a + bx). ▪ The Least Squares Regression Line is the line that makes the vertical distance from the data points to the regression line as small as possible. ▪ It’s called a “least squares” because the best line of fit minimizes the variance (the sum of squares of the errors). © 2014 Pearson Education, Inc. 4 - 25 Least Squares Method Values of Dependent Variable (y-values) Actual observation Deviation7 (y-value) Deviation5 Deviation6 Deviation3 Least squares method minimizes the sum of Deviation the squared 4 errors (deviations) Deviation1 (error) Deviation2 Trend line, y^ = a + bx | | | | | | | 1 2 3 4 5 6 7 Figure 4.4 Time period © 2014 Pearson Education, Inc. 4 - 26 Finding the Linear Regression Equation ▪ We will use the following table to determine the value of both a & b Coefficients in the regression equation © 2014 Pearson Education, Inc. 4 - 27 Example: Given the following information, Develop a sales forecast for year 2020 and 2022. Year 2013 2014 2015 2016 2017 2018 2019 Sales 10 8 7 9 12 14 11 Solution Year X Y XY X2 Y2 2013 1 10 10 1 100 2014 2 8 16 4 64 2015 3 7 21 9 49 2016 4 9 36 16 81 2017 5 12 60 25 144 2018 6 14 84 36 196 2019 7 11 77 49 121 SUM ∑X=28 ∑Y=71 ∑XY=304 ∑X2=140 ∑Y2=755 Avg X=28/7 Y=71/7 =4 = 10.14 Different Methods to Calculate regression equation Method 1 b= 304−(7)(4)(10.14) =0.71 140−7 (4)2 𝐚 = 𝟏𝟎. 𝟏𝟒 − 𝟎. 𝟕𝟏 𝟒 = 𝟕. 𝟑 𝐘 = 𝟕. 𝟑 + 0.71 (X) Method 2: 7(304) − (28∗71) b= 𝟐 =0.71 7(140)− 𝟐𝟖 𝐚 = 𝟏𝟎. 𝟏𝟒 − 𝟏. 𝟑𝟓 𝟓. 𝟏𝟒 = 𝟕. 𝟑 𝐘 = 𝟕. 𝟑 + 0.71 (X) Method 3: Var. XY 𝐕𝐚𝐫 𝐗𝐘 = ∑𝐗𝐘 − 𝐂𝐅 𝐗𝐘 𝐕𝐚𝐫𝐗𝟐 = ∑𝐗𝟐 − 𝐂𝐅 𝐗𝟐 b= Var.X2 CF XY= ∑𝐗 (𝐘) 𝐂𝐅 𝐗𝟐 = ∑𝐗 (X) X Y XY X2 Y2 CF XY= 𝟐𝟖 𝟏𝟎. 𝟏𝟒 = 𝟐𝟖𝟑. 𝟗 1 10 10 1 100 2 8 16 4 64 𝐕𝐚𝐫 𝐗𝐘 = 𝟑𝟎𝟒 − 𝟐𝟖𝟑. 𝟗 = 𝟐𝟎. 𝟏 3 7 21 9 49 𝐂𝐅 𝐗𝟐 = 𝟐𝟖 𝟒 = 𝟏𝟏𝟐 4 9 36 16 81 𝐕𝐚𝐫𝐗𝟐 = 𝟏𝟒𝟎 − 𝟏𝟏𝟐 = 𝟐𝟖 5 12 60 25 144 84 36 b= 20.1 = 0.71 6 14 196 7 11 77 49 121 28 ∑X=28 ∑Y=71 ∑XY=304 ∑X2=140 ∑Y2=755 𝐘 = 𝟕. 𝟑 + 0.71 (X) X= 4 Y= 10.14 Method 4: Using two equations and solve them algebraically ∑ Y = na + b∑ X ∑ XY = a∑x + b ∑ (X2) 71 = 7(a)+ b (28) 304 = 28(a) + b (140) By 4 71= 7a + b (28) 5= 7 b 76 = 7a + b (35) b= 0.71 a = 7.3 Example: Given the following information, Develop a sales forecast for year 2020 and 2022. X 1 2 3 4 5 6 7 Year 2013 2014 2015 2016 2017 2018 2019 Sales 10 8 7 9 12 14 11 𝐘 = 𝟕. 𝟑 + 0.71 (X) 𝐅𝐨𝐫𝐞𝐜𝐚𝐬𝐭 𝐟𝐨𝐫 𝐲𝐞𝐚𝐫 𝟐𝟎𝟐𝟎 Y = 7.3+ 0.71(8) = 12.98 𝐅𝐨𝐫𝐞𝐜𝐚𝐬𝐭 𝐟𝐨𝐫 𝐲𝐞𝐚𝐫 𝟐𝟎𝟐𝟐 Y = 7.3+ 0.71(10) = 14.4 𝐅𝐨𝐫𝐞𝐜𝐚𝐬𝐭 𝐟𝐨𝐫 𝐲𝐞𝐚𝐫 𝟐𝟎𝟏𝟓 Y = 7.3+ 0.71(3) = 9.43 Correlation ► How strong is the linear relationship between the variables? ► Coefficient of correlation, r, measures degree of association ► Values range from -1 to +1 © 2014 Pearson Education, Inc. 4 - 36 Correlation Coefficient Figure 4.10 y y x x (a) Perfect negative (e) Perfect positive correlation y correlation y y x x (b) Negative correlation (d) Positive correlation x (c) No correlation High Moderate Low Low Moderate High | | | | | | | | | –1.0 –0.8 –0.6 –0.4 –0.2 0 0.2 0.4 0.6 0.8 1.0 Correlation coefficient values © 2014 Pearson Education, Inc. 4 - 37 Coefficient of Determination ► Coefficient of Determination, r2, measures the percent of change in y predicted by the change in x ► Values range from 0 to 1 ► Easy to interpret © 2014 Pearson Education, Inc. 4 - 38 Computation of r and r2 b∗ Var. XY r2 = Var.Y2 © 2014 Pearson Education, Inc. 4 - 39 Solution Year X Y XY X2 Y2 2013 1 10 10 1 100 2014 2 8 16 4 64 2015 3 7 21 9 49 2016 4 9 36 16 81 2017 5 12 60 25 144 2018 6 14 84 36 196 2019 7 11 77 49 121 SUM ∑X=28 ∑Y=71 ∑XY=304 ∑X2=140 ∑Y2=755 Avg X=28/7 Y=71/7 =4 = 10.14 Computation of r and r2 𝟕(𝟑𝟎𝟒)−(𝟐𝟖)(𝟕𝟏) r= = 𝟎. 𝟔𝟒 𝟕∗𝟏𝟒𝟎−𝟐𝟖𝟐 𝟕∗𝟕𝟓𝟓−𝟕𝟏𝟐 r=0.64 r2= 0.41 © 2014 Pearson Education, Inc. 4 - 38 Method 3: Var. XY 𝐕𝐚𝐫 𝐗𝐘 = ∑𝐗𝐘 − 𝐂𝐅 𝐗𝐘 𝐕𝐚𝐫𝐗𝟐 = ∑𝐗𝟐 − 𝐂𝐅 𝐗𝟐 b= Var.X2 CF XY= ∑𝐗 (𝐘) 𝐂𝐅 𝐗𝟐 = ∑𝐗 (X) X Y XY X2 Y2 CF XY= 𝟐𝟖 𝟏𝟎. 𝟏𝟒 = 𝟐𝟖𝟑. 𝟗 1 10 10 1 100 2 8 16 4 64 𝐕𝐚𝐫 𝐗𝐘 = 𝟑𝟎𝟒 − 𝟐𝟖𝟑. 𝟗 = 𝟐𝟎. 𝟏 3 7 21 9 49 𝐂𝐅 𝐗𝟐 = 𝟐𝟖 𝟒 = 𝟏𝟏𝟐 4 9 36 16 81 𝐕𝐚𝐫𝐗𝟐 = 𝟏𝟒𝟎 − 𝟏𝟏𝟐 = 𝟐𝟖 5 12 60 25 144 84 36 b= 20.1 = 0.71 6 14 196 7 11 77 49 121 28 ∑X=28 ∑Y=71 ∑XY=304 ∑X2=140 ∑Y2=755 𝐘 = 𝟕. 𝟑 + 0.71 (X) X= 4 Y= 10.14 b∗ Var. XY r2 = Var.Y2 Var Y2 = ∑Y2 – CFY2 , CFY2 = ∑Y (Y) CF Y2 = 71 (10.14) = 719.94 Var Y2 = ∑Y2 – CFY2 = 755 – 719.94 = 35.06 r2 = (0.71*20.1) / 35.06 = 0.41 r = 0.64 © 2014 Pearson Education, Inc. 4 - 40 Best Luck