Econometrics Project
Econometrics II Project 2
October 30, 2013
Robby Barbieri, Kevin Caughman, Benedikt Holt
In this exercise we used the EViews program SIMARMA.PRG to simulate ARMA time series. We used 89109 as our seed for the 500-count observation and then analyzed the subsequent data using AR(1), MA(1), ARMA (1,1), and ARMA (2,2).
For the autoregressive process with a sample size of 500 we used a phi of .8 to provide a strong link between today and yesterday’s mu. Based off of this information we derived the following correlogram and reported the results in Appendix A.
As you can see from the ACF and PACF, the lag 1 statistic almost perfectly matches our phi of .8. In addition to this we have significant autocorrelation, which decreases as we increase the lags. Partial correlation is roughly equal to alpha in lag 1 but quickly reduces to around 0 thereafter.
For the moving average process with a sample size of 500 we used a theta of .8. The observed series was a weighted (moving) average of white noise shocks. Our MA(1) correlogram yielded significantly different results than the AR(1).
Because the data simply reflected white noise shocks we only observed autocorrelation in lag 1. The partial correlation shows an oscillating shape that is unique to the moving average process. In addition to these two observations we also noticed that the MA(1) coefficient in least squares regression found in Appendix A- Exhibit 3 matches our theta of .8.
When analyzing the data using ARMA (1,1) and ARMA (2,2) we discovered some interesting results. As you can see below, we found that the ARMA (1,1) and ARMA (2,2) correlograms had qualities of both AR and MA.
ARMA (1,1) exhibited the significant but decreasing autocorrelation of the AR(1), while also following the same oscillating pattern in the partial correlation as the MA(1). When examining the least squares regression of this data we found that the coefficients of both AR and MA match the phi and theta of .8.
This yielded significantly different results than the ARMA