Autoregressive Models
Definition
Autoregressive (AR) models are time series models that predict a variable using a linear combination of its own past values. They assume that past observations contain information useful for forecasting future values.
Key takeaways
- AR models forecast future values from past values of the same series.
- Common forms are AR(1), AR(2), … AR(p), where p is the number of lagged terms.
- They work well in stable systems but can fail when fundamental conditions change suddenly.
- AR models can be extended (for example, ARIMA) or combined with other methods for better performance.
How they work
An AR(p) model expresses the current value xt as:
xt = c + ϕ1 xt−1 + ϕ2 xt−2 + … + ϕp xt−p + εt
where c is a constant, ϕi are coefficients, and εt is white noise (unpredictable error).
Explore More Resources
Special cases:
* AR(0): no dependence on past values (pure white noise).
* AR(1): current value depends only on the immediately preceding value.
* AR(2): depends on the two most recent past values.
Estimating coefficients is typically done by least squares or maximum likelihood. Model selection (choosing p) often uses information criteria such as AIC or BIC. Stationarity is an important assumption—if the series is nonstationary, differencing or other transformations (e.g., the “I” in ARIMA) are used.
Explore More Resources
Extensions and related models
- ARIMA (Autoregressive Integrated Moving Average): adds differencing (integration) and moving-average components to handle trends, seasonality, and autocorrelated errors.
- ARMA: combines autoregressive and moving-average terms for stationary series.
- Seasonal and multivariate variants handle periodic patterns and multiple interrelated series.
Example: financial markets and structural shifts
A trader using an AR model to forecast a stock price assumes recent prices reflect the forces that will continue to act. This works in stable periods. But during disruptive events—such as the 2008 financial crisis—fundamental risks that were previously ignored suddenly dominated investor decisions. Prices revalued rapidly in ways not predicted by past patterns, demonstrating how AR models can be confounded by regime changes or one-time shocks.
Notably, a shock in an AR model can propagate into future forecasts indefinitely unless the model specification or data transformations account for structural breaks.
Explore More Resources
Limitations and when they fail
- Reliance on historical patterns: AR models implicitly assume future behavior mirrors past behavior.
- Sensitivity to nonstationarity and structural breaks.
- Linear by design: they may not capture nonlinear dynamics unless extended or combined with nonlinear techniques.
- Poor performance during regime changes, crises, or technological disruptions.
Practical usage tips
- Check stationarity (e.g., with ADF test) and difference data if needed.
- Examine autocorrelation (ACF) and partial autocorrelation (PACF) plots to choose lags.
- Validate models on out-of-sample data and monitor residuals for remaining structure.
- Combine AR-based signals with other information (fundamental analysis, exogenous variables) when appropriate.
Explain like I’m five
An autoregressive model guesses tomorrow’s value by looking at the recent past of the same thing—like predicting tomorrow’s temperature from the last few days. It works when things change slowly, but if something big happens, the guess might be wrong.
Common FAQs
Q: Are autoregressive models only linear?
A: Standard AR models are linear, but there are nonlinear extensions and hybrid approaches that capture more complex behavior.
Explore More Resources
Q: What is ARIMA?
A: ARIMA augments AR models with differencing (to remove trends) and moving-average terms (to model autocorrelated errors), making it useful for nonstationary series.
Q: Can AR models predict during crises?
A: They often struggle during crises or sudden structural changes because past patterns may no longer apply.
Explore More Resources
Bottom line
Autoregressive models are a foundational, interpretable tool for forecasting time series by leveraging past values. They perform well in stable environments and serve as building blocks for more sophisticated models (like ARIMA). However, analysts must be cautious about stationarity, structural breaks, and regime changes—and should validate and combine AR methods with other approaches when necessary.