VAR is not a Good Risk Measure
Value at Risk (VAR) has taken alot of heat lately and deservedly so.
VAR, as banks are required to calculate it, relies solely on recent past data for calibration. The use of “recent” data means that following any period of low losses, the VAR measure will show low risk. That is just not the case. It fails to recognize the longer term volatility that might exist. In other words if there are problems that have a periodicity longer than the usual one year time frame of VAR, then VAR will ignore them most of the time and over emphasize them some of the time. Like the stopped clock that is right twice a day, except that VAR might never be right.
Risk models can be calibrated to history, long term or short term, or to future expectations, either long term or short term or they can be calibrated to assumptions consistent with market prices, either spot or over some period of time. Of those six choices, VAR is calibrated from one of the less useful possible choices.
What VAR does is to answer the question of what would the 1/100 loss have been had I held the current risk positions over the past year. The advantage of the definition chosen is that you can be sure of consistency. However, that is only a consistently useful result if you always believe that the world will remain exactly as risky as it was in the past year.
If you believe in equilibrium, then of course the next year will be very similar to the last. So risk management is a very small task relating to keeping things in line with past variability. However, the world and the markets do not evidence a fundamental belief in equilibrium. Some years are riskier than others.
So VAR is not a Good Risk Measure if it is taken alone. If used with other more forward looking risk measures, it can be a part of a suite of information tat can lead to good risk management.
On the other hand, if you divorce the idea of VAR from the actual implementation of VAR in the banks, then you can conclude that Var is not a Bad Risk Measure.