### Present Remotely

Send the link below via email or IM

CopyPresent to your audience

Start remote presentation- Invited audience members
**will follow you**as you navigate and present - People invited to a presentation
**do not need a Prezi account** - This link expires
**10 minutes**after you close the presentation - A maximum of
**30 users**can follow your presentation - Learn more about this feature in our knowledge base article

# Value At Risk

Advance Finance - Momenian

by

Tweet## Mohammad Hassan Momenian

on 31 December 2012#### Transcript of Value At Risk

Risk Measures a mapping from a set of random variables to the real numbers Statistical measures that are historical predictors of investment risk Alpha Beta R-Squared Standard Deviation Sharpe Ratio Expected shortfall Value at Risk Risk Assessment The process of determining the likelihood that a specified negative event will occur.

CVaR LTV Credit Analysis The magnitude of the potential loss (L), and the probability (p) that the loss will occur Scenario Analysis Simulation Decision Tree What is VaR Table of Contents Risk Measures General Description Various Estimation Issues Extensions Measuring potential loss in value of a risky asset or portfolio over a defined period for a given confidence interval VaR on an asset is $ 100 million at a one-week, 95% confidence level Any Entity most often by commercial and investment banks Closer Look 1. Probability Distribution of Individual Risks, Correlation across these risks and the effect of such risks on value. 2. Focus on downside risk and potential losses and use in banks reflects their fear of a liquidity crisis 3. the VaR at investment banks is specified in terms of market risks but there is no reason why the

risks cannot be defined more broadly or narrowly in specific contexts History “Value at Risk” was not widely used prior to the mid 1990s Portfolio theory by Harry Markowitz Market Risk & Co-Movements in the risks Came from the crises that Securities Exchange Commission (SEC)

borrowings below 2000% of their equity capital. increased risk created by the advent of derivative markets and floating exchange rates in the early 1970s, financial assets twelve classes, based upon risk, and required different capital requirements for each Financial and Operating Combined Uniform Single (FOCUS) reports SEC tied the capital requirements 95% confidence over a thirty-day interval the trading portfolios of investment and commercial

banks were becoming larger and more volatile early 1990s, many financial service firms

had developed rudimentary measures of Value

at Risk, with wide variations on how it was measured In 1995, J.P. Morgan provided public access to data on the variances of and covariances across various security and asset classes, that it had used internally for almost a decade to manage risk,

and allowed software makers to develop software to measure risk. Measuring Value at Risk 1. analytically by making assumptions about return distributions for market risks 2. running hypothetical portfolios through historical data 3. Monte Carlo simulations Example of one Asset Mean = $ 100 Million STD = $ 10 Million 95% --> 120> or <80 Example of 100 Asset compute variance of this portfolio 49500 covariances + 100 variances map the risk in the individual investments in

the portfolio to more general market risks General Computation 1. Map to Simpler Standardized Instruments ( Difficult for complex assets ) 2. estimate the variances in each of these instruments and the

covariances across the instruments 3. Using weights 1 & 2 --> VaR six-month dollar/euro forward contract 1.six-month zero coupon dollar bond,

the six-month zero coupon Euro bond and the spot $/Euro. short position in zero-coupon dollar $12.7 (1.04) 180/360 = $ 12.4534 long position zero-coupon euro bond (in dollar terms) holding spot rate fixed spot euro position (in dollar terms) holding euro rate fixed 2. Compute Variances and Covariances 3. Daily Standard deviation

of forward contract = 0.0111021

= $105,367 (1/2) VaR = $105,367* 1.65 = $173,855 90% Confidence Publications by J.P. Morgan in 1996 describe

the assumptions underlying their computation of VaR 1. standardized return (computed as the return divided by the forecasted standard deviation) is normally distributed. 2. more frequent large outliers than would be expected with a normal distribution 3. RiskMetrics approach was extended to cover normal mixture distributions, which allow for the assignment of higher probabilities for outliers Assessment Variance - Covariance Weaknesses 1. wrong distributional assumption ( Other Distributions ) 2. Input error 3. Non-Stationary Variables (refinements on sampling methods ) (allow the standard deviation to change of time) Mohammad Hassan Momenian VaR In The Name of God 4. There is a linear relationship between risk and portfolio positions (option-like securities such as convertible bonds) --> ( developed Quadratic Value at Risk ) changes in the portfolio over time yield all the information you need to compute the Value at Risk. separated the daily price changes into positive and negative numbers Compute Confidence level Probability there are no underlying assumptions of normality driving the conclusion Each day in the time series carries an equal weight the approach is based on the

assumption of history repeating itself Historical Simulation Weaknesses 1. Past is not prologue (entirely from historical) 2. Trends in the data 3. New assets or market risks (all data points are weighted equally) (online business) Modification 1. Weighting the recent past more (a probability weight based on its recency) 2. Combining historical simulation with time series models ( fitting at time series model through the historical data and using the parameters of that model to forecast the Value at Risk - ARMA ) 3. Volatility Updating scaling that number to reflect the

change in volatility - GARCH convert individual assets into positions in standardized instruments specify probability distributions for each of the market risk factors freedom you have to pick alternate distributions for the variables Monte Carlo Simulation process starts In each run, the market risk variables take on different outcomes and the value of the portfolio reflects the

outcomes --- > distribution of portfolio values Monte Carlo a simulation is only as good as the probability distribution for the inputs that are fed into it. Number of market risk factors increases and their comovements become more complex, Monte Carlo simulations become more difficult not have to make unrealistic assumptions about normality in returns any type of portfolio and are flexible enough to cover options and option-like securities Modification 15 key rates and four possible values for each will

require 1,073,741,824 simulations (415) to be complete 1. Scenario Simulation reduce the computation burden of running Monte Carlo simulations likely combinations of these variables 2. Monte Carlo Simulations with Variance-Covariance method modification strength of the Variance-covariance method is its speed The strength of the Monte Carlo simulation is flexibility approximations from the variance-covariance approach to guide the sampling process in Monte Carlo simulations Comparing variance-covariance : strong assumptions about the return distributions - Speed - Simple historical simulation : no assumptions about the nature of return distributions , assumes that the data used in the simulation is a representative sample of the risks looking forward Monte Carlo simulation : most flexibility but is the most demanding from a computational standpoint If they are different, which approach yields the most reliable estimate of VaR ? three approaches are a function of the inputs Normal --> historical simulation = variance-covariance

= Monte Carlo Simulation distributions entirely based upon historical data --> Monte Carlo = Historical all of the measures have trouble capturing extreme outcomes and shifts in underlying risk The Answer do not include options, over very short time periods (a day or a week), the variance-covariance approach does a reasonably good job If the Value at Risk is being computed for a risk source that is stable and where there is substantial historical data (commodity prices, for instance), historical simulations provide good estimates Other --- > Monte Carlo Limitation of VaR 1. VaR can be wrong 1.1. Return distributions assumptions about return distributions, which, if violated, result in incorrect estimates of the Value at Risk outliers more common in reality but that they are much larger than expected 1.2. History may not a good predictor All measures Some Degree time period was a relatively stable one -->

VaR will be a low number and will

understate the risk looking forward 2. Narrow Focus 2.1. Type of risks the likelihood of losses to an asset or portfolio

due to market risk. estimate potential profits?! 2.2. Short term Political Risks ? Liquidity Risks ? Regulatory Risks ? usually a day, a week or a few weeks focused on hedging these risks on a day to-day basis and are thus less concerned about long term risk exposures the regulatory authorities easiest to estimate for short

periods. (Quality) 2.3. Absolute Value fixed value makes it an attractive measure comparing investments with

very different scales and returns look at only a small slice of the risk and Information 3. Sub Optimal Decisions Even if Value at Risk is correctly measured making investment decisions based upon Value at

Risk can lead to over exposure to risk invest in more risky portfolios can be gamed by managers Extensions of VaR to mitigate problems associated with the original measure extending the use of the measure from

financial service firms to the rest of the

market. a Component Value at Risk, breaking

down a firm’s risk exposure to

different market risks : determine where their risk is coming from a Conditional Value at Risk, which they define as a weighted average of the VaR and losses exceeding the VaR Extreme Value Theory Cashflow at Risk (CFaR). ( Much Longer Periods ) the value can remain relatively

stable while cash flows plummet Non Financial Firms earnings (Earnings at Risk) and to stock prices (SPaR). subset of the information that comes out of scenario analysis (the close to worst

case scenario) or simulations (the fifth percentile or tenth percentile of the distribution)

and throw the rest of it out. Thank You Why Banks Use VaR ? huge nominal values of the leveraged portfolios Consultants and software firms

then filled in the gaps and sold the measure as the magic bullet to stop runaway risk primarily marketable securities, making it

easier to break risks down into market risks

and compute Value at Risk. the regulatory authorities have augmented the use of

the measure by demanding regular reports on

Value at Risk exposure Historical Simulation

Full transcriptCVaR LTV Credit Analysis The magnitude of the potential loss (L), and the probability (p) that the loss will occur Scenario Analysis Simulation Decision Tree What is VaR Table of Contents Risk Measures General Description Various Estimation Issues Extensions Measuring potential loss in value of a risky asset or portfolio over a defined period for a given confidence interval VaR on an asset is $ 100 million at a one-week, 95% confidence level Any Entity most often by commercial and investment banks Closer Look 1. Probability Distribution of Individual Risks, Correlation across these risks and the effect of such risks on value. 2. Focus on downside risk and potential losses and use in banks reflects their fear of a liquidity crisis 3. the VaR at investment banks is specified in terms of market risks but there is no reason why the

risks cannot be defined more broadly or narrowly in specific contexts History “Value at Risk” was not widely used prior to the mid 1990s Portfolio theory by Harry Markowitz Market Risk & Co-Movements in the risks Came from the crises that Securities Exchange Commission (SEC)

borrowings below 2000% of their equity capital. increased risk created by the advent of derivative markets and floating exchange rates in the early 1970s, financial assets twelve classes, based upon risk, and required different capital requirements for each Financial and Operating Combined Uniform Single (FOCUS) reports SEC tied the capital requirements 95% confidence over a thirty-day interval the trading portfolios of investment and commercial

banks were becoming larger and more volatile early 1990s, many financial service firms

had developed rudimentary measures of Value

at Risk, with wide variations on how it was measured In 1995, J.P. Morgan provided public access to data on the variances of and covariances across various security and asset classes, that it had used internally for almost a decade to manage risk,

and allowed software makers to develop software to measure risk. Measuring Value at Risk 1. analytically by making assumptions about return distributions for market risks 2. running hypothetical portfolios through historical data 3. Monte Carlo simulations Example of one Asset Mean = $ 100 Million STD = $ 10 Million 95% --> 120> or <80 Example of 100 Asset compute variance of this portfolio 49500 covariances + 100 variances map the risk in the individual investments in

the portfolio to more general market risks General Computation 1. Map to Simpler Standardized Instruments ( Difficult for complex assets ) 2. estimate the variances in each of these instruments and the

covariances across the instruments 3. Using weights 1 & 2 --> VaR six-month dollar/euro forward contract 1.six-month zero coupon dollar bond,

the six-month zero coupon Euro bond and the spot $/Euro. short position in zero-coupon dollar $12.7 (1.04) 180/360 = $ 12.4534 long position zero-coupon euro bond (in dollar terms) holding spot rate fixed spot euro position (in dollar terms) holding euro rate fixed 2. Compute Variances and Covariances 3. Daily Standard deviation

of forward contract = 0.0111021

= $105,367 (1/2) VaR = $105,367* 1.65 = $173,855 90% Confidence Publications by J.P. Morgan in 1996 describe

the assumptions underlying their computation of VaR 1. standardized return (computed as the return divided by the forecasted standard deviation) is normally distributed. 2. more frequent large outliers than would be expected with a normal distribution 3. RiskMetrics approach was extended to cover normal mixture distributions, which allow for the assignment of higher probabilities for outliers Assessment Variance - Covariance Weaknesses 1. wrong distributional assumption ( Other Distributions ) 2. Input error 3. Non-Stationary Variables (refinements on sampling methods ) (allow the standard deviation to change of time) Mohammad Hassan Momenian VaR In The Name of God 4. There is a linear relationship between risk and portfolio positions (option-like securities such as convertible bonds) --> ( developed Quadratic Value at Risk ) changes in the portfolio over time yield all the information you need to compute the Value at Risk. separated the daily price changes into positive and negative numbers Compute Confidence level Probability there are no underlying assumptions of normality driving the conclusion Each day in the time series carries an equal weight the approach is based on the

assumption of history repeating itself Historical Simulation Weaknesses 1. Past is not prologue (entirely from historical) 2. Trends in the data 3. New assets or market risks (all data points are weighted equally) (online business) Modification 1. Weighting the recent past more (a probability weight based on its recency) 2. Combining historical simulation with time series models ( fitting at time series model through the historical data and using the parameters of that model to forecast the Value at Risk - ARMA ) 3. Volatility Updating scaling that number to reflect the

change in volatility - GARCH convert individual assets into positions in standardized instruments specify probability distributions for each of the market risk factors freedom you have to pick alternate distributions for the variables Monte Carlo Simulation process starts In each run, the market risk variables take on different outcomes and the value of the portfolio reflects the

outcomes --- > distribution of portfolio values Monte Carlo a simulation is only as good as the probability distribution for the inputs that are fed into it. Number of market risk factors increases and their comovements become more complex, Monte Carlo simulations become more difficult not have to make unrealistic assumptions about normality in returns any type of portfolio and are flexible enough to cover options and option-like securities Modification 15 key rates and four possible values for each will

require 1,073,741,824 simulations (415) to be complete 1. Scenario Simulation reduce the computation burden of running Monte Carlo simulations likely combinations of these variables 2. Monte Carlo Simulations with Variance-Covariance method modification strength of the Variance-covariance method is its speed The strength of the Monte Carlo simulation is flexibility approximations from the variance-covariance approach to guide the sampling process in Monte Carlo simulations Comparing variance-covariance : strong assumptions about the return distributions - Speed - Simple historical simulation : no assumptions about the nature of return distributions , assumes that the data used in the simulation is a representative sample of the risks looking forward Monte Carlo simulation : most flexibility but is the most demanding from a computational standpoint If they are different, which approach yields the most reliable estimate of VaR ? three approaches are a function of the inputs Normal --> historical simulation = variance-covariance

= Monte Carlo Simulation distributions entirely based upon historical data --> Monte Carlo = Historical all of the measures have trouble capturing extreme outcomes and shifts in underlying risk The Answer do not include options, over very short time periods (a day or a week), the variance-covariance approach does a reasonably good job If the Value at Risk is being computed for a risk source that is stable and where there is substantial historical data (commodity prices, for instance), historical simulations provide good estimates Other --- > Monte Carlo Limitation of VaR 1. VaR can be wrong 1.1. Return distributions assumptions about return distributions, which, if violated, result in incorrect estimates of the Value at Risk outliers more common in reality but that they are much larger than expected 1.2. History may not a good predictor All measures Some Degree time period was a relatively stable one -->

VaR will be a low number and will

understate the risk looking forward 2. Narrow Focus 2.1. Type of risks the likelihood of losses to an asset or portfolio

due to market risk. estimate potential profits?! 2.2. Short term Political Risks ? Liquidity Risks ? Regulatory Risks ? usually a day, a week or a few weeks focused on hedging these risks on a day to-day basis and are thus less concerned about long term risk exposures the regulatory authorities easiest to estimate for short

periods. (Quality) 2.3. Absolute Value fixed value makes it an attractive measure comparing investments with

very different scales and returns look at only a small slice of the risk and Information 3. Sub Optimal Decisions Even if Value at Risk is correctly measured making investment decisions based upon Value at

Risk can lead to over exposure to risk invest in more risky portfolios can be gamed by managers Extensions of VaR to mitigate problems associated with the original measure extending the use of the measure from

financial service firms to the rest of the

market. a Component Value at Risk, breaking

down a firm’s risk exposure to

different market risks : determine where their risk is coming from a Conditional Value at Risk, which they define as a weighted average of the VaR and losses exceeding the VaR Extreme Value Theory Cashflow at Risk (CFaR). ( Much Longer Periods ) the value can remain relatively

stable while cash flows plummet Non Financial Firms earnings (Earnings at Risk) and to stock prices (SPaR). subset of the information that comes out of scenario analysis (the close to worst

case scenario) or simulations (the fifth percentile or tenth percentile of the distribution)

and throw the rest of it out. Thank You Why Banks Use VaR ? huge nominal values of the leveraged portfolios Consultants and software firms

then filled in the gaps and sold the measure as the magic bullet to stop runaway risk primarily marketable securities, making it

easier to break risks down into market risks

and compute Value at Risk. the regulatory authorities have augmented the use of

the measure by demanding regular reports on

Value at Risk exposure Historical Simulation