Risk isn’t just about numbers. It’s also about human nature. Financial institutions must have the courage to resist the sense of overconfidence that comes with surviving the recent “near-disaster” in the financial system and the year-long bull market that’s followed.
This sentiment was a constant theme in our panel discussion last Wednesday at the Microsoft New York office on current risk management challenges in the OTC derivatives market, featuring a stellar line-up of speakers. The panel’s consensus was that while it’s important to make sure that your financial modeling (the “science”) is keeping pace with the complex instruments in use today, it’s just as important to pay attention to the “art” of risk management.
Left to right: Serguei Issakov (SVP Quantitative R&D, Numerix), Adam Litke (Chief Risk Strategist, Bloomberg), David Askin (Co-Founder and Head of BVAL, Bloomberg), Sanjay Sharma (CRO – Global Arbitrage & Trading, RBC Capital Markets).
On the science side…
There’s simply no choice but to rely on pricing models to interpolate or extrapolate observable data, given that roughly 98% of OTC trades in the world won’t trade today and won’t trade tomorrow. This brings up two issues: making sure you’re using the appropriate model and then applying old-fashioned common sense.
Moderator Gillian Christie (Director, Deloitte Consulting) enlightened and entertained
The first step is to make sure your model fits the characteristics of the individual instrument rather than a “one-size-fits-all” approach. For example, if the value of a particular trade depends heavily on the volatility smile, you should be using a stochastic volatility model that can calibrate to the entire volatility surface, and possibly even a model that simulates jumps in the underlying asset. This clearly won’t be appropriate for all trades, but it’s important to have the modeling flexibility to use the right tool for the job.
For risk reporting, a single number like VaR may be helpful in certain circumstances, but it’s a good practice to use the full Monte Carlo distribution for day-to-day risk management. One of the hot topics of recent quantitative research is the development of a one-step Monte Carlo for structured products, enabling MC VaR, PFE and CVA computations that take the same amount of time as pricing calculations.
But modeling should just be the starting point…
Mr. Askin told of an example when he was handed the challenge of modeling mortgage prepayment risk in the ‘80s, before anyone had really dealt with prepayment risk. After using all kinds of models to get a solid computed risk number, it still felt low to him—so he just tripled it. His number still turned out to be conservative. In a world governed so much by precise calculations, there is still an important role for gut instinct.
The corollary to this is to make sure computations (and processes) are governed by common sense. It’s possible to have a well-documented process that’s still bad on its merits. For example, avoid over-bucketing to simplify valuations. Underlying pools may seem similar, but it’s still important to look at each pool on its individual characteristics.
Looking toward a decade of uncertainty
In a topic as wide-ranging as “challenges in risk management,” keep this in mind: considering the trillions of dollars committed globally in the latest financial rescue, we are truly in uncharted waters in which nothing over the next 10 years is predictable. Besides some obvious winners among the clearinghouses, who wins and who loses will depend on who can adapt the fastest to making transparency and flexibility an integral part of their processes.
To learn about Numerix solutions for risk managers, click here.