Next Finance English Flag Drapeau Francais
Jobs Training Search

Elie Ayache : « The trading of derivatives products has nothing to do with the probability distributions»

He was among the first volatility traders in the matif ! ITO33 Founder and former head of research at Dexia AM, Elie Ayache gives us his thoughts on the derivatives markets he defines as the technology...

Article also available in : English EN | français FR

You were in the late 80’s , one of the leading options traders on the Matif, what do you remember about ?

I keep a lot more than a memory, because I have never stopped thinking about it. For me, the experience of trading options on the pit is the real beginning of thinking about markets. It is both the beginning and the end, since the purpose of my thinking and my entire company remains to this day, to deliver tools for option traders and volatility arbitragers, and not theorizing. all market share, and everything should return to the market.

The derivatives market uses a unique logic, different from the theory. As necessary is the theory -it is indeed required to associate with the market price of the derivatives products the action rule to hedge them and to ensure also that the family of price that is generated is an arbitrage example- it is only part of the complete logic of the derivatives market.

Indeed, what I call the "theoretical episode" is reduced to the following inference: if the stochastic process of the underlying is this, then the theoretical value of this derivative product is that. In this theoretical derivation, all the efforts of quantitative analysts are spent - the famous quants - including those of writers who have given their successive names to successive literature models, and the results of all papers published in theory of derivatives products pricing.

By definition, none of the actors involved in this activity of theorizing and derivation is able to access to the full sense of the derivatives products market, which is expressed in turn, through the following truths:

  1. The parameters of the stochastic process are always calibrated against the market prices of derivatives, which means - by pushing the logic where the theorist is unable to push it - that these parameters become stochastic under the recalibration.
  2. The theoretical model is calibrated to the prices of some liquid derivatives instruments in order to value a less liquid product, such as an exotic option. However, this new exotic product is only valued to be treated in turn by its own trader, in its own market; As it becomes liquid, it acquires its autonomy and deviates from the theoretical value dictated by the model. It itself becomes a market benchmark. for example, out the money Put options have become so liquid, following the 1987 crash, making their prices different today from the Black & Scholes model, which provides a unique implied volatility for all options. The barrier options have become so liquid in the foreign exchange market as their prices do not agree with most stochastic volatility model calibrated only to vanillas. The variance swaps have become so liquid on the underlying equity indices making also their prices different from the static replication formula based on vanillas.

These two truths, or axioms, recognize the first place to the market, that is the beginning and the end of all pricing tool (I no longer say "model). In the beginning is (always) the market, and thus the recalibration at the market prices makes the model stochastic. And at the end is (always) the market, and thus the theoretical value of the derivative product is a market price out of the model. (This price in turn becomes a data calibration, and the two axioms fill each other). It is useless to generalize the theory in order to incorporate this "perverse impact", or this market "irony" [1] because the problem will stand at the next level.

This unique logical goes beyond the context of the theory. It is specific to the market, and if we must make the fans of formalism happy, my suggestion is let us define the market as this new logic. When I have personally taken this last step, this had the effect of releasing my thoughts. Now I only think about the market in terms of this first logic [2]

My position is a bit radical, I admit, and is adapted to the market-maker trader of derivatives, fully immersed in his market, rather than the outside observer, or econometrician, which seeks to "measure" the probability distribution of the underlying (one does not know by what means or what phenomenon). Indeed, few among the market-makers, who would not want to calibrate their pricing tool to the reference prices, or would not admit that the tool in question allows them, ultimately, to come back to the market in order to process the derivative product they have just valued

As the name implies, these "market-makers are at the heart of the market event. They are the main users of the pricing tool yet, by the deliberate act of prior recalibration and by the deliberate act of subsequent trading, they contradict the assumption of fixity of the probability distribution that based their model. They constantly move their model. That’s why I maintain that the trading of derivatives products (in the strong sense of this immersion and this move) has nothing to do with the probability distributions ! The trader at the heart cannot be judged by the canon of "chance" (as suggested by Nassim Taleb [3] ) and cannot be subject of "stochastic process". his field, which is the market area, is precisely that which is beyond any stochastic process ! And this is also why all the debate about the "true" probability distribution of the market - whether extreme events are less rare than we think, or if it is invariant from a scale measure, or if the volatility process is auto-correlated, and so on. -, for it to be interesting in terms of econometric misses the trading logic of derivatives products [4] .

For me, the probabilities are just an intermediary, or a step of calculation. They are used to calculate temporary a hedge ratio, and to ensure temporarily the non-arbitrage. (And I am not talking about the risk-neutral probability, but the full probabilistic assumption). To those who still believe, nevertheless, that probability is a knowledge or forecasting theory and would therefore apply it to market considered as an indeterministic phenomenon, I propose to reverse the problem, here’s my challenge: what if, instead of being an object for the probability, the derivatives market was itself a new technology intended to replace, in terms of "future prediction", the entire class of probability and knowledge ? A technology requiring a dynamic involvement and commitment of the trader, not that passive observation. literally, the technology of the future [5].

What do you think were the most developments over the past fifteen years in the financial markets ?

Do not expect me that I answer : "the advent of electronic exchanges or credit risk securitization through the CDO type structures (Collateralized Debt Obligations)". Because I remain very attached to my personal experience of the open outcry markets, where those lose, in my opinion, a very important human dimension (that is to say a whole spectrum of possibilities and capacities) when they become electronic [6] . An I remain very attached to the tradition coming from Black & Scholes, the dynamic replication (perfect, or otherwise, normal) of the derivative product being valued, but I still do not see the CDO in that direction.

So I will answer by the little piece of my glass, or the heart of my problem, and I will seek, in terms of financial markets growth, one that goes in the direction of thinning of the logic update above. That is to say that I will remember any development in line with the introduction of the derivatives products market as the technology of the future.

But, what I stated above as elements of this technology that comes to replace the prediction and statistical inference, namely the repeated calibration to market prices of derivatives, and the market takeover, of the new product that has just been assessed, has found so far that shine in the implied volatility concept. ie, the way the market took up the Black & Scholes formula to infer the volatility from the single price of an option (instead of estimating from a set of underlying prices) and to treat the volatility as a marketable asset (rather than the theoretical model assuming it constant) is , in my view, the most important technological change to date. Except that I will not answer your question by saying that, because the implied volatility is well over fifteen years.

In the same vein, you will understand that the attempts that have been conducted over the last fifteen years, to generalize the concept of implied volatility, that is to say precisely smile models, have shown a regression in relation to the implied volatility, rather a real change. Because, instead that the smile phenomenon being received as the need to upgrade the model to new risks factors (stochastic volatility, jumps in the underlying) and to generalize the dynamic hedging beyond the single underlying, the use was about drowning this important signal in the inhomogeneity of the models, and to make the diffusion parameter depending on the underlying price and time !

The local volatility model has blocked any attempt to calibrate other than plain vanilla options, which meant that no recalibration or marketization of risk related factors (both virtues essential of implied volatility) could not be exploited to the next level. And I am not talking about interest rates models where the solutions adopted are much worse than inhomogeneous, since they are not even Markovian, and that the diffusion parameters are path dependent.

The important development, if it is to take place, will be a return to the main virtue of the Black & Scholes model, namely uniformity, and to understand the true meaning of implied volatility [7] .

But you were talking about market developments and not in the minds of modelers. In my opinion, It turns out that the market has recently produced a very important one (well ahead, as usually, the conceptualization effort). It is the variance swap. With this instrument, the volatility or the variance, is finally recognized as a tradable asset (even though it was only an accident or an irony to compute the volatility as a variable in the Black & Scholes model). Moreover, the variance swap is itself a homogeneous instrument ! And the time is not far off when we implied volatility surfaces of variance swaps, for different maturities and different starting points, will be used as reference for the calibration of jumps and stochastic volatility models , which will then be naturally homogeneous [8].

You Were in charge of research at Dexia AM, what prompted you to quit your job and create ITO 33 ?

At that time (late 1998), it was my feeling that the problem of derivatives pricing should be approached as an engineering problem, and eventually lead to an industrial solution. Neither one nor the other of these perspectives are available in an environment like that of Dexia AM, or any other banking environment. In any case, unsustainable.

The teams of quants working in big houses have neither the time nor the necessary depths to carry out such a technological project. Their mission is primarily about solving quickly case by case, problems with varying pricing complexity, on behalf of structuring desk. These will charge a sufficient margin on the price of the complex structured product recommended to their clients, to cover (it is the word!) the pricing uncertainties. I barely exaggerate when I say that all pricing problem solved by a quant in a bank is limited to the Excel spreadsheet he finally gives to his trader, and he is "lost" as soon as a new Excel spreadsheet replaces the previous one, or a new quantitative analyst replaces the previous one. (Note that Dexia AM is still not an ITO 33 customer).

This disparity of models and solutions is further exacerbated today by the pricing libraries offered by several distributors. Their presumed advantage is to prevent the quant from rewriting the Monte Carlo simulation or the finite differences grid, whenever he seeks to evaluate a new payoff. The showcase of these vendors is typically composed of a list - one imagines it the longest possible - of payoffs covered by their libraries, and a parallel list of theoretical models, they expect the most "complete" as possible. Seduction can even be pushed to propose to the quant a generic pricing engine and a symbolic language describing elementary events (coupon, strike, barrier, reset rule, etc..) so that he structures himself the desired payoff.

I personally think that this little pricing bookstore industry (or pricing add-ins, or pricing analytics) only multiplies the "theoretical episodes" without making any significant step towards the solution of the full derivatives pricing problem, which logic has been described above. (These libraries are ideal to complement learning about derivatives and the approach of INRIA, as explained by Agnès Sulem, in the interview she has granted, seems to me, the person expressing with more honesty both the horizon and the limitation of such programs: " The purpose of Premia, she says, is to implement all the new models as they appear but to produce reference implementations").

Solving the most complex payoff against the theoretical model, computing all types of greeks and all types of stress scenarios, is purely the field of applied mathematics and numerical analysis. There, I have no doubt that the great progress can be made and good implementations to be used as reference. but the technology behind derivatives pricing is a different thing. To be effective in the hands of the trader, it must adapt to the complete logic of the derivatives market, as I have identified above.

The truth behind pricing does not stand in any particular model, and let alone in a list of disparate models. because the complete logic of the market is about defeating any model, as we have seen. What I call an appropriate pricing engine must address two crucial problems of recalibration and expansion of the universe of calibration, as expressed in the two axioms above. It is that technology we develop for ITO 33. Being the technology of the future, it requires time for more developments and upgrades, beyond the life of a team of quants and host of advanced numeric and mathematics, all converging towards the same goal . My favorite analogy is the Space technology, where seven full years were necessary to Grumman Engineering (1962-69) to develop the lunar model.

Through numerous articles you discuss the volatility smile issue and its dynamic. How far has the quantitative research got with the subject ?

You speak as if the volatility smile issue and its dynamic were two separate things. In fact, any smile model, ie any model that could explain the actual derivatives prices considered liquid and observed in the market, should be a model of the underlying dynamic if it is to prevent from arbitrage opportunities (that excludes phenomenological deformation model of the implied volatility surface, and the so called approach of "mixture of models" or "set of models"), and as such it is a model of the smile dynamic.

The distinction between the smile problem and the problem of its dynamic, is only due to an accident of history that now gives the impression that we discover, with the smile dynamic, a new and exciting problem, whereas it is the same old problem from the beginning: that of the smile, or, more simply the derivatives pricing outside the Black & Scholes framework. This accident of the history is the local volatility model (1994).

Because of its ability to match exactly, in its non-parametric form, any implied volatility surface of vanilla options, the local volatility model has long been considered as the ultimate smile model that could offer full satisfaction if its practical implementation had not been "marginally" hampered by the interpolation/extrapolation issue, exempt from arbitrage, data missing from the implied volatility surface and the problem of numerical instability. For ten years, the efforts of applied mathematicians, interpolators calculators, optimizers and others experts of problems not clearly exposed, focused on this part of the problem, when on the other side, the discontent among traders was growing about a model that did not provide the optimal hedge ratios for the options - starting with the vanilla options - and could not explain the prices of barrier options or forward start options. two important shortcomings, naturally.

Thus, the quantitative finance community discovered the dynamic smile issue and come to realize the obvious, namely that it is dependent on the dynamic model chosen ! We can only observe snapshot market prices. Regarding their dynamic, it can only be an assumption, and therefore result from a model. Different models may even match the same actual smile, but the forward smile will depend on the model and will be different.

However the observed market snapshot prices may contain information rich enough and thin enough on the chosen dynamic, provided one is interested in derivatives of quite varied structure. Nobody ever said that vanilla options should be the ultimate reference for the smile ! Again , this attachment to vanilla options is a relic of the famous incident in the history of the local volatility (not to mention that the word "smile" is an implicit reference to vanilla). Because of inhomogeneity of its non-parametric form, the local volatility is, indeed tailored to match the pricing structure of vanillas options, which are just "local" and inhomogeneous because of their strikes. But in doing so , the local volatility is doing without information reflected in the market prices of barrier options or forward start options or other path-dependent structures such as variance swaps. (it must be said that the Dupire formula devoted this fact). However, the market prices of exotic or path dependent structures contain specific information "missing" on the smile dynamics:
a) Coarsely, if by "smile dynamics" we mean precisely the difference it made to the prices of exotic options;
b) Effectively, if the goal is to determine the hedge ratios of vanillas.

Within a model sufficiently flexible and robust (and there, my preference is consistent with models that have explanatory power that cannot be an "empty" model as the local volatility [9] ), there is indeed a close connection between, for example, the prices of American digital options (or one touch) and the hedge ratios for vanilla options. Consequently my research program can be stated as follows:

  1. Adopt a uniform dynamic, rich enough - ie, incorporating at least jumps diffusion process of the underlying and the stochastic volatility - while I no longer call "smile model". Because it has meant to be calibrated to market prices of any structure, as exotic as it is, and not just to vanillas.
  2. If an exotic option becomes more liquid enough to escape the supposed model aimed to price it relatively to more liquid instruments, it is better to expand the world of calibration and calibrate the dynamics to this exotic option price.
  3. Hope that the calibration to market prices of more exotic instruments will determine the dynamics, to produce among several possible solutions, the good hedging strategy for less exotic instruments. (here stand my main conjecture about the smile dynamics). Obviously, This assumes that we agreed in advance on what is meant by "hedging" in presence of jumps and stochastic volatility.
  4. defending the calibration with hedging. This means that exotic options used as supports to calibrate and expected to determine "the optimal hedging strategy" of less exotic options will be themselves used in the generalized hedging strategy, and they will then help, in addition to the underlying and other basic instruments intended to hedge the derivative product with well specific ratios, "hedging" this hedge and "ensure" these ratios.

To summarize:

The smile dynamics does make a difference when we see that some exotic prices are not matched. My suggestion then is to calibrate the model to such exotic prices. As I do not differentiate between vanilla and exotic, this is simply to say that the simple model is "rightly" calibrated.

The smile dynamics is also a difference in the dynamic hedging strategy for any derivative product (by definition of the price "dynamics"). But, the only real dynamic is that of the recalibration, that changes the model parameters - and thus the model - every day. So we can expect to demonstrate that the hedging will be more robust than another, a priori. It has to fall back on the previous point, to ensure at least that today exotic prices are matched, and hope that the price structures of more complex options will move relatively less, during the real scenario, compared to vanilla options and that they will eventually "anchor" the hedging strategies of these options, through the successive recalibration. (That’s what a student will never learn in a derivatives course, and what a teacher can never tell him. This is the eigenspace where technology must be greater than the theory).

Ultimately the smile issue and its dynamics can arise or be resolved only in relative terms, with respect to a given context of calibration. The key is to have a calibration/pricing/hedging tool that can be maintained through the change of scopes and their ascending hierarchy (and that is numerically solvable, naturally). This is the only prescription that can be stated a priori. We have reasons to believe, here at ITO 33, that the model with changing regimes is the right option.

ITO 33 has the reputation of being a soft specialized in convertible bonds; what are the other strengths of ITO 33 ?

It is true that the pricing of the convertible bond has provided the "perfect hedge", the time for us to set up the technological revolution that I announce.

All joking aside, the convertible bond is among the most complex structured products (it combines the characteristics of the American option, the barrier option, the forward start option, the Asian option the Parisian option, etc..., and the credit derivative), and because we have tried since the beginning to price appropriately the convertible bond, that we have been led to formulate the calibration/pricing/hedging program that I have described to you and develop the most efficient numerical schemes and the fastest calibration routines.

Above all, we were lucky enough, with the convertibles, to be confronted with the most demanding clients in terms of products technology, including hedge funds (based, for the majority in New York and London) whose one of the favorites strategy was the convertible bonds arbitrage. When this strategy declined (in 2005), the hedge funds concerned, naturally turned to the more general arbitrage strategy, the equity-to-credit, and then accompanied the evolution of our product, where the Credit default Swaps (CDS), and the variance swaps were considered as benchmarks instruments, in addition to vanilla products.

Market-neutral hedge funds are trying to extract value from their quantitative models, and their first concern is the robustness of the hedging strategy. In this, they differ from flow traders of banking desks, "rebalancing" perpetually their portfolios. (That said, a technology such as the one we are developing is really designed for the market-maker, and the ideal would be a situation where it would generate itself the prices of exotic as support to calibrate a model like ours, ensuring thus both the stability and consistency of its operation. This is why I predict that big hedge funds will become market-makers of derivatives).

In summary, the strength of ITO is the ability, both technological and intellectual, to approach the derivatives pricing issue in its only significant dimension, one that starts and ends in the market. It is not only a work of thought, but a change in the tradition, and even the "destruction of metaphysics" (in words of Heidegger, a philosopher from the beginning of thought [10]). I think this idea of the market, when specifically driven in terms of products, confirms a new beginning. That’s why I do not intend to lead ITO 33 as a simple software publisher, but also as a real think tank to which I invite every thinker concerned with my challenge

F.Y January 2007

Article also available in : English EN | français FR

Footnotes

[1] Cf. Elie Ayache, “The Irony in the Variance Swaps”, Wilmott, September 2006: 16-23.

[2] Cf. Elie Ayache, « L’événement du marché ou la nécessité de l’ascension méta-contextuelle », in Théorie quantique et sciences humaines, Michel Bitbol (éd.), Paris : Editions du CNRS, to be published soon.

[3] Cf. Nassim Nicholas Taleb, Le Hasard sauvage : Des marchés boursiers à notre vie : le rôle caché de la chance, Carine Chichereau (trad.), Paris : Les Belles Lettres 2005.

[4] Elie Ayache, "Elie Ayache on Option Trading and Modeling", in Espen Gaarder Haug, Derivatives Models on Models, Wiley: February 2007.

[5] Cf. Elie Ayache, “Why 13 Can Only Succeed to 11, or, the End of Probability”, Wilmott, July 2006: 30-38.

[6] Cf. Caitlin Zaloom, Out of the Pits: Traders and Technology from Chicago to London. Chicago: the University of Chicago Press, Fall 2006.

[7] Cf. Elie Ayache, “What Is Implied by Implied Volatility?”, Wilmott, January 2006: 28-35.

[8] Cf. Philippe Henrotte, “How Exotic Is the Variance Swap?”, Wilmott, November 2006: 24-26.

[9] Cf. Elie Ayache, “Dial 33 for your Local Cleaner”, Wilmott, January 2007.

[10] Cf. Richard Polt, The Emergency of Being (On Heidegger’s Contributions to Philosophy), Ithaca: Cornell University Press 2006.

Tags


Share

Facebook Facebook Twitter Twitter Viadeo Viadeo LinkedIn LinkedIn

Comment
Advertising
In the same section
Sections