Abstract
We introduce an event based framework mapping financial data onto a state based discretisation of time series. The mapping is intrinsically multi-scale and naturally accommodates itself with tick-by-tick data. Within this framework, we define an information theoretic quantity that characterises the unlikeliness of price trajectories and, akin to a liquidity measure, detects and predicts stress in financial markets. In particular, we show empirical examples within the foreign exchange market where the new measure not only quantifies liquidity but also seems to act as an early warning signal.
Introduction
The notion of market liquidity is nowadays ubiquitous. It quantifies the ability of a financial market to match buyers and sellers in an efficient way without causing a significant price move thus delivering low transaction costs. It is the lifeblood of financial markets (Fernandez, 1999) without which market dislocations can show as for example in the 2008 crisis (Brunnermeier, 2009), but also in many others cases that go unnoticed but are potent candidates to become the next crisis. While omnipresent, liquidity is an elusive concept. For instance, the foreign exchange (FX) market with its impressive daily turnover of $5.3 trillion (Bank of International Settlement, 2013) is mistakenly assumed to be always extremely liquid since the generated volume is considered as a proxy for liquidity but regularly shows illiquid episodes as illustrated through the selected examples below.
Despite the obvious importance of liquidity there is little agreement on the best way to measure and define it (von Wyss, 2004; Sarr and Lybek, 2002; Kavajecz and Odders-White, 2004; Gabrielsen et al., 2011). Liquidity measures can be classified into different categories. Volume-based measures: liquidity ratio, Martin index, Hui and Heubel ratio, turnover ratio, market adjusted liquidity index (see Gabrielsen et al., 2011, for details) where, over a fixed period of time, the exchanged volume is compared to price changes. This class implies that non-trivial assumptions are made about the relation between volume and price moves. Other classes of measures include price based measures: Marsh and Rock ratio, variance ratio, vector autoregressive models; transaction costs based measures: spread, implied spread, absolute spread or relative spread; or time based measures: number of transactions or orders per time unit. There exists plenty of studies that analyse measures of liquidity in various contexts (see von Wyss, 2004; Gabrielsen et al., 2011, and references therein) without reaching a true consensus. In addition, it is worth highlighting that some of the data used in these measures could be hard to obtain or even not available at all as it is the case for the full limit order book in the FX market making therefore impossible the use of a majority of these measures.
From our point of view, the aforementioned approaches suffer from a major drawback. They provide a top-down approach to explore financial markets where the impact of the variation of liquidity is analysed through macroscopic assumptions rather than providing a bottom-up approach where illiquid times are identified and quantified from its possible constituents. The former therefore requires one to make appropriate, and non-trivial, assumptions about the macroscopic system while the latter needs us to identify the right constituents of the system as well as quantifying their dynamics.
Hence this paper aims at looking at liquidity from a different angle where multi-scale price moves are analysed through an event based framework. It allows us to track price moves occurring at different scales (see details below) and, as we shall see, quantifying the unlikeness of these price moves leading to a novel measure of illiquidity as the unlikeliness of the price trajectory with respect to a Brownian motion. We shall observe below the ability of our measure to detect and predict stress in financial markets, illustrated by examples within the FX market, only requiring asset prices as an input.
The document is organised as follows; Section 2 describes the event based framework. Section 3 defines the state based discretisation of price trajectory movement termed intrinsic network. In Section 4 we derive the transition probabilities of the Markov chain modelling the transitions on the intrinsic network, for the case of a Brownian motion. Section 5 describes the information-theoretic concept that characterises the unlikeliness of price trajectories and quantifies illiquidity. Finally, in Section 6 we demonstrate the measurements ability to quantify liquidity during extreme price movements and illustrate the behaviour of the new measurement by focusing on well documented financial crises.
The event based framework
Traditional high frequency finance models (Dacorogna et al., 2001) use equidistantly spaced data for their inputs, yet markets are known not to operate in a uniform fashion: during the weekend the markets come to a standstill, while unexpected news can trigger a spur of market activity. The non-uniformity is expressed in the markets through the so-called stylized facts, consisting of long range memory in volatility (Poon and Granger, 2003), non-stationary fat tailed distribution of returns (Mandelbrot, 1963), nonlinear serial dependencies in returns (LeBaron, 1994), volatility seasonality (Dacorogna et al., 2001) and scaling in financial time series (Glattfelder et al., 2011b). The idea of modelling financial series using a different time clock can be traced back to the seminal work of Mandelbrot and Taylor (1967) and Clark (1973), advocating the use of transaction and volume based clock. One other area of research that analyses high-frequency time series from the perspective of fractal theory was initiated by Mandelbrot (1963). This seminal work has inspired others to search for empirical patterns in market data - namely scaling laws
1
. One of the most reported scaling laws in financial markets (Müller et al., 1990; Galluccio et al., 1997; Dacorogna et al., 2001; Di Matteo et al., 2005) relates the average absolute price change 〈Δx〉 and the time interval of its occurrence Δt
The discovery of a scaling law that relates the number of rising and falling price moves of a certain size (threshold), produced an event-based time scale named intrinsic time that ticks according to an evolution of price moves (Glattfelder et al., 1997). The intrinsic time dissects the time series based on market events where the direction of the trend alternates, see Fig. 1. These directional change events are identified by price reversals of a given threshold value set ex-ante. Once a directional change event is confirmed an overshoot event begins and continues the trend identified by the directional changes. An overshoot event ends when the opposite directional change occurs. With each directional change event, the intrinsic time ticks one unit (Glattfelder et al., 2011a).
Figure 1 shows a price curve with its many peaks and valleys. We choose a threshold of δ > 0 percent of the data series. The detailed sampling rule is as follows - we start in the upward mode - we queue all the recorded prices one by one and keep in memory the highest price; as soon as the price drops by δ, then this is the first intrinsic event and we sample this data point. We discard the old queue and now start a downward queue. We keep in memory the lowest observed data point until we record an up move of δ; this is then the second intrinsic time point. The data series thus determines the pace of sampling and generates itself the intrinsic event-based time scale; the method is endogenous of the price.
The benefits of this approach in the analysis of high-frequency data are threefold; firstly, it can be applied to non-homogeneous time series without the need for further data transformations. Secondly, multiple directional change thresholds can be applied at the same time for the same tick-by-tick data. And thirdly, it captures the level of market activity at any one time.
Financial markets are noisy by nature and are therefore expected to oscillate around price levels. Such oscillations produce alternating directional changes which thresholds reflect the noise amplitude. In between any two directional changes shows an overshoot, as previously described, that length reflects the ability of buyers and sellers to agree upon prices. A long overshoot is then susceptible to be the footprint of a lack of liquidity. We will later show that our intuition of illiquidity is indeed suitable, since we shall observe in section 6 that long overshoots exhibit during market liquidity crisis episodes.
The intrinsic network
The concept of intrinsic time is self-similar, i.e. fractal and described by few scaling laws as seen above. What is occurring at a certain threshold is similarly occurring at another. This activity is happening simultaneously without however being synchronised: a set of thresholds may exhibit up moves whereas other scales may be in down moves. Representing such a rich activity is cumbersome when not handled in an appropriate framework. This is the subject of this section where we introduce the so-called intrinsic network that not only elegantly handle the activity but also precisely quantify the unlikeness of price trajectories.
We consider n ordered thresholds δ1 < δ2 < … < δ n that dissect the price curve into directional changes of fixed length δ i and overshoots ω i of varying length. We assign the states of the market for a directional change threshold δ i either to be 1 or 0, depending whether the corresponding overshoot is moving upwards or downwards. At each time we assign a binary vector b = (b1, …, b n ) consisting of 1 or 0, describing the market over various scales. The binary encoding b = (b1, …, b n ) therefore expresses the state of the market s in numeric terms as follows s = b1 · 20 + b2 · 21 + … + b n · 2n-1. We will interchangeably use both notations and it is straightforward to notice there is a total of 2 n possible states.
A large enough price move makes the market state to evolve in two possible ways. Firstly, in any state a move in the time series of the opposite direction would first flip the smallest b1, flipping a previous move down b1 = 0 into an upward state b1 = 1, and similarly a move down would flip the first state b1 = 1 state into an b1 = 0 state. In case the time series continues with the move in the same direction, it would flip the first state b i that shows the opposite direction, b i = 0 would flip to b i = 1, likewise b i = 0 to b i = 1. The precise rule is then
b = (b1, …, b n ) can transition to where and .
Defining W as the transition probability matrix of the underlying stochastic process, we have created the so-called intrinsic network or in short .
The intrinsic network exhibits a couple of peculiar states where the network is non-reactive: the downward blind-spot (0, …, 0) where a downward price move has no effect and, conversely when the market can keep on moving down and the upward blind-spot (1, …, 1) when the market can keep on moving up without being traced. From a blind spot, the available transition is unique; (1, 1, …, 1) can only to transition to (0, 1, …, 1) and (0, 0, …, 0) to (1, 0, …, 0). Regardless of the dimension of the intrinsic network, blind spots will be present, and do present a flaw that will be addressed in the future.
Figure 3 demonstrates an example of transitions on a 2-dimensional intrinsic network for a given time series.
Transition probabilities
In this section we compute the transition probabilities W corresponding to the intrinsic network. We assume that the price obeys a Brownian motion and that the transitions are modelled as a first order Markov process. We will use these probabilities to compute the unlikeliness of a price trajectory mapped onto the intrinsic network, Brownian motion being used as the reference model of a liquid market.
Firstly, we stress that given a Brownian motion modelling the price
Secondly, we have conducted numerical simulations for price process with time-varying volatility and concluded that the distribution of overshoot length did not change with introduction time-varying volatility. In what follows we shall therefore consider a Brownian motion with constant volatility, i.e.
We now present the analytical expressions for transition probabilities.
It is remarkable to note that Theorem 4.1 does not depend on volatility for which proof is given in Appendix C.
In addition we stress that it is certainly possible to assume different price generating processes at the cost of losing analytic tractability. Another possibility is to estimate the transition probabilities from empirical data. In what follows, however, we choose to use Theorem 4.1 corresponding to a Brownian motion with constant volatility.
We have numerically checked that the transition probabilities presented in Theorem 4.1 are in good agreement with Monte Carlo simulations considering Brownian motion as an underlying process. The small observed discrepancy is mainly due to the Markovian assumption made to derive Theorem 4.1. We have indeed noticed that a Brownian motion on a network is in fact non-Markovian, it has memory. However this assumption and its related error appear to only marginally affect our liquidity measure.
Here we introduce the quantity , an information theoretic value that measures the unlikeliness of price trajectories mapped onto the intrinsic network. We argue and demonstrate below that is an alternative definition of liquidity.
We first consider the surprise γ
ij
(Cover and Thomas, 1991) of a transition from state s
i
to state s
j
as
We further define the surprise of a price trajectory within a time interval [0, T] that have experienced K transitions as
Since the number of transitions is a variable, some time periods might exhibit large surprise purely due to large number of transitions. In order to remove this effect we center the surprise by its expected value, the entropy rate multiplied by the number of transitions K · H(1), and divide it by the square root of its variance, the second order of informativeness multiplied by the number of transitions (Pfister et al., 2001). According to the central limit theorem (Pfister et al., 2001), the obtained expression converges to the normal distribution
Following (Pfister et al., 2001), the entropy rate of the Markov chain equals
The expression (11) allows us to introduce liquidity defined as
To summarize, we have mapped any price trajectory onto an intrinsic network modelling the underlying process as a first order Markov chain, and derived the transition probabilities for the case of a Brownian motion. We defined the surprise of a sequence of transitions that we normalised so as it follows a normal distribution. Finally we quantified liquidity by assessing the likelihood of the surprise that indicates illiquid times when the value is close to zero, while liquid times are indicated with value close to one.
In this section we present the liquidity measurement in an empirical setting. Firstly, we begin by describing the dataset used in this study. Next, we evaluate the predictive capabilities of liquidity measurement on extreme price movement. Then we present the measurement on well-known FX market crises. Finally, we present the intra-week seasonality of liquidity and compare it to seasonality in spread and volatility.
Dataset
The data used in this study is quoted by Oanda (Oanda, 2015), one of the major market makers which proposed stable spreads until December 2012. The data set represents the quotes of the major currency pairs from 2006-01-01 to 2014-11-01 at the finest resolution: tick-by-tick. Each tick contains a time-stamp, bid and offer prices for transactions up to $10 million. The following pairs composed in the dataset: AUD/CAD, AUD/NZD, AUD/JPY, AUD/ USD, CAD/JPY, CHF/JPY, EUR/AUD, EUR/CAD, EUR/CHF, EUR/GBP, EUR/JPY, EUR/NZD, EUR/ USD, GBP/AUD, GBP/CAD, GBP/CHF, GBP/JPY, GBP/USD, NZD/CAD, NZD/JPY, NZD/USD, USD/ CAD, USD/CHF and USD/JPY. Furthermore, for the event studies in section 6.4, where we read spread information, we use tick data from Dukascopy (Dukascopy, 2015), a company gathering bids and offers from market participants through an order book that contains time-stamp, bid and offer price, but unlike Oanda data, it also contains bid and offer quoted volume.
An intrinsic network
We now present the intrinsic network used in the following subsections. Firstly, as we are concerned with high frequency market conditions we choose the first threshold δ1 to be 0.025% and taking each next threshold as the double of its predecessor. We use a total of twelve thresholds
The proposal to set the thresholds in an optimal manner can steam from Maximum Entropy Principle applied on the surprise which is known, when properly adjusted, to converge to normal distribution for K ⪢ 0. Reshuffling the expression of surprise, for large but fixed K the distribution is approximately normal γ K ∼ N (K · H(1), K · H(2)). The entropy of surprise equals hence we conclude that the optimal choice of thresholds is the one that maximizes H(2). We note that the aforementioned optimisation process is a highly complex mathematical problem, which is intended to be solved using numerical procedures. Briefly, we remark that one should at least double the consecutive thresholds, i.e. δ i ⪢ 2 · δi-1 . The closed form expression for the optimal thresholds is subject to further research. The Brownian motion assumption is indirectly expressed through the values of transition probability matrix that feed into the H(2). The corresponding probability transition matrix W is obtained from the analytical expressions in Theorem 4.1, hence we will not need a training set to obtain the transition probabilities. We numerically approximate with Monte Carlo simulation, the first H(1) = 0.4604 and second order informativeness H(2) = 0.70818, by running a path of one million transitions of process set by the transition probability matrix W on the intrinsic network and computing the average, standard deviation of the resulting surprises and the stationary distribution. We consider a sliding window for the analysis of price trajectory arbitrary set to T = 1 day.
Extreme events
We start by systematically exploring the relationship between liquidity and price moves to assess its predictive power.
For all exchange rates in our dataset, we compute the daily absolute price changes |R t | and compare it with on the same day, up to five days before the observed absolute price changes, . We proceed by selecting all daily absolute price changes larger than an amplitude x and compute the average liquidity for the same day and up to five days before the absolute price change larger than x
Figure 4 shows the behaviour of , for the 5 previous days. The right graph shows the relationship between the liquidity and 〈|Rt+k|〉 (y). We note that, as expected, the larger the amplitude x, the smaller average liquidity . Likewise, the smaller liquidity the larger the average absolute price change, for up to 5 days into the future. This seems to indicate that is capable of predicting large price moves for at least 5 days ahead, suggesting that the measurement might be a good early warning signal. Note that we did not investigate the predictability over more than 5 days.
It is also instructive to examine how the measurement behaves during well-known crises. We therefore choose to focus on well documented events: August 2007 Yen carry trade collapse (Brunnermeier et al., 2008) and the Swiss National Bank implementation of 1.20 floor on EUR/CHF (Dorgan, 2012).
First we present the liquidity measurement during the 2007 Yen carry trade unwind, when massive price drops in Yen related pairs were the result of unwinding of large positions from major market players; many hedge funds and banks with proprietary trading desks had large positions at risk and decided to buy back yen to pay back low-interest loans (Chaboud et al., 2014).
The upper graph in Fig. 5 shows the time evolution of the tick-by-tick USD/JPY exchange rate, as well as the evolution of the liquidity over the past 24 hour period graphed every minute. We notice notable shocks to market liquidity occurred in mid July with almost 2% drop in USD/JPY in matter of hours. From there on, illiquid conditions is shown by the measurement. Liquidity decreases three weeks preceding the spectacular 6% drop, which occurred on August 16th 2007.
On the other hand, to compare with alternative liquidity measurements, in the lower part of the Fig. 5 we depict the average price weighted quoted volume over the 24 hour period plotted every minute. It also shows a decrease in the price weighted quoted volume in the three weeks preceding the carry trade unwind, having its lowest value on August 16th 2007. With the delay of around 2 weeks, comparable conclusions could be drawn with the two techniques, even though only needs the time series of prices.
Next we focus on the Swiss National Bank (SNB) setting the floor on EUR/CHF (Schmidt, 2011). The upper graph in Fig. 6 shows the time evolution of the tick-by-tick EUR/CHF exchange rate and minute-by-minute liquidity in the months of the SNB intervention. Our measurement shows slow but steady deterioration of liquidity conditions during the time of Franc appreciation. The graph highlights that the liquidity decreases during the week proceeding spectacular near 10% gain in Franc against Euro, reaching near parity on August 9th 2011. In addition, our measurement shows that illiquid market conditions continue in the next weeks following a almost 20% reversal, and the liquidity recovers after the SNB intervention on September 6th 2011. Again, for comparison, we show in the lower part of the Fig. 6 the average price weighted quoted volume over a 24 hour period. We note that the decrease in the measurement follows the decrease in liquidity , with only difference that it stays reduced after the intervention, while liquidity recovers. These examples tend to indicate that the liquidity tends to indicate illiquidity earlier, both indicate illiquidity at the same time, even though liquidity uses only price information.
Intra-week seasonality
In this subsection we present intra-week seasonality in liquidity , bid-ask spread and squared logarithmic price changes. Liquidity varies during the day, seasonally at the open and close of major FX markets, and also during scheduled news events (Ito and Hashimoto, 2006). The FX trading hours move around the world as follows: New York opens at 13:00 and closes at 22:00; Asia opens at 22:00 and closes at 7:00; London opens at 8:00 and closes at 17:00. Market zones are denoted on graphs as follows: Asia - orange, London - green, New York - purple. In this subsection we demonstrate that liquidity can identify these predictable liquidity events. We create a 5 minute time grid from Monday 00:00 till Friday 24:00 and compute the average liquidity , average bid-ask spread and average squared logarithmic price changes for each point of the grid using the data in the whole sample of Oanda data for NZD/USD exchange rate, as the chosen exchange rate displays clear seasonality patterns.
Figure 7 shows the intra-week (Monday to Friday) pattern of the liquidity , the bid-ask spread and the volatility, proxied by squared logarithmic price changes. We note that all three measurements exhibit seasonality patterns. Liquidity is high during London and New York trading sessions, while during the Asian trading session it drops, indicating illiquid market conditions. Spread is the tightest during London and New York trading sessions, it is highest during the transitions to Asian markets. Volatility increases at the beginning of London trading and the transition to New York trading session. The volatility is the lowest during the transitions from Asian to London trading sessions. Figure 7 reveals several interesting features among the presented measurements. First, a negative relationship between the liquidity and the bid-ask spread is found in the NZD/USD exchange market - when bid-ask spread increases the liquidity drops.
Conclusion
Liquidity is often measured following top-down approaches that require one to make firm assumptions about market behaviours and often elude market micro-structures that we believe have a rich content somewhat still largely unexploited. In our point of view, a bottom-up multi-scale approach provides an alternative to describe liquidity where market micro-structures is fully embraced and where minimal assumptions have to be made.
We have indeed presented above an alternative measure where we only use the price evolution of the asset that is dissected by identifying directional changes of price. After a directional change, the price can further move to form a so-called overshoot region before exhibiting another directional change. Inspired by the idea that a long overshoot might be the signature of a lack of liquidity we propose a new measure of liquidity by mapping the price trajectory onto a multi-scale Markov chain framework termed intrinsic network. We compute the transition probabilities of the network for a benchmark Brownian motion that allows us to define illiquidity by quantifying unlikeliness of price trajectories.
The new measure is applied to empirical FX data where we systematically analyse market events to observe that low liquidity is indeed correlated to large price moves. We then concentrate our attention on a couple of well-known liquidity shocks and observe the way our measure shows low liquidity during, but also before, these episodes. These empirical analyses therefore not only show the success of our approach but also tend to indicate that it has a potential to be an early warning indicator, highly appreciated to possibly announce forthcoming crises.
Funding
The research leading to these results has received funding from the European Union Seventh Framework Programme (FP7/2007-2013) under grant agreement no 317534.
Footnotes
1
A scaling law establishes a mathematical relationship between two variables that holds true over multiple orders of magnitude.
