THE CONTAGION FROM THE 2007-09 US STOCK MARKET CRASH

The global financial crisis that took place during the period 2007-09 had its most prominent manifestation in the general stock market crash. This could be studied from the perspective of financial contagion, using a mathematical tool known as wavelets. This paper aims to assess the impact of the US stock market crash on other stock markets all over the world. As an initial point the assumption that the former was the epicenter of the global financial crisis stands out. In order to determine the existence of differentiated impacts that show the presence of inertial factors in different stock exchange markets, a filtering technique is used on stock market indexes to assess such impacts. The data series are worked out on different time scales in order to identify short and long term effects.


Introduction
In the summer of 2007 the first global financial panic of the 21 st century was sparked: see timemap of events in Ariff (2012). In January 2009, asset losses of financial institutions were estimated at $700 billion. The global stock market losses exceeded thirty billion dollars. The stock market index of the five hundred largest companies in the world in the S&P500 had fallen 39 percent in 2008: the NASDAQ stocks fell by 42 percent while that of Dow Jones by35 percent.
The financing capacity of global financial system had declined significantly in 2008 relative to 2007. The issuance of securities in 2008 for debt funding fell 37.7 percent compared to the previous year, indeed an unprecedented decline in the last twelve years. The stock issuance was dramatically reduced by 73 percent for the same period. This implied a reduction in bond issues amounting to $269.86 billion and 148.1 billion dollars decline in stock financing. The aim of this study is to evaluate the impact of this US-origin stock market crash on other major stock exchanges in the world. The main hypothesis is that stock market crash in the US was the epicenter of the global stock market crash. The analysis has been conducted by a technique of filtering the data using wavelets.
The paper is organized as follows. Previous works that have been written about the stock market using wavelets are briefly summarized in the next section. The filtering process of stock series using a discrete transformation of wavelet coefficients is outlined in section 3. The results of regression and Pearson correlation coefficients are summarized and inferences drawn in section 4. Some concluding remarks are given in the final section. Capobianco (2004) uses search algorithms with wavelet shaped dictionaries to break down the scale of the dynamics of stock returns. The wavelet analysis is used to identify intraday periodicities of both one and five minutes, in the timeline. In order to find side effects out of returns in the stock markets at different time scales using wavelet analysis, Fernandez (2004) worked out with stock markets in North America and found that there were side effects to Latin America, emerging Asia, Far East countries and Pacific markets. She also identifies side effects from Europe and Latin America to North American markets.

Previous Studies Using Wavelets
To study the relationship between the markets of South Korea and US, Lee (2004) worked with wavelets. He used a multi resolution technique at different scales. A strong evidence of side effects and price volatility from the stock markets of developed countries to those of developing nations is reported in that study. Vuorenmaa (2004) studied the stock share volatility of Nokia using wavelet multi resolution analysis to find out that wavelet variance and covariance revealed a considerable amount of stock market activity in intraday levels. Moreover, applying a rule of local scale and long memory to volatility, he found that the variation in long memory was supported in the medium term (months).
With high-frequency financial data and a Markov tree model, Gençay and Whitcher (2005) used wavelets to establish a new stylized fact about volatility: the low volatility of longer maturities is more likely to be followed by short period low volatility and it would not be the case for a high volatility of shorter time horizons. This phenomenon is called asymmetric vertical dependence.

What are Wavelets?
One way to introduce wavelets is from the Fourier analysis, which is a process to analyze components in a function. Such an analysis could represent a sound, an electrical signal, a light beam, and so on. By considering time dependent functions, as might be the case of a measuring device that connects one side of the power meter from one's house to record voltage transients to find that varies with time. The Fourier transform is a mathematical procedure that follows the shape: It is a change of one function, f, to be named the signal, whose variable  is the time, to another function, F, whose variable is the frequency of oscillations . For instance, in the case of a light signal, this procedure allows us to extract the frequencies (patterns) that make up the signal, and its importance. However, if the signal is localized, Fourier transform begins loosing quality because it does not help much to locate in what time interval a signal was allocated, in such a way that tends to be blind to the details located at specific interval spaces.
One way to address the solution of this problem is by modifying the Fourier transform to get an expression like the following: In the previous equation a function that places the signal in a time interval around the instant t is added. Such an addition was introduced by Dennis Gabor in the 1940s. An improvement developed nearly forty years later is due to Morlet and Grossman, who modified the Gabor transformation adding up the frequency as a factor multiplying the time difference, which resulted as the following equation: These kinds of transformations are called wavelet transformations and allow for a great variety of them, depending on the g function chosen.
The above approach is useful when the ability to observe an event at any given time is feasible. In addition, such a process permits to manage time as a continuous variable. In contrast, when the signals are collections of data taken at specific moments, whose modification is not convenient or when it cannot be changed, it is better to use discrete wavelet transforms. In these cases the time variable is a collection of moments that are usually equally spaced, as is the case of the data used in economics.
The allocation of squares at different moments to accept or delete frequencies at different instants is performed by the simple method of multiplying by 1 to pass up, or by 0 to remove them out. The graphs below give a rough idea of how to carry out this procedure.
To sum up, a set of simple functions that take one and zero as value, can be arranged to pass a section of frequencies and to eliminate others. For example, the following figure shows how a set of frequencies counted on the horizontal axis, with weights computed on the vertical axis, can be canceled to filter them, i.e. do not pass up. Firstly, a low-frequency filter is presented in Figure 1. The shaded area on the left indicates that lower frequencies are eliminated when multiplied by zero.  Therefore, given that the filters have different sets of moments, wherein one can be selected under certain frequency band, it is possible to have the most important frequencies in different points on time.
It is possible to find details of frequencies of oscillation in time, what is difficult with the Fourier transform. To understand the importance of this development, it is worth posing the following question: Is it feasible to have a computer program such that a musical signal is received by a microphone, so that once you send an electrical signal to a system you can allocate the frequency to identify musical notes issued?
If one wants to pass it on to writing musical language, it is also necessary to determine how long the note lasted to be automatically written on the score. The wavelets are designed to solve that problem, because due to its definition, includes a location in a time interval. Whichever signal, the filtered process can provide frequencies on specific time intervals in the same way as a musician that listen to or imagine a musical piece and writes the musical score down indicating which note (frequency), which tune (time location) and an intensity (strong or soft sounds) are introduced. However, it is not possible to achieve full precision simultaneously both in time and frequency. As a result of a theorem called uncertainty principle, it happens that as long as more precision on frequency is achieved less accuracy in time location is reached and the reverse is true. For this reason, one of the results of wavelets uses, which is called spectrogram, locate frequencies on instants when they occurred. It has a property, when many time periods are involved, the frequency knowledge decline. Conversely, when the number of periods decreases, frequency knowledge accuracy increases. Likewise, an analogous situation is faced when looking at a digitalized picture in a computer, in a zoom out, some details are fainted.

Pearson Parameter
The Pearson parameter is useful to compute as the correlation degree between two statistical variables. Assuming n measures of a variable X and n measures of any other variable Y, the question to answer is: what is the correlation between both? Such a parameter is symbolized as r and it is defined as For r=1, the dots resemble a positive oriented line.
For r=-1, results in a negative oriented line.
For r=0, a cloud of unorganized dots shows up as follows: For r=0.4, a quite organized scatter is observed Meanwhile with an r = -0.4, the following image is perceived: In this paper is shown that if the values of X variable are the main oscillation frequencies of the New York Stock Exchange (independent variable) and Y represents the oscillation frequencies of any overseas stock exchange, such frequencies are randomly distributed, with a correlation coefficient close to zero that look like spherical distributions. On the contrary, when observation periods are longer than daily, for instance, weekly, bimonthly or monthly, the distributions appear much more organized. The Pearson parameter takes values around 0.9.

Comparison of stock exchange indexes encompasses the period of March the 15 th , 2006 till
March the 9 th , 2009. In order to synchronize physical time, the data for continents different from America was lagged one day with respect to New York Stock Exchange (NYSE) values.
The coefficients of discrete transform wavelets were obtained for each stock exchange.
Here a graphic example of different levels of such coefficients for the American case is presented. The bottom part of graph no. 3 shows the Dow Jones Index evolution. Then, eight levels of wavelets coefficients are displayed, identified as Wn. Once the discrete transform wavelet coefficients were obtained, the computations were carried out in order to calculate the Pearson correlation coefficients, so as to measure the synchrony, if there was any.
The applied method allows for a selection of periods to measure frequencies.
Measurement was performed in six periods, each of which is called level. The studied period roughly includes: each working day as the first level, every two working days, on the second level, every four working days, for the third level, every eight working days, on the fourth level (two weeks approximately), every sixteen working days, on the fifth level (one month approximately) and every thirty two working days, on the sixth level (two months approximately).  On the second level, correlations between frequencies are similar. As a result, the tune up for every two days between the overseas stock exchanges and NYSE has not grown significantly. The tune up tends to grow for longer periods. For instance, in the third level there are 21 stock exchanges whose Pearson parameter is over one tenth in absolute value. The procedure is similar to the one used to change Fahrenheit to Celsius degrees and vice versa. The transformation is:  v  e + r Where A = 61199.2 is a number that is calculated as follows: calculate the difference between the maximum frequency (pattern 2.1) visible and low frequency visible (pattern 2.7).
Next, the difference between the maximum and minimum economic frequency is calculated. Patterns are allocated based on the following Table 2. The standard deviations of the regressions of various indices of stock exchanges in the world against the U.S. Dow Jones converted as variations in pattern range are shown. It is observed that the longer the time span, the larger the homogeneity degree of oscillations, which is clearly reflected in the predominance of one pattern.

Concluding remarks
The first financial crises of the 21st century that took place during 2007-09 disclosed several drawbacks. The global stock market underwent considerable losses. Capital generation diminished to unprecedented levels. The main hypothesis of this paper is that US stock market was the epicenter of the world market crash.
NYSE movements on other stock markets is studied using the wavelets mathematical tool to assess the effects. Frequencies of oscillation for the NYSE as well as stock exchanges of 40 other countries were calculated and filtered. Such frequencies were correlated between US and other stock markets.
It was found that NYSE movements exerted a great influence, during the first two working days, in a limited group of countries such as Mexico, Chile, Peru, Canada, Germany, Hong Kong, Czech Republic and Austria. After eight working days, it is observed that 21 stock exchanges (see map of the world) were significantly affected.