Competing Neural Networks as Models for Non Stationary Financial Time Series -Changepoint Analysis-

  • The problem of structural changes (variations) play a central role in many scientific fields. One of the most current debates is about climatic changes. Further, politicians, environmentalists, scientists, etc. are involved in this debate and almost everyone is concerned with the consequences of climatic changes. However, in this thesis we will not move into the latter direction, i.e. the study of climatic changes. Instead, we consider models for analyzing changes in the dynamics of observed time series assuming these changes are driven by a non-observable stochastic process. To this end, we consider a first order stationary Markov Chain as hidden process and define the Generalized Mixture of AR-ARCH model(GMAR-ARCH) which is an extension of the classical ARCH model to suit to model with dynamical changes. For this model we provide sufficient conditions that ensure its geometric ergodic property. Further, we define a conditional likelihood given the hidden process and a pseudo conditional likelihood in turn. For the pseudo conditional likelihood we assume that at each time instant the autoregressive and volatility functions can be suitably approximated by given Feedfoward Networks. Under this setting the consistency of the parameter estimates is derived and versions of the well-known Expectation Maximization algorithm and Viterbi Algorithm are designed to solve the problem numerically. Moreover, considering the volatility functions to be constants, we establish the consistency of the autoregressive functions estimates given some parametric classes of functions in general and some classes of single layer Feedfoward Networks in particular. Beside this hidden Markov Driven model, we define as alternative a Weighted Least Squares for estimating the time of change and the autoregressive functions. For the latter formulation, we consider a mixture of independent nonlinear autoregressive processes and assume once more that the autoregressive functions can be approximated by given single layer Feedfoward Networks. We derive the consistency and asymptotic normality of the parameter estimates. Further, we prove the convergence of Backpropagation for this setting under some regularity assumptions. Last but not least, we consider a Mixture of Nonlinear autoregressive processes with only one abrupt unknown changepoint and design a statistical test that can validate such changes.

Volltext Dateien herunterladen

Metadaten exportieren

  • Export nach Bibtex
  • Export nach RIS

Weitere Dienste

Teilen auf Twitter Suche bei Google Scholar
Verfasserangaben:Joseph Tadjuidje Kamgaing
URN (Permalink):urn:nbn:de:hbz:386-kluedo-18189
Betreuer:Jürgen Franke
Sprache der Veröffentlichung:Englisch
Jahr der Fertigstellung:2005
Jahr der Veröffentlichung:2005
Veröffentlichende Institution:Technische Universität Kaiserslautern
Titel verleihende Institution:Technische Universität Kaiserslautern
Datum der Annahme der Abschlussarbeit:14.02.2005
Datum der Publikation (Server):23.02.2005
Freies Schlagwort / Tag:EM algorithm; Feedfoward Neural Networks; Geometric Ergodicity; Hidden Markov models for Financial Time Series; Test for Changepoint
Fachbereiche / Organisatorische Einheiten:Fachbereich Mathematik
DDC-Sachgruppen:5 Naturwissenschaften und Mathematik / 51 Mathematik / 510 Mathematik
MSC-Klassifikation (Mathematik):62-XX STATISTICS / 62Gxx Nonparametric inference / 62G05 Estimation
62-XX STATISTICS / 62Gxx Nonparametric inference / 62G08 Nonparametric regression
62-XX STATISTICS / 62Gxx Nonparametric inference / 62G10 Hypothesis testing
62-XX STATISTICS / 62Gxx Nonparametric inference / 62G20 Asymptotic properties
62-XX STATISTICS / 62Pxx Applications [See also 90-XX, 91-XX, 92-XX] / 62P20 Applications to economics [See also 91Bxx]
Lizenz (Deutsch):Standard gemäß KLUEDO-Leitlinien vor dem 27.05.2011

$Rev: 13581 $