• search hit 1 of 2
Back to Result List

Competing Neural Networks as Models for Non Stationary Financial Time Series -Changepoint Analysis-

  • The problem of structural changes (variations) play a central role in many scientific fields. One of the most current debates is about climatic changes. Further, politicians, environmentalists, scientists, etc. are involved in this debate and almost everyone is concerned with the consequences of climatic changes. However, in this thesis we will not move into the latter direction, i.e. the study of climatic changes. Instead, we consider models for analyzing changes in the dynamics of observed time series assuming these changes are driven by a non-observable stochastic process. To this end, we consider a first order stationary Markov Chain as hidden process and define the Generalized Mixture of AR-ARCH model(GMAR-ARCH) which is an extension of the classical ARCH model to suit to model with dynamical changes. For this model we provide sufficient conditions that ensure its geometric ergodic property. Further, we define a conditional likelihood given the hidden process and a pseudo conditional likelihood in turn. For the pseudo conditional likelihood we assume that at each time instant the autoregressive and volatility functions can be suitably approximated by given Feedfoward Networks. Under this setting the consistency of the parameter estimates is derived and versions of the well-known Expectation Maximization algorithm and Viterbi Algorithm are designed to solve the problem numerically. Moreover, considering the volatility functions to be constants, we establish the consistency of the autoregressive functions estimates given some parametric classes of functions in general and some classes of single layer Feedfoward Networks in particular. Beside this hidden Markov Driven model, we define as alternative a Weighted Least Squares for estimating the time of change and the autoregressive functions. For the latter formulation, we consider a mixture of independent nonlinear autoregressive processes and assume once more that the autoregressive functions can be approximated by given single layer Feedfoward Networks. We derive the consistency and asymptotic normality of the parameter estimates. Further, we prove the convergence of Backpropagation for this setting under some regularity assumptions. Last but not least, we consider a Mixture of Nonlinear autoregressive processes with only one abrupt unknown changepoint and design a statistical test that can validate such changes.

Download full text files

Export metadata

Additional Services

Search Google Scholar
Metadaten
Author:Joseph Tadjuidje Kamgaing
URN:urn:nbn:de:hbz:386-kluedo-18189
Advisor:Jürgen Franke
Document Type:Doctoral Thesis
Language of publication:English
Year of Completion:2005
Year of first Publication:2005
Publishing Institution:Technische Universität Kaiserslautern
Granting Institution:Technische Universität Kaiserslautern
Acceptance Date of the Thesis:2005/02/14
Date of the Publication (Server):2005/02/23
Tag:EM algorithm; Feedfoward Neural Networks; Geometric Ergodicity; Hidden Markov models for Financial Time Series; Test for Changepoint
Faculties / Organisational entities:Kaiserslautern - Fachbereich Mathematik
DDC-Cassification:5 Naturwissenschaften und Mathematik / 510 Mathematik
MSC-Classification (mathematics):62-XX STATISTICS / 62Gxx Nonparametric inference / 62G05 Estimation
62-XX STATISTICS / 62Gxx Nonparametric inference / 62G08 Nonparametric regression
62-XX STATISTICS / 62Gxx Nonparametric inference / 62G10 Hypothesis testing
62-XX STATISTICS / 62Gxx Nonparametric inference / 62G20 Asymptotic properties
62-XX STATISTICS / 62Pxx Applications [See also 90-XX, 91-XX, 92-XX] / 62P20 Applications to economics [See also 91Bxx]
Licence (German):Standard gemäß KLUEDO-Leitlinien vor dem 27.05.2011