KLUEDO RSS FeedKLUEDO Dokumente/documents
https://kluedo.ub.uni-kl.de/index/index/
Wed, 29 Jan 2014 10:20:27 +0100Wed, 29 Jan 2014 10:20:27 +0100Monitoring time series based on estimating functions
https://kluedo.ub.uni-kl.de/frontdoor/index/index/docId/3693
A large class of estimators including maximum likelihood, least squares and M-estimators are based on estimating functions. In sequential change point detection related monitoring functions can be used to monitor new incoming observations based on an initial estimator, which is computationally efficient because possible numeric optimization is restricted to the initial estimation. In this work, we give general regularity conditions under which we derive the asymptotic null behavior of the corresponding tests in addition to their behavior under alternatives, where conditions become particularly simple for sufficiently smooth estimating and monitoring functions. These regularity conditions unify and even extend a large amount of existing procedures in the literature, while they also allow us to derive monitoring schemes in time series that have not yet been considered in the literature including non-linear autoregressive time series and certain count time series such as binary or Poisson autoregressive models. We do not assume that the estimating and monitoring function are equal or even of the same dimension, allowing for example to combine a non-robust but more precise initial estimator with a robust monitoring scheme. Some simulations and data examples illustrate the usefulness of the described procedures.Claudia Kirch; Joseph Tadjuidje Kamgaingpreprinthttps://kluedo.ub.uni-kl.de/frontdoor/index/index/docId/3693Wed, 29 Jan 2014 10:20:27 +0100Geometric Ergodicity of Binary Autoregressive Models with Exogenous Variables
https://kluedo.ub.uni-kl.de/frontdoor/index/index/docId/3647
In this paper we introduce a binary autoregressive model. In contrast to the typical autoregression framework, we allow the conditional distribution of the observed process to depend on past values of the time series and some exogenous variables. Such processes have
potential applications in econometrics, medicine and environmental sciences. In this
paper, we establish stationarity and geometric ergodicity of these
processes under suitable conditions on the parameters of the model. Such properties are
important for understanding the stability properties of the model as well as for deriving the
asymptotic behavior of the parameter estimators.Claudia Kirch; Joseph Tadjuidje Kamgaingworkingpaperhttps://kluedo.ub.uni-kl.de/frontdoor/index/index/docId/3647Wed, 13 Nov 2013 15:43:19 +0100Maximum Likelihood Estimators for Multivariate Hidden Markov Mixture Models
https://kluedo.ub.uni-kl.de/frontdoor/index/index/docId/3480
In this paper we consider a multivariate switching model, with constant states means
and covariances. In this model, the switching mechanism between the basic states of
the observed time series is controlled by a hidden Markov chain. As illustration, under
Gaussian assumption on the innovations and some rather simple conditions, we prove
the consistency and asymptotic normality of the maximum likelihood estimates of the model parameters.Joseph Tadjuidje Kamgaingpreprinthttps://kluedo.ub.uni-kl.de/frontdoor/index/index/docId/3480Mon, 15 Apr 2013 17:29:52 +0200An online approach to detecting changes in nonlinear autoregressive models
https://kluedo.ub.uni-kl.de/frontdoor/index/index/docId/2772
In this paper we develop monitoring schemes for detecting structural changes
in nonlinear autoregressive models. We approximate the regression function by a
single layer feedforward neural network. We show that CUSUM-type tests based
on cumulative sums of estimated residuals, that have been intensively studied
for linear regression in both an offline as well as online setting, can be extended
to this model. The proposed monitoring schemes reject (asymptotically) the null
hypothesis only with a given probability but will detect a large class of alternatives
with probability one. In order to construct these sequential size tests the limit
distribution under the null hypothesis is obtained.Claudia Kirch; Joseph Tadjuidje Kamgaingreporthttps://kluedo.ub.uni-kl.de/frontdoor/index/index/docId/2772Mon, 24 Oct 2011 10:46:14 +0000Changepoint tests for INARCH time series
https://kluedo.ub.uni-kl.de/frontdoor/index/index/docId/2725
In this paper, we discuss the problem of testing for a changepoint in the structure
of an integer-valued time series. In particular, we consider a test statistic
of cumulative sum (CUSUM) type for general Poisson autoregressions of order
1. We investigate the asymptotic behaviour of conditional least-squares estimates
of the parameters in the presence of a changepoint. Then, we derive the
asymptotic distribution of the test statistic under the hypothesis of no change,
allowing for the calculation of critical values. We prove consistency of the test,
i.e. asymptotic power 1, and consistency of the corresponding changepoint estimate.
As an application, we have a look at changepoint detection in daily
epileptic seizure counts from a clinical study.Jürgen Franke; Claudia Kirch; Joseph Tadjuidje Kamgaingpreprinthttps://kluedo.ub.uni-kl.de/frontdoor/index/index/docId/2725Mon, 12 Sep 2011 09:49:44 +0200A uniform central limit theorem for neural network based autoregressive processes with applications to change-point analysis
https://kluedo.ub.uni-kl.de/frontdoor/index/index/docId/2302
We consider an autoregressive process with a nonlinear regression function that is modeled by a feedforward neural network. We derive a uniform central limit theorem which is useful in the context of change-point analysis. We propose a test for a change in the autoregression function which - by the uniform central limit theorem - has asymptotic power one for a large class of alternatives including local alternatives.Claudia Kirch; Joseph Tadjuidje Kamgaingpreprinthttps://kluedo.ub.uni-kl.de/frontdoor/index/index/docId/2302Fri, 25 Mar 2011 14:44:40 +0100Testing for parameter stability in nonlinear autoregressive models
https://kluedo.ub.uni-kl.de/frontdoor/index/index/docId/2301
In this paper we develop testing procedures for the detection of structural changes in nonlinear autoregressive processes. For the detection procedure we model the regression function by a single layer feedforward neural network. We show that CUSUM-type tests based on cumulative sums of estimated residuals, that have been intensively studied for linear regression, can be extended to this case. The limit distribution under the null hypothesis is obtained, which is needed to construct asymptotic tests. For a large class of alternatives it is shown that the tests have asymptotic power one. In this case, we obtain a consistent change-point estimator which is related to the test statistics. Power and size are further investigated in a small simulation study with a particular emphasis on situations where the model is misspecified, i.e. the data is not generated by a neural network but some other regression function. As illustration, an application on the Nile data set as well as S&P log-returns is given.Claudia Kirch; Joseph Tadjuidje Kamgaingpreprinthttps://kluedo.ub.uni-kl.de/frontdoor/index/index/docId/2301Fri, 25 Mar 2011 14:44:06 +0100Maximum Likelihood Estimators for Markov Switching Autoregressive Processes with ARCH Component
https://kluedo.ub.uni-kl.de/frontdoor/index/index/docId/2146
We consider a mixture of AR-ARCH models where the switching between the basic states of the observed time series is controlled by a hidden Markov chain. Under simple conditions, we prove consistency and asymptotic normality of the maximum likelihood parameter estimates combining general results on asymptotics of Douc et al (2004) and of geometric ergodicity of Franke et al (2007).Jürgen Franke; Joseph Tadjuidje Kamgaingpreprinthttps://kluedo.ub.uni-kl.de/frontdoor/index/index/docId/2146Mon, 19 Oct 2009 17:01:13 +0200A Class of Switching Regimes Autoregressive Driven Processes with Exogenous Components
https://kluedo.ub.uni-kl.de/frontdoor/index/index/docId/2016
In this paper we develop a data-driven mixture of vector autoregressive models with exogenous components. The process is assumed to change regimes according to an underlying Markov process. In contrast to the hidden Markov setup, we allow the transition probabilities of the underlying Markov process to depend on past time series values and exogenous variables. Such processes have potential applications to modeling brain signals. For example, brain activity at time t (measured by electroencephalograms) will can be modeled as a function of both its past values as well as exogenous variables (such as visual or somatosensory stimuli). Furthermore, we establish stationarity, geometric ergodicity and the existence of moments for these processes under suitable conditions on the parameters of the model. Such properties are important for understanding the stability properties of the model as well as deriving the asymptotic behavior of various statistics and model parameter estimators.Joseph Tadjuidje Kamgaing; Hernando Ombao; Richard A. Davispreprinthttps://kluedo.ub.uni-kl.de/frontdoor/index/index/docId/2016Wed, 23 Jul 2008 15:16:26 +0200A note on the identifiability of the conditional expectation for the mixtures of neural networks
https://kluedo.ub.uni-kl.de/frontdoor/index/index/docId/1832
We consider a generalized mixture of nonlinear AR models, a hidden Markov model for which the autoregressive functions are single layer feedforward neural networks. The non trivial problem of identifiability, which is usually postulated for hidden Markov models, is addressed here.Jürgen Franke; Jean-Pierre Stockis; Joseph Tadjuidje Kamgaingpreprinthttps://kluedo.ub.uni-kl.de/frontdoor/index/index/docId/1832Fri, 12 Jan 2007 20:16:48 +0100On Geometric Ergodicity of CHARME Models
https://kluedo.ub.uni-kl.de/frontdoor/index/index/docId/1831
In this paper we consider a CHARME Model, a class of generalized mixture of nonlinear nonparametric AR-ARCH time series. We apply the theory of Markov models to derive asymptotic stability of this model. Indeed, the goal is to provide some sets of conditions under which our model is geometric ergodic and therefore satisfies some mixing conditions. This result can be considered as the basis toward an asymptotic theory for our model.Jürgen Franke; Jean-Pierre Stockis; Joseph Tadjuidje Kamgaingpreprinthttps://kluedo.ub.uni-kl.de/frontdoor/index/index/docId/1831Fri, 12 Jan 2007 20:14:26 +0100Competing Neural Networks as Models for Non Stationary Financial Time Series -Changepoint Analysis-
https://kluedo.ub.uni-kl.de/frontdoor/index/index/docId/1613
The problem of structural changes (variations) play a central role in many scientific fields. One of the most current debates is about climatic changes. Further, politicians, environmentalists, scientists, etc. are involved in this debate and almost everyone is concerned with the consequences of climatic changes. However, in this thesis we will not move into the latter direction, i.e. the study of climatic changes. Instead, we consider models for analyzing changes in the dynamics of observed time series assuming these changes are driven by a non-observable stochastic process. To this end, we consider a first order stationary Markov Chain as hidden process and define the Generalized Mixture of AR-ARCH model(GMAR-ARCH) which is an extension of the classical ARCH model to suit to model with dynamical changes. For this model we provide sufficient conditions that ensure its geometric ergodic property. Further, we define a conditional likelihood given the hidden process and a pseudo conditional likelihood in turn. For the pseudo conditional likelihood we assume that at each time instant the autoregressive and volatility functions can be suitably approximated by given Feedfoward Networks. Under this setting the consistency of the parameter estimates is derived and versions of the well-known Expectation Maximization algorithm and Viterbi Algorithm are designed to solve the problem numerically. Moreover, considering the volatility functions to be constants, we establish the consistency of the autoregressive functions estimates given some parametric classes of functions in general and some classes of single layer Feedfoward Networks in particular. Beside this hidden Markov Driven model, we define as alternative a Weighted Least Squares for estimating the time of change and the autoregressive functions. For the latter formulation, we consider a mixture of independent nonlinear autoregressive processes and assume once more that the autoregressive functions can be approximated by given single layer Feedfoward Networks. We derive the consistency and asymptotic normality of the parameter estimates. Further, we prove the convergence of Backpropagation for this setting under some regularity assumptions. Last but not least, we consider a Mixture of Nonlinear autoregressive processes with only one abrupt unknown changepoint and design a statistical test that can validate such changes.Joseph Tadjuidje Kamgaingdoctoralthesishttps://kluedo.ub.uni-kl.de/frontdoor/index/index/docId/1613Wed, 23 Feb 2005 12:52:52 +0100