KLUEDO RSS FeedKLUEDO Dokumente/documents
https://kluedo.ub.uni-kl.de/index/index/
Fri, 09 Dec 2011 09:49:44 +0200Fri, 09 Dec 2011 09:49:44 +0200Changepoint tests for INARCH time series
https://kluedo.ub.uni-kl.de/frontdoor/index/index/docId/2725
In this paper, we discuss the problem of testing for a changepoint in the structure
of an integer-valued time series. In particular, we consider a test statistic
of cumulative sum (CUSUM) type for general Poisson autoregressions of order
1. We investigate the asymptotic behaviour of conditional least-squares estimates
of the parameters in the presence of a changepoint. Then, we derive the
asymptotic distribution of the test statistic under the hypothesis of no change,
allowing for the calculation of critical values. We prove consistency of the test,
i.e. asymptotic power 1, and consistency of the corresponding changepoint estimate.
As an application, we have a look at changepoint detection in daily
epileptic seizure counts from a clinical study.Jürgen Franke; Claudia Kirch; Joseph Tadjuidje Kamgaingpreprinthttps://kluedo.ub.uni-kl.de/frontdoor/index/index/docId/2725Mon, 12 Sep 2011 09:49:44 +0200Weak Dependence of Functional INGARCH Processes
https://kluedo.ub.uni-kl.de/frontdoor/index/index/docId/2186
We introduce a class of models for time series of counts which include INGARCH-type models as well as log linear models for conditionally Poisson distributed data. For those processes, we formulate simple conditions for stationarity and weak dependence with a geometric rate. The coupling argument used in the proof serves as a role model for a similar treatment of integer-valued time series models based on other types of thinning operations.Jürgen Frankepreprinthttps://kluedo.ub.uni-kl.de/frontdoor/index/index/docId/2186Thu, 15 Apr 2010 07:56:55 +0200Maximum Likelihood Estimators for Markov Switching Autoregressive Processes with ARCH Component
https://kluedo.ub.uni-kl.de/frontdoor/index/index/docId/2146
We consider a mixture of AR-ARCH models where the switching between the basic states of the observed time series is controlled by a hidden Markov chain. Under simple conditions, we prove consistency and asymptotic normality of the maximum likelihood parameter estimates combining general results on asymptotics of Douc et al (2004) and of geometric ergodicity of Franke et al (2007).Jürgen Franke; Joseph Tadjuidje Kamgaingpreprinthttps://kluedo.ub.uni-kl.de/frontdoor/index/index/docId/2146Mon, 19 Oct 2009 17:01:13 +0200Mixtures of Nonparametric Autoregression, revised
https://kluedo.ub.uni-kl.de/frontdoor/index/index/docId/2115
We consider data generating mechanisms which can be represented as mixtures of finitely many regression or autoregression models. We propose nonparametric estimators for the functions characterizing the various mixture components based on a local quasi maximum likelihood approach and prove their consistency. We present an EM algorithm for calculating the estimates numerically which is mainly based on iteratively applying common local smoothers and discuss its convergence properties.Jürgen Franke; Jean-Pierre Stockis; Joseph Tadjuidje; W.K. Lipreprinthttps://kluedo.ub.uni-kl.de/frontdoor/index/index/docId/2115Mon, 27 Jul 2009 08:47:55 +0200Mixtures of Nonparametric Autoregressions
https://kluedo.ub.uni-kl.de/frontdoor/index/index/docId/2102
We consider data generating mechanisms which can be represented as mixtures of finitely many regression or autoregression models. We propose nonparametric estimators for the functions characterizing the various mixture components based on a local quasi maximum likelihood approach and prove their consistency. We present an EM algorithm for calculating the estimates numerically which is mainly based on iteratively applying common local smoothers and discuss its convergence properties.Jürgen Franke; Jean-Pierre Stockis; Joseph Tadjuidje; W.K. Lipreprinthttps://kluedo.ub.uni-kl.de/frontdoor/index/index/docId/2102Mon, 13 Jul 2009 15:52:26 +0200Some asymptotics for local least-squares regression with regularization
https://kluedo.ub.uni-kl.de/frontdoor/index/index/docId/1902
We derive some asymptotics for a new approach to curve estimation proposed by Mr'{a}zek et al. cite{MWB06} which combines localization and regularization. This methodology has been considered as the basis of a unified framework covering various different smoothing methods in the analogous two-dimensional problem of image denoising. As a first step for understanding this approach theoretically, we restrict our discussion here to the least-squares distance where we have explicit formulas for the function estimates and where we can derive a rather complete asymptotic theory from known results for the Priestley-Chao curve estimate. In this paper, we consider only the case where the bias dominates the mean-square error. Other situations are dealt with in subsequent papers.Jürgen Franke; Joseph Tadjuidje; Stefan Didas; Joachim Weickertpreprinthttps://kluedo.ub.uni-kl.de/frontdoor/index/index/docId/1902Thu, 11 Oct 2007 12:37:44 +0200Quantile Sieve Estimates for Time Series
https://kluedo.ub.uni-kl.de/frontdoor/index/index/docId/1834
We consider the problem of estimating the conditional quantile of a time series at time \(t\) given observations of the same and perhaps other time series available at time \(t-1\). We discuss sieve estimates which are a nonparametric versions of the Koenker-Bassett regression quantiles and do not require the specification of the innovation law. We prove consistency of those estimates and illustrate their good performance for light- and heavy-tailed distributions of the innovations with a small simulation study. As an economic application, we use the estimates for calculating the value at risk of some stock price series.Jürgen Franke; Jean-Pierre Stockis; Joseph Tadjuidjepreprinthttps://kluedo.ub.uni-kl.de/frontdoor/index/index/docId/1834Mon, 05 Feb 2007 14:01:57 +0100A note on the identifiability of the conditional expectation for the mixtures of neural networks
https://kluedo.ub.uni-kl.de/frontdoor/index/index/docId/1832
We consider a generalized mixture of nonlinear AR models, a hidden Markov model for which the autoregressive functions are single layer feedforward neural networks. The non trivial problem of identifiability, which is usually postulated for hidden Markov models, is addressed here.Jürgen Franke; Jean-Pierre Stockis; Joseph Tadjuidje Kamgaingpreprinthttps://kluedo.ub.uni-kl.de/frontdoor/index/index/docId/1832Fri, 12 Jan 2007 20:16:48 +0100On Geometric Ergodicity of CHARME Models
https://kluedo.ub.uni-kl.de/frontdoor/index/index/docId/1831
In this paper we consider a CHARME Model, a class of generalized mixture of nonlinear nonparametric AR-ARCH time series. We apply the theory of Markov models to derive asymptotic stability of this model. Indeed, the goal is to provide some sets of conditions under which our model is geometric ergodic and therefore satisfies some mixing conditions. This result can be considered as the basis toward an asymptotic theory for our model.Jürgen Franke; Jean-Pierre Stockis; Joseph Tadjuidje Kamgaingpreprinthttps://kluedo.ub.uni-kl.de/frontdoor/index/index/docId/1831Fri, 12 Jan 2007 20:14:26 +0100Nonparametric Estimates for Conditional Quantiles of Time Series
https://kluedo.ub.uni-kl.de/frontdoor/index/index/docId/1454
We consider the problem of estimating the conditional quantile of a time series at time t given observations of the same and perhaps other time series available at time t-1. We discuss an estimate which we get by inverting a kernel estimate of the conditional distribution function, and prove its asymptotic normality and uniform strong consistency. We illustrate the good performance of the estimate for light and heavy-tailed distributions of the innovations with a small simulation study.Jürgen Franke; Peter Mwitapreprinthttps://kluedo.ub.uni-kl.de/frontdoor/index/index/docId/1454Wed, 19 Nov 2003 16:26:59 +0100Multivariate First-Order Integer-Valued Autoregressions
https://kluedo.ub.uni-kl.de/frontdoor/index/index/docId/734
Jürgen Franke; T. Rao Subbapreprinthttps://kluedo.ub.uni-kl.de/frontdoor/index/index/docId/734Tue, 17 Oct 2000 00:00:00 +0200Nonparametric Estimation in a Stochastic Volatility Model
https://kluedo.ub.uni-kl.de/frontdoor/index/index/docId/1119
In this paper we derive nonparametric stochastic volatility models in discrete time. These models generalize parametric autoregressive random variance models, which have been applied quite successfully to nancial time series. For the proposed models we investigate nonparametric kernel smoothers. It is seen that so-called nonparametric deconvolution estimators could be applied in this situation and that consistency results known for nonparametric errors- in-variables models carry over to the situation considered herein.Jürgen Franke; Wolfgang Härdle; Jens-Peter Kreisspreprinthttps://kluedo.ub.uni-kl.de/frontdoor/index/index/docId/1119Mon, 28 Aug 2000 00:00:00 +0200Portfolio management and market risk quantification using neural networks
https://kluedo.ub.uni-kl.de/frontdoor/index/index/docId/1121
We discuss how neural networks may be used to estimate conditional means, variances and quantiles of nancial time series nonparametrically. These estimates may be used to forecast, to derive trading rules and to measure market risk.Jürgen Frankepreprinthttps://kluedo.ub.uni-kl.de/frontdoor/index/index/docId/1121Mon, 28 Aug 2000 00:00:00 +0200Bootstrap of kernel smoothing in nonlinear time series
https://kluedo.ub.uni-kl.de/frontdoor/index/index/docId/495
Kernel smoothing in nonparametric autoregressive schemes offers a powerful tool in modelling time series. In this paper it is shown that the bootstrap can be used for estimating the distribution of kernel smoothers. This can be done by mimicking the stochastic nature of the whole process in the bootstrap resampling or by generating a simple regression model. Consistency of these bootstrap procedures will be shown.Jürgen Franke; Kreiss J.-P.; E. Mammenpreprinthttps://kluedo.ub.uni-kl.de/frontdoor/index/index/docId/495Mon, 03 Apr 2000 00:00:00 +0200Bootstrapping neural networks
https://kluedo.ub.uni-kl.de/frontdoor/index/index/docId/511
Knowledge about the distribution of a statistical estimator is important for various purposes like, for example, the construction of confidence intervals for model parameters or the determiation of critical values of tests. A widely used method to estimate this distribution is the so-called bootstrap which is based on an imitation of the probabilistic structure of the data generating process on the basis of the information provided by a given set of random observations. In this paper we investigate this classical method in the context of artificial neural networks used for estimating a mapping from input to output space. We establish consistency results for bootstrap estimates of the distribution of parameter estimates.Jürgen Franke; Michael Neumannpreprinthttps://kluedo.ub.uni-kl.de/frontdoor/index/index/docId/511Mon, 03 Apr 2000 00:00:00 +0200General Kriging for Spatial-Temporal Processes with Random ARX-Regression Parameters
https://kluedo.ub.uni-kl.de/frontdoor/index/index/docId/523
In the following, we discuss a procedure for interpolating a spatial-temporal stochastic process. We stick to a particular, moderately general model but the approach can be easily transered to other similar problems. The original data, which motivated this work, are measurements of gas concentrations (SO2, NO, O2) and several meteorological parameters (temperature, sun radiation, procipitation, wind speed etc.). These date have been and are still recorded twice every hour at several irregularly located places in the forests of the state Rheinland-Pfalz as part of a program monitoring the air pollution in the forests.Jürgen Franke; B. Gründerpreprinthttps://kluedo.ub.uni-kl.de/frontdoor/index/index/docId/523Mon, 03 Apr 2000 00:00:00 +0200Finanzinnovation (Grundlagen und Praxis der Optionspreisbestimmung)
https://kluedo.ub.uni-kl.de/frontdoor/index/index/docId/525
Jürgen Franke; Klaus Schindler; Norbert Siedowpreprinthttps://kluedo.ub.uni-kl.de/frontdoor/index/index/docId/525Mon, 03 Apr 2000 00:00:00 +0200Nonlinear and Nonparametric Methods for Analyzing Financial Time Series
https://kluedo.ub.uni-kl.de/frontdoor/index/index/docId/968
We consider nonparametric generalization of various well-known financial time series models and study estimates of the trend and volatility functions and forecasts based on kernel smoothers as well as on neural networks.Jürgen Frankepreprinthttps://kluedo.ub.uni-kl.de/frontdoor/index/index/docId/968Fri, 18 Feb 2000 00:00:00 +0100Bootstrap Autoregressive Order Selection
https://kluedo.ub.uni-kl.de/frontdoor/index/index/docId/969
In this paper we deal with the problem of fitting an autoregression of order p to given data coming from a stationary autoregressive process with infinite order. The paper is mainly concerned with the selection of an appropriate order of the autoregressive model. Based on the so-called final prediction error (FPE) a bootstrap order selection can be proposed, because it turns out that one relevant expression occuring in the FPE is ready for the application of the bootstrap principle. Some asymptotic properties of the bootstrap order selection are proved. To carry through the bootstrap procedure an autoregression with increasing but non-stochastic order is fitted to the given data. The paper is concluded by some simulations.Jürgen Franke; Jens-Peter Kreiss; Martin Moserpreprinthttps://kluedo.ub.uni-kl.de/frontdoor/index/index/docId/969Fri, 18 Feb 2000 00:00:00 +0100Optimal portfolio management using neural networks - a case study
https://kluedo.ub.uni-kl.de/frontdoor/index/index/docId/970
Neural networks are now a well-established tool for solving classification and forecasting problems in financial applications (compare, e.g., Bol et al., 1996, Evans, 1997, Rehkugler and Zimmermann, 1994, Refenes 1995, and Refenes et al. 1996a) though many practioners are still suspicious against too evident success stories. One reason may be that the construction of an appropriate network which provides a reasonable solution to a complex data-analytic problem is rarely made explicit in the literature. In this paper, we try to contribute to filling this gap by discussing in detail the problem of dynamically allocating capital to various components of a currency portfolio in such a manner that the average gain will be larger than for certain benchmark portfolios. We base our solution on feedforward neural networks which are constructed employing various statistical model selection procedures described in, e.g., (Anders, 1997, or Refenes et al., 1996b). Neural networks which are used as the basis of trading strategies in finance should be assessed differently than in technical applications. The task is not to construct a network which provides good forecasts with respect to mean-square error of some quantities of interest or to provide good approximation of some given target values, but to achieve a good performance in economic terms. For portfolio allocation, the main goal is to achieve on the average a large return combined with a small risk. Therefore, we do not consider forecasts of the foreign exchange (FX-) rate time series using neural networks, but we try to get the allocation directly as the output of a network. Furthermore, we do not minimize some estimation or prediction error, but we try to maximize an economically meaningful performance measure, the risk-adjusted return, directly (compare also Heitkamp, 1996). In the subsequent chapter, we describe the details of the portfolio allocation problem. The following two chapters provide some technical information on how the networks were fitted to the available data and how the network inputs and outputs were selected. In chapter 5, finally, we discuss the promising results.Jürgen Franke; Matthias Kleinpreprinthttps://kluedo.ub.uni-kl.de/frontdoor/index/index/docId/970Fri, 18 Feb 2000 00:00:00 +0100