Kaiserslautern - Fachbereich Mathematik
Refine
Year of publication
- 2007 (34) (remove)
Document Type
- Doctoral Thesis (14)
- Preprint (14)
- Diploma Thesis (3)
- Report (3)
Has Fulltext
- yes (34)
Keywords
- Elastoplastizität (2)
- Mixture Models (2)
- Multiresolution Analysis (2)
- Optionspreistheorie (2)
- Sobolev spaces (2)
- Spline-Approximation (2)
- localizing basis (2)
- 2-d kernel regression (1)
- A-infinity-bimodule (1)
- A-infinity-category (1)
Faculty / Organisational entity
The main aim of this work was to obtain an approximate solution of the seismic traveltime tomography problems with the help of splines based on reproducing kernel Sobolev spaces. In order to be able to apply the spline approximation concept to surface wave as well as to body wave tomography problems, the spherical spline approximation concept was extended for the case where the domain of the function to be approximated is an arbitrary compact set in R^n and a finite number of discontinuity points is allowed. We present applications of such spline method to seismic surface wave as well as body wave tomography, and discuss the theoretical and numerical aspects of such applications. Moreover, we run numerous numerical tests that justify the theoretical considerations.
In this paper we construct spline functions based on a reproducing kernel Hilbert space to interpolate/approximate the velocity field of earthquake waves inside the Earth based on traveltime data for an inhomogeneous grid of sources (hypocenters) and receivers (seismic stations). Theoretical aspects including error estimates and convergence results as well as numerical results are demonstrated.
The nowadays increasing number of fields where large quantities of data are collected generates an emergent demand for methods for extracting relevant information from huge databases. Amongst the various existing data mining models, decision trees are widely used since they represent a good trade-off between accuracy and interpretability. However, one of their main problems is that they are very instable, which complicates the process of the knowledge discovery because the users are disturbed by the different decision trees generated from almost the same input learning samples. In the current work, binary tree classifiers are analyzed and partially improved. The analysis of tree classifiers goes from their topology from the graph theory point of view to the creation of a new tree classification model by means of combining decision trees and soft comparison operators (Mlynski, 2003) with the purpose to not only overcome the well known instability problem of decision trees, but also in order to confer the ability of dealing with uncertainty. In order to study and compare the structural stability of tree classifiers, we propose an instability coefficient which is based on the notion of Lipschitz continuity and offer a metric to measure the proximity between decision trees. This thesis converges towards its main part with the presentation of our model ``Soft Operators Decision Tree\'\' (SODT). Mainly, we describe its construction, application and the consistency of the mathematical formulation behind this. Finally we show the results of the implementation of SODT and compare numerically the stability and accuracy of a SODT and a crisp DT. The numerical simulations support the stability hypothesis and a smaller tendency to overfitting the training data with SODT than with crisp DT is observed. A further aspect of this inclusion of soft operators is that we choose them in a way so that the resulting goodness function (used by this method) is differentiable and thus allows to calculate the best split points by means of gradient descent methods. The main drawback of SODT is the incorporation of the unpreciseness factor, which increases the complexity of the algorithm.
In this thesis we classify simple coherent sheaves on Kodaira fibers of types II, III and IV (cuspidal and tacnode cubic curves and a plane configuration of three concurrent lines). Indecomposable vector bundles on smooth elliptic curves were classified in 1957 by Atiyah. In works of Burban, Drozd and Greuel it was shown that the categories of vector bundles and coherent sheaves on cycles of projective lines are tame. It turns out, that all other degenerations of elliptic curves are vector-bundle-wild. Nevertheless, we prove that the category of coherent sheaves of an arbitrary reduced plane cubic curve, (including the mentioned Kodaira fibers) is brick-tame. The main technical tool of our approach is the representation theory of bocses. Although, this technique was mainly used for purely theoretical purposes, we illustrate its computational potential for investigating tame behavior in wild categories. In particular, it allows to prove that a simple vector bundle on a reduced cubic curve is determined by its rank, multidegree and determinant, generalizing Atiyah's classification. Our approach leads to an interesting class of bocses, which can be wild but are brick-tame.
Given an undirected, connected network G = (V,E) with weights on the edges, the cut basis problem is asking for a maximal number of linear independent cuts such that the sum of the cut weights is minimized. Surprisingly, this problem has not attained as much attention as its graph theoretic counterpart, the cycle basis problem. We consider two versions of the problem, the unconstrained and the fundamental cut basis problem. For the unconstrained case, where the cuts in the basis can be of an arbitrary kind, the problem can be written as a multiterminal network flow problem and is thus solvable in strongly polynomial time. The complexity of this algorithm improves the complexity of the best algorithms for the cycle basis problem, such that it is preferable for cycle basis problems in planar graphs. In contrast, the fundamental cut basis problem, where all cuts in the basis are obtained by deleting an edge, each, from a spanning tree T is shown to be NP-hard. We present heuristics, integer programming formulations and summarize first experiences with numerical tests.
The lattice Boltzmann method (LBM) is a numerical solver for the Navier-Stokes equations, based on an underlying molecular dynamic model. Recently, it has been extended towardsthe simulation of complex fluids. We use the asymptotic expansion technique to investigate the standard scheme, the initialization problem and possible developments towards moving boundary and fluid-structure interaction problems. At the same time, it will be shown how the mathematical analysis can be used to understand and improve the algorithm. First of all, we elaborate the tool "asymptotic analysis", proposing a general formulation of the technique and explaining the methods and the strategy we use for the investigation. A first standard application to the LBM is described, which leads to the approximation of the Navier-Stokes solution starting from the lattice Boltzmann equation. As next, we extend the analysis to investigate origin and dynamics of initial layers. A class of initialization algorithms to generate accurate initial values within the LB framework is described in detail. Starting from existing routines, we will be able to improve the schemes in term of efficiency and accuracy. Then we study the features of a simple moving boundary LBM. In particular, we concentrate on the initialization of new fluid nodes created by the variations of the computational fluid domain. An overview of existing possible choices is presented. Performing a careful analysis of the problem we propose a modified algorithm, which produces satisfactory results. Finally, to set up an LBM for fluid structure interaction, efficient routines to evaluate forces are required. We describe the Momentum Exchange algorithm (MEA). Precise accuracy estimates are derived, and the analysis leads to the construction of an improved method to evaluate the interface stresses. In conclusion, we test the defined code and validate the results of the analysis on several simple benchmarks. From the theoretical point of view, in the thesis we have developed a general formulation of the asymptotic expansion, which is expected to offer a more flexible tool in the investigation of numerical methods. The main practical contribution offered by this work is the detailed analysis of the numerical method. It allows to understand and improve the algorithms, and construct new routines, which can be considered as starting points for future researches.
The scope of this diploma thesis is to examine the four generations of asset pricing models and the corresponding volatility dynamics which have been devepoled so far. We proceed as follows: In chapter 1 we give a short repetition of the Black-Scholes first generation model which assumes a constant volatility and we show that volatility should not be modeled as constant by examining statistical data and introducing the notion of implied volatility. In chapter 2, we examine the simplest models that are able to produce smiles or skews - local volatility models. These are called second generation models. Local volatility models model the volatility as a function of the stock price and time. We start with the work of Dupire, show how local volatility models can be calibrated and end with a detailed discussion of the constant elasticity of volatility model. Chapter 3 focuses on the Heston model which represents the class of the stochastic volatility models, which assume that the volatility itself is driven by a stochastic process. These are called third generation models. We introduce the model structure, derive a partial differential pricing equation, give a closed-form solution for European calls by solving this equation and explain how the model is calibrated. The last part of chapter 3 then deals with the limits and the mis-specifications of the Heston model, in particular for recent exotic options like reverse cliquets, Accumulators or Napoleons. In chapter 4 we then introduce the Bergomi forward variance model which is called fourth generation model as a consequence of the limits of the Heston model explained in chapter 3. The Bergomi model is a stochastic local volatility model - the spot price is modeled as a constant elasticity of volatility diffusion and its volatility parameters are functions of the so called forward variances which are specified as stochastic processes. We start with the model specification, derive a partial differential pricing equation, show how the model has to be calibrated and end with pricing examples and a concluding discussion.
While in classical scheduling theory the locations of machines are assumed to be fixed we will show how to tackle location and scheduling problems simultaneously. Obviously, this integrated approach enhances the modeling power of scheduling for various real-life problems. In this paper, we present in an exemplary way theory and a solution algorithm for a specific type of a scheduling and a rather general, planar location problem, respectively. More general results and a report on numerical tests will be presented in a subsequent paper.
This paper deals with the problem of determining the sea surface topography from geostrophic flow of ocean currents on local domains of the spherical Earth. In mathematical context the problem amounts to the solution of a spherical differential equation relating the surface curl gradient of a scalar field (sea surface topography) to a surface divergence-free vector field(geostrophic ocean flow). At first, a continuous solution theory is presented in the framework of an integral formula involving Green’s function of the spherical Beltrami operator. Different criteria derived from spherical vector analysis are given to investigate uniqueness. Second, for practical applications Green’s function is replaced by a regularized counterpart. The solution is obtained by a convolution of the flow field with a scaled version of the regularized Green function. Calculating locally without boundary correction would lead to errors near the boundary. To avoid these Gibbs phenomenona we additionally consider the boundary integral of the corresponding region on the sphere which occurs in the integral formula of the solution. For reasons of simplicity we discuss a spherical cap first, that means we consider a continuously differentiable (regular) boundary curve. In a second step we concentrate on a more complicated domain with a non continuously differentiable boundary curve, namely a rectangular region. It will turn out that the boundary integral provides a major part for stabilizing and reconstructing the approximation of the solution in our multiscale procedure.
We consider the problem of estimating the conditional quantile of a time series at time \(t\) given observations of the same and perhaps other time series available at time \(t-1\). We discuss sieve estimates which are a nonparametric versions of the Koenker-Bassett regression quantiles and do not require the specification of the innovation law. We prove consistency of those estimates and illustrate their good performance for light- and heavy-tailed distributions of the innovations with a small simulation study. As an economic application, we use the estimates for calculating the value at risk of some stock price series.