### Refine

#### Year of publication

- 2004 (57) (remove)

#### Document Type

- Report (22)
- Doctoral Thesis (19)
- Preprint (9)
- Working Paper (2)
- Article (1)
- Diploma Thesis (1)
- Habilitation (1)
- Lecture (1)
- Periodical Part (1)

#### Language

- English (57) (remove)

#### Keywords

- Inverses Problem (4)
- Regularisierung (4)
- Wavelet (3)
- Gravitationsfeld (2)
- Kugel (2)
- MAC type grid (2)
- Mehrskalenanalyse (2)
- SOC (2)
- Weißes Rauschen (2)
- auditory brainstem (2)

#### Faculty / Organisational entity

A spectral theory for constituents of macroscopically homogeneous random microstructures modeled as homogeneous random closed sets is developed and provided with a sound mathematical basis, where the spectrum obtained by Fourier methods corresponds to the angular intensity distribution of x-rays scattered by this constituent. It is shown that the fast Fourier transform applied to three-dimensional images of microstructures obtained by micro-tomography is a powerful tool of image processing. The applicability of this technique is is demonstrated in the analysis of images of porous media.

No doubt: Mathematics has become a technology in its own right, maybe even a key technology. Technology may be defined as the application of science to the problems of commerce and industry. And science? Science maybe defined as developing, testing and improving models for the prediction of system behavior; the language used to describe these models is mathematics and mathematics provides methods to evaluate these models. Here we are! Why has mathematics become a technology only recently? Since it got a tool, a tool to evaluate complex, "near to reality" models: Computer! The model may be quite old - Navier-Stokes equations describe flow behavior rather well, but to solve these equations for realistic geometry and higher Reynolds numbers with sufficient precision is even for powerful parallel computing a real challenge. Make the models as simple as possible, as complex as necessary - and then evaluate them with the help of efficient and reliable algorithms: These are genuine mathematical tasks.

Based on the well-known results of classical potential theory, viz. the limit and jump relations for layer integrals, a numerically viable and e±cient multiscale method of approximating the disturbing potential from gravity anomalies is established on regular surfaces, i.e., on telluroids of ellipsoidal or even more structured geometric shape. The essential idea is to use scale dependent regularizations of the layer potentials occurring in the integral formulation of the linearized Molodensky problem to introduce scaling functions and wavelets on the telluroid. As an application of our multiscale approach some numerical examples are presented on an ellipsoidal telluroid.

Inappropriate speed is the most common reason for road traffic accidents world wide. Thus, a necessity for speed management exists. The so-called SUNflower states Sweden, the United Kingdom and the Netherlands - each spending strong effort in traffic safety policies - have great success in reducing mean road speeds and speed variances through speed management. However, the effect is still insufficient for gaining real traffic safety. Thus, there is a discussion to make use of technical in-vehicle devices. One of these technologies called Intelligent Speed Adaptation (ISA) reduces vehicle speeds. This is done either by warning the driver that he is speeding, or activating the accelerator pedal with a counterforce, or reducing the gasoline supply to the motor. The three ways of reducing the speed are called version 1-3. The EC-project for research on speed adaptation policies on European roads (PROSPER) deals with strategic proposals for the implementation of the different ISA-versions. This thesis includes selected results of PROSPER. In this thesis two empiric surveys were done in order to give an overview about the basic conditions (e.g. social, economic, technical aspects) for an ISA implementation in Germany. On one hand, a stakeholder analysis and questionnaire using the Delphi-method has been accomplished in two rounds. On the other hand, a questionnaire with speed offenders has been accomplished, too, in two rounds. In addition, the author created an expert pool consisting of 23 experts representing the most important fields of science and practice in which ISA is involved. The author made phone or personal interviews with most of the experts. 12 experts also produced a detailed publication on their professional point of view towards ISA. The two surveys and the professional comments on ISA led to four possible implementation scenarios for ISA in Germany. However, due to a strong political opposition against ISA it is also thinkable that ISA is not implemented or the implementation process starts after 2015 (i.e. outside the aimed period of time). The scenarios are as follows: A) Implementation of version 1 by market forces with governmental subventions. B) Implementation of version 2 by market forces supported by traffic safety institutions and image-making processes. C) Implementation of a modified version 3 by law for speed offenders instead of cancellation of the driving licence. D) Implementation of various versions in Germany because of a broad implementation of ISA in the SUNflower states. X) Non-implementation of ISA leads to the necessity of alternative speed management measures. The author prefers scenario B because - ceteris paribus - it seems to be the most likely way to implement the technology. As soon as ISA reaches technical maturity, the implementation process has to be accomplished in three steps. 1) Marketing and image making 2) Margin introduction 3) Market penetration This implementation process for ISA by market forces could effect a percentage of at least 15% of all vehicles equipped with ISA before the year 2015.

In traditional portfolio optimization under the threat of a crash the investment horizon or time to maturity is neglected. Developing the so-called crash hedging strategies (which are portfolio strategies which make an investor indifferent to the occurrence of an uncertain (down) jumps of the price of the risky asset) the time to maturity turns out to be essential. The crash hedging strategies are derived as solutions of non-linear differential equations which itself are consequences of an equilibrium strategy. Hereby the situation of changing market coefficients after a possible crash is considered for the case of logarithmic utility as well as for the case of general utility functions. A benefit-cost analysis of the crash hedging strategy is done as well as a comparison of the crash hedging strategy with the optimal portfolio strategies given in traditional crash models. Moreover, it will be shown that the crash hedging strategies optimize the worst-case bound for the expected utility from final wealth subject to some restrictions. Another application is to model crash hedging strategies in situations where both the number and the height of the crash are uncertain but bounded. Taking the additional information of the probability of a possible crash happening into account leads to the development of the q-quantile crash hedging strategy.

Algebraic Systems Theory
(2004)

Control systems are usually described by differential equations, but their properties of interest are most naturally expressed in terms of the system trajectories, i.e., the set of all solutions to the equations. This is the central idea behind the so-called "behavioral approach" to systems and control theory. On the other hand, the manipulation of linear systems of differential equations can be formalized using algebra, more precisely, module theory and homological methods ("algebraic analysis"). The relationship between modules and systems is very rich, in fact, it is a categorical duality in many cases of practical interest. This leads to algebraic characterizations of structural systems properties such as autonomy, controllability, and observability. The aim of these lecture notes is to investigate this module-system correspondence. Particular emphasis is put on the application areas of one-dimensional rational systems (linear ODE with rational coefficients), and multi-dimensional constant systems (linear PDE with constant coefficients).

Porous media flow of polymers with Carreau law viscosities and their application to enhanced oil recovery (EOR) is considered. Applying the homogenization method leads to a nonlinear two-scale problem. In case of a small difference between the Carreau and the Newtonian case an asymptotic expansion based on the small deviation of the viscosity from the Newtonian case is introduced. For uni-directional pressure gradients, which is a reasonable assumption in applications like EOR, auxiliary problems to decouple the micro- from the macrovariables are derived. The microscopic flow field obtained by the proposed approach is compared to the solution of the two-scale problem. Finite element calculations for an isotropic and an anisotropic pore cell geometries are used to validate the accuracy and speed-up of the proposed approach. The order of accuracy has been studied by performing the simulations up to the third order expansion for the isotropic geometry.

The hypoxia inducible factor-1 (HIF-1), a heterodimer composed of HIF-1alpha and HIF-1beta, is activated in response to low oxygen tension and serves as the master regulator for cells to adapt to hypoxia. HIF-1 is usually considered to be regulated via degradation of its a-subunit. Recent findings, however, point to the existence of alternative mechanisms of HIF-1 regulation which appear to be important for down-regulating HIF-1 under prolonged and severe oxygen depletion. The aims of my Ph.D. thesis, therefore, were to further elucidate mechanisms involved in such down-regulation of HIF-1. The first part of the thesis addresses the impact of the severity and duration of oxygen depletion on HIF-1alpha protein accumulation and HIF-1 transcriptional activity. A special focus was put on the influence of the transcription factor p53 on HIF-1. I found that p53 only accumulates under prolonged anoxia (but not hypoxia), thus limiting its influence on HIF-1 to severe hypoxic conditions. At low expression levels, p53 inhibits HIF-1 transactivity. I attributed this effect to a competition between p53 and HIF-1alpha for binding to the transcriptional co-factor p300, since p300 overexpression reverses this inhibition. This assumption is corroborated by competitive binding of IVTT-generated p53 and HIF-1alpha to the CH1-domain of p300 in vitro. High p53 expression, on the other hand, affects HIF-1alpha protein negatively, i.e., p53 provokes pVHL-independent degradation of HIF-1alpha. Therefore, I conclude that low p53 expression attenuates HIF-1 transactivation by competing for p300, while high p53 expression negatively affects HIF-1alpha protein, thereby eliminating HIF-1 transactivity. Thus, once p53 becomes activated under prolonged anoxia, it contributes to terminating HIF-1 responses. In the second part of my study, I intended to further characterize the effects induced by prolonged periods of low oxygen, i.e., hypoxia, as compared to anoxia, with respect to alterations in HIF-1alpha mRNA. Prolonged anoxia, but not hypoxia, showed pronounced effects on HIF-1alpha mRNA. Long-term anoxia induced destabilization of HIF-1alpha mRNA, which manifests itself in a dramatic reduction of the half-life. The mechanistic background points to natural anti-sense HIF-1alpha mRNA, which is induced in a HIF-1-dependent manner, and additional factors, which most likely influence HIF-1alpha mRNA indirectly via anti-sense HIF-1alpha mRNA mediated trans-effects. In summary, the data provide new information concerning the impact of p53 on HIF-1, which might be of importance for the decision between pro- and anti-apoptotic mechanisms depending upon the severity and duration of hypoxia. Furthermore, the results of this project give further insights into a novel mechanism of HIF-1 regulation, namely mRNA down-regulation under prolonged anoxic incubations. These mechanisms appear to be activated only in response to prolonged anoxia, but not to hypoxia. These considerations regarding HIF-1 regulation should be taken into account when prolonged incubations to hypoxic or anoxic conditions are analyzed at the level of HIF-1 stability regulation.

Nowadays one of the major objectives in geosciences is the determination of the gravitational field of our planet, the Earth. A precise knowledge of this quantity is not just interesting on its own but it is indeed a key point for a vast number of applications. The important question is how to obtain a good model for the gravitational field on a global scale. The only applicable solution - both in costs and data coverage - is the usage of satellite data. We concentrate on highly precise measurements which will be obtained by GOCE (Gravity Field and Steady State Ocean Circulation Explorer, launch expected 2006). This satellite has a gradiometer onboard which returns the second derivatives of the gravitational potential. Mathematically seen we have to deal with several obstacles. The first one is that the noise in the different components of these second derivatives differs over several orders of magnitude, i.e. a straightforward solution of this outer boundary value problem will not work properly. Furthermore we are not interested in the data at satellite height but we want to know the field at the Earth's surface, thus we need a regularization (downward-continuation) of the data. These two problems are tackled in the thesis and are now described briefly. Split Operators: We have to solve an outer boundary value problem at the height of the satellite track. Classically one can handle first order side conditions which are not tangential to the surface and second derivatives pointing in the radial direction employing integral and pseudo differential equation methods. We present a different approach: We classify all first and purely second order operators which fulfill that a harmonic function stays harmonic under their application. This task is done by using modern algebraic methods for solving systems of partial differential equations symbolically. Now we can look at the problem with oblique side conditions as if we had ordinary i.e. non-derived side conditions. The only additional work which has to be done is an inversion of the differential operator, i.e. integration. In particular we are capable to deal with derivatives which are tangential to the boundary. Auto-Regularization: The second obstacle is finding a proper regularization procedure. This is complicated by the fact that we are facing stochastic rather than deterministic noise. The main question is how to find an optimal regularization parameter which is impossible without any additional knowledge. However we could show that with a very limited number of additional information, which are obtainable also in practice, we can regularize in an asymptotically optimal way. In particular we showed that the knowledge of two input data sets allows an order optimal regularization procedure even under the hard conditions of Gaussian white noise and an exponentially ill-posed problem. A last but rather simple task is combining data from different derivatives which can be done by a weighted least squares approach using the information we obtained out of the regularization procedure. A practical application to the downward-continuation problem for simulated gravitational data is shown.

Compared to our current knowledge of neuronal excitation, little is known about the development and maturation of inhibitory circuits. Recent studies show that inhibitory circuits develop and mature in a similar way like excitatory circuit. One such similarity is the development through excitation, irrespective of its inhibitory nature. Here in this current study, I used the inhibitory projection between the medial nucleus of the trapezoid body (MNTB) and the lateral superior olive (LSO) as a model system to unravel some aspects of the development of inhibitory synapses. In LSO neurons of the rat auditory brainstem, glycine receptor-mediated responses change from depolarizing to hyperpolarizing during the first two postnatal weeks (Kandler and Friauf 1995, J. Neurosci. 15:6890-6904). The depolarizing effect of glycine is due to a high intracellular chloride concentration ([Cl-]i), which induces a reversal potential of glycine (EGly) more positive than the resting membrane potential (Vrest). In older LSO neurons, the hyperpolarizing effect is due to a low [Cl-]i (Ehrlich et al., 1999, J. Physiol. 520:121-137). Aim of the present study was to elucidate the molecular mechanism behind Clhomeostasis in LSO neurons which determines polarity of glycine response. To do so, the role and developmental expression of Cl-cotransporters, such as NKCC1 and KCC2 were investigated. Molecular biological and gramicidin perforated patchclamp experiments revealed, the role of KCC2 as an outward Cl-cotransporter in mature LSO neurons (Balakrishnan et al., 2003, J Neurosci. 23:4134-4145). But, NKCC1 does not appear to be involved in accumulating chloride in immature LSO neurons. Further experiments, indicated the role of GABA and glycine transporters (GAT1 and GLYT2) in accumulating Cl- in immature LSO neurons. Finally, the experiments with hypothyroid animals suggest the possible role of thyroid hormone in the maturation of inhibitory synapse. Altogether, this thesis addressed the molecular mechanism underlying the Cl- regulation in LSO neurons and deciphered it to some extent.