### Refine

#### Year of publication

- 2011 (110) (remove)

#### Document Type

- Doctoral Thesis (62)
- Report (18)
- Preprint (10)
- Periodical Part (9)
- Working Paper (4)
- Book (3)
- Master's Thesis (2)
- Bachelor Thesis (1)
- Habilitation (1)

#### Keywords

- Visualisierung (4)
- Simulation (3)
- autoregressive process (3)
- neural network (3)
- nonparametric regression (3)
- Außenpolitik (2)
- CSO (2)
- Implementation (2)
- Katalyse (2)
- Metropolregion (2)

#### Faculty / Organisational entity

- Fachbereich Mathematik (30)
- Fachbereich Chemie (15)
- Fachbereich ARUBI (12)
- Fraunhofer (ITWM) (11)
- Fachbereich Maschinenbau und Verfahrenstechnik (9)
- Fachbereich Sozialwissenschaften (8)
- Fachbereich Biologie (7)
- Fachbereich Informatik (5)
- Fachbereich Raum- und Umweltplanung (4)
- Fachbereich Elektrotechnik und Informationstechnik (2)
- Fachbereich Physik (2)
- Fachbereich Wirtschaftswissenschaften (2)
- Universität (2)
- Universitätsbibliothek (1)

The various uses of fiber-reinforced composites, for example in the enclosures of planes, boats and cars, generates the demand for a detailed analysis of these materials. The final goal is to optimize fibrous materials by the means of “virtual material design”. New fibrous materials are virtually created as realizations of a stochastic model and evaluated with physical simulations. In that way, materials can be optimized for specific use cases, without constructing expensive prototypes or performing mechanical experiments. In order to design a practically fabricable material, the stochastic model is first adapted to an existing material and then slightly modified. The virtual reconstruction of the existing material requires a precise knowledge of the geometry of its microstructure. The first part of this thesis describes a fiber quantification method by the means of local measurements of the fiber radius and orientation. The combination of a sparse chord length transform and inertia moments leads to an efficient and precise new algorithm. It outperforms existing approaches with the possibility to treat different fiber radii within one sample, with high precision in continuous space and comparably fast computing time. This local quantification method can be directly applied on gray value images by adapting the directional distance transforms on gray values. In this work, several approaches of this kind are developed and evaluated. Further characterization of the fiber system requires a segmentation of each single fiber. Using basic morphological operators with specific structuring elements, it is possible to derive a probability for each pixel describing if the pixel belongs to a fiber core in a region without overlapping fibers. Tracking high probabilities leads to a partly reconstruction of the fiber cores in non crossing regions. These core parts are then reconnected over critical regions, if they fulfill certain conditions ensuring the affiliation to the same fiber. In the second part of this work, we develop a new stochastic model for dense systems of non overlapping fibers with a controllable level of bending. Existing approaches in the literature have at least one weakness in either achieving high volume fractions, producing non overlapping fibers, or controlling the bending or the orientation distribution. This gap can be bridged by our stochastic model, which operates in two steps. Firstly, a random walk with the multivariate von Mises-Fisher orientation distribution defines bent fibers. Secondly, a force-biased packing approach arranges them in a non overlapping configuration. Furthermore, we provide the estimation of all parameters needed for the fitting of this model to a real microstructure. Finally, we simulate the macroscopic behavior of different microstructures to derive their mechanical and thermal properties. This part is mostly supported by existing software and serves as a summary of physical simulation applied to random fiber systems. The application on a glass fiber reinforced polymer proves the quality of the reconstruction by our stochastic model, as the effective properties match for both the real microstructure and the realizations of the fitted model. This thesis includes all steps to successfully perform virtual material design on various data sets. With novel and efficient algorithms it contributes to the science of analysis and modeling of fiber reinforced materials.

For many years real-time task models have focused the timing constraints on execution windows defined by earliest start times and deadlines for feasibility.
However, the utility of some application may vary among scenarios which yield correct behavior, and maximizing this utility improves the resource utilization.
For example, target sensitive applications have a target point where execution results in maximized utility, and an execution window for feasibility.
Execution around this point and within the execution window is allowed, albeit at lower utility.
The intensity of the utility decay accounts for the importance of the application.
Examples of such applications include multimedia and control; multimedia application are very popular nowadays and control applications are present in every automated system.
In this thesis, we present a novel real-time task model which provides for easy abstractions to express the timing constraints of target sensitive RT applications: the gravitational task model.
This model uses a simple gravity pendulum (or bob pendulum) system as a visualization model for trade-offs among target sensitive RT applications.
We consider jobs as objects in a pendulum system, and the target points as the central point.
Then, the equilibrium state of the physical problem is equivalent to the best compromise among jobs with conflicting targets.
Analogies with well-known systems are helpful to fill in the gap between application requirements and theoretical abstractions used in task models.
For instance, the so-called nature algorithms use key elements of physical processes to form the basis of an optimization algorithm.
Examples include the knapsack problem, traveling salesman problem, ant colony optimization, and simulated annealing.
We also present a few scheduling algorithms designed for the gravitational task model which fulfill the requirements for on-line adaptivity.
The scheduling of target sensitive RT applications must account for timing constraints, and the trade-off among tasks with conflicting targets.
Our proposed scheduling algorithms use the equilibrium state concept to order the execution sequence of jobs, and compute the deviation of jobs from their target points for increased system utility.
The execution sequence of jobs in the schedule has a significant impact on the equilibrium of jobs, and dominates the complexity of the problem --- the optimum solution is NP-hard.
We show the efficacy of our approach through simulations results and 3 target sensitive RT applications enhanced with the gravitational task model.

A Multi-Phase Flow Model Incorporated with Population Balance Equation in a Meshfree Framework
(2011)

This study deals with the numerical solution of a meshfree coupled model of Computational Fluid Dynamics (CFD) and Population Balance Equation (PBE) for liquid-liquid extraction columns. In modeling the coupled hydrodynamics and mass transfer in liquid extraction columns one encounters multidimensional population balance equation that could not be fully resolved numerically within a reasonable time necessary for steady state or dynamic simulations. For this reason, there is an obvious need for a new liquid extraction model that captures all the essential physical phenomena and still tractable from computational point of view. This thesis discusses a new model which focuses on discretization of the external (spatial) and internal coordinates such that the computational time is drastically reduced. For the internal coordinates, the concept of the multi-primary particle method; as a special case of the Sectional Quadrature Method of Moments (SQMOM) is used to represent the droplet internal properties. This model is capable of conserving the most important integral properties of the distribution; namely: the total number, solute and volume concentrations and reduces the computational time when compared to the classical finite difference methods, which require many grid points to conserve the desired physical quantities. On the other hand, due to the discrete nature of the dispersed phase, a meshfree Lagrangian particle method is used to discretize the spatial domain (extraction column height) using the Finite Pointset Method (FPM). This method avoids the extremely difficult convective term discretization using the classical finite volume methods, which require a lot of grid points to capture the moving fronts propagating along column height.

This paper presents a new similarity measure and nonlocal filters for images corrupted by multiplicative noise. The considered filters are generalizations of the nonlocal means filter of Buades et al., which is known to be well suited for removing additive Gaussian noise. To adapt to different noise models, the patch comparison involved in this filter has first of all to be performed by a suitable noise dependent similarity measure. To this purpose, we start by studying a probabilistic measure recently proposed for general noise models by Deledalle et al. We analyze this measure in the context of conditional density functions and examine its properties for images corrupted by additive and multiplicative noise. Since it turns out to have unfavorable properties for multiplicative noise we deduce a new similarity measure consisting of a probability density function specially chosen for this type of noise. The properties of our new measure are studied theoretically as well as by numerical experiments. To obtain the final nonlocal filters we apply a weighted maximum likelihood estimation framework, which also incorporates the noise statistics. Moreover, we define the weights occurring in these filters using our new similarity measure and propose different adaptations to further improve the results. Finally, restoration results for images corrupted by multiplicative Gamma and Rayleigh noise are presented to demonstrate the very good performance of our nonlocal filters.

This work presents the dynamic capillary pressure model (Hassanizadeh, Gray, 1990, 1993a) adapted for the needs of paper manufacturing process simulations. The dynamic capillary pressure-saturation relation is included in a one-dimensional simulation model for the pressing section of a paper machine. The one-dimensional model is derived from a two-dimensional model by averaging with respect to the vertical direction. Then, the model is discretized by the finite volume method and solved by Newton’s method. The numerical experiments are carried out for parameters typical for the paper layer. The dynamic capillary pressure-saturation relation shows significant influence on the distribution of water pressure. The behaviour of the solution agrees with laboratory experiments (Beck, 1983).

The interest of the exploration of new hydrocarbon fields as well as deep geothermal reservoirs is permanently growing. The analysis of seismic data specific for such exploration projects is very complex and requires the deep knowledge in geology, geophysics, petrology, etc from interpreters, as well as the ability of advanced tools that are able to recover some particular properties. There again the existing wavelet techniques have a huge success in signal processing, data compression, noise reduction, etc. They enable to break complicate functions into many simple pieces at different scales and positions that makes detection and interpretation of local events significantly easier.
In this thesis mathematical methods and tools are presented which are applicable to the seismic data postprocessing in regions with non-smooth boundaries. We provide wavelet techniques that relate to the solutions of the Helmholtz equation. As application we are interested in seismic data analysis. A similar idea to construct wavelet functions from the limit and jump relations of the layer potentials was first suggested by Freeden and his Geomathematics Group.
The particular difficulty in such approaches is the formulation of limit and
jump relations for surfaces used in seismic data processing, i.e., non-smooth
surfaces in various topologies (for example, uniform and
quadratic). The essential idea is to replace the concept of parallel surfaces known for a smooth regular surface by certain appropriate substitutes for non-smooth surfaces.
By using the jump and limit relations formulated for regular surfaces, Helmholtz wavelets can be introduced that recursively approximate functions on surfaces with edges and corners. The exceptional point is that the construction of wavelets allows the efficient implementation in form of
a tree algorithm for the fast numerical computation of functions on the boundary.
In order to demonstrate the
applicability of the Helmholtz FWT, we study a seismic image obtained by the reverse time migration which is based on a finite-difference implementation. In fact, regarding the requirements of such migration algorithms in filtering and denoising the wavelet decomposition is successfully applied to this image for the attenuation of low-frequency
artifacts and noise. Essential feature is the space localization property of
Helmholtz wavelets which numerically enables to discuss the velocity field in
pointwise dependence. Moreover, the multiscale analysis leads us to reveal additional geological information from optical features.

We consider an autoregressive process with a nonlinear regression function that is modeled by a feedforward neural network. We derive a uniform central limit theorem which is useful in the context of change-point analysis. We propose a test for a change in the autoregression function which - by the uniform central limit theorem - has asymptotic power one for a large class of alternatives including local alternatives.

In the first part of the thesis we develop the theory of standard bases in free modules over (localized) polynomial rings. Given that linear equations are solvable in the coefficients of the polynomials, we introduce an algorithm to compute standard bases with respect to arbitrary (module) monomial orderings. Moreover, we take special care to principal ideal rings, allowing zero divisors. For these rings we design modified algorithms which are new and much faster than the general ones. These algorithms were motivated by current limitations in formal verification of microelectronic System-on-Chip designs. We show that our novel approach using computational algebra is able to overcome these limitations in important classes of applications coming from industrial challenges.
The second part is based on research in collaboration with Jason Morton, Bernd Sturmfels and Anne Shiu. We devise a general method to describe and compute a certain class of rank tests motivated by statistics. The class of rank tests may loosely be described as being based on computing the number of linear extensions to given partial orders. In order to apply these tests to actual data we developed two algorithms and used our implementations to apply the methodology to gene expression data created at the Stowers Institute for Medical Research. The dataset is concerned with the development of the vertebra. Our rankings proved valuable to the biologists.

In this paper we develop monitoring schemes for detecting structural changes
in nonlinear autoregressive models. We approximate the regression function by a
single layer feedforward neural network. We show that CUSUM-type tests based
on cumulative sums of estimated residuals, that have been intensively studied
for linear regression in both an offline as well as online setting, can be extended
to this model. The proposed monitoring schemes reject (asymptotically) the null
hypothesis only with a given probability but will detect a large class of alternatives
with probability one. In order to construct these sequential size tests the limit
distribution under the null hypothesis is obtained.

Annual Report 2010
(2011)

Annual Report, Jahrbuch AG Magnetismus