## Doctoral Thesis

### Refine

#### Faculty / Organisational entity

- Fachbereich Mathematik (183)
- Fachbereich Informatik (90)
- Fachbereich Maschinenbau und Verfahrenstechnik (56)
- Fachbereich Chemie (38)
- Fachbereich Elektrotechnik und Informationstechnik (37)
- Fachbereich Biologie (20)
- Fachbereich Sozialwissenschaften (12)
- Fachbereich ARUBI (5)
- Fachbereich Physik (4)
- Fraunhofer (ITWM) (4)

#### Year of publication

#### Document Type

- Doctoral Thesis (450) (remove)

#### Language

- English (450) (remove)

#### Keywords

- Visualisierung (10)
- finite element method (5)
- Algebraische Geometrie (4)
- Finite-Elemente-Methode (4)
- Navier-Stokes-Gleichung (4)
- Numerische Strömungssimulation (4)
- Optimization (4)
- Computeralgebra (3)
- Computergraphik (3)
- Finanzmathematik (3)

- Morphology and Morphology Formation of Injection Molded PP-based Nanocomposites (2016)
- The mechanical properties of semi-crystalline polymers depend extremely on their morphology, which is dependent on the crystallization during processing. The aim of this research is to determine the effect of various nanoparticles on morphology formation and tensile mechanical properties of polypropylene under conditions relevant in polymer processing and to contribute ultimately to the understanding of this influence. Based on the thermal analyses of samples during fast cooling, it is found that the presence of nanoparticle enhances the overall crystallization process of PP. The results suggest that an increase of the nucleation density/rate is a dominant process that controls the crystallization process of PP in this work, which can help to reduce the cycle time in the injection process. Moreover, the analysis of melting behaviors obtained after each undercooling reveals that crystal perfection increases significantly with the incorporation of TiO2 nanoparticles, while it is not influenced by the SiO2 nanoparticles. This work also comprises an analysis of the influence of nanoparticles on the microstructure of injection-molded parts. The results clearly show multi-layers along the wall thickness. The spherulite size and the degree of crystallinity continuously decrease from the center to the edge. Generally both the spherulite size and the degree of crystallinity decrease with higher the SiO2 loading. In contrast, an increase in the degree of crystallinity with an increasing TiO2 nanoparticle loading was detected. The tensile properties exhibit a tendency to increase in the tensile strength as the core is reached. The tensile strength decreases with the addition of nanoparticles, while the elongation at break of nanoparticle-filled PP decreases from the skin to the core. With increasing TiO2 loading, the elongation at break decreases.

- Worst-Case Performance Analysis of Feed-Forward Networks – An Efficient and Accurate Network Calculus (2016)
- Distributed systems are omnipresent nowadays and networking them is fundamental for the continuous dissemination and thus availability of data. Provision of data in real-time is one of the most important non-functional aspects that safety-critical networks must guarantee. Formal verification of data communication against worst-case deadline requirements is key to certification of emerging x-by-wire systems. Verification allows aircraft to take off, cars to steer by wire, and safety-critical industrial facilities to operate. Therefore, different methodologies for worst-case modeling and analysis of real-time systems have been established. Among them is deterministic Network Calculus (NC), a versatile technique that is applicable across multiple domains such as packet switching, task scheduling, system on chip, software-defined networking, data center networking and network virtualization. NC is a methodology to derive deterministic bounds on two crucial performance metrics of communication systems: (a) the end-to-end delay data flows experience and (b) the buffer space required by a server to queue all incoming data. NC has already seen application in the industry, for instance, basic results have been used to certify the backbone network of the Airbus A380 aircraft. The NC methodology for worst-case performance analysis of distributed real-time systems consists of two branches. Both share the NC network model but diverge regarding their respective derivation of performance bounds, i.e., their analysis principle. NC was created as a deterministic system theory for queueing analysis and its operations were later cast in a (min,+)-algebraic framework. This branch is known as algebraic Network Calculus (algNC). While algNC can efficiently compute bounds on delay and backlog, the algebraic manipulations do not allow NC to attain the most accurate bounds achievable for the given network model. These tight performance bounds can only be attained with the other, newly established branch of NC, the optimization-based analysis (optNC). However, the only optNC analysis that can currently derive tight bounds was proven to be computationally infeasible even for the analysis of moderately sized networks other than simple sequences of servers. This thesis makes various contributions in the area of algNC: accuracy within the existing framework is improved, distributivity of the sensor network calculus analysis is established, and most significantly the algNC is extended with optimization principles. They allow algNC to derive performance bounds that are competitive with optNC. Moreover, the computational efficiency of the new NC approach is improved such that this thesis presents the first NC analysis that is both accurate and computationally feasible at the same time. It allows NC to scale to larger, more complex systems that require formal verification of their real-time capabilities.

- Gröbner Bases over Extention Fields of \(\mathbb{Q}\) (2016)
- Gröbner bases are one of the most powerful tools in computer algebra and commutative algebra, with applications in algebraic geometry and singularity theory. From the theoretical point of view, these bases can be computed over any field using Buchberger's algorithm. In practice, however, the computational efficiency depends on the arithmetic of the coefficient field. In this thesis, we consider Gröbner bases computations over two types of coefficient fields. First, consider a simple extension \(K=\mathbb{Q}(\alpha)\) of \(\mathbb{Q}\), where \(\alpha\) is an algebraic number, and let \(f\in \mathbb{Q}[t]\) be the minimal polynomial of \(\alpha\). Second, let \(K'\) be the algebraic function field over \(\mathbb{Q}\) with transcendental parameters \(t_1,\ldots,t_m\), that is, \(K' = \mathbb{Q}(t_1,\ldots,t_m)\). In particular, we present efficient algorithms for computing Gröbner bases over \(K\) and \(K'\). Moreover, we present an efficient method for computing syzygy modules over \(K\). To compute Gröbner bases over \(K\), starting from the ideas of Noro [35], we proceed by joining \(f\) to the ideal to be considered, adding \(t\) as an extra variable. But instead of avoiding superfluous S-pair reductions by inverting algebraic numbers, we achieve the same goal by applying modular methods as in [2,4,27], that is, by inferring information in characteristic zero from information in characteristic \(p > 0\). For suitable primes \(p\), the minimal polynomial \(f\) is reducible over \(\mathbb{F}_p\). This allows us to apply modular methods once again, on a second level, with respect to the modular factors of \(f\). The algorithm thus resembles a divide and conquer strategy and is in particular easily parallelizable. Moreover, using a similar approach, we present an algorithm for computing syzygy modules over \(K\). On the other hand, to compute Gröbner bases over \(K'\), our new algorithm first specializes the parameters \(t_1,\ldots,t_m\) to reduce the problem from \(K'[x_1,\ldots,x_n]\) to \(\mathbb{Q}[x_1,\ldots,x_n]\). The algorithm then computes a set of Gröbner bases of specialized ideals. From this set of Gröbner bases with coefficients in \(\mathbb{Q}\), it obtains a Gröbner basis of the input ideal using sparse multivariate rational interpolation. At current state, these algorithms are probabilistic in the sense that, as for other modular Gröbner basis computations, an effective final verification test is only known for homogeneous ideals or for local monomial orderings. The presented timings show that for most examples, our algorithms, which have been implemented in SINGULAR [17], are considerably faster than other known methods.

- Interest Rate Modeling - The Potential Approach and Multi-Curve Potential Models (2016)
- This thesis is concerned with interest rate modeling by means of the potential approach. The contribution of this work is twofold. First, by making use of the potential approach and the theory of affine Markov processes, we develop a general class of rational models to the term structure of interest rates which we refer to as "the affine rational potential model". These models feature positive interest rates and analytical pricing formulae for zero-coupon bonds, caps, swaptions, and European currency options. We present some concrete models to illustrate the scope of the affine rational potential model and calibrate a model specification to real-world market data. Second, we develop a general family of "multi-curve potential models" for post-crisis interest rates. Our models feature positive stochastic basis spreads, positive term structures, and analytic pricing formulae for interest rate derivatives. This modeling framework is also flexible enough to accommodate negative interest rates and positive basis spreads.

- Plants, herbivores, and their interactions in human-modified landscapes (2016)
- Human forest modification is among the largest global drivers of terrestrial degradation of biodiversity, species interactions, and ecosystem functioning. One of the most pertinent components, forest fragmentation, has a long history in ecological research across the globe, particularly in lower latitudes. However, we still know little how fragmentation shapes temperate ecosystems, irrespective of the ancient status quo of European deforestation. Furthermore, its interaction with another pivotal component of European forests, silvicultural management, are practically unexplored. Hence, answering the question how anthropogenic modification of temperate forests affects fundamental components of forest ecosystems is essential basic research that has been neglected thus far. Most basal ecosystem elements are plants and their insect herbivores, as they form the energetic basis of the tropic pyramid. Furthermore, their respective biodiversity, functional traits, and the networks of interactions they establish are key for a multitude of ecosystem functions, not least ecosystem stability. Hence, the thesis at hand aimed to disentangle this complex system of interdependencies of human impacts, biodiversity, species traits and inter-species interactions. The first step lay in understanding how woody plant assemblages are shaped by human forest modification. For this purpose, field investigations in 57 plots in the hyperfragmented cultural landscape of the Northern Palatinate highlands (SW Germany) were conducted, censusing > 4,000 tree/shrub individuals from 34 species. Use of novel, integrative indices for different types of land-use allowed an accurate quantification of biotic responses. Intriguingly, woody tree/shrub communities reacted strikingly positive to forest fragmentation, with increases in alpha and beta diversity, as well as proliferation of heat/drought/light adapted pioneer species. Contrarily, managed interior forests were homogenized/constrained in biodiversity, with dominance of shade/cold adapted commercial tree species. Comparisons with recently unmanaged stands (> 40 a) revealed first indications for nascent conversion to oldgrowth conditions, with larger variability in light conditions and subsequent community composition. Reactions to microclimatic conditions, the relationship between associated species traits and the corresponding species pool, as well as facilitative/constraining effects by foresters were discussed as underlying mechanisms. Reactions of herbivore assemblages to forest fragmentation and the subsequent changes in host plant communities were assessed by comprehensive sampling of > 1,000 live herbivores from 134 species in the forest understory. Diversity was – similarly to plant communities - higher in fragmentation affected habitats, particularly in edges of continuous control forests. Furthermore, average trophic specialization showed an identical pattern. Mechanistically, benefits from microclimatic conditions, host availability, as well as pronounced niche differentiation are deemed responsible. While communities were heterogeneous, with no segregation across habitats, (smallforest fragments, edges, and interior of control forests), vegetation diversity, herbivore diversity, as well as trophic specialization were identified to shape community composition. This probably reflected a gradient from generalistic/species poor vs. specialist/species rich herbivore assemblages. Insect studies conducted in forest systems are doomed to incompleteness without considering ‘the last biological frontier’, the tree canopies. To access their biodiversity, relationship to edge effects, and their conservational value, the arboricolous arthropod fauna of 24 beech (Fagus sylvatica) canopies was sampled via insecticidal knockdown (‘fogging’). This resulted in an exhaustive collection of > 46,000 specimens from 24 major taxonomic/functional groups. Abundance distributions were markedly negative exponential, indicating high abundance variability in tree crowns. Individuals of six pertinent orders were identified to species level, returning > 3,100 individuals from 175 species and 52 families. This high diversity did marginally differ across habitats, with slightly higher species richness in edge canopies. However, communities in edge crowns were noticeably more heterogeneous than those in the forest interior, possibly due to higher variability in environmental edge conditions. In total, 49 species with protective value were identified, of which only one showed habitat preferences (for near-natural interior forests). Among them, six species (all beetles, Coleoptera) were classified as ‘priority species’ for conservation efforts. Hence, beech canopies of the Northern Palatinate highlands can be considered strongholds of insect biodiversity, incorporating many species of particular protective value. The intricacy of plant-herbivore interaction networks and their relationship to forest fragmentation is largely unexplored, particularly in Central Europe. Illumination of this matter is all the more important, as ecological networks are highly relevant for ecosystem stability, particularly in the face of additional anthropogenic disturbances, such as climate change. Hence, plant-herbivore interaction networks (PHNs) were constructed from woody plants and their associated herbivores, sampled alive in the understory. Herbivory verification was achieved using no-choice-feeding assays, as well as literature references. In total, networks across small forest fragments, edges, and the forest interior consisted of 696 interactions. Network complexity and trophic niche redundancy were compared across habitats using a rarefaction-like resampling procedure. PHNs in fragmentation affected forest habitats were significantly more complex, as well as more redundant in their realized niches, despite being composed of relatively more specialist species. Furthermore, network robustness to climate change was quantified utilizing four different scenarios for climate change susceptibility of involved plants. In this procedure, remaining herbivores in the network were measured upon successive loss of their host plant species. Consistently, PHNs in edges (and to a smaller degree in small fragments) withstood primary extinction of plant species longer, making them more robust. This was attributed to the high prevalence of heat/drought-adapted species, as well as to beneficial effects of network topography (complexity and redundancy). Consequently, strong correlative relationships were found between realized niche redundancy and climate change robustness of PHNs. This was both the first time that biologically realistic extinctions (instead of e.g.random extinctions) were used to measure network robustness, and that topographical network parameters were identified as potential indicators for network robustness against climate change. In synthesis, in the light of global biotic degradation due to human forest modification, the necessity to differentiate must be claimed. Ecosystems react differently to anthropogenic disturbances, and it seems the particular features present in Central European forests (ancient deforestation, extensive management, and, most importantly, high richness in open-forest plant species) cause partly opposed patterns to other biomes. Lenient microclimates and diverse plant communities facilitate equally diverse herbivore assemblages, and hence complex and robust networks, opposed to the forest interior. Therefore, in the reality of extensively used cultural landscapes, fragmentation affected forest ecosystems, particularly forest edges, can be perceived as reservoir for biodiversity, and ecosystem functionality. Nevertheless, as practically all forest habitats considered in this thesis are under human cultivation, recommendations for ecological enhancement of all forest habitats are discussed.

- Interactive Visualizations Supporting Minimal Cut Set Analysis II (2016)
- The Context and Its Importance: In safety and reliability analysis, the information generated by Minimal Cut Set (MCS) analysis is large. The Top Level event (TLE) that is the root of the fault tree (FT) represents a hazardous state of the system being analyzed. MCS analysis helps in analyzing the fault tree (FT) qualitatively-and quantitatively when accompanied with quantitative measures. The information shows the bottlenecks in the fault tree design leading to identifying weaknesses of the system being examined. Safety analysis (containing the MCS analysis) is especially important for critical systems, where harm can be done to the environment or human causing injuries, or even death during the system usage. Minimal Cut Set (MCS) analysis is performed using computers and generating a lot of information. This phase is called MCS analysis I in this thesis. The information is then analyzed by the analysts to determine possible issues and to improve the design of the system regarding its safety as early as possible. This phase is called MCS analysis II in this thesis. The goal of my thesis was developing interactive visualizations to support MCS analysis II of one fault tree (FT). The Methodology: As safety visualization-in this thesis, Minimal Cut Set analysis II visualization-is an emerging field and no complete checklist regarding Minimal Cut Set analysis II requirements and gaps were available from the perspective of visualization and interaction capabilities, I have conducted multiple studies using different methods with different data sources (i.e., triangulation of methods and data) for determining these requirements and gaps before developing and evaluating visualizations and interactions supporting Minimal Cut Set analysis II. Thus, the following approach was taken in my thesis: 1- First, a triangulation of mixed methods and data sources was conducted. 2- Then, four novel interactive visualizations and one novel interaction widget were developed. 3- Finally, these interactive visualizations were evaluated both objectively and subjectively (compared to multiple safety tools) from the point of view of users and developers of the safety tools that perform MCS analysis I with respect to their degree in supporting MCS analysis II and from the point of non-domain people using empirical strategies. The Spiral tool supports analysts with different visions, i.e., full vision, color deficiency protanopia, deuteranopia, and tritanopia. It supports 100 out of 103 (97%) requirements obtained from the triangulation and it fills 37 out of 39 (95%) gaps. Its usability was rated high (better than their best currently used tools) by the users of the safety and reliability tools (RiskSpectrum, ESSaRel, FaultTree+, and a self-developed tool) and at least similar to the best currently used tools from the point of view of the CAFTA tool developers. Its quality was higher regarding its degree of supporting MCS analysis II compared to the FaultTree+ tool. The time spent for discovering the critical MCSs from a problem size of 540 MCSs (with a worst case of all equal order) was less than a minute while achieving 99.5% accuracy. The scalability of the Spiral visualization was above 4000 MCSs for a comparison task. The Dynamic Slider reduces the interaction movements up to 85.71% of the previous sliders and solves the overlapping thumb issues by the sliders provides the 3D model view of the system being analyzed provides the ability to change the coloring of MCSs according to the color vision of the user provides selecting a BE (i.e., multi-selection of MCSs), thus, can observe the BEs' NoO and provides its quality provides two interaction speeds for panning and zooming in the MCS, BE, and model views provide a MCS, a BE, and a physical tab for supporting the analysis starting by the MCSs, the BEs, or the physical parts. It combines MCS analysis results and the model of an embedded system enabling the analysts to directly relate safety information with the corresponding parts of the system being analyzed and provides an interactive mapping between the textual information of the BEs and MCSs and the parts related to the BEs. Verifications and Assessments: I have evaluated all visualizations and the interaction widget both objectively and subjectively, and finally evaluated the final Spiral visualization tool also both objectively and subjectively regarding its perceived quality and regarding its degree of supporting MCS analysis II.

- The Bootstrap for the Functional Autoregressive Model FAR(1) (2016)
- Functional data analysis is a branch of statistics that deals with observations \(X_1,..., X_n\) which are curves. We are interested in particular in time series of dependent curves and, specifically, consider the functional autoregressive process of order one (FAR(1)), which is defined as \(X_{n+1}=\Psi(X_{n})+\epsilon_{n+1}\) with independent innovations \(\epsilon_t\). Estimates \(\hat{\Psi}\) for the autoregressive operator \(\Psi\) have been investigated a lot during the last two decades, and their asymptotic properties are well understood. Particularly difficult and different from scalar- or vector-valued autoregressions are the weak convergence properties which also form the basis of the bootstrap theory. Although the asymptotics for \(\hat{\Psi}{(X_{n})}\) are still tractable, they are only useful for large enough samples. In applications, however, frequently only small samples of data are available such that an alternative method for approximating the distribution of \(\hat{\Psi}{(X_{n})}\) is welcome. As a motivation, we discuss a real-data example where we investigate a changepoint detection problem for a stimulus response dataset obtained from the animal physiology group at the Technical University of Kaiserslautern. To get an alternative for asymptotic approximations, we employ the naive or residual-based bootstrap procedure. In this thesis, we prove theoretically and show via simulations that the bootstrap provides asymptotically valid and practically useful approximations of the distributions of certain functions of the data. Such results may be used to calculate approximate confidence bands or critical bounds for tests.

- Integrality of representations of finite groups (2016)
- Since the early days of representation theory of finite groups in the 19th century, it was known that complex linear representations of finite groups live over number fields, that is, over finite extensions of the field of rational numbers. While the related question of integrality of representations was answered negatively by the work of Cliff, Ritter and Weiss as well as by Serre and Feit, it was not known how to decide integrality of a given representation. In this thesis we show that there exists an algorithm that given a representation of a finite group over a number field decides whether this representation can be made integral. Moreover, we provide theoretical and numerical evidence for a conjecture, which predicts the existence of splitting fields of irreducible characters with integrality properties. In the first part, we describe two algorithms for the pseudo-Hermite normal form, which is crucial when handling modules over ring of integers. Using a newly developed computational model for ideal and element arithmetic in number fields, we show that our pseudo-Hermite normal form algorithms have polynomial running time. Furthermore, we address a range of algorithmic questions related to orders and lattices over Dedekind domains, including computation of genera, testing local isomorphism, computation of various homomorphism rings and computation of Solomon zeta functions. In the second part we turn to the integrality of representations of finite groups and show that an important ingredient is a thorough understanding of the reduction of lattices at almost all prime ideals. By employing class field theory and tools from representation theory we solve this problem and eventually describe an algorithm for testing integrality. After running the algorithm on a large set of examples we are led to a conjecture on the existence of integral and nonintegral splitting fields of characters. By extending techniques of Serre we prove the conjecture for characters with rational character field and Schur index two.

- Development of nano/micro hybrid susceptor sheet for induction heating applications (2016)
- Thermoplastic composite materials are being widely used in the automotive and aerospace industries. Due to the limitations of shape complexity, different components need to be joined. They can be joined by mechanical fasteners, adhesive bonding or both. However, these methods have several limitations. Components can be joined by fusion bonding due to the property of thermoplastics. Thermoplastics can be melted on heating and regain their shape on cooling. This property makes them ideal for joining through fusion bonding by induction heating. Joining of non-conducting or non-magnetic thermoplastic composites needs an additional material that can generate heat by induction heating. Polymers are neither conductive nor electromagnetic so they don’t have inherent potential for inductive heating. A susceptor sheet having conductive materials (e.g. carbon fiber) or magnetic materials (e.g. nickel) can generate heat during induction. The main issues related with induction heating are non-homogeneous and uncontrolled heating. In this work, it was observed that to generate heat with a susceptor sheet depends on its filler, its concentration, and its dispersion. It also depends on the coil, magnetic field strength and coupling distance. The combination of different fillers not only increased the heating rate but also changed the heating mechanism. Heating of 40ºC/ sec was achieved with 15wt.-% nickel coated short carbon fibers and 3wt.-% multiwalled carbon nanotubes. However, only nickel coated short carbon fibers (15wt-.%) attained the heating rate of 24ºC/ sec. In this study, electrical conductivity, thermal conductivity and magnetic properties testing were also performed. The results also showed that electrical percolation was achieved around 15wt.-% in fibers and (13- 6)wt.-% with hybrid fillers. Induction heating tests were also performed by making parallel and perpendicular susceptor sheet as fibers were uni-directionally aligned. The susceptor sheet was also tested by making perforations. The susceptor sheet showed homogeneous and fast heating, and can be used for joining of non-conductive or non-magnetic thermoplastic composites.

- Verification & Performance Measurement for Transport Protocol Parallel Routing of an AUTOSAR Gateway System (2016)
- A wide range of methods and techniques have been developed over the years to manage the increasing complexity of automotive Electrical/Electronic systems. Standardization is an example of such complexity managing techniques that aims to minimize the costs, avoid compatibility problems and improve the efficiency of development processes. A well-known and -practiced standard in automotive industry is AUTOSAR (Automotive Open System Architecture). AUTOSAR is a common standard among OEMs (Original Equipment Manufacturer), suppliers and other involved companies. It was developed originally with the goal of simplifying the overall development and integration process of Electrical/Electronic artifacts from different functional domains, such as hardware, software, and vehicle communication. However, the AUTOSAR standard, in its current status, is not able to manage the problems in some areas of the system development. Validation and optimization process of system configuration handled in this thesis are examples of such areas, in which the AUTOSAR standard offers so far no mature solutions. Generally, systems developed on the basis of AUTOSAR must be configured in a way that all defined requirements are met. In most cases, the number of configuration parameters and their possible settings in AUTOSAR systems are large, especially if the developed system is complex with modules from various knowledge domains. The verification process here can consume a lot of resources to test all possible combinations of configuration settings, and ideally find the optimal configuration variant, since the number of test cases can be very high. This problem is referred to in literature as the combinatorial explosion problem. Combinatorial testing is an active and promising area of functional testing that offers ideas to solve the combinatorial explosion problem. Thereby, the focus is to cover the interaction errors by selecting a sample of system input parameters or configuration settings for test case generation. However, the industrial acceptance of combinatorial testing is still weak because of the deficiency of real industrial examples. This thesis is tempted to fill this gap between the industry and the academy in the area of combinatorial testing to emphasizes the effectiveness of combinatorial testing in verifying complex configurable systems. The particular intention of the thesis is to provide a new applicable approach to combinatorial testing to fight the combinatorial explosion problem emerged during the verification and performance measurement of transport protocol parallel routing of an AUTOSAR gateway. The proposed approach has been validated and evaluated by means of two real industrial examples of AUTOSAR gateways with multiple communication buses and two different degrees of complexity to illustrate its applicability.