Refine
Year of publication
Document Type
- Preprint (1038)
- Doctoral Thesis (943)
- Article (670)
- Report (399)
- Master's Thesis (30)
- Conference Proceeding (28)
- Diploma Thesis (24)
- Periodical Part (21)
- Working Paper (15)
- Lecture (11)
Language
- English (3216) (remove)
Keywords
- AG-RESY (47)
- PARO (25)
- Visualisierung (16)
- SKALP (15)
- Wavelet (13)
- finite element method (12)
- Case-Based Reasoning (11)
- Inverses Problem (11)
- Optimization (11)
- RODEO (11)
Faculty / Organisational entity
- Kaiserslautern - Fachbereich Mathematik (1054)
- Kaiserslautern - Fachbereich Informatik (755)
- Kaiserslautern - Fachbereich Physik (324)
- Kaiserslautern - Fachbereich Maschinenbau und Verfahrenstechnik (314)
- Fraunhofer (ITWM) (205)
- Kaiserslautern - Fachbereich Chemie (126)
- Kaiserslautern - Fachbereich Elektrotechnik und Informationstechnik (116)
- Kaiserslautern - Fachbereich Biologie (102)
- Kaiserslautern - Fachbereich Sozialwissenschaften (76)
- Kaiserslautern - Fachbereich Wirtschaftswissenschaften (37)
Intelligent formal methods
(2022)
Information technology has become an indispensable part of our daily lives, with a significant proportion of our everyday activities relying on the safe and reliable operation of computer systems. One promising approach to ensuring these critical properties is the use of so-called formal methods, a broad range of rigorous, mathematical techniques for specifying, developing, and verifying hardware, software, cyber-physical systems, and artificial intelligence. Unlike traditional quality assurance approaches, such as testing, formal methods offer the unique ability to provide formal proof of the absence of errors, a trait particularly desirable in the context of today's ubiquitous safety-critical systems. However, this advantage comes at a cost: formal methods require extensive training, often assume idealized or limited settings, and typically demand substantial computational resources.
Inspired by the vision of artificial intelligence, this work seeks to automate formal methods and dramatically expand their applicability. To achieve this goal, we develop a novel, innovative type of formal method that combines inductive techniques from machine learning with deductive techniques from logic. We name this new approach "intelligent formal methods" and apply it to three fundamental areas: software verification, hardware and software synthesis, and the generation of formal specifications.
Objective. Gradient-based optimization using algorithmic derivatives can be a useful technique to improve engineering designs with respect to a computer-implemented objective function. Likewise, uncertainty quantification through computer simulations can be carried out by means of derivatives of the computer simulation. However, the effectiveness of these techniques depends on how 'well-linearizable' the software is. In this study, we assess how promising derivative information of a typical proton computed tomography (pCT) scan computer simulation is for the aforementioned applications. Approach. This study is mainly based on numerical experiments, in which we repeatedly evaluate three representative computational steps with perturbed input values. We support our observations with a review of the algorithmic steps and arithmetic operations performed by the software, using debugging techniques. Main results. The model-based iterative reconstruction (MBIR) subprocedure (at the end of the software pipeline) and the Monte Carlo (MC) simulation (at the beginning) were piecewise differentiable. However, the observed high density and magnitude of jumps was likely to preclude most meaningful uses of the derivatives. Jumps in the MBIR function arose from the discrete computation of the set of voxels intersected by a proton path, and could be reduced in magnitude by a 'fuzzy voxels' approach. The investigated jumps in the MC function arose from local changes in the control flow that affected the amount of consumed random numbers. The tracking algorithm solves an inherently non-differentiable problem. Significance. Besides the technical challenges of merely applying AD to existing software projects, the MC and MBIR codes must be adapted to compute smoother functions. For the MBIR code, we presented one possible approach for this while for the MC code, this will be subject to further research. For the tracking subprocedure, further research on surrogate models is necessary.
We discuss the dynamics of the formation of a Bose polaron when an impurity is injected into a weakly interacting one-dimensional Bose condensate. While for small impurity-boson couplings this process can be described within the Froehlich model as generation, emission and binding of Bogoliubov phonons, this is no longer adequate if the coupling becomes strong. To treat this regime we consider a mean-field approach beyond the Froehlich model which accounts for the backaction to the condensate, complemented with Truncated Wigner simulations to include quantum fluctuation. For the stationary polaron we find a periodic energy-momentum relation and non-monotonous relation between impurity velocity and polaron momentum including regions of negative impurity velocity. Studying the polaron formation after turning on the impurity-boson coupling quasi-adiabatically and in a sudden quench, we find a very rich scenario of dynamical regimes. Due to the build-up of an effective mass, the impurity is slowed down even if its initial velocity is below the Landau critical value. For larger initial velocities we find deceleration and even backscattering caused by emission of density waves or grey solitons and subsequent formation of stationary polaron states in different momentum sectors. In order to analyze the effect of quantum fluctuations we consider a trapped condensate to avoid 1D infrared divergencies. Using Truncated Wigner simulations in this case we show under what conditions the influence of quantum fluctuations is small.
European crayfish species are considered keystone in freshwater ecosystems. As such, their conservation is of paramount importance to prevent biodiversity decline and loss of ecosystem function. Unfortunately, today, European crayfish species are among the most threatened crayfish species worldwide. An especially relevant threat is represented by the invasive pathogen Aphanomyces astaci. This oomycete, native of North America, has been one of the main causes of crayfish population declines across Europe since its first introduction 150 years ago, to the point of causing the local extinction of many populations. Over the years, several introductions of A. astaci strains into Europe took place through translocation of infected North American crayfish, and were followed by mass mortalities across European crayfish populations. However, in the past 20 years, more and more reports emerged of European crayfish populations surviving A. astaci infections or being latently infected with the pathogen. The survival of infected crayfish can be ascribed to both increased resistance of some crayfish populations and decreased virulence of some A. astaci strains. As the relationship between host and pathogen in Europe is changing, it is imperative to gain insights on what shapes these changes to understand the implications for the long-term coexistence of crayfish and A. astaci in Europe. With this thesis, I focused on the virulence of A. astaci, looking for mechanisms, patterns and determinants underlying the pathogen’s virulence variability. In particular, by characterising the virulence of several A. astaci strains, I identified two possible different mechanisms of loss of virulence. I revealed that A. astaci’s virulence variability is not linked to variation of in vitro growth and sporulation, traits classically associated with a pathogen’s virulence. Based on these results, I suggest that the pathogen’s virulence determinants are likely its “virulence effectors”, of which A. astaci genome is enriched. Additionally, with the present work I provided transcriptomic evidence of coevolution between A. astaci and European crayfish. I showed that the haplogroups based on the canonical mitochondrial markers, often used to assess A. astaci’s virulence to inform management actions, do not differ for some of their characterising phenotypical traits, including virulence. Finally, after experimental characterisation of virulence and assessment of its likely phenotypical determinants, i.e., sporulation and growth, the next and more comprehensive step to study the pathogen’s virulence is through genomic approaches. To this aim, I provided key data for future comparative genomic studies, i.e., highly complete genome assemblies based on Nanopore (3) and Illumina reads (11). These data can be exploited in several ways, from building a pangenome of the species to a genome-wide association study (GWAS), that can offer a much deeper understanding of A. astaci’s virulence and adaptability. In particular, the identification of the loci associated with virulence through a GWAS has the potential to be revolutionary for the management of A. astaci, as it can become the basis to create a genomic tool to quickly and accurately assess the virulence of newly introduced strains, directing management actions towards the more dangerous strains.
We report the experimental implementation of dynamical decoupling on a small, non-interacting ensemble of up to 25 optically trapped, neutral Cs atoms. The qubit consists of the two magnetic-insensitive Cs clock states \(\vert F = 3, m_F = 0\rangle\) and \(\vert F = 4, m_F = 0\rangle\), which are coupled by microwave radiation. We observe a significant enhancement of the coherence time when employing Carr-Purcell-Meiboom-Gill (CPMG) dynamical decoupling. A CPMG sequence with ten refocusing pulses increases the coherence time of 16.2(9) ms by more than one order of magnitude to 178(2) ms. In addition, we make use of the filter function formalism and utilise the CPMG sequence to measure the background noise floor affecting the qubit coherence, finding a power-law noise spectrum \(1/\omega^\alpha\) with \(\mathit{\alpha} = 0.89(2)\). This finding is in very good agreement with an independent measurement of the noise in the intensity of the trapping laser. Moreover, the measured coherence evolutions also exhibit signatures of low-frequency noise originating at distinct frequencies. Our findings point toward noise spectroscopy of engineered atomic baths through single-atom dynamical decoupling in a system of individual Cs impurities immersed in an ultracold 87Rb bath.
Olive mill wastewater (OMW) is a by-product of olive oil extraction and its disposal on soil has been associated with significant environmental challenges, including toxic effects on soil organisms and quality of groundwater due to its high phenolic content. Recent studies focusing on the dynamics of OMW degradation in soil are handling the environmental conditions as main factors influencing the fate and transport of polyphenols in the soil-water system. The understanding of seasonal-dependent phenol leaching from OMW-treated soil remained elusive, as field studies are hindered by spatial variability and complex environmental dynamics. Therefore, controlled lysimeter experiments were conducted to investigate the leaching and transport mechanisms of OMW-derived phenolic compounds in soil.
This thesis presents the results of an 18-week lysimeter experiment conducted in a laboratory setting, aimed at monitoring and comprehending the distribution and leaching of OMW-derived phenolic compounds in soil after OMW application. The experiment spanned four seasonal simulation phases, including two winter, one spring, and one summer, under semi-arid climate Tunisian conditions. The effects of OMW on soil leachates properties, soil water repellency, and soil water retention capacity were assessed.
The soil leachates exhibited varying degrees of recovery across the different simulation phases. However, persistent salinity in the leachates and high soil water repellency at the top treated OMW-soils were recorded. The findings revealed also that OMW application changed the pore size distribution in treated OMW-soils. Most of the OMW-derived phenols were immobilized in the upper 5 cm of the soil. Notably, soluble phenolic compounds exhibited the formation of coarser pores for the sake of fine pores, suggesting that OMW- organic carbon played a crucial role in controlling the depth-dependent transport mechanisms of OMW within the soil matrix.
In conclusion, this study provides valuable insights into the fate and impact of OMW-derived phenolic compounds in soil. It emphasizes the significance of conducting OMW applications with careful irrigation practices and thorough phenol leaching surveys to minimize the risk of potential groundwater contamination. Additionally, more experiments are warranted to investigate the sorption capacity of the soil during and after OMW application and its influence on the stability of soluble phenolic compounds
in soils.
Thermo-optic interaction significantly differs from the usual particle-particle interactions in physics, as it is retarded in time. A prominent platform for realising this kind of interaction are photon Bose–Einstein condensates, which are created in dye-filled microcavities. The dye solution continually absorbs and re-emits these photons, causing the photon gas to thermalize and to form a Bose–Einstein condensate. Because of a non-ideal quantum efficiency, these cycles heat the dye solution, creating a medium that provides an effective thermo-optic photon–photon interaction. So far, only a mean-field description of this process exists. This paper goes beyond by working out a quantum mechanical description of the effective thermo-optic photon–photon interaction. To this end, the self-consistent modelling of the temperature diffusion builds the backbone of the modelling. Furthermore, the manyfold experimental timescales allow for deriving an approximate Hamiltonian. The resulting quantum theory is applied in the perturbative regime to both a harmonic and a box potential for investigating its prospect for precise measurements of the effective photon–photon interaction strength.
Although photon Bose–Einstein condensates have already been used for studying many interesting effects, the precise role of the photon–photon interaction is not fully clarified up to now. In view of this, it is advantageous that these systems allow measuring both the intensity of the light leaking out of the cavity and its spectrum at the same time. Therefore, the photon–photon interaction strength can be determined once via analysing the condensate broadening and once via examining the interaction-induced modifications of the cavity modes. As the former method depends crucially on the concrete shape of the trapping potential and the spatial resolution of the used camera, interferometric methods promise more precise measurements. To this end, the present paper works out the impact of the photon–photon interaction upon the cavity modes. A quantum mechanical description of the photon–photon interaction, including the thermal cloud, builds the theoretical backbone of the method. An exact diagonalisation approach introduced here exposes how the effective photon–photon interaction modifies both the spectrum and the width of the photon gas. A comparison with a variational approach based on the Gross–Pitaevskii equation quantifies the contribution of the thermal cloud in the respective applications.
Surface roughness plays a critical role and has effects in, e.g. fluid dynamics or contact mechanics. For example, to evaluate fluid behavior at different roughness properties, real-world or numerical experiments are performed. Numerical simulations of rough surfaces can speed up these studies because they can help collect more relevant information. However, it is hard to simulate rough surfaces with deterministic or structured components in current methods. In this work, we present a novel approach to simulate rough surfaces with a Gaussian process (GP) and a noise model because GPs can model structured and periodic elements. GPs generalize traditional methods and are not restricted to stationarity so they can simulate a wider range of rough surfaces. In this paper, we summarize the theoretical similarities of GPs with auto-regressive moving-average processes and introduce a linear process view of GPs. We also show examples of ground and honed surfaces simulated by a predefined model. The proposed method can also be used to fit a model to measurement data of a rough surface. In particular, we demonstrate this to model turned profiles and surfaces that are inherently periodic.
An important tool for the functional characterization of technical surfaces are envelope estimation techniques. This paper describes a new method for generating profile envelope lines based on a simplified beam-surface contact model with intuitive parameterization. The method is closely related to spline filters and shares some of their positive characteristics such as smoothness and robustness against isolated outliers. Unlike spline filters, the proposed method does not calculate mean lines, but envelope lines. Several examples of calculated profile envelopes of sintered surfaces are shown and a comparison with morphological methods, the state-of-the-art method for envelope estimation, is presented.