Refine
Year of publication
- 2009 (37) (remove)
Document Type
- Doctoral Thesis (37) (remove)
Language
- English (37) (remove)
Has Fulltext
- yes (37)
Keywords
- Algebraische Geometrie (2)
- Datenanalyse (2)
- Extrapolation (2)
- Finanzmathematik (2)
- Visualisierung (2)
- illiquidity (2)
- 17beta-Estradiol (1)
- 3D Gene Expression (1)
- 3D Point Data (1)
- Ableitungsschätzung (1)
- Adhäsion (1)
- Advanced Encryption Standard (1)
- Ah-Rezeptor (1)
- AhR/ER Crosstalk (1)
- Algebraic geometry (1)
- Ansäuerung (1)
- Apoptosis (1)
- Approximationsalgorithmus (1)
- Atmungskette (1)
- Austin (1)
- Befahrbarkeitsanalyse (1)
- Benutzerfreundlichkeit (1)
- Berufliche Entwicklung (1)
- Bewertung (1)
- Binomialbaum (1)
- Bionas (1)
- Brinkman (1)
- Bruchmechanik (1)
- CYP1A1 (1)
- CYP1B1 (1)
- Cluster-Analyse (1)
- Combined IR/UV spectroscopy (1)
- Competence (1)
- Computer Algebra (1)
- Computer Graphic (1)
- Computer algebra (1)
- Computergraphik (1)
- Computerphysik (1)
- Computertomographie (1)
- Cook Wilson (1)
- Crash (1)
- Cross-Cultural Product Development (1)
- Cytochomes P450 (1)
- Cytochrom P-450 (1)
- DNS-Schädigung (1)
- Delaunay-Triangulierung (1)
- Derivat <Wertpapier> (1)
- Derivative Estimation (1)
- Diffusion processes (1)
- Diskrete Simulation (1)
- Disulfidbrücken-Transfer (1)
- Doppelresonanz (1)
- Dreidimensionale Bildverarbeitung (1)
- Effective Conductivity (1)
- Elastomer (1)
- Elektroporation (1)
- Elliptisches Randwertproblem (1)
- Enumerative Geometrie (1)
- Erkenntnistheorie (1)
- Estradiol (1)
- Estradiolrezeptor (1)
- Extended Finite-Elemente-Methode (1)
- Fahrtkostenmodelle (1)
- Feature Detection (1)
- Feature Extraction (1)
- Finanzkrise (1)
- Finanznumerik (1)
- Finite-Elemente-Simulation (1)
- Finite-Volumen-Methode (1)
- First Order Optimality System (1)
- Fluid-Feststoff-Strömung (1)
- Formaler Beweis (1)
- Gauß-Filter (1)
- Gene expression programming (1)
- Generierung (1)
- Geodesie (1)
- Gewichteter Sobolev-Raum (1)
- Granular (1)
- Granulat (1)
- Gromov-Witten-Invariante (1)
- Group and Organizational Learning (1)
- Gröbner-Basis (1)
- Harold Arthur (1)
- Hochskalieren (1)
- Homogenisieren (1)
- Homogenisierung <Mathematik> (1)
- Homogenization (1)
- Illiquidität (1)
- Imatinib mesilat (1)
- Individual (1)
- Induktive logische Programmierung (1)
- Infrarotspektroskopie (1)
- Innovation (1)
- Interkulturelle Produktentwicklung (1)
- Interpolation (1)
- John L. (1)
- Kanalcodierung (1)
- Kardiotoxizität (1)
- Kelvin Transformation (1)
- Kombinierte IR/UV-Spektroskopie (1)
- Kompetenz (1)
- Konvergenzrate (1)
- Konvergenzverhalten (1)
- Kryptoanalyse (1)
- Laser Wakefield Particle Accelerator (1)
- Leber (1)
- Linking Data Analysis and Visualization (1)
- Liquidität (1)
- MIP-Emissionsspektroskopie (1)
- MIP-Massenspektrometrie (1)
- Maschinelle Übersetzung (1)
- Maximum Intensity Projection (1)
- Mechanics (1)
- Mehrskalenmodell (1)
- Merkmalsextraktion (1)
- Mikroskopie (1)
- Mitochondrium (1)
- Molekularstrahl (1)
- Multi-Asset Option (1)
- Multiscale (1)
- Mustererkennung (1)
- Mutagenität (1)
- Natural Neighbor (1)
- Natural Neighbor Interpolation (1)
- Natürliche Nachbarn (1)
- Navier-Stokes-Gleichung (1)
- Navigation (1)
- Nichtlineare Finite-Elemente-Methode (1)
- Nichtlineare Kontinuumsmechanik (1)
- Numerische Mathematik (1)
- Numerische Strömungssimulation (1)
- Nutzerorientierte Produktentwicklung (1)
- Optimal Control (1)
- Optimierender Compiler (1)
- Optimierung (1)
- Optimization Algorithms (1)
- Option (1)
- Optische Abbildung (1)
- Paralleler Hybrid (1)
- Partial Differential Equations (1)
- Particle (1)
- Partielle Differentialgleichung (1)
- Photonische Kristalle (1)
- Poisson-Gleichung (1)
- PolyBoRi (1)
- Portfolio Selection (1)
- Portfolio-Optimierung (1)
- Prichard (1)
- Produktentwicklung (1)
- Professional development (1)
- Protein-Tyrosin-Kinasen (1)
- Proteintransport (1)
- Protonentransfer (1)
- Quantitative Bildanalyse (1)
- RNS-Interferenz (1)
- Ray casting (1)
- Ray tracing (1)
- Reaktive Sauerstoff Spezies (1)
- Rechtecksgitter (1)
- Rectilinear Grid (1)
- Requirements engineering (1)
- Rissausbreitung (1)
- Robotik (1)
- Rollreibung und -verschleiß (1)
- Räumliche Statistik (1)
- Sauerstoffverbrauch (1)
- Scattered-Data-Interpolation (1)
- Schiefe Ableitung (1)
- Schlagfrequenz (1)
- Schnitttheorie (1)
- Schwache Lösu (1)
- Second Order Conditions (1)
- Singular <Programm> (1)
- Singularity theory (1)
- Singularitätentheorie (1)
- Sobolev-Raum (1)
- Sprödbru (1)
- Steuer (1)
- Stochastische Inhomogenitäten (1)
- Stochastische dynamische Optimierung (1)
- Strukturiertes Gitter (1)
- Strömungsmechanik (1)
- Surface Reconstruction (1)
- Systemic Constructivist Approach (1)
- Systemische Konstruktivistischen Ansatz (1)
- Sägezahneffekt (1)
- TCDD (1)
- TVET teachers’ education (1)
- Teilchen (1)
- Tesselation (1)
- Tetrachlordibenzodioxine (1)
- Tetraeder (1)
- Tetraedergi (1)
- Tetrahedral Grid (1)
- Tetrahedral Mesh (1)
- Topologie (1)
- Transfektion (1)
- Traversability Analysis (1)
- Tropische Geometrie (1)
- Tube Drawing (1)
- Ultraviolettspektroskopie (1)
- Unorganized Data (1)
- Untermannigfaltigkeit (1)
- Upscaling (1)
- Usability (1)
- User-Centred Product Development (1)
- User-Experience (1)
- Validierung (1)
- Visualization (1)
- Voronoi-Diagramm (1)
- Wasserstoffbrückenbindungen (1)
- Wissen (1)
- Wissenserwerb (1)
- Worst-Case (1)
- Wärmeleitfähigkeit (1)
- Yaglom limits (1)
- acidification (1)
- adhesion (1)
- algebraic attack (1)
- alkin (1)
- alkyne (1)
- amid (1)
- amide (1)
- apprehension (1)
- beam refocusing (1)
- beating rate (1)
- binomial tree (1)
- blackout period (1)
- brittle fracture (1)
- c-Abl (1)
- cells on chips (1)
- cohesive cracks (1)
- composite (1)
- computational finance (1)
- convergence behaviour (1)
- crack path tracking (1)
- crash (1)
- decoding (1)
- disulfide bond transfer (1)
- efficiency loss (1)
- elastomer (1)
- electronically excited states (1)
- elektronisch angeregte Zustände (1)
- enamid (1)
- enamide (1)
- fallible knowledge (1)
- finite element method (1)
- folding rocks (1)
- foundational translation validation (1)
- gaussian filter (1)
- global tracking (1)
- hybrid materials (1)
- hydrogen bonds (1)
- image analysis (1)
- image processing (1)
- imaging (1)
- impulse control (1)
- inverse optimization (1)
- limit theorems (1)
- linear code (1)
- machine-checkable proof (1)
- matroid flows (1)
- mesoporous (1)
- monotone Konvergenz (1)
- monotropic programming (1)
- multi-asset option (1)
- negative refraction (1)
- neonatal rat ventricular cardiomyocytes (1)
- neonatale ventrikuläre Kardiomyozyten der Ratte (1)
- network flows (1)
- nicht-newtonsche Strömungen (1)
- nichtlineare Druckkorrektor (1)
- non-newtonian flow (1)
- nonlinear pressure correction (1)
- nucleofection (1)
- optimal investment (1)
- optimales Investment (1)
- optimization correctness (1)
- option valuation (1)
- oxidative DNA Damage (1)
- oxidative DNA Schäden (1)
- oxygen consumption (1)
- path cost models (1)
- pattern recognition (1)
- phenothiazine (1)
- photonic crystals (1)
- photonic crystals filter (1)
- polymer blends (1)
- portfolio decision (1)
- portfolio optimization (1)
- processing (1)
- proof generating optimizer (1)
- quasi-variational inequalities (1)
- rare disasters (1)
- rate of convergence (1)
- reactive oxygen species (1)
- respiratory chain (1)
- rolling friction (1)
- ruthenium (1)
- sawtooth effect (1)
- spatial statistics (1)
- sulfonic (1)
- tax (1)
- technische und berufliche Aus- und Weiterbildung Lehrer lernen (1)
- tensions (1)
- thiazolium (1)
- translation contract (1)
- wear (1)
- worst-case (1)
- Übersetzung (1)
Faculty / Organisational entity
- Kaiserslautern - Fachbereich Mathematik (13)
- Kaiserslautern - Fachbereich Informatik (8)
- Kaiserslautern - Fachbereich Maschinenbau und Verfahrenstechnik (7)
- Kaiserslautern - Fachbereich Chemie (5)
- Kaiserslautern - Fachbereich Sozialwissenschaften (2)
- Kaiserslautern - Fachbereich Biologie (1)
- Kaiserslautern - Fachbereich Elektrotechnik und Informationstechnik (1)
Adaptive Extraction and Representation of Geometric Structures from Unorganized 3D Point Sets
(2009)
The primary emphasis of this thesis concerns the extraction and representation of intrinsic properties of three-dimensional (3D) unorganized point clouds. The points establishing a point cloud as it mainly emerges from LiDaR (Light Detection and Ranging) scan devices or by reconstruction from two-dimensional (2D) image series represent discrete samples of real world objects. Depending on the type of scenery the data is generated from the resulting point cloud may exhibit a variety of different structures. Especially, in the case of environmental LiDaR scans the complexity of the corresponding point clouds is relatively high. Hence, finding new techniques allowing the efficient extraction and representation of the underlying structural entities becomes an important research issue of recent interest. This thesis introduces new methods regarding the extraction and visualization of structural features like surfaces and curves (e.g. ridge-lines, creases) from 3D (environmental) point clouds. One main part concerns the extraction of curve-like features from environmental point data sets. It provides a new method supporting a stable feature extraction by incorporating a probability-based point classification scheme that characterizes individual points regarding their affiliation to surface-, curve- and volume-like structures. Another part is concerned with the surface reconstruction from (environmental) point clouds exhibiting objects that are more or less complex. A new method providing multi-resolutional surface representations from regular point clouds is discussed. Following the applied principles of this approach a volumetric surface reconstruction method based on the proposed classification scheme is introduced. It allows the reconstruction of surfaces from highly unstructured and noisy point data sets. Furthermore, contributions in the field of reconstructing 3D point clouds from 2D image series are provided. In addition, a discussion concerning the most important properties of (environmental) point clouds with respect to feature extraction is presented.
Photonic crystals are inhomogeneous dielectric media with periodic variation of the refractive index. A photonic crystal gives us new tools for the manipulation of photons and thus has received great interests in a variety of fields. Photonic crystals are expected to be used in novel optical devices such as thresholdless laser diodes, single-mode light emitting diodes, small waveguides with low-loss sharp bends, small prisms, and small integrated optical circuits. They can be operated in some aspects as "left handed materials" which are capable of focusing transmitted waves into a sub-wavelength spot due to negative refraction. The thesis is focused on the applications of photonic crystals in communications and optical imaging: • Photonic crystal structures for potential dispersion management in optical telecommunication systems • 2D non-uniform photonic crystal waveguides with a square lattice for wide-angle beam refocusing using negative refraction • 2D non-uniform photonic crystal slabs with triangular lattice for all-angle beam refocusing • Compact phase-shifted band-pass transmission filter based on photonic crystals
Photochemical reactions are of great interest due to their importance in chemical and biological processes. Highly sensitive IR/UV double and triple resonance spectroscopy in molecular beam experiments in combination with ab initio and DFT calculations yields information on reaction coordinates and Intersystem Crossing (ISC) processes subsequent to photoexcitation. In general, molecular beam experiments enable the investigation of isolated, cold molecules without any influence of the environment. Furthermore, small aggregates can be analyzed in a supersonic jet by gradually adding solvent molecules like water. Conclusions concerning the interactions in solution can be derived by investigating and fully understanding small systems with a defined amount of solvent molecules. In this work the first applications of combined IR/UV spectroscopy on reactive isolated molecules and triplet states in molecular beams without using any messenger molecules are presented. Special focus was on excited state proton transfer reactions, which can also be described as keto enol tautomerisms. Various molecules such as 3-hydroxyflavone, 2-(2-naphthyl)-3-hydroxychromone and 2,5-dihydroxybenzoic acid have been investigated with regard to this question. In the case of 3-hydroxyflavone and 2-(2-naphthyl)-3-hydroxychromone, the IR spectra have been recorded subsequent to an excited state proton transfer. Furthermore the dihydrate of 3-hydroxyflavone has been analyzed concerning a possible proton transfer in the excited state: The proton transfer reaction along the water molecules (proton wire) has to be induced by raising the excitation energy. However, photoinduced reactions involve not only singlet but also triplet states. As an archetype molecule xanthone has been analysed. After excitation to the S2 state, ISC occurs into the triplet manifold leading to a population of the T1 state. The IR spectrum of the T1 state has been recorded for the first time using the UV/IR/UV technique without using any messenger molecules. Altogether it is shown that IR/UV double and triple resonance techniques are suitable tools to analyze reaction coordinates of photochemical processes.
The manuscript divides in 7 chapters. Chapter 2 briefly introduces the reader to the elementary measures of classical continuum mechanics and thus allows to familiarize with the employed notation. Furthermore, deeper insight of the proposed first-order computational homogenization strategy is presented. Based on the need for a discrete representative volume element (rve), Chapter 3 focuses on a proper rve generation algorithm. Therein, the algorithm itself is described in detail. Additionally, we introduce the concept of periodicity. This chapter finalizes by granting multiple representative examples. A potential based soft particle contact method, used for the computations on the microscale level, is defined in Chapter 4. Included are a description of the used discrete element method (dem) as well as the applied macroscopically driven Dirichlet boundary conditions. The chapter closes with the proposition of a proper solution algorithm as well as illustrative representative examples. Homogenization of the discrete microscopic quantities is discussed in Chapter 5. Therein, the focus is on the upscaling of the aggregate energy as well as on the derivation of related macroscopic stress measures. Necessary quantities for coupling between a standard finite element method and the proposed discrete microscale are presented in Chapter 6. Therein, we tend to the derivation of the macroscopic tangent, necessary for the inclusion into the standard finite element programs. Chapter 7 focuses on the incorporation of inter-particle friction. We select to derive a variational based formulation of inter-particle friction forces, founded on a proposed reduced incremental potential. This contribution is closed by providing a discussion as well as an outlook.
This PhD thesis aims at finding a global robot navigation strategy for rugged off-road terrain which is robust against inaccurate self-localization, scalable to large environments, but also cost-efficient, e.g. able to generate navigation paths which optimize a cost measure closely related to terrain traversability. In order to meet this goal, aspects of both metrical and topological navigation techniques are combined. A primarily topological map is extended with the previously lacking capability of cost-efficient path planning and map extension. Further innovations include a multi-dimensional cost measure for topological edges, a method to learn these costs based on live feedback from the robot and a set of extrapolation methods to predict the traversability costs for untraversed edges. The thesis presents two sophisticated new image analysis techniques to optimize cost prediction based on the shape and appearance of surrounding terrain. Experimental results indicate that the proposed global navigation system is indeed able to perform cost-efficient, large scale path planning. At the same time, the need to maintain a fine-grained, global world model which would reduce the scalability of the approach is avoided.
Proprietary polyurea based thermosets (3P resins) were produced from polymeric methylene diphenylisocyanate (PMDI) and water glass (WG) using a phosphate emulsifier. Polyisocyanates when combined with WG in presence of suitable emulsifier result in very versatile products. WG acts in the resulting polyurea through a special sol-gel route as a cheap precursor of the silicate (xerogel) filler produced in-situ. The particle size and its distribution of the silicate are coarse and very broad, respectively, which impart the mechanical properties of the 3P systems negatively. The research strategy was to achieve initially a fine water in oil type (W/O = WG/PMDI) emulsion by “hybridising” the polyisocyanate with suitable thermosetting resins (such as vinylester (VE), melamine/formaldehyde (MF) or epoxy resin (EP)). As the presently used phosphate emulsifiers may leak into the environment, the research work was directed to find such “reactive” emulsifiers which can be chemically built in into the final polyurea-based thermosets. The progressive elimination of the organic phosphate, following the European Community Regulation on chemicals and their safe use (REACH), was studied and alternative emulsifiers for the PMDI/WG systems were found. The new hybrid systems in which the role of the phosphate emulsifier has been overtaken by suitable resins (VE, EP) or additives (MF) are designed 2P resins. Further, the cure behaviour (DSC, ATR-IR), chemorheology (plate/plate rheometer), morphology (SEM, AFM) and mechanical properties (flexure, fracture mechanics) have been studied accordingly. The property upgrade targeted not only the mechanical performances but also thermal and flame resistance. Therefore, emphasis was made to improve the thermal and fire resistance (e.g. TGA, UL-94 flammability test) of the in-situ filled hybrid resins. Improvements on the fracture mechanical properties as well as in the flexural properties of the novel 3P and 2P hybrids were obtained. This was accompanied in most of the cases by a pronounced reduction of the polysilicate particle size as well as by a finer dispersion. Further the complex reaction kinetics of the reference 3P was studied, and some of the main reactions taking place during the curing process were established. The pot life of the hybrid resins was, in most of the cases, prolonged, which facilitates the posterior processing of such resins. The thermal resistance of the hybrid resins was also enhanced for all the novel hybrids. However, the hybridization strategy (mostly with EP and VE) did not have satisfactory results when taking into account the fire resistance. Efforts will be made in the future to overcome this problem. Finally it was confirmed that the elimination of the organic phosphate emulsifier was feasible, obtaining the so called 2P hybrids. Those, in many cases, showed improved fracture mechanical, flexural and thermal resistance properties as well as a finer and more homogeneous morphology. The novel hybrid resins of unusual characteristics (e.g. curing under wet conditions and even in water) are promising matrix materials for composites in various application fields such as infrastructure (rehabilitation of sewers), building and construction (refilling), transportation (coating of vessels, pipes of improved chemical resistance)…
Elastomers and their various composites, and blends are frequently used as engineering working parts subjected to rolling friction movements. This fact already substantiates the importance of a study addressing the rolling tribological properties of elastomers and their compounds. It is worth noting that until now the research and development works on the friction and wear of rubber materials were mostly focused on abrasion and to lesser extent on sliding type of loading. As the tribological knowledge acquired with various counterparts, excluding rubbers, can hardly be adopted for those with rubbers, there is a substantial need to study the latter. Therefore, the present work was aimed at investigating the rolling friction and wear properties of different kinds of elastomers against steel under unlubricated condition. In the research the rolling friction and wear properties of various rubber materials were studied in home-made rolling ball-on-plate test configurations under dry condition. The materials inspected were ethylene/propylene/diene rubber (EPDM) without and with carbon black (EPDM_CB), hydrogenated acrylonitrile/butadiene rubber (HNBR) without and with carbon black/silica/multiwall carbon nanotube (HNBR_CB/silica/MWCNT), rubber-rubber hybrid (HNBR and fluororubber (HNBR-FKM)) and rubber-thermoplastic blend (HNBR and cyclic butylene terephthalate oligomers (HNBR-CBT)). The dominant wear mechanisms were investigated by scanning electron microscopy (SEM), and analyzed as a function of composition and testing conditions. Differential scanning calorimetry (DSC), dynamic-mechanical thermal analysis (DMTA), atomic force microscopy (AFM), and transmission electron microscopy (TEM) along with other auxiliary measurements, were adopted to determine the phase structure and network-related properties of the rubber systems. The changes of the friction and wear as a function of type and amount of the additives were explored. The friction process of selected rubbers was also modelled by making use of the finite element method (FEM). The results show that incorporation of filler enhanced generally the wear resistance, hardness, stiffness (storage modulus), and apparent crosslinking of the related rubbers (EPDM-, HNBR- and HNBR-FKM based ones), but did not affect their glass transition temperature. Filling of rubbers usually reduced the coefficient of friction (COF). However, the tribological parameters strongly depended also on the test set-up and test duration. High wear loss was noticed for systems showing the occurrence of Schallamach-type wavy pattern. The blends HNBR-FKM and HNBR-CBT were two-phase structured. In HNBR-FKM, the FKM was dispersed in form of large microscaled domains in the HNBR matrix. This phase structure did not change by incorporation of MWCNT. It was established that the MWCNT was preferentially embedded in the HNBR matrix. Blending HNBR with FKM reduced the stiffness and degree of apparent crosslinking of the blend, which was traced to the dilution of the cure recipe with FKM. The coefficient of friction increased with increasing FKM opposed to the expectation. On the other hand, the specific wear rate (Ws) changed marginally with increasing content of FKM. In HNBR-CBT hybrids the HNBR was the matrix, irrespective to the rather high CBT content. Both the partly and mostly polymerized CBT ((p)CBT and pCBT, respectively) in the hybrids worked as active filler and thus increased the stiffness and hardness. The COF and Ws decreased with increasing CBT content. The FEM results in respect to COF achieved on systems possessing very different structures and thus properties (EPDM_30CB, HNBR-FKM 100-100 and HNBR-(p)CBT 100-100, respectively) were in accordance with the experimental results. This verifies that FEM can be properly used to consider the complex viscoelastic behaviour of rubber materials under dry rolling condition.
It was recently reported that imatinib causes cell death in neonatal rat ventricular cardiomyocytes (NRVCM) by triggering endoplasmic reticulum (ER) stress and collapsed mitochondrial membrane potential. Retroviral gene transfer of an imatinib-resistant mutant c-Abl into NRVCM appeared to alleviate imatinib-induced cell death and it was concluded that the observed imatinib-induced cytotoxicity is mediated through direct interactions of imatinib with c-Abl. The imatinib effects were described as being specific for cardiomyocytes only, which are relevant also for the in vivo situation in man. [Kerkelä et al. 2006] The goal of the present study was to reproduce the published experiments and to further explore the dose-response relationship of imatinib-induced cell death in cardiomyocytes. Additional markers of toxicity were investigated. The following biochemical assays were applied: LDH release (membrane leakage marker), MTS-reduction (marker of mitochondrial integrity), ATP cellular contents (energy homoeostasis) and caspase 3/7 activity (apoptosis). The endoplasmatic reticulum (ER) stress markers eIF2α (elongation initiation factor 2α), XBP1 (X Box binding Protein 1), and CHOP (cAMP response element-binding transcription factor (C/EBP) homologous protein) were determined at the transcriptional and protein level. Online monitoring of cell attachment of, oxygen consumption and acidification of the medium by rat heart cells (H9c2) seated on chips (Bionas) allowed the determination of the onset and reversibility of cellular functions. Image analysis measured the spontaneous beating rates after imatinib treatment. The role of imatinib-induced reactive oxygen species was evaluated directly by 2’,7’-Dichlorofluorescein fluorescence and indirectly by means of interference experiments with antioxidants. The specificity of imatinib-induced effects were specific to cardiomyocytes was evaluated in fibroblasts derived from rat heart, lung and skin. The specific role of c-Abl in the imatinib-induced cellular toxicity was investigated by specific gene silencing of c-Abl in NRVCM. The results demonstrated that imatinib caused concentration-dependent cytotoxicity, apoptosis, and ER stress in heart, skin and lung fibroblasts, similar or stronger to those observed in cardiomyocytes. Similar to the results from cardiomyocytes, ER stress markers in fibroblasts were only increased at cytotoxic concentrations of imatinib. This effect was not reversible; also, reactive oxygen species did not participate in the mechanism of the imatinib-induced cytotoxicity in NRVCM. Small interfering RNA (siRNA)-mediated reduction of c-Abl mRNA levels by 51 % and c-Abl protein levels by 70 % had neither an effect on the spontaneous beating frequency of cardiomyocytes nor did it induce cytotoxicity, apoptosis, mitochondrial dysfunction or ER stress in NRVCM. Incubation of imatinib with c-Abl siRNA-transfected NRVCM suggested that reduced c-Abl protein levels did not rescue cardiomyocytes from imatinib-induced cytotoxicity. In conclusion, results from this study do not support a specific c-Abl-mediated mechanism of cytotoxicity in NRVCM.
Most software systems are described in high-level model or programming languages. Their runtime behavior, however, is determined by the compiled code. For uncritical software, it may be sufficient to test the runtime behavior of the code. For safety-critical software, there is an additional aggravating factor resulting from the fact that the code must satisfy the formal specification which reflects the safety policy of the software consumer and that the software producer is obliged to demonstrate that the code is correct with respect to the specification using formal verification techniques. In this scenario, it is of great importance that static analyses and formal methods can be applied on the source code level, because this level is more abstract and better suited for such techniques. However, the results of the analyses and the verification can only be carried over to the machine code level, if we can establish the correctness of the translation. Thus, compilation is a crucial step in the development of software systems and formally verified translation correctness is essential to close the formalization chain from high-level formal methods to the machine-code level. In this thesis, I propose an approach to certifying compilers which achieves the aim of closing the formalization chain from high-level formal methods to the machine-code level by applying techniques from mathematical logic and programming language semantics. I propose an approach called foundational translation validation (FTV) in which the software producer implements an FTV system comprising a compiler and a specification and verification framework (SVF) which is implemented in higher-order logic (HOL). The most important part of the SVF is an explicit translation contract which comprises the formalizations of the source and the target languages of the compiler and the formalization of a binary translation correctness predicate corrTrans(S,T) for source programs S and target programs T. The formalizations of the languages are realized as deep embeddings in HOL. This enables one to declare the whole program in a formalized language as a HOL constant. The predicate formally specifies when T is considered to be a correct translation of S. Its definition is explicitly based on the program semantics definitions provided by the translation contract. Subsequent to the translation, the compiler translates the source and the target programs into their syntactic representations as HOL constants, S and T, and generates a proof of corrTrans(S,T). We call a compiler which follows the FTV approach a proof generating compiler. Our approach borrows the idea of representing programs in correctness proofs as logic constants from the foundational proof-carrying code (FPCC) approach. Novel features that distinquish our approach from further approaches to certifying compilers, such as proof-carrying code (PCC) and translation validation (TV) are the following: Firstly, the presence of an explicit translation contract formalized in HOL: The approaches PCC and TV do not formalize a translation contract explicitly. Instead of this, they incorporate operational semantics and translation correctness criterion in translation validation tools on the programming language level. Secondly, representation of programs in correctness proofs as logic constants: The approaches PCC and the TV translate programs into their representations as semantic abstractions that serve as inputs for translation validation tools. Thirdly, certification of program transformation chains: Unlike the TV approach, which certifies single program transformations, the FTV approach achieves the aim of certifying whole chains of program transformations. This is possible due to the fact that the translation contract provides, for all programming languages involved in the program transformation chain, definitions of program semantics functions which map programs to mathematical objects that are elements of a set with an (at least) partial order "<=". Then, the proof makes use of the fact that the relation "<=" is transitive. In this thesis, the feasibility of the FTV approach is exemplified by the implementation of an FTV system. The system comprises a compiler front-end that certifies its optimization phase and an accompanying SVF that is implemented in the theorem prover Isabelle/HOL. The compiler front-end translates programs in a small C-like programming language, performs three optimizations: constant folding, dead assignment elimination, and loop invariant hoisting, and generates translation certificates in the form of Isabelle/HOL theories. The main focus of the thesis is on the description of the SVF and its translation verification techniques.
This dissertation is intended to give a systematic treatment of hypersurface singularities in arbitrary characteristic which provides the necessary tools, theoretically and computationally, for the purpose of classification. This thesis consists of five chapters: In chapter 1, we introduce the background on isolated hypersurface singularities needed for our work. In chapter 2, we formalize the notions of piecewise-homogeneous grading and we discuss thoroughly non-degeneracy in arbitrary characteristic. Chapter 3 is devoted to determinacy and normal forms of isolated hypersurface singularities. In the first part, we give finite determinacy theorems in arbitrary characteristic with respect to right respectively contact equivalence. Furthermore, we show that "isolated" and finite determinacy properties are equivalent. In the second part, we formalize Arnol'd's key ideas for the computation of normal forms an define the conditions (AA) and (AAC). The last part of Chapter 3 is devoted to the study of normal forms in the general setting of hypersurface singularities imposing neither condition (A) nor Newton-Nondegeneracy. In Chapter 4, we present algorithms which we implement in Singular for the purpose of explicit computation of regular bases and normal forms. In chapter 5, we transfer some classical results on invariants over the field C of complex numbers to algebraically closed fields of characteristic zero known as Lefschetz principle.