Refine
Year of publication
- 2009 (37) (remove)
Document Type
- Doctoral Thesis (37) (remove)
Language
- English (37) (remove)
Has Fulltext
- yes (37)
Keywords
- Algebraische Geometrie (2)
- Datenanalyse (2)
- Extrapolation (2)
- Finanzmathematik (2)
- Visualisierung (2)
- illiquidity (2)
- 17beta-Estradiol (1)
- 3D Gene Expression (1)
- 3D Point Data (1)
- Ableitungsschätzung (1)
Faculty / Organisational entity
- Kaiserslautern - Fachbereich Mathematik (13)
- Kaiserslautern - Fachbereich Informatik (8)
- Kaiserslautern - Fachbereich Maschinenbau und Verfahrenstechnik (7)
- Kaiserslautern - Fachbereich Chemie (5)
- Kaiserslautern - Fachbereich Sozialwissenschaften (2)
- Kaiserslautern - Fachbereich Biologie (1)
- Kaiserslautern - Fachbereich Elektrotechnik und Informationstechnik (1)
Epoxy resins have achieved acceptance as adhesives, coatings, and potting compounds,
but their main application is as matrix to produce reinforced composites.
However, their usefulness in this field still limited due to their brittle nature. Some
studies have been done to increase the toughness of epoxy composites, of which the
most successful one is the modification of the polymer matrix with a second toughening
phase.
Resin Transfer Molding (RTM) is one of the most important technologies to manufacture
fiber reinforced composites. In the last decade it has experimented new impulse,
due to its favorable application to produce large surface composites with good technical
properties and at relative low cost.
This research work focuses on the development of novel modified epoxy matrices,
with enhanced mechanical and thermal properties, suitable to be processed by resin
transfer molding technology, to manufacture Glass Fiber Reinforced Composites
(GFRC’s) with improved performance in comparison to the commercially available
ones.
In the first stage of the project, a neat epoxy resin (EP) was modified using two different
nano-sized ceramics: silicium dioxide (SiO2) and zirconium dioxide (ZrO2); and
micro-sized particles of silicone rubber (SR) as second filler. Series of nanocomposites
and hybrid modified epoxy resins were obtained by systematic variation of filler
contents. The rheology and curing process of the modified epoxy resins were determined
in order to define their aptness to be processed by RTM. The resulting matrices
were extensively characterized qualitatively and quantitatively to precise the effect
of each filler on the polymer properties.
It was shown that the nanoparticles confer better mechanical properties to the epoxy
resin, including modulus and toughness. It was possible to improve simultaneously
the tensile modulus and toughness of the epoxy matrix in more than 30 % and 50 %
respectively, only by using 8 vol.-% nano-SiO2 as filler. A similar performance was
obtained by nanocomposites containing zirconia. The epoxy matrix modified with 8 vol.-% ZrO2 recorded tensile modulus and toughness improved up to 36% and 45%
respectively regarding EP.
On the other hand, the addition of silicone rubber to EP and nanocomposites results
in a superior toughness but has a slightly negative effect on modulus and strength.
The addition of 3 vol.-% SR to the neat epoxy and nanocomposites increases their
toughness between 1.5 and 2.5 fold; but implies also a reduction in their tensile modulus
and strength in range 5-10%. Therefore, when the right proportion of nanoceramic
and rubber were added to the epoxy resin, hybrid epoxy matrices with fracture
toughness 3 fold higher than EP but also with up to 20% improved modulus were
obtained.
Widespread investigations were carried out to define the structural mechanisms responsible
for these improvements. It was stated, that each type of filler induces specific
energy dissipating mechanisms during the mechanical loading and fracture
processes, which are closely related to their nature, morphology and of course to
their bonding with the epoxy matrix. When both nanoceramic and silicone rubber are
involved in the epoxy formulation, a superposition of their corresponding energy release
mechanisms is generated, which provides the matrix with an unusual properties
balance.
From the modified matrices glass fiber reinforced RTM-plates were produced. The
structure of the obtained composites was microscopically analyzed to determine their
impregnation quality. In all cases composites with no structural defects (i.e. voids,
delaminations) and good superficial finish were reached. The composites were also
properly characterized. As expected the final performance of the GFRCs is strongly
determined by the matrix properties. Thus, the enhancement reached by epoxy matrices
is translated into better GFRC´s macroscopical properties. Composites with up
to 15% enhanced strength and toughness improved up to 50%, were obtained from
the modified epoxy matrices.
2,3,7,8-Tetrachlorodibenzo-p-dioxin (TCDD) is a highly toxic and persistent organic pollutant, which is ubiquitously found in the environment. The prototype dioxin compound was classified as a human carcinogen by the International Agency for Research on Cancer. TCDD acts as a potent liver tumor promoter in rats, which is one of the major concerns related to TCDD exposure. There is extensive evidence, that TCDD exerts anti-estrogenic effects via arylhydrocarbon receptor (AhR)-mediated induction of cytochromes P450 and interferes with the estrogen receptor alpha (ERalpha)-mediated signaling pathway. The present work was conducted to shed light on the hypothesis that enhanced activation of estradiol metabolism by TCDD-induced enzymes, mainly CYP1A1 and CYP1B1, leads to oxidative DNA damage in liver cells. Furthermore, the possible modulation by 17beta-estradiol (E2) was investigated. The effects were examined using four different AhR-responsive species- and sex-specific liver cell models, rat H4II2 and human HepG2 hepatoma cell lines as well as rat primary hepatocytes from male and female Wistar rats. The effective induction of CYP1A1 and CYP1B1 by TCDD was demonstrated in all liver cell models. Basal and TCDD-induced expression of CYP1B1, which is a key enzyme in stimulating E2 metabolism via the more reactive formation of the genotoxic 4-hydroxyestradiol, was most pronounced in rat primary hepatocytes. CYP-dependent induction of reactive oxygen species (ROS) was only observed in rodent cells. E2 induced ROS only in primary rat hepatocytes, which was associated with a weak CYP1B1 mRNA induction. Thus, E2 itself was suggested to induce its own metabolism in primary rat hepatocytes, resulting in the redox cycling of catechol estradiol metabolites leading to ROS formation. In this study the role of TCDD and E2 on oxidative DNA damage was investigated for the first time in vitro in the comet assay using liver cells. Both TCDD and E2 were shown to induce oxidative DNA base modifications only in rat hepatocytes. Additionally, direct oxidative DNA-damaging effects of the two main E2 metabolites, 4-hydroxyestradiol and 2-hydroxyestradiol, were only observed in rat hepatocytes and revealed that E2 damaged the DNA to the same extent. However, the induction of oxidative DNA damage by E2 could not completely be explained by the metabolic conversion of E2 via CYP1A1 and CYP1B1 and has to be further investigated. The expression of low levels of endogenous ERalpha mRNA in primary rat hepatocytes and the lack of ERalpha in hepatoma cell lines were identified as crucial. Therefore, the effects of interference of ERalpha with AhR were examined in HepG2 cells, which were transiently transfected with ERalpha. The over-expression of ERalpha led to enhanced AhR-mediated transcriptional activity by E2, suggesting a possible regulation of E2 levels. In turn, TCDD reduced E2-mediated ERalpha signaling, confirming the anti-estrogenic action of TCDD. Such a modulation of the combined effects of TCDD with E2 was not observed in any of the other experiments. Thus, the role of low endogenous ERalpha levels has to be further investigated in transfection experiments using rat primary hepatocytes. Overall, rat primary hepatocyte culture turned out to be the more adaptive cell model to investigate metabolism in the liver, reflecting a more realistic situation of the liver tissue. Nevertheless, during this work a crosstalk between ERalpha and AhR was shown for the first time using human hepatoma cell line HepG2 by transiently transfecting ERalpha.
We study the extension of techniques from Inductive Logic Programming (ILP) to temporal logic programming languages. Therefore we present two temporal logic programming languages and analyse the learnability of programs from these languages from finite sets of examples. In first order temporal logic the following topics are analysed: - How can we characterize the denotational semantics of programs? - Which proof techniques are best suited? - How complex is the learning task? In propositional temporal logic we analyse the following topics: - How can we use well known techniques from model checking in order to refine programs? - How complex is the learning task? In both cases we present estimations for the VC-dimension of selected classes of programs.
Proteins of the intermembrane space of mitochondria are generally encoded by nuclear genes that are synthesized in the cytosol. A group of small intermembrane space proteins lack classical mitochondrial targeting sequences, but these proteins are imported in an oxidation-driven reaction that relies on the activity of two components, Mia40 and Erv1. Both proteins constitute the mitochondrial disulfide relay system. Mia40 functions as an import receptor that interacts with incoming polypeptides via transient, intermolecular disulfide bonds. Erv1 is an FAD-binding sulfhydryl oxidase that activates Mia40 by re-oxidation, but the process how Erv1 itself is re-oxidized has been poorly understood. Here, I show that Erv1 interacts with cytochrome c which provides a functional link between the mitochondrial disulfide relay system and the respiratory chain. This mechanism not only increases the efficiency of mitochondrial inport by the re-oxidation of Erv1 and Mia40 but also prevents the formation of deleterious hydrogen peroxide within the intermembrane space. Thus, the miochondrial disulfide relay system is, analogous to that of the bacterial periplasm, connected to the electron transport chain of the inner membrane, which possibly allows an oxygen-dependend regulation of mitochondrial import rates. In addition, I modeled the structure of Erv1 on the basis of the Saccharomyces cerevisiae Erv2 crystal structure in order to gain insight into the molecular mechanism of Erv1. According to the high degree of sequence homologies, various characteristics found for Erv2 are also valid for Erv1. Finally, I propose a regulatory function of the disulfide relay system on the respiratory chain. The disulfide relay system senses the molecular oxygen levels in mitochondria and, thus, is able to adapt respiratory chain activity in order to prevent wastage of NADH and production of ROS.
This dissertation is intended to give a systematic treatment of hypersurface singularities in arbitrary characteristic which provides the necessary tools, theoretically and computationally, for the purpose of classification. This thesis consists of five chapters: In chapter 1, we introduce the background on isolated hypersurface singularities needed for our work. In chapter 2, we formalize the notions of piecewise-homogeneous grading and we discuss thoroughly non-degeneracy in arbitrary characteristic. Chapter 3 is devoted to determinacy and normal forms of isolated hypersurface singularities. In the first part, we give finite determinacy theorems in arbitrary characteristic with respect to right respectively contact equivalence. Furthermore, we show that "isolated" and finite determinacy properties are equivalent. In the second part, we formalize Arnol'd's key ideas for the computation of normal forms an define the conditions (AA) and (AAC). The last part of Chapter 3 is devoted to the study of normal forms in the general setting of hypersurface singularities imposing neither condition (A) nor Newton-Nondegeneracy. In Chapter 4, we present algorithms which we implement in Singular for the purpose of explicit computation of regular bases and normal forms. In chapter 5, we transfer some classical results on invariants over the field C of complex numbers to algebraically closed fields of characteristic zero known as Lefschetz principle.
Subject of this book is an epistemological consideration - a consideration which could be characterised as a main theme - maybe the main theme - of that part of philosophy we all know as epistemology: the nature of knowledge. But other than the most essays on the subject of knowledge, here I am going to deal with a largely overlooked account to try to find an answer to the epistemological question of knowledge. This is the mental state account of knowledge (Price in his 'Belief' the formulation ``mental acts'' and Williamson talks about a ``state of mind''). Or to put it into the question I chose as title: is knowledge a mental state? We have to concede first that there is only a small group of philosophers who used to explain knowledge in terms of a mental state, particularly the `Oxford Realists'. And secondly, the acceptance of the MS thesis is low and negative. There is an interesting detail here: unlike the poor interest in an epistemic theory such as the MS thesis, philosophers like Prichard or Austin (and their philosophical thinking) are not really living in the shadows of philosophical consideration. Indeed their philosophical impact is high level, if we consider for instance Prichard's moral writings or Austin's theory of speech acts. I think we can conclude from this fact that the reason of the `negative' ignorance in respect of their epistemological point of view was not caused by a negative quality of their philosophy. Now, the question we are faced with (and that should be answered here) is: what is wrong with the MS thesis even though it is held by high class philosophers? Why is the epistemic thinking of Cook Wilson, Prichard and Austin afflicted with such ignorance? I will try to explain this later on with the notion of an unreflected Platonian heritage during 2000 years of epistemic thinking - a notion which is similar to a point Hetherington has called ``epistemic absolutism''. So, there are three main purposes which I am pursuing in this consideration: 1.To explain the reasons why there is such an ignorance towards an assertion of the MS thesis. I am going to pursue this through an analysis of knowledge which will demonstrate the inappropriateness of the JTB thesis as an adequate analysis of knowledge. 2.To describe that it is a mistake to ignore or at least underestimate the MS thesis in the discussion of an appropriate definition of knowledge and to maintain that the MS thesis is the key to a general theory of knowledge. 3.Conclusion: If the first two steps are correct, the JTB thesis is insufficient in order to give an account of the nature of knowledge in general. A consequence from this is: all the epistemic theories which are dealing with the JTB thesis are based on deficient assumptions. Hence their results - notably the well-known externalism/internalism debate - are insufficient, too. So, there is a need for a new theory of knowledge based on the MS thesis. In the course of my consideration I am going to justify the following three theses: i) The JTB thesis as a definition of knowledge in general is deficient, as the JTB thesis describes the propositional aspect of knowledge only. But the propositional knowledge - the so-called `knowledge that' - is merely one element among others that has to be recognized in search of a theory of knowledge. ii) The status of the `knowledge that' is derivative and not ultimate. It is derived from the non- propositional knowledge in order to make the non-propositional knowledge communicable to others. The mode of the `knowledge that' is indirect and thus can be stated in the third person point of view only. This ultimate kind of knowledge - the knowledge which the `knowledge that' is derived from - is the non-propositional knowledge. Its mode is direct and hence it is restricted to the first person point of view. Therefore the basis towards a theory of knowledge in general has to be this non-propositional aspect of knowledge. iii) Hence, taking the first two theses for granted, an appropriate theory of knowledge needs an account of the non-propositional knowledge. The MS thesis will accomplish this task.
Limit theorems constitute a classical and important field in probability theory. In several applications, in particular in demographic or medical contexts, killed Markov processes suggest themselves as models for populations undergoing culling by mortality or other processes. In these situations mathematical research features a general interest in the observable distribution of survivors, which is known as Yaglom limit or quasi-stationary distribution. Previous work often focuses on discrete state spaces, commonly birth-death processes (or with some more flexible localization of the transitions), with killing only on the boundary. The central concerns of this thesis are to describe, for a given class of one dimensional diffusion processes, the quasistationary distributions (if any), and to describe the convergence (or not) of the process conditioned on survival to one of these quasistationary distributions. Rather general diffusion processes on the half-line are considered, where 0 is allowed to be regular or an exit boundary. Very similar techniques are applied in this work in order to derive results on the large time behavior of an exotic measure valued process, which is closely related to so-called point interactions, which have been widely studied in the mathematical physics literature.
This Dissertation tried to provide insights into the influences of individual and contextual factors on Technical and Vocational Education and Training (TVET) teachers’ learning and professional development in Ethiopia. Specifically, this research focused on identifying and determining the influences of teachers’ self perception as learners and professionals, and investigates the impact of the context, process and content of their learning and experiences on their professional development. The knowledge of these factors and their impacts help in improving the learning and professional development of the TVET teachers and their professionalization. This research tried to provide answers for the following five research questions. (1) How do TVET teachers perceive themselves as active learners and as professionals? And what are the implications of their perceptions on their learning and development? (2) How do TVET teachers engage themselves in learning and professional development activities? (3) What contextual factors facilitated or hindered the TVET Teachers’ learning and professional development? (4) Which competencies are found critical for the TVET teachers’ learning and professional development? (5) What actions need to be considered to enhance and sustain TVET teachers learning and professional development in their context? It is believed that the research results are significant not only to the TVET teachers, but also to schools leaders, TVET Teacher Training Institutions, education experts and policy makers, researchers and others stakeholders in the TVET sector. The theoretical perspectives adopted in this research are based on the systemic constructivist approach to professional development. An integrated approach to professional development requires that the teachers’ learning and development activities to be taken as an adult education based on the principles of constructivism. Professional development is considered as context - specific and long-term process in which teachers are trusted, respected and empowered as professionals. Teachers’ development activities are sought as more of collaborative activities portraying the social nature of learning. Schools that facilitate the learning and development of teachers exhibit characteristics of a learning organisation culture where, professional collaboration, collegiality and shared leadership are practiced. This research has drawn also relevant point of views from studies and reports on vocational education and TVET teacher education programs and practices at international, continental and national levels. The research objectives and the types of research questions in this study implied the use of a qualitative inductive research approach as a research strategy. Primary data were collected from TVET teachers in four schools using a one-on-one qualitative in-depth interview method. These data were analyzed using a Qualitative Content Analysis method based on the inductive category development procedure. ATLAS.ti software was used for supporting the coding and categorization process. The research findings showed that most of the TVET teachers neither perceive themselves as professionals nor as active learners. These perceptions are found to be one of the major barriers to their learning and development. Professional collaborations in the schools are minimal and teaching is sought as an isolated individual activity; a secluded task for the teacher. Self-directed learning initiatives and individual learning projects are not strongly evident. The predominantly teacher-centered approach used in TVET teacher education and professional development programs put emphasis mainly to the development of technical competences and has limited the development of a range of competences essential to teachers’ professional development. Moreover, factors such as the TVET school culture, the society’s perception of the teaching profession, economic conditions, and weak links with industries and business sectors are among the major contextual factors that hindered the TVET teachers’ learning and professional development. A number of recommendations are forwarded to improve the professional development of the TVET teachers. These include change in the TVET schools culture, a paradigm shift in TVET teacher education approach and practice, and development of educational policies that support the professionalization of TVET teachers. Areas for further theoretical research and empirical enquiry are also suggested to support the learning and professional development of the TVET teachers in Ethiopia.
Most software systems are described in high-level model or programming languages. Their runtime behavior, however, is determined by the compiled code. For uncritical software, it may be sufficient to test the runtime behavior of the code. For safety-critical software, there is an additional aggravating factor resulting from the fact that the code must satisfy the formal specification which reflects the safety policy of the software consumer and that the software producer is obliged to demonstrate that the code is correct with respect to the specification using formal verification techniques. In this scenario, it is of great importance that static analyses and formal methods can be applied on the source code level, because this level is more abstract and better suited for such techniques. However, the results of the analyses and the verification can only be carried over to the machine code level, if we can establish the correctness of the translation. Thus, compilation is a crucial step in the development of software systems and formally verified translation correctness is essential to close the formalization chain from high-level formal methods to the machine-code level. In this thesis, I propose an approach to certifying compilers which achieves the aim of closing the formalization chain from high-level formal methods to the machine-code level by applying techniques from mathematical logic and programming language semantics. I propose an approach called foundational translation validation (FTV) in which the software producer implements an FTV system comprising a compiler and a specification and verification framework (SVF) which is implemented in higher-order logic (HOL). The most important part of the SVF is an explicit translation contract which comprises the formalizations of the source and the target languages of the compiler and the formalization of a binary translation correctness predicate corrTrans(S,T) for source programs S and target programs T. The formalizations of the languages are realized as deep embeddings in HOL. This enables one to declare the whole program in a formalized language as a HOL constant. The predicate formally specifies when T is considered to be a correct translation of S. Its definition is explicitly based on the program semantics definitions provided by the translation contract. Subsequent to the translation, the compiler translates the source and the target programs into their syntactic representations as HOL constants, S and T, and generates a proof of corrTrans(S,T). We call a compiler which follows the FTV approach a proof generating compiler. Our approach borrows the idea of representing programs in correctness proofs as logic constants from the foundational proof-carrying code (FPCC) approach. Novel features that distinquish our approach from further approaches to certifying compilers, such as proof-carrying code (PCC) and translation validation (TV) are the following: Firstly, the presence of an explicit translation contract formalized in HOL: The approaches PCC and TV do not formalize a translation contract explicitly. Instead of this, they incorporate operational semantics and translation correctness criterion in translation validation tools on the programming language level. Secondly, representation of programs in correctness proofs as logic constants: The approaches PCC and the TV translate programs into their representations as semantic abstractions that serve as inputs for translation validation tools. Thirdly, certification of program transformation chains: Unlike the TV approach, which certifies single program transformations, the FTV approach achieves the aim of certifying whole chains of program transformations. This is possible due to the fact that the translation contract provides, for all programming languages involved in the program transformation chain, definitions of program semantics functions which map programs to mathematical objects that are elements of a set with an (at least) partial order "<=". Then, the proof makes use of the fact that the relation "<=" is transitive. In this thesis, the feasibility of the FTV approach is exemplified by the implementation of an FTV system. The system comprises a compiler front-end that certifies its optimization phase and an accompanying SVF that is implemented in the theorem prover Isabelle/HOL. The compiler front-end translates programs in a small C-like programming language, performs three optimizations: constant folding, dead assignment elimination, and loop invariant hoisting, and generates translation certificates in the form of Isabelle/HOL theories. The main focus of the thesis is on the description of the SVF and its translation verification techniques.
Photochemical reactions are of great interest due to their importance in chemical and biological processes. Highly sensitive IR/UV double and triple resonance spectroscopy in molecular beam experiments in combination with ab initio and DFT calculations yields information on reaction coordinates and Intersystem Crossing (ISC) processes subsequent to photoexcitation. In general, molecular beam experiments enable the investigation of isolated, cold molecules without any influence of the environment. Furthermore, small aggregates can be analyzed in a supersonic jet by gradually adding solvent molecules like water. Conclusions concerning the interactions in solution can be derived by investigating and fully understanding small systems with a defined amount of solvent molecules. In this work the first applications of combined IR/UV spectroscopy on reactive isolated molecules and triplet states in molecular beams without using any messenger molecules are presented. Special focus was on excited state proton transfer reactions, which can also be described as keto enol tautomerisms. Various molecules such as 3-hydroxyflavone, 2-(2-naphthyl)-3-hydroxychromone and 2,5-dihydroxybenzoic acid have been investigated with regard to this question. In the case of 3-hydroxyflavone and 2-(2-naphthyl)-3-hydroxychromone, the IR spectra have been recorded subsequent to an excited state proton transfer. Furthermore the dihydrate of 3-hydroxyflavone has been analyzed concerning a possible proton transfer in the excited state: The proton transfer reaction along the water molecules (proton wire) has to be induced by raising the excitation energy. However, photoinduced reactions involve not only singlet but also triplet states. As an archetype molecule xanthone has been analysed. After excitation to the S2 state, ISC occurs into the triplet manifold leading to a population of the T1 state. The IR spectrum of the T1 state has been recorded for the first time using the UV/IR/UV technique without using any messenger molecules. Altogether it is shown that IR/UV double and triple resonance techniques are suitable tools to analyze reaction coordinates of photochemical processes.