Refine
Year of publication
Document Type
- Doctoral Thesis (1874)
- Preprint (1154)
- Article (707)
- Report (483)
- Periodical Part (296)
- Master's Thesis (255)
- Working Paper (115)
- Conference Proceeding (47)
- Diploma Thesis (35)
- Lecture (25)
Language
- English (3111)
- German (1982)
- Multiple languages (6)
- Spanish (4)
Has Fulltext
- yes (5103) (remove)
Keywords
- AG-RESY (64)
- PARO (31)
- Stadtplanung (30)
- Erwachsenenbildung (29)
- Organisationsentwicklung (28)
- Schule (25)
- Simulation (25)
- Modellierung (24)
- Visualisierung (21)
- Case-Based Reasoning (20)
Faculty / Organisational entity
- Kaiserslautern - Fachbereich Mathematik (1147)
- Kaiserslautern - Fachbereich Informatik (927)
- Kaiserslautern - Fachbereich Maschinenbau und Verfahrenstechnik (576)
- Kaiserslautern - Fachbereich Chemie (428)
- Kaiserslautern - Fachbereich Sozialwissenschaften (350)
- Kaiserslautern - Fachbereich Physik (329)
- Fraunhofer (ITWM) (224)
- Kaiserslautern - Fachbereich Biologie (182)
- Kaiserslautern - Fachbereich Elektrotechnik und Informationstechnik (171)
- Distance and Independent Studies Center (DISC) (168)
Congress Report 2013.01-03
(2013)
Data integration aims at providing uniform access to heterogeneous data, managed by distributed source systems. Data sources can range from legacy systems, databases, and enterprise applications to web-scale data management systems. The materialized approach to data integration, extracts data from the sources, transforms and consolidates the data, and loads it into an integration system, where it is persistently stored and can be queried and analyzed.
To support materialized data integration, so called Extract-Transform-Load (ETL) systems have been built and are widely used to populate data warehouses today. While ETL is considered state-of-the-art in enterprise data warehousing, a new paradigm known as MapReduce has recently gained popularity for web-scale data transformations, such as web indexing or page rank computation.
The input data of both, ETL and MapReduce programs keeps changing over time, while business transactions are processed or the web is crawled, for instance. Hence, the results of ETL and MapReduce programs get stale and need to be recomputed from time to time. Recurrent computations over changing input data can be performed in two ways. The result may either be recomputed from scratch or recomputed in an incremental fashion. The idea behind the latter approach is to update the existing result in response to incremental changes in the input data. This is typically more efficient than the full recomputation approach, because reprocessing unchanged portions of the input data can often be avoided.
Incremental recomputation techniques have been studied by the database research community mainly in the context of the maintenance of materialized views and have been adopted by all major commercial database systems today. However, neither today's ETL tools nor MapReduce support incremental recomputation techniques. The situation of ETL and MapReduce programmers nowadays is thus much comparable to the situation of database programmers in the early 1990s. This thesis makes an effort to transfer incremental recomputation techniques into the ETL and MapReduce environments. This poses interesting research challenges, because these environments differ fundamentally from the relational world with regard to query and programming models, change data capture, transactional guarantees and consistency models. However, as this thesis will show, incremental recomputations are feasible in ETL and MapReduce and may lead to considerable efficiency improvements.
Hardware prototyping is an essential part in the hardware design flow. Furthermore, hardware prototyping usually relies on system-level design and hardware-in-the-loop simulations in order to develop, test and evaluate intellectual property cores. One common task in this process consist on interfacing cores with different port specifications. Data width conversion is used to overcome this issue. This work presents two open source hardware cores compliant with AXI4-Stream bus protocol, where each core performs upsizing/downsizing data width conversion.
Hydrogels are known to be covalently or ionic cross-linked, hydrophilic three-dimensional
polymer networks, which exist in our bodies in a biological gel form such as the vitreous
humour that fills the interior of the eyes. Poly(N-isopropylacrylamide) (poly(NIPAAm))
hydrogels are attracting more interest in biomedical applications because, besides others, they
exhibit a well-defined lower critical solution temperature (LCST) in water, around 31–34°C,
which is close to the body temperature. This is considered to be of great interest in drug
delivery, cell encapsulation, and tissue engineering applications. In this work, the
poly(NIPAAm) hydrogel is synthesized by free radical polymerization. Hydrogel properties
and the dimensional changes accompanied with the volume phase transition of the
thermosensitive poly(NIPAAm) hydrogel were investigated in terms of Raman spectra,
swelling ratio, and hydration. The thermal swelling/deswelling changes that occur at different
equilibrium temperatures and different solutions (phenol, ethanol, propanol, and sodium
chloride) based on Raman spectrum were investigated. In addition, Raman spectroscopy has
been employed to evaluate the diffusion aspects of bovine serum albumin (BSA) and phenol
through the poly(NIPAAm) network. The determination of the mutual diffusion coefficient,
\(D_{mut}\) for hydrogels/solvent system was achieved successfully using Raman spectroscopy at
different solute concentrations. Moreover, the mechanical properties of the hydrogel, which
were investigated by uniaxial compression tests, were used to characterize the hydrogel and to
determine the collective diffusion coefficient through the hydrogel. The solute release coupled
with shrinking of the hydrogel particles was modelled with a bi-dimensional diffusion model
with moving boundary conditions. The influence of the variable diffusion coefficient is
observed and leads to a better description of the kinetic curve in the case of important
deformation around the LCST. A good accordance between experimental and calculated data
was obtained.
Recently, a new Quicksort variant due to Yaroslavskiy was chosen as standard sorting
method for Oracle's Java 7 runtime library. The decision for the change was based on
empirical studies showing that on average, the new algorithm is faster than the formerly
used classic Quicksort. Surprisingly, the improvement was achieved by using a dual pivot
approach — an idea that was considered not promising by several theoretical studies in the
past. In this thesis, I try to find the reason for this unexpected success.
My focus is on the precise and detailed average case analysis, aiming at the flavor of
Knuth's series “The Art of Computer Programming”. In particular, I go beyond abstract
measures like counting key comparisons, and try to understand the efficiency of the
algorithms at different levels of abstraction. Whenever possible, precise expected values are
preferred to asymptotic approximations. This rigor ensures that (a) the sorting methods
discussed here are actually usable in practice and (b) that the analysis results contribute to
a sound comparison of the Quicksort variants.
Palladium-Catalyzed C–C Bond Formations via Activation of Carboxylic Acids and Their Derivatives
(2013)
Applications of carboxylic acids and their derivatives in transition metal-catalyzed cross-coupling reactions regio-selectively forming Csp3-Csp2, and Csp2-Csp2 bonds were explored in this thesis. Several important organic building blocks such as aryl acetates, diaryl acetates, imines, ketones, biaryls, styrenes and polysubstituted alkenes were successfully accessed from carboxylic acids and their derivatives by the means of C–H activation and decarboxylative cross-couplings.
An efficient and practical protocol for the synthesis of biologically important ethyl 2-arylacates through the dealkoxycarbonlative cross-coupling reaction between aryl halides and malonates was developed. Activation of the alpha-proton of alkyl esters by a copper catalyst allowed the deprotonation of esters even in the presence of mild bases, leading to a straightforward and efficient approach to alkyl alpha-diarylacetate from simple alkyl acetates and aryl halides.
The addition of a primary amine into the coupling reaction of alpha-oxocarboxylic acids and aryl halides led to an unprecedented low-temperature redox-neutral decarboxylative coupling process, providing a green and efficient method for the preparation of azomethines, in which all the three substituents can be independently varied. A minor modification of this protocol allowed us to easily access the corresponding ketones.
The decarboxylative coupling of robust aryl mesylates as well as polysubstituted alkenyl mesylates using our customized imidazolyl phosphine ligands was realized, further expanding the scope of carbon electrophiles in decarboxylative coupling reactions. Variation of the ligands led to two complementary protocols, providing the corresponding biaryls and polysubstituted olefins in high yields.
The use of a new class of pyrimidinyl phosphine ligands dramatically reduced the reaction temperatures of decarboxylative cross-coupling reactions between aromatic carboxylic acids and aryl or alkenyl triflates. The new catalyst system for the first time allowed the efficient decarboxylative biaryls synthesis at only 100 °C, representing a significant achievement in redox-neutral decarboxylative coupling reactions.
In this paper we consider a multivariate switching model, with constant states means
and covariances. In this model, the switching mechanism between the basic states of
the observed time series is controlled by a hidden Markov chain. As illustration, under
Gaussian assumption on the innovations and some rather simple conditions, we prove
the consistency and asymptotic normality of the maximum likelihood estimates of the model parameters.
There is growing international concern about the necessity to re-think the university so that it might remain relevant in a modern society. In the traditional task division at universities, knowledge is the main resource. Universities make use of both the cognitive and the informational approach. It was expected that universities use each approach to improve overall university performance. To effectively use the informational approach, universities should apply the tools from knowledge management. To effectively use the cognitive approach, universities must update their teaching-learning strategies to incorporate some of the recent advances in neuroscience and biology of knowledge, specifically from neurobiology and autopoiesis. With this frame, the main contribution of this work is the result of merging pedagogy and biology, towards an ideal future university. This goal was achieved through an exploratory study conducted to identify opportunities and difficulties in improving the teaching-learning process for the future of higher education in Honduras. The Delphi Study was used as a predictive method. Nineteen Honduran experts participated in this study, and two rounds were necessary to achieve consensus.
The multi-disciplinary approach of this research addresses three different fields whose core element is knowledge. First, input from the present field of higher education is used to speak about the future. Second, input is taken from the biology of knowledge, and its contributions from neurobiology and autopoiesis that allow modifying and completing the already existing learning theories with a biological basis. Third, input is taken from the knowledge process, which is traditionally used as an organizational tool and know is translated to the individual level. The exploration shows that experts are concerned about all the missions and responsibilities of universities, but they agree that changes should primarily take place in the teaching dimension. Even though they are not aware of the possible contributions of biology, they suggest new forms of teaching that more favor skills development, promotes values, pertinent knowledge, and personal development over short-term contents. The resulting BRAIN Model encompasses the ideal future of higher education regarding teaching and learning, according to experts’ answers. It provides a useful guide that any reform in teaching should take into account for a holistic, integral, and therefore more efficient learning task.
Congress Report 2012.11-12
(2012)
Congress Report 2012.09-10
(2012)