Doctoral Thesis
Refine
Year of publication
- 2014 (78) (remove)
Document Type
- Doctoral Thesis (78) (remove)
Has Fulltext
- yes (78)
Keywords
- Activity recognition (2)
- Arylhydrocarbon-Rezeptor (2)
- Leber (2)
- Querkraft (2)
- Wearable computing (2)
- 2,3,7,8-Tetrachlordibenzo-p-dioxin (1)
- 2,3,7,8-tetrachlordibenzo-p-dioxine (1)
- AFS (1)
- AFSfein (1)
- Abfluss (1)
- Adaptive Data Structure (1)
- AhR-Knockout Mäuse (1)
- AhR-VDR-Crosstalk (1)
- AhR-deficient mice (1)
- AhRR (1)
- Algorithm (1)
- Angebotspreise (1)
- Artefakt (1)
- Bauen im Bestand (1)
- Betonfestigkeit (1)
- Betonstahl (1)
- Biegetragfähigkeit (1)
- Boosting (1)
- C-Si-Kupplung (1)
- CYP1A1 (1)
- Calcitriol (1)
- Classification (1)
- Closure (1)
- Code Generation (1)
- Computer graphics (1)
- Cycle Accuracy (1)
- Cyp1a1 (1)
- Cyp1a1 Genexpression (1)
- Cyp24a1 Genexpression (1)
- DL-PCBs (1)
- Dach (1)
- Daseinsvorsorge (1)
- Data Analysis (1)
- Dataset (1)
- Debt Management (1)
- Definition (1)
- Dekonsolidierung (1)
- Dendritische Zellen (1)
- Dioxin (1)
- Direct Numerical Simulation (1)
- Discrete Event Simulation (DES) (1)
- ELISA (1)
- EROD (1)
- Eikonal equation (1)
- Endlicher Automat (1)
- Entscheidung (1)
- Entscheidungstheorie (1)
- Entscheidungsunterstützung (1)
- Erschließungsform (1)
- Erwarteter Nutzen (1)
- Evaluation (1)
- Faser (1)
- Faserorientierung (1)
- Feasibility study (1)
- Feature extraction (1)
- Feststoff (1)
- Feststoffe (1)
- Fluoreszenz (1)
- Formale Grammatik (1)
- Formale Sprache (1)
- Fußgängerverkehr (1)
- Grouping by similarity (1)
- Herkunftsfläche (1)
- Hohlkörper (1)
- Hohlkörperdecke (1)
- Hybridmaterialien (1)
- Hypergraph (1)
- IP-XACT (1)
- Ileostomy (1)
- Immobilienpreis (1)
- Immunoblot (1)
- Intensity estimation (1)
- Interactive decision support systems (1)
- Invariante (1)
- Inwertsetzung (1)
- Kellerautomat (1)
- Kieselgel (1)
- Klein- und Mittelstädte (1)
- Knowledge Management (1)
- Koinzidenz (1)
- Konstitutionsform (1)
- LIR-Tree (1)
- Leichtbeton (1)
- Lokales Durchstanzen (1)
- MCM-41 (1)
- Machine learning (1)
- Makrophagen (1)
- Metabolismus (1)
- Methyleugenol (1)
- Microarray (1)
- Mikrosomen (1)
- Minimal training (1)
- Mobile system (1)
- Most (1)
- Murine Knochenmarkzellen (1)
- Mustererkennung (1)
- Mykotoxine (1)
- Nahmobilität (1)
- Niederschlag (1)
- Niederschlagsabfluss (1)
- Niere (1)
- Noise control (1)
- Normalbeton (1)
- OCR (1)
- Ontologie (1)
- Optimierung (1)
- Organosilica (1)
- Organosiloxane (1)
- PCDD/Fs (1)
- PM63 (1)
- PMO (1)
- Partial Differential Equations (1)
- Pedestrian FLow (1)
- Perceptual grouping (1)
- Personalisation (1)
- Pervasive health (1)
- Phenothiazin (1)
- Phenothiazinderivate (1)
- Phenylpropanoide (1)
- Photolumineszenz (1)
- Physical activity monitoring (1)
- Planung (1)
- Portfolio Selection (1)
- Qualität (1)
- Quantifizierung (1)
- Quantilwertbestimmung (1)
- Querkrafttragfähigkeit (1)
- Recommender Systems (1)
- Response Priming (1)
- Revitalisierung (1)
- Rheologie (1)
- SBA-15 (1)
- Schwermetallbelastung (1)
- Self-splitting objects (1)
- Semantic Web (1)
- Semantic Wikis (1)
- Shared Resource Modeling (1)
- Simulation (1)
- Sol-Gel (1)
- Sol-Gel-Verfahren (1)
- Soziale Infrastruktur (1)
- Sozialraumanalyse (1)
- Spatial Econometrics (1)
- Speech recognition (1)
- Stadtplanung (1)
- Stahlbeton (1)
- Stahlbetonbau (1)
- Stahlverbundbau (1)
- Stahlverbundkonstruktion (1)
- Statistische Schlussweise (1)
- Stokes Equations (1)
- Sustainability (1)
- Suzuki-Kupplung (1)
- Suzuki-Miyaura-Reaktion (1)
- Symmetry (1)
- SystemC (1)
- TIPARP (1)
- Technik (1)
- Technikbegriff (1)
- Technikphilosophie (1)
- Temporal Decoupling (1)
- Tensorfeld (1)
- Tetrachlordibenzo-p-dioxin (1)
- Thermoplast (1)
- Thiophen (1)
- Topology visualization (1)
- Transaction Level Modeling (TLM) (1)
- Traubensortierung (1)
- Trennkanalisation (1)
- Ubiquitous system (1)
- Unobtrusive instrumentations (1)
- Unsicherheit (1)
- Urban Water Supply (1)
- Verbunddecken (1)
- Verbundtragfähigkeit (1)
- Verkehrsfläche (1)
- Vitamin-D (1)
- Vitamin-D-Rezeptor (1)
- Volume rendering (1)
- Water resources (1)
- Wein (1)
- Werkstoffprüfung (1)
- XMCD (1)
- Xerogel (1)
- Xerogele (1)
- Zytokine (1)
- aryl hydrocarbon receptor (1)
- aryl-hydrocarbon-receptor (1)
- bioavailability (1)
- charakteristische Werkstofffestigkeiten (1)
- cobalt (1)
- coffee (1)
- demografischer Wandel (1)
- dioxin-like compounds (1)
- fatigue (1)
- flow cytometry (1)
- gas phase (1)
- geographic information systems (1)
- geology (1)
- hypergraph (1)
- invariant (1)
- iron (1)
- kidney (1)
- liver (1)
- magnetism (1)
- mesoporöse Materialien (1)
- metal cluster (1)
- moment (1)
- nickel (1)
- optimization (1)
- orbit (1)
- peripheral blood mononuclear cells (1)
- periphere Region (1)
- philosophy of technology (1)
- point cloud (1)
- polycyclische Aromaten (1)
- polyphenol (1)
- rat liver cell systems (1)
- relative effect potencies (1)
- single molecule magnet (1)
- soziale Infrastruktur (1)
- spin (1)
- tensor (1)
- tensorfield (1)
- terrain rendering (1)
- tetrachlorodibenzo-p-dioxin (1)
- toxic equivalency factor (TEF) concept (1)
- vectorfield (1)
- virtual reality (1)
- whole genome microarray analysis (1)
- zementgebundene Feinkornsysteme (1)
- Ökonometrie (1)
Faculty / Organisational entity
- Kaiserslautern - Fachbereich Mathematik (15)
- Kaiserslautern - Fachbereich Informatik (14)
- Kaiserslautern - Fachbereich Chemie (12)
- Kaiserslautern - Fachbereich Maschinenbau und Verfahrenstechnik (12)
- Kaiserslautern - Fachbereich Bauingenieurwesen (7)
- Kaiserslautern - Fachbereich Sozialwissenschaften (7)
- Kaiserslautern - Fachbereich Biologie (3)
- Kaiserslautern - Fachbereich Elektrotechnik und Informationstechnik (3)
- Kaiserslautern - Fachbereich ARUBI (2)
- Kaiserslautern - Fachbereich Wirtschaftswissenschaften (2)
Bei der Herstellung hochbelasteter Strukturbauteile aus Faser-Kunststoff-Verbund (FKV) wird verbreitet auf textile Halbzeuge wie Gewebe oder vernähte Biaxial-Gelege zurückgegrif-fen. Diese Halbzeuge zeigen im Verbund mit Kunststoffen periodische out-of-plane Roving-welligkeiten. Die Größe der Welligkeiten hängt unter anderem von den Fertigungsparametern der trockenen Halbzeuge ab. Durch das Verständnis der effektiven Kausalitäten zwischen Rovingwelligkeiten und mechanischem Verhalten soll eine bessere rechnerische Abschätzung der Materialkennwerte erzielt werden.
In dieser Arbeit wurde unter anderem der Einfluss der Welligkeitsparameter Amplitude und Wellenlänge auf die faserparallelen Kennwerte an unidirektional verstärkten Proben unter-sucht. Dafür wurden gezielt unterschiedliche Amplituden und Wellenlängen mit Hilfe von unidirektionalen Geweben in die Probekörper eingebracht. Der Einfluss der Rovingwelligkei-ten auf die Steifigkeiten war kleiner als auf die Festigkeiten. Bei Letzten war zu beobachten, dass die Druckfestigkeiten mehr von den Ondulationen beeinflusst wurden als die Zugfestig-keiten. Außerdem wurden die Welligkeiten und die mechanischen Kennwerte von textilen FKV-Geweben und -Gelegen bestimmt. Bei der Auswahl der untersuchten Halbzeuge war ein wichtiges Kriterium, dass diese auch in der Praxis Anwendung finden.
Im nächsten Schritt wurde ein vereinfachtes Finite-Elemente-Welligkeitsmodell entwickelt, welches es ermöglicht, die faserparallelen Kennwerte ohne zeit- und kostenintensive Materi-alversuche zu bestimmen. Speziell für Gewebe-Materialien mit großen Rovingwelligkeiten ist dieses Modell in der Lage, deutlich bessere Abschätzungen als vorhandene Methoden zu ge-ben. Weiterhin wurde auf dieser Basis ein Regressionsmodell für unidirektional verstärkte Materialien abgeleitet, welches auch dem Konstrukteur ohne Erfahrungen im Umgang mit Finiten-Elemente-Programmen die Anwendung des Welligkeitsmodells ermöglicht.
Der Vorteil des entwickelten Welligkeitsmodells wurde in einem dreistufigen Validierungs-programm nachgewiesen. Dieses beinhaltet den Übergang auf multidirektionale Laminate sowie komplexere Bauteilgeometrien. Die verbesserte Prognose mit Hilfe des Welligkeitsmo-dells zeigte sich vor allem bei Materialien mit großen Rovingwelligkeiten und bei Bauteilen die aufgrund eines Faserbruchs durch eine Druckbelastung versagten.
Continuum Mechanical Modeling of Dry Granular Systems: From Dilute Flow to Solid-Like Behavior
(2014)
In this thesis, we develop a granular hydrodynamic model which covers the three principal regimes observed in granular systems, i.e. the dilute flow, the dense flow and the solid-like regime. We start from a kinetic model valid at low density and extend its validity to the granular solid-like behavior. Analytical and numerical results show that this model reproduces a lot of complex phenomena like for instance slow viscoplastic motion, critical states and the pressure dip in sand piles. Finally we formulate a 1D version of the full model and develop a numerical method to solve it. We present two numerical examples, a filling simulation and the flow on an inclined plane where the three regimes are included.
The objective of this thesis consists in developing systematic event-triggered control designs for specified event generators, which is an important alternative to the traditional periodic sampling control. Sporadic sampling inherently arising in event-triggered control is determined by the event-triggering conditions. This feature invokes the desire of
finding new control theory as the traditional sampled-data theory in computer control.
Developing controller coupling with the applied event-triggering condition to maximize the control performance is the essence for event-triggered control design. In the design the stability of the control system needs to be ensured with the first priority. Concerning variant control aims they should be clearly incorporated in the design procedures. Considering applications in embedded control systems efficient implementation requires a low complexity of embedded software architectures. The thesis targets at offering such a design to further complete the theory of event-triggered control designs.
In the presented work, I evaluate if and how Virtual Reality (VR) technologies can be used to support researchers working in the geosciences by providing immersive, collaborative visualization systems as well as virtual tools for data analysis. Technical challenges encountered in the development of theses systems are identified and solutions for these are provided.
To enable geologists to explore large digital terrain models (DTMs) in an immersive, explorative fashion within a VR environment, a suitable terrain rendering algorithm is required. For realistic perception of planetary curvature at large viewer altitudes, spherical rendering of the surface is necessary. Furthermore, rendering must sustain interactive frame rates of about 30 frames per second to avoid sensory confusion of the user. At the same time, the data structures used for visualization should also be suitable for efficiently computing spatial properties such as height profiles or volumes in order to implement virtual analysis tools. To address these requirements, I have developed a novel terrain rendering algorithm based on tiled quadtree hierarchies using the HEALPix parametrization of a sphere. For evaluation purposes, the system is applied to a 500 GiB dataset representing the surface of Mars.
Considering the current development of inexpensive remote surveillance equipment such as quadcopters, it seems inevitable that these devices will play a major role in future disaster management applications. Virtual reality installations in disaster management headquarters which provide an immersive visualization of near-live, three-dimensional situational data could then be a valuable asset for rapid, collaborative decision making. Most terrain visualization algorithms, however, require a computationally expensive pre-processing step to construct a terrain database.
To address this problem, I present an on-the-fly pre-processing system for cartographic data. The system consists of a frontend for rendering and interaction as well as a distributed processing backend executing on a small cluster which produces tiled data in the format required by the frontend on demand. The backend employs a CUDA based algorithm on graphics cards to perform efficient conversion from cartographic standard projections to the HEALPix-based grid used by the frontend.
Measurement of spatial properties is an important step in quantifying geological phenomena. When performing these tasks in a VR environment, a suitable input device and abstraction for the interaction (a “virtual tool”) must be provided. This tool should enable the user to precisely select the location of the measurement even under a perspective projection. Furthermore, the measurement process should be accurate to the resolution of the data available and should not have a large impact on the frame rate in order to not violate interactivity requirements.
I have implemented virtual tools based on the HEALPix data structure for measurement of height profiles as well as volumes. For interaction, a ray-based picking metaphor was employed, using a virtual selection ray extending from the user’s hand holding a VR interaction device. To provide maximum accuracy, the algorithms access the quad-tree terrain database at the highest available resolution level while at the same time maintaining interactivity in rendering.
Geological faults are cracks in the earth’s crust along which a differential movement of rock volumes can be observed. Quantifying the direction and magnitude of such translations is an essential requirement in understanding earth’s geological history. For this purpose, geologists traditionally use maps in top-down projection which are cut (e.g. using image editing software) along the suspected fault trace. The two resulting pieces of the map are then translated in parallel against each other until surface features which have been cut by the fault motion come back into alignment. The amount of translation applied is then used as a hypothesis for the magnitude of the fault action. In the scope of this work it is shown, however, that performing this study in a top-down perspective can lead to the acceptance of faulty reconstructions, since the three-dimensional structure of topography is not considered.
To address this problem, I present a novel terrain deformation algorithm which allows the user to trace a fault line directly within a 3D terrain visualization system and interactively deform the terrain model while inspecting the resulting reconstruction from arbitrary perspectives. I demonstrate that the application of 3D visualization allows for a more informed interpretation of fault reconstruction hypotheses. The algorithm is implemented on graphics cards and performs real-time geometric deformation of the terrain model, guaranteeing interactivity with respect to all parameters.
Paleoceanography is the study of the prehistoric evolution of the ocean. One of the key data sources used in this research are coring experiments which provide point samples of layered sediment depositions at the ocean floor. The samples obtained in these experiments document the time-varying sediment concentrations within the ocean water at the point of measurement. The task of recovering the ocean flow patterns based on these deposition records is a challenging inverse numerical problem, however.
To support domain scientists working on this problem, I have developed a VR visualization tool to aid in the verification of model parameters by providing simultaneous visualization of experimental data from coring as well as the resulting predicted flow field obtained from numerical simulation. Earth is visualized as a globe in the VR environment with coring data being presented using a billboard rendering technique while the
time-variant flow field is indicated using Line-Integral-Convolution (LIC). To study individual sediment transport pathways and their correlation with the depositional record, interactive particle injection and real-time advection is supported.
The present work investigated three important constructs in the field of psychology: creativity, intelligence and giftedness. The major objective was to clarify some aspects about each one of these three constructs, as well as some possible correlations between them. Of special interest were: (1) the relationship between creativity and intelligence - particularly the validity of the threshold theory; (2) the development of these constructs within average and above-average intelligent children and throughout grade levels; and (3) the comparison between the development of intelligence and creativity in above-average intelligent primary school children that participated in a special program for children classified as “gifted”, called Entdeckertag (ET), against an age-class- and-IQ matched control group. The ET is a pilot program which was implemented in 2004 by the Ministry for Education, Science, Youth and Culture of the state of Rhineland-Palatinate, Germany. The central goals of this program are the early recognition of gifted children and intervention, based on the areas of German language, general science and mathematics, and also to foster the development of a child’s creativity, social ability, and more. Five hypotheses were proposed and analyzed, and reported separately within five chapters. To analyze these hypotheses, a sample of 217 children recruited from first to fourth grade, and between the ages of six and ten years, was tested for intelligence and creativity. Children performed three tests: Standard Progressive Matrices (SPM) for the assessment of classical intelligence, Test of Creative Thinking – Drawing Production (TCT-DP) for the measurement of classical creativity, and Creative Reasoning Task (CRT) for the evaluation of convergent and divergent thinking, both in open problem spaces. Participants were divided according to two general cohorts: Intervention group (N = 43), composed of children participating in the Entdeckertag program, and a non-intervention group (N = 174), composed of children from the regular primary school. For the testing of the hypotheses, children were placed into more specific groups according to the particular hypothesis that was being tested. It could be concluded that creativity and intelligence were not significantly related and the threshold theory was not confirmed. Additionally, intelligence accounted for less than 1% of the variance within creativity; moreover, scores on intelligence were unable to predict later creativity scores. The development of classical intelligence and classical creativity throughout grade levels also presented a different pattern; intelligence grew increasingly and continually, whereas creativity stagnated after the third grade. Finally, the ET program proved to be beneficial for classical intelligence after two years of attendance, but no effect was found for creativity. Overall, results indicate that organizations and institutions such as schools should not look solely to intelligence performance, especially when aiming to identify and foster gifted or creative individuals.
For many decades, the search for language classes that extend the
context-free laguages enough to include various languages that arise in
practice, while still keeping as many of the useful properties that
context-free grammars have - most notably cubic parsing time - has been
one of the major areas of research in formal language theory. In this thesis
we add a new family of classes to this field, namely
position-and-length-dependent context-free grammars. Our classes use the
approach of regulated rewriting, where derivations in a context-free base
grammar are allowed or forbidden based on, e.g., the sequence of rules used
in a derivation or the sentential forms, each rule is applied to. For our
new classes we look at the yield of each rule application, i.e. the
subword of the final word that eventually is derived from the symbols
introduced by the rule application. The position and length of the yield
in the final word define the position and length of the rule application and
each rule is associated a set of positions and lengths where it is allowed
to be applied.
We show that - unless the sets of allowed positions and lengths are really
complex - the languages in our classes can be parsed in the same time as
context-free grammars, using slight adaptations of well-known parsing
algorithms. We also show that they form a proper hierarchy above the
context-free languages and examine their relation to language classes
defined by other types of regulated rewriting.
We complete the treatment of the language classes by introducing pushdown
automata with position counter, an extension of traditional pushdown
automata that recognizes the languages generated by
position-and-length-dependent context-free grammars, and we examine various
closure and decidability properties of our classes. Additionally, we gather
the corresponding results for the subclasses that use right-linear resp.
left-linear base grammars and the corresponding class of automata, finite
automata with position counter.
Finally, as an application of our idea, we introduce length-dependent
stochastic context-free grammars and show how they can be employed to
improve the quality of predictions for RNA secondary structures.
The work presented in this thesis discusses the thermal and power management of multi-core processors (MCPs) with both two dimensional (2D) package and there dimensional (3D) package chips. The power and thermal management/balancing is of increasing concern and is a technological challenge to the MCP development and will be a main performance bottleneck for the development of MCPs. This thesis develops optimal thermal and power management policies for MCPs. The system thermal behavior for both 2D package and 3D package chips is analyzed and mathematical models are developed. Thereafter, the optimal thermal and power management methods are introduced.
Nowadays, the chips are generally packed in 2D technique, which means that there is only one layer of dies in the chip. The chip thermal behavior can be described by a 3D heat conduction partial differential equation (PDE). As the target is to balance the thermal behavior and power consumption among the cores, a group of one dimensional (1D) PDEs, which is derived from the developed 3D PDE heat conduction equation, is proposed to describe the thermal behavior of each core. Therefore, the thermal behavior of the MCP is described by a group of 1D PDEs. An optimal controller is designed to manage the power consumption and balance the temperature among the cores based on the proposed 1D model.
3D package is an advanced package technology, which contains at least 2 layers of dies stacked in one chip. Different from 2D package, the cooling system should be installed among the layers to reduce the internal temperature of the chip. In this thesis, the micro-channel liquid cooling system is considered, and the heat transfer character of the micro-channel is analyzed and modeled as an ordinary differential equation (ODE). The dies are discretized to blocks based on the chip layout with each block modeled as a thermal resistance and capacitance (R-C) circuit. Thereafter, the micro-channels are discretized. The thermal behavior of the whole system is modeled as an ODE system. The micro-channel liquid velocity is set according to the workload and the temperature of the dies. Under each velocity, the system can be described as a linear ODE model system and the whole system is a switched linear system. An H-infinity observer is designed to estimate the states. The model predictive control (MPC) method is employed to design the thermal and power management/balancing controller for each submodel.
The models and controllers developed in this thesis are verified by simulation experiments via MATLAB. The IBM cell 8 cores processor and water micro-channel cooling system developed by IBM Research in collaboration with EPFL and ETHZ are employed as the experiment objects.
In the present work, the phase transitions in different Fe/FeC systems were studied by using the molecular dynamics simulation and the Meyer-Entel interaction potential (also the Johnson potential for Fe-C interaction). Fe-bicrystal, thin film, Fe-C bulk and Fe-C nanowire systems were investigated to study the behaviour of the phase transition, where the energetics, dynamics and transformations pathways were analysed.
Technik ist in der heutigen Zeit allgegenwärtig. Bei all ihrer Omnipräsenz wird jedoch leicht übersehen, dass die Frage nach der Technik selber, d.h. die Frage danach, was genau unter „Technik“ überhaupt zu verstehen ist, bisher weitestgehend undeutlich geblieben ist.
Für die Philosophie erwächst daraus die Aufgabe, an dieser Stelle begriffsklärend einzugreifen.
Die vorliegende Arbeit hat zum Ziel, einen Beitrag zu einem besseren Verständnis von Technik und technischen Artefakten zu leisten. Die Argumentation gliedert sich dabei in zwei Schritte: Zuerst wird gezeigt, dass sich Technik nur in ihren Abgrenzungsverhältnissen zur Natur und zum Leben verstehen lässt und eine dem entsprechende Definition des Technikbegriffs vorgeschlagen. Anschließend wird daraus ein Verständnis technischer Artefakte im Sinne einer artefaktischen Technik abgeleitet.
Die Gliederung der Arbeit besteht dann im wesentlichen aus drei Teilen:
1. Das erste Kapitel dient der Einführung in die Problematik des Technikbegriffs:
Dabei wird in einem ersten Abschnitt auf die historische Dimension des Technikbegriffs verwiesen (1.1), anschließend die gegenwärtige Diskussion um den Technikbegriff zusammengefasst und kritisch bewertet (1.2) sowie Klassifikationen bzw. Kriterien hinsichtlich einer möglichen Definition des Technikbegriffs vorgeschlagen (1.3).
2. Das zweite Kapitel dient der Etablierung eines Technikbegriffs, der sich als semantisch abhängig von den Begriffen „Leben“ und „Natur“ erweist:
Dabei wird in einem ersten Abschnitt ein solches semantisches Verhältnis der Begriffe zueinander von anderen Möglichkeiten wechselseitiger Abgrenzung unterschieden (2.1). Sodann wird diese Abgrenzung mittels sogenannter Konstitutionsformen inhaltlich aufgefüllt (2.2). Nach der ausführlichen Erläuterung dieser Konstitutionsformen in ihrem paarweisen Zusammenhang, wird eine auf ihnen beruhende Definition von „Technik“ vorgeschlagen. In einem dritten Abschnitt wird das Modell der Konstitutionsformen um sogenannte Erschließungsformen erweitert, als diejenigen Fragehorizonte, mittels denen eine Binnendifferenzierung in verschieden Arten von Technik gelingt (2.3). In der Folge davon, wird eine Definition für eine jeweils „spezifische Technik“ vorgeschlagen.
3. Das dritte Kapitel dient der Untersuchung des ontologischen Status' technischer Artefakte:
Dabei werden technische Artefakte im Sinne einer „spezifischen Technik“ konkretisiert und damit als eine artefaktische Technik interpretiert (3.1). Anschließend wird überprüft, inwiefern sich eine solche Interpretation bezüglich a) der Frage, ob technische Artefakte natürliche Arten darstellen, bzw. b) des Problems der Koinzidenz von Objekten bewährt. Die aus diesen Überlegungen heraus gewonnenen Erkenntnisse werden abschließend in ihrer Anwendung auf Grenzfälle technischer Artefakte fruchtbar gemacht (3.2).
This thesis focuses on dealing with some new aspects of continuous time portfolio optimization by using the stochastic control method.
First, we extend the Busch-Korn-Seifried model for a large investor by using the Vasicek model for the short rate, and that problem is solved explicitly for two types of intensity functions.
Next, we justify the existence of the constant proportion portfolio insurance (CPPI) strategy in a framework containing a stochastic short rate and a Markov switching parameter. The effect of Vasicek short rate on the CPPI strategy has been studied by Horsky (2012). This part of the thesis extends his research by including a Markov switching parameter, and the generalization is based on the B\"{a}uerle-Rieder investment problem. The explicit solutions are obtained for the portfolio problem without the Money Market Account as well as the portfolio problem with the Money Market Account.
Finally, we apply the method used in Busch-Korn-Seifried investment problem to explicitly solve the portfolio optimization with a stochastic benchmark.