Refine
Year of publication
- 2015 (22) (remove)
Document Type
- Doctoral Thesis (19)
- Article (1)
- Bachelor Thesis (1)
- Conference Proceeding (1)
Keywords
- verification (2)
- Automat <Automatentheorie> (1)
- Autonomer Roboter (1)
- Endlicher Automat (1)
- Entwurf (1)
- Experiment (1)
- Experimentation (1)
- Functional Safety (1)
- Funktionale Sicherheit (1)
- Gefahren- und Risikoanalyse (1)
Faculty / Organisational entity
- Fachbereich Informatik (22) (remove)
There are a number of designs for an online advertising system that allow for behavioral targeting without revealing user online behavior or user interest profiles to the ad network. Although these designs purport to be practical solutions, none of them adequately consider the role of ad auctions, which today are central to the operation of online advertising systems. Moreover, none of the proposed designs have been deployed in real-life settings. In this thesis, we present an effort to fill this gap. First, we address the challenge of running ad auctions that leverage user profiles while keeping the profile information private. We define the problem, broadly explore the solution space, and discuss the pros and cons of these solutions. We analyze the performance of our solutions using data from Microsoft Bing advertising auctions. We conclude that, while none of our auctions are ideal in all respects, they are adequate and practical solutions. Second, we build and evaluate a fully functional prototype of a practical privacy-preserving ad system at a reasonably large scale. With more than 13K opted-in users, our system was in operation for over two months serving an average of 4800 active users daily. During the last month alone, we registered 790K ad views, 417 clicks, and even a small number of product purchases. Our system obtained click-through rates comparable with those for Google display ads. In addition, our prototype is equipped with a differentially private analytics mechanism, which we used as the primary means for gathering experimental data. In this thesis, we describe our first-hand experience and lessons learned in running the world's first fully operational “private-by-design” behavioral advertising and analytics system.
Optimal Multilevel Monte Carlo Algorithms for Parametric Integration and Initial Value Problems
(2015)
We intend to find optimal deterministic and randomized algorithms for three related problems: multivariate integration, parametric multivariate integration, and parametric initial value problems. The main interest is concentrated on the question, in how far randomization affects the precision of an approximation. We want to understand when and to which extent randomized algorithms are superior to deterministic ones.
All problems are studied for Banach space valued input functions. The analysis of Banach space valued problems is motivated by the investigation of scalar parametric problems; these can be understood as particular cases of Banach space valued problems. The gain achieved by randomization depends on the underlying Banach space.
For each problem, we introduce deterministic and randomized algorithms and provide the corresponding convergence analysis.
Moreover, we also provide lower bounds for the general Banach space valued settings, and thus, determine the complexity of the problems. It turns out that the obtained algorithms are order optimal in the deterministic setting. In the randomized setting, they are order optimal for certain classes of Banach spaces, which includes the L_p spaces and any finite dimensional Banach space. For general Banach spaces, they are optimal up to an arbitrarily small gap in the order of convergence.
This dissertation focuses on the visualization of urban microclimate data sets,
which describe the atmospheric impact of individual urban features. The application
and adaptation of visualization and analysis concepts to enhance the
insight into observational data sets used this specialized area are explored, motivated
through application problems encountered during active involvement
in urban microclimate research at the Arizona State University in Tempe, Arizona.
Besides two smaller projects dealing with the analysis of thermographs
recorded with a hand-held device and visualization techniques used for building
performance simulation results, the main focus of the work described in
this document is the development of a prototypic tool for the visualization
and analysis of mobile transect measurements. This observation technique involves
a sensor platform mounted to a vehicle, which is then used to traverse
a heterogeneous neighborhood to investigate the relationships between urban
form and microclimate. The resulting data sets are among the most complex
modes of in-situ observations due to their spatio-temporal dependence, their
multivariate nature, but also due to the various error sources associated with
moving platform observations.
The prototype enables urban climate researchers to preprocess their data,
to explore a single transect in detail, and to aggregate observations from multiple
traverses conducted over diverse routes for a visual delineation of climatic
microenvironments. Extending traditional analysis methods, the suggested visualization
tool provides techniques to relate the measured attributes to each
other and to the surrounding land cover structure. In addition to that, an
improved method for sensor lag correction is described, which shows the potential
to increase the spatial resolution of measurements conducted with slow
air temperature sensors.
In summary, the interdisciplinary approach followed in this thesis triggers
contributions to geospatial visualization and visual analytics, as well as to urban
climatology. The solutions developed in the course of this dissertation are
meant to support domain experts in their research tasks, providing means to
gain a qualitative overview over their specific data sets and to detect patterns,
which can then be further analyzed using domain-specific tools and methods.
Since their invention in the 1980s, behaviour-based systems have become very popular among roboticists. Their component-based nature facilitates the distributed implementation of systems, fosters reuse, and allows for early testing and integration. However, the distributed approach necessitates the interconnection of many components into a network in order to realise complex functionalities. This network is crucial to the correct operation of the robotic system. There are few sound design techniques for behaviour networks, especially if the systems shall realise task sequences. Therefore, the quality of the resulting behaviour-based systems is often highly dependant on the experience of their developers.
This dissertation presents a novel integrated concept for the design and verification of behaviour-based systems that realise task sequences. Part of this concept is a technique for encoding task sequences in behaviour networks. Furthermore, the concept provides guidance to developers of such networks. Based on a thorough analysis of methods for defining sequences, Moore machines have been selected for representing complex tasks. With the help of the structured workflow proposed in this work and the developed accompanying tool support, Moore machines defining task sequences can be transferred automatically into corresponding behaviour networks, resulting in less work for the developer and a lower risk of failure.
Due to the common integration of automatically and manually created behaviour-based components, a formal analysis of the final behaviour network is reasonable. For this purpose, the dissertation at hand presents two verification techniques and justifies the selection of model checking. A novel concept for applying model checking to behaviour-based systems is proposed according to which behaviour networks are modelled as synchronised automata. Based on such automata, properties of behaviour networks that realise task sequences can be verified or falsified. Extensive graphical tool support has been developed in order to assist the developer during the verification process.
Several examples are provided in order to illustrate the soundness of the presented design and verification techniques. The applicability of the integrated overall concept to real-world tasks is demonstrated using the control system of an autonomous bucket excavator. It can be shown that the proposed design concept is suitable for developing complex sophisticated behaviour networks and that the presented verification technique allows for verifying real-world behaviour-based systems.
The last couple of years have marked the entire field of information technology with the introduction of a new global resource, called data. Certainly, one can argue that large amounts of information and highly interconnected and complex datasets were available since the dawn of the computer and even centuries before. However, it has been only a few years since digital data has exponentially expended, diversified and interconnected into an overwhelming range of domains, generating an entire universe of zeros and ones. This universe represents a source of information with the potential of advancing a multitude of fields and sparking valuable insights. In order to obtain this information, this data needs to be explored, analyzed and interpreted.
While a large set of problems can be addressed through automatic techniques from fields like artificial intelligence, machine learning or computer vision, there are various datasets and domains that still rely on the human intuition and experience in order to parse and discover hidden information. In such instances, the data is usually structured and represented in the form of an interactive visual representation that allows users to efficiently explore the data space and reach valuable insights. However, the experience, knowledge and intuition of a single person also has its limits. To address this, collaborative visualizations allow multiple users to communicate, interact and explore a visual representation by building on the different views and knowledge blocks contributed by each person.
In this dissertation, we explore the potential of subjective measurements and user emotional awareness in collaborative scenarios as well as support flexible and user- centered collaboration in information visualization systems running on tabletop displays. We commence by introducing the concept of user-centered collaborative visualization (UCCV) and highlighting the context in which it applies. We continue with a thorough overview of the state-of-the-art in the areas of collaborative information visualization, subjectivity measurement and emotion visualization, combinable tabletop tangibles, as well as browsing history visualizations. Based on a new web browser history visualization for exploring user parallel browsing behavior, we introduce two novel user-centered techniques for supporting collaboration in co-located visualization systems. To begin with, we inspect the particularities of detecting user subjectivity through brain-computer interfaces, and present two emotion visualization techniques for touch and desktop interfaces. These visualizations offer real-time or post-task feedback about the users’ affective states, both in single-user and collaborative settings, thus increasing the emotional self-awareness and the awareness of other users’ emotions. For supporting collaborative interaction, a novel design for tabletop tangibles is described together with a set of specifically developed interactions for supporting tabletop collaboration. These ring-shaped tangibles minimize occlusion, support touch interaction, can act as interaction lenses, and describe logical operations through nesting operations. The visualization and the two UCCV techniques are each evaluated individually capturing a set of advantages and limitations of each approach. Additionally, the collaborative visualization supported by the two UCCV techniques is also collectively evaluated in three user studies that offer insight into the specifics of interpersonal interaction and task transition in collaborative visualization. The results show that the proposed collaboration support techniques do not only improve the efficiency of the visualization, but also help maintain the collaboration process and aid a balanced social interaction.
Component fault trees that contain safety basic events as well as security basic events cannot be analyzed like normal CFTs. Safety basic events are rated with probabilities in an interval [0,1], for security basic events simpler scales such as \{low, medium, high\} make more sense. In this paper an approach is described how to handle a quantitative safety analysis with different rating schemes for safety and security basic events. By doing so, it is possible to take security causes for safety failures into account and to rate their effect on system safety.
In a networked system, the communication system is indispensable but often the weakest link w.r.t. performance and reliability. This, particularly, holds for wireless communication systems, where the error- and interference-prone medium and the character of network topologies implicate special challenges. However, there are many scenarios of wireless networks, in which a certain quality-of-service has to be provided despite these conditions. In this regard, distributed real-time systems, whose realization by wireless multi-hop networks becomes increasingly popular, are a particular challenge. For such systems, it is of crucial importance that communication protocols are deterministic and come with the required amount of efficiency and predictability, while additionally considering scarce hardware resources that are a major limiting factor of wireless sensor nodes. This, in turn, does not only place demands on the behavior of a protocol but also on its implementation, which has to comply with timing and resource constraints.
The first part of this thesis presents a deterministic protocol for wireless multi-hop networks with time-critical behavior. The protocol is referred to as Arbitrating and Cooperative Transfer Protocol (ACTP), and is an instance of a binary countdown protocol. It enables the reliable transfer of bit sequences of adjustable length and deterministically resolves contest among nodes based on a flexible priority assignment, with constant delays, and within configurable arbitration radii. The protocol's key requirement is the collision-resistant encoding of bits, which is achieved by the incorporation of black bursts. Besides revisiting black bursts and proposing measures to optimize their detection, robustness, and implementation on wireless sensor nodes, the first part of this thesis presents the mode of operation and time behavior of ACTP. In addition, possible applications of ACTP are illustrated, presenting solutions to well-known problems of distributed systems like leader election and data dissemination. Furthermore, results of experimental evaluations with customary wireless transceivers are outlined to provide evidence of the protocol's implementability and benefits.
In the second part of this thesis, the focus is shifted from concrete deterministic protocols to their model-driven development with the Specification and Description Language (SDL). Though SDL is well-established in the domain of telecommunication and distributed systems, the predictability of its implementations is often insufficient as previous projects have shown. To increase this predictability and to improve SDL's applicability to time-critical systems, real-time tasks, an approved concept in the design of real-time systems, are transferred to SDL and extended to cover node-spanning system tasks. In this regard, a priority-based execution and suspension model is introduced in SDL, which enables task-specific priority assignments in the SDL specification that are orthogonal to the static structure of SDL systems and control transition execution orders on design as well as on implementation level. Both the formal incorporation of real-time tasks into SDL and their implementation in a novel scheduling strategy are discussed in this context. By means of evaluations on wireless sensor nodes, evidence is provided that these extensions reduce worst-case execution times substantially, and improve the predictability of SDL implementations and the language's applicability to real-time systems.
In the digital era we live in, users can access an abundance of digital resources in their daily life. These digital resources can be located on the user's devices, in traditional repositories such as intranets or digital libraries, but also in open environments such as the World Wide Web.
To be able to efficiently work with this abundance of information, users need support to get access to the resources that are relevant to them. Access to digital resources can be supported in various ways. Whether we talk about technologies for browsing, searching, filtering, ranking, or recommending resources: what they all have in common is that they depend on the available information (i.e., resources and metadata). The accessibility of digital resources that meet a user's information need, and the existence and quality of metadata is crucial for the success of any information system.
This work focuses on how social media technologies can support the access to digital resources. In contrast to closed and controlled environments where only selected users have the rights to contribute digital resources and metadata, and where this contribution involves a social process of formal agreement of the relevant stakeholders, potentially any user can easily create and provide information in social media environments. This usually leads to a larger variety of resources and metadata, and allows for dynamics that would otherwise hardly be possible.
Most information systems still mainly rely on traditional top-down approaches where only selected stakeholders can contribute information. The main idea of this thesis is an approach that allows for introducing the characteristics of social media environments in such traditional contexts. The requirements for such an approach are being examined, as well as the benefits and potentials it can provide.
The ALOE infrastructure was developed according to the identified requirements and realises a Social Resource and Metadata Hub. Case studies and evaluation results are provided to show the impact of the approach on the user's behaviours and the creation of digital resources and metadata, and to justify the presented approach.
Maintaining complex software systems tends to be a costly activity where software engineers spend a significant amount of time trying to understand the system's structure and behavior. As early as the 1980s, operation and maintenance costs were already twice as expensive as the initial development costs incurred. Since then these costs have steadily increased. The focus of this thesis is to reduce these costs through novel interactive exploratory visualization concepts and to apply these modern techniques in the context of services offered by software quality analysis.
Costs associated with the understanding of software are governed by specific features of the system in terms of different domains, including re-engineering, maintenance, and evolution. These features are reflected in software measurements or inner qualities such as extensibility, reusability, modifiability, testability, compatability, or adatability. The presence or absence of these qualities determines how easily a software system can conform or be customized to meet new requirements. Consequently, the need arises to monitor and evaluate the qualitative state of a software system in terms of these qualities. Using metrics-based analysis, production costs and quality defects of the software can be recorded objectively and analyzed.
In practice, there exist a number of free and commercial tools that analyze the inner quality of a software system through the use of software metrics. However, most of these tools focus on software data mining and metrics (computational analysis) and only a few support visual analytical reasoning. Typically, computational analysis tools generate data and software visualization tools facilitate the exploration and explanation of this data through static or interactive visual representations. Tools that combine these two approaches focus only on well-known metrics and lack the ability to examine user defined metrics. Further, they are often confined to simple visualization methods and metaphors, including charts, histograms, scatter plots, and node-link diagrams.
The goal of this thesis is to develop methodologies that combine computational analysis methods together with sophisticated visualization methods and metaphors through an interactive visual analysis approach. This approach promotes an iterative knowledge discovery process through multiple views of the data where analysts select features of interest in one of the views and inspect data items of the select subset in all of the views. On the one hand, we introduce a novel approach for the visual analysis of software measurement data that captures complete facts of the system, employs a flow-based visual paradigm for the specification of software measurement queries, and presents measurement results through integrated software visualizations. This approach facilitates the on-demand computation of desired features and supports interactive knowledge discovery - the analyst can gain more insight into the data through activities that involve: building a mental model of the system; exploring expected and unexpected features and relations; and generating, verifying, or rejecting hypothesis with visual tools. On the other hand, we have also extended existing tools with additional views of the data for the presentation and interactive exploration of system artifacts and their inter-relations.
Contributions of this thesis have been integrated into two different prototype tools. First evaluations of these tools show that they can indeed improve the understanding of large and complex software systems.
Industrial design has a long history. With the introduction of Computer-Aided Engineering, industrial design was revolutionised. Due to the newly found support, the design workflow changed, and with the introduction of virtual prototyping, new challenges arose. These new engineering problems have triggered
new basic research questions in computer science.
In this dissertation, I present a range of methods which support different components of the virtual design cycle, from modifications of a virtual prototype and optimisation of said prototype, to analysis of simulation results.
Starting with a virtual prototype, I support engineers by supplying intuitive discrete normal vectors which can be used to interactively deform the control mesh of a surface. I provide and compare a variety of different normal definitions which have different strengths and weaknesses. The best choice depends on
the specific model and on an engineer’s priorities. Some methods have higher accuracy, whereas other methods are faster.
I further provide an automatic means of surface optimisation in the form of minimising total curvature. This minimisation reduces surface bending, and therefore, it reduces material expenses. The best results can be obtained for analytic surfaces, however, the technique can also be applied to real-world examples.
Moreover, I provide engineers with a curvature-aware technique to optimise mesh quality. This helps to avoid degenerated triangles which can cause numerical issues. It can be applied to any component of the virtual design cycle: as a direct modification of the virtual prototype (depending on the surface defini-
tion), during optimisation, or dynamically during simulation.
Finally, I have developed two different particle relaxation techniques that both support two components of the virtual design cycle. The first component for which they can be used is discretisation. To run computer simulations on a model, it has to be discretised. Particle relaxation uses an initial sampling,
and it improves it with the goal of uniform distances or curvature-awareness. The second component for which they can be used is the analysis of simulation results. Flow visualisation is a powerful tool in supporting the analysis of flow fields through the insertion of particles into the flow, and through tracing their movements. The particle seeding is usually uniform, e.g. for an integral surface, one could seed on a square. Integral surfaces undergo strong deformations, and they can have highly varying curvature. Particle relaxation redistributes the seeds on the surface depending on surface properties like local deformation or curvature.