Article
Refine
Year of publication
Document Type
- Article (707) (remove)
Has Fulltext
- yes (707)
Keywords
- AG-RESY (42)
- PARO (30)
- SKALP (15)
- Schule (12)
- MINT (11)
- Mathematische Modellierung (11)
- Stadtplanung (9)
- Denkmäler (8)
- HANDFLEX (8)
- Monitoring (8)
Faculty / Organisational entity
- Kaiserslautern - Fachbereich Maschinenbau und Verfahrenstechnik (145)
- Kaiserslautern - Fachbereich Informatik (134)
- Kaiserslautern - Fachbereich Physik (100)
- Kaiserslautern - Fachbereich Mathematik (82)
- Kaiserslautern - Fachbereich Sozialwissenschaften (53)
- Kaiserslautern - Fachbereich Biologie (47)
- Kaiserslautern - Fachbereich Chemie (39)
- Kaiserslautern - Fachbereich Raum- und Umweltplanung (27)
- Kaiserslautern - Fachbereich Elektrotechnik und Informationstechnik (26)
- Kaiserslautern - Fachbereich Bauingenieurwesen (22)
Algorithms increasingly govern people's lives, including through rapidly spreading applications in the public sector. This paper sheds light on acceptance of algorithms used by the public sector emphasizing that algorithms, as parts of socio-technical systems, are always embedded in a specific social context. We show that citizens' acceptance of an algorithm is strongly shaped by how they evaluate aspects of this context, namely the personal importance of the specific problems an algorithm is supposed to help address and their trust in the organizations deploying the algorithm. The objective performance of presented algorithms affects acceptance much less in comparison. These findings are based on an original dataset from a survey covering two real-world applications, predictive policing and skin cancer prediction, with a sample of 2661 respondents from a representative German online panel. The results have important implications for the conditions under which citizens will accept algorithms in the public sector.
The development of algorithmic differentiation (AD) tools focuses mostly on handling floating point types in the target language. Taping optimizations in these tools mostly focus on specific operations like matrix vector products. Aggregated types like std::complex are usually handled by specifying the AD type as a template argument. This approach provides exact results, but prevents the use of expression templates. If AD tools are extended and specialized such that aggregated types can be added to the expression framework, then this will result in reduced memory utilization and improve the timing for applications where aggregated types such as complex number or matrix vector operations are used. Such an integration requires a reformulation of the stored data per expression and a rework of the tape evaluation process. We will demonstrate the overheads on a synthetic benchmark and show the improvement when aggregated types are handled properly by the expression framework of the AD tool.
A stereoselective synthesis of isoindolo[2,1-a]quinolin-11(5H)-ones containing three contiguous stereogenic centers is described. This Lewis-acid mediated reaction of enamides with N-aryl-acylimines affords the desired fused heterocyclic isoindolinones in high yields and diastereoselectivities. Scope and limitations of this method are discussed. The stereochemical outcome of this transformation indicates a stepwise reaction pathway.
The measurement of self-diffusion coefficients using pulsed-field gradient (PFG) nuclear magnetic resonance (NMR) spectroscopy is a well-established method. Recently, benchtop NMR spectrometers with gradient coils have also been used, which greatly simplify these measurements. However, a disadvantage of benchtop NMR spectrometers is the lower resolution of the acquired NMR signals compared to high-field NMR spectrometers, which requires sophisticated analysis methods. In this work, we use a recently developed quantum mechanical (QM) model-based approach for the estimation of self-diffusion coefficients from complex benchtop NMR data. With the knowledge of the species present in the mixture, signatures for each species are created and adjusted to the measured NMR signal. With this model-based approach, the self-diffusion coefficients of all species in the mixtures were estimated with a discrepancy of less than 2 % compared to self-diffusion coefficients estimated from high-field NMR data sets of the same mixtures. These results suggest benchtop NMR is a reliable tool for quantitative analysis of self-diffusion coefficients, even in complex mixtures.
We compute three-dimensional displacement vector fields to estimate the deformation of microstructural data sets in mechanical tests. For this, we extend the well-known optical flow by Brox et al. to three dimensions, with special focus on the discretization of nonlinear terms. We evaluate our method first by synthetically deforming foams and comparing against this ground truth and second with data sets of samples that underwent real mechanical tests. Our results are compared to those from state-of-the-art algorithms in materials science and medical image registration. By a thorough evaluation, we show that our proposed method is able to resolve the displacement best among all chosen comparison methods.
We study the sensor fault estimation and accommodation problems in a data-driven \(\mathcal{H}_\infty\) setting, leading to a data-driven sensor fault-tolerant control scheme. First, we formulate the fault estimation problem as a finite-horizon minimax \(\mathcal{H}_\infty\)-optimization problem in a data-driven setup, whose solution yields the fault estimate. The estimated fault is then used for output compensation. This compensated output and the experimental input are used to achieve certain control objectives in a data-driven \(\mathcal{H}_\infty\) setting. Next, the data-driven \(\mathcal{H}_\infty\) fault estimation and control problems are solved using a subspace predictor-based approach. Finally, the proposed algorithm is applied to the steering subsystem of the remotely operated underwater vehicle.
Opposition parties under minority governments find themselves in a fundamental dilemma. They are competing with other parties, including the government, for electoral support while also having a common responsibility to make stable government work. This dilemma is especially pronounced for opposition parties signing support agreements with the government. While not formally in a coalition, they nonetheless publicly commit to supporting a government. They may thus be concerned about losing distinctiveness and have an interest in strategically timing cooperation with the minority government. The present paper tests whether this is the case using data on opposition party voting on committee proposals from 23 years of Swedish minority governments between 1991 and 2018. The findings indicate that support parties are less likely to support the government towards the beginning and end of the election cycle, that is, when public attention is intense – a pattern that is not observable for other opposition parties.
With direct laser writing micro structures can be manufactured by solidifying a photo resist when the laser beam triggers a photochemical reaction in the focal voxel. We have used direct laser writing to fabricate a thermally actuated microgripper, which can move its two cantilever like arms to grip micro-objects. One cantilever consists thereby of two strips with different coefficients of thermal expansion such that both cantilevers bends towards each other for an increasing temperature like a welded bimetal.This work investigates the impact of each cantilever's geometry on the gripping performance of the micro gripper theoretically. The tip deflection of the gripper is calculated by the analytical model of Timoshenko's theory of elasticity. After fabricaiton of the microgripper, its gripping performance is observed under the microscope while heated by a heating element.
The quality of risk reports: Integrating requirement levels of standard setters into text analysis
(2021)
The intention of this paper is to shed light on the analysis of financial disclosure through the integration of requirement levels. This in return will lead to the development of a general applicable evaluation methodology based on Bloom's taxonomy system. Therefore, it will be possible to explicitly consider the relevance of the given information. To underline the appropriateness of our method, we combine the requirement levels with a qualitative content analysis. Based on the German accounting standard DRS 20, we clarify the respective application of the requirement levels in the context of the qualitative content analysis. Hence, we will discuss the limitations of our developed approach. In addition, we analyze further areas of application in the context of qualitative analysis of financial disclosure. All things considered, it is evident that our chosen approach, through the integration of a taxonomy system, contributes to the validity of established text analyzing methods.
Firn describes the interstage product between snow and ice in cold regions of the earth, where annual snow fall exceeds the amount of snow melting. The continuing accumulation of snow leads to its densificiation due to overburden stress until it becomes ice. In the field of glaciology various attempts on simulating firn densification have been made and new models are still developed, as the knowledge of the firn column's density structure allows important derivations.
The presented study reassesses a model description for low density firn based on the process of grain boundary sliding presented by Alley in 1987 [1] using an optimisation approach. By comparing simulation results to 159 measured firn density profiles from Greenland and Antarctica it finds a possible additional dependency of the constitutive relation on the mean surface mass balance. This result is interpreted as an insufficient description of the stress regime.