Refine
Year of publication
- 1997 (97) (remove)
Document Type
- Preprint (66)
- Article (17)
- Report (9)
- Doctoral Thesis (2)
- Diploma Thesis (1)
- Master's Thesis (1)
- Periodical (1)
Keywords
- AG-RESY (7)
- PARO (7)
- SKALP (2)
- Anisotropic smoothness classes (1)
- Bayesrisiko (1)
- Bewegungsplanung (1)
- Brownian motion (1)
- C (1)
- CAx-Anwendungen (1)
- CODET (1)
- CoMo-Kit (1)
- Dense gas (1)
- Diffusionsprozess (1)
- Elliptic-parabolic equation (1)
- Enskog equation (1)
- Function of bounded variation (1)
- Integral transform (1)
- Intelligent Object Fusion (1)
- Internet knowledge base (1)
- Internet knowledge reuse (1)
- Jacobian (1)
- Java (1)
- Kohonen's SOM (1)
- Laplace transform (1)
- Locally stationary processes (1)
- Moment sequence (1)
- Netz-Architekturen (1)
- Netzwerkmanagement (1)
- Neural networks (1)
- PVM (1)
- Panel clustering (1)
- Parallel Virtual Machines (1)
- Robotik (1)
- Scalar-type operator (1)
- Software Agents (1)
- Stieltjes transform (1)
- Suchve (1)
- Tcl (1)
- Workstation-Cluster (1)
- adaptive estimation (1)
- asymptotic analysis (1)
- authentication (1)
- automated theorem proving (1)
- autonomous systems (1)
- average density (1)
- business process reengineering (1)
- byte code (1)
- compact operator equation (1)
- density distribution (1)
- drift-diffusion limit (1)
- dynamical systems (1)
- entropy (1)
- finite pointset method (1)
- finite-difference methods (1)
- higher-order calculi (1)
- interpreter (1)
- kinetic semiconductor equations (1)
- kinetic theory (1)
- lacunarity distribution (1)
- local stationarity (1)
- localization (1)
- logarithmic averages (1)
- migration (1)
- minimax estimation (1)
- motion planning (1)
- multi-language (1)
- mutiresolution (1)
- non-linear wavelet thresholding (1)
- non-stationary time series (1)
- numerical integration (1)
- numerical methods for stiff equations (1)
- object-oriented software modeling (1)
- occupation measure (1)
- one-dimensional self-organization (1)
- optimal rate of convergence (1)
- order-three density (1)
- parallel algorithms (1)
- parallel numerical algorithms (1)
- parallel processing (1)
- parallelism and concurrency (1)
- particle method (1)
- persistence (1)
- phase-space (1)
- porous media (1)
- quantum chaos (1)
- quantum mechanics (1)
- quantum tunneling (1)
- regularization wavelets (1)
- review (1)
- robot control (1)
- robot kinematics (1)
- robotics (1)
- security domain (1)
- semiclassical quantisation (1)
- shock wave (1)
- software reuse (1)
- spline and wavelet based determination of the geoid and the gravitational potential (1)
- stationarity (1)
- tensor product basis (1)
- test (1)
- threshold choice (1)
- time series (1)
- time-frequency plan (1)
- time-varying covariance (1)
- wavelet thresholding (1)
- wavelets (1)
- winner definition (1)
Faculty / Organisational entity
- Kaiserslautern - Fachbereich Mathematik (36)
- Kaiserslautern - Fachbereich Informatik (35)
- Kaiserslautern - Fachbereich Physik (18)
- Kaiserslautern - Fachbereich Maschinenbau und Verfahrenstechnik (4)
- Kaiserslautern - Fachbereich Wirtschaftswissenschaften (3)
- Kaiserslautern - Fachbereich Elektrotechnik und Informationstechnik (1)
Retrieving multiple cases is supposed to be an adequate retrieval strategy for guiding partial-order planners because of the recognized flexibility of these planners to interleave steps in the plans. Cases are combined by merging them. In this paper, we will examine two different kinds of merging cases in the context of partial-order planning. We will see that merging cases can be very difficult if the cases are merged eagerly. On the other hand, if cases are merged by avoiding redundant steps, the guidance of the additional cases tends to decrease with the number of covered goals and retrieved cases in domains having a certain kind of interactions. Thus, to retrieve a single case covering many of the goals of the problem or to retrieve fewer cases covering many of the goals is at least equally effective as to retrieve several cases covering all goals in these domains.
Estelle is an internationally standardized formal description technique (FDT) designed for the specification of distributed systems, in particular communication protocols. An Estelle specification describes a system of communicating components (module instances). The specified system is closed in a topological sense, i.e. it has no ability to interact with some environment. Because of this restriction, open systems can only be specified together with and incorporated with an environment. To overcome this restriction, we introduce a compatible extension of Estelle, called "Open Estelle". It allows the specification of (topologically) open systems, i.e. systems that have the ability to communicate with any environment through a well-defined external interface. We define aformal syntax and a formal semantics for Open Estelle, both based on and extending the syntax and semantics of Estelle. The extension is compatible syntactically and semantically, i.e. Estelle is a subset of Open Estelle. In particular, the formal semantics of Open Estelle reduces to the Estelle semantics in the special case of a closed system. Furthermore, we present a tool for the textual integration of open systems into environments specified in Open Estelle, and a compiler for the automatic generation of implementations directly from Open Estelle specifications.
It is of basic interest to assess the quality of the decisions of a statistician, based on the outcoming data of a statistical experiment, in the context of a given model class P of probability distributions. The statistician picks a particular distribution P , suffering a loss by not picking the 'true' distribution P' . There are several relevant loss functions, one being based on the the relative entropy function or Kullback Leibler information distance. In this paper we prove a general 'minimax risk equals maximin (Bayes) risk' theorem for the Kullback Leibler loss under the hypothesis of a dominated and compact family of distributions over a Polish observation space with suitably integrable densities. We also find that there is always an optimal Bayes strategy (i.e. a suitable prior) achieving the minimax value. Further, we see that every such minimax optimal strategy leads to the same distribution P in the convex closure of the model class. Finally, we give some examples to illustrate the results and to indicate, how the minimax result reflects in the structure of least favorable priors. This paper is mainly based on parts of this author's doctorial thesis.
The Multiple Objective Median Problem involves locating a new facility so that a vector of performance criteria is optimized over a given set of existing facilities. A variation of this problem is obtained if the existing facilities are situated on two sides of a linear barrier. Such barriers like rivers, highways, borders, or mountain ranges are frequently encountered in practice. In this paper, theory of the Multiple Objective Median Problem with line barriers is developped. As this problem is nonconvex but specially-structured, a reduction to a series of convex optimization problems is proposed. The general results lead to a polynomial algorithm for finding the set of efficient solutions. The algorithm is proposed for bi-criteria problems with different measures of distance.
An asymptotic-induced scheme for kinetic semiconductor equations with the diffusion scaling is developed. The scheme is based on the asymptotic analysis of the kinetic semiconductor equation. It works uniformly for all ranges of mean free paths. The velocity discretization is done using quadrature points equivalent to a moment expansion method. Numerical results for different physical situations are presented.
Due to continuously increasing demands in the area of advanced robot control, it became necessary to speed up the computation. One way to reduce the computation time is to distribute the computation onto several processing units. In this survey we present different approaches to parallel computation of robot kinematics and Jacobian. Thereby, we discuss both the forward and the reverse problem. We introduce a classification scheme and classify the references by this scheme.
We develop a test for stationarity of a time series against the alternative of a time-changing covariance structure. Using localized versions of the periodogram, we obtain empirical versions of a reasonable notion of a time-varying spectral density. Coefficients w.r.t. a Haar wavelet series expansion of such a time-varying periodogram are a possible indicator whether there is some deviation from covariance stationarity. We propose a test based on the limit distribution of these empirical coefficients.
We report on Brillouin light scattering investigations of the elastic properties in Co/Ni superlattices which exhibit localized electronic eigenstates near the Fermi level causing an oscillation of the resistivity as a function of the superlattice periodicity A. No oscillations of the Rayleigh and Sezawa mode as a function of A could be observed within an error margin of +- 2% indicating that the localized electronic states do not contribute to the elastic constants.
This paper shows an approach to profit from type information about planning objects in a partial-order planner. The approach turns out to combine representational and computational advantages. On the one hand, type hierarchies allow better structuring of domain specifications. On the other hand, operators contain type constraints which reduce the search space of the planner as they partially achieve the functionality of filter conditions.
This paper is a continuation of a joint paper with B. Martin [MS] dealing with the problem of direct sum decompositions. The techniques of that paper areused to decide wether two modules are isomorphic or not. An positive answer to this question has many applications - for example for the classification ofmaximal Cohen-Macaulay module over local algebras as well as for the study of projective modules. Up to now computer algebra is normally dealing withequality of ideals or modules which depends on chosen embeddings. The present algorithm allows to switch to isomorphism classes which is more natural inthe sense of commutative algebra and algebraic geometry.
An asymptotic-induced scheme for nonstationary transport equations with thediffusion scaling is developed. The scheme works uniformly for all ranges ofmean free paths. It is based on the asymptotic analysis of the diffusion limit ofthe transport equation. A theoretical investigation of the behaviour of thescheme in the diffusion limit is given and an approximation property is proven.Moreover, numerical results for different physical situations are shown and atheuniform convergence of the scheme is established numerically.
This paper describes an Internet-scalable knowledge base infrastructure for managing the knowledge used by an in-telligent software productivity infrastructure system. The infrastructure provides workable solutions for several significant issues: (1) Internetunique names for pieces of knowledge; (2) multi-platform, multi-language support; (3) distributed knowledge base synchronization mechanisms; (4) support for extensive customized variations in knowledge content, and (5) knowledge caching mechanisms for improved system performance. The infrastructure described here is a workable example of the kind of infrastructure that will be required to manage the evolution and reuse of millions of pieces of knowledge in the future.
Static magnetic and spin wave properties of square lattices of permalloy micron dots with thicknesses of 500 Å and 1000 Å and with varying dot separations have been investigated. A magnetic fourfold anisotropy was found for the lattice with dot diameters of 1 micrometer and a dot separation of 0.1 micrometer. The anisotropy is attributed to an anisotropic dipole-dipole interaction between magnetically unsaturated parts of the dots. The anisotropy strength (order of 100000 erg/cm^3 ) decreases with increasing in-plane applied magnetic field.
Annual Report
(1997)
Fabric reinforced thermoplastic composites, suitable for the production of thin-walled, high
strength structural parts, are available on the market today with various fibre/matrix combinations.
However, further market penetration and series production are inhibited as long as forming
technologies are not well understood. In this thesis, the potential for series production of different
forming technologies is evaluated. Stamp forming is an efficient way to produce parts in
very short cycle times. A limiting factor to part complexibilty is undesired wrinkle formation as
a consequence of insufficient fabric shear. Fabric shear and other important deformations of impregnated
fabrics were examined by means of new test devices. Evidence was found that membrane
tension is the crucial factor to avoid wrinkle formation. New tool concepts and processing
Windows were developed to produce fabric reinforced thermoplastic parts free of wrinkles and
distortions.
An unusual interlayer coupling, recently discovered in layered magnetic systems, is analysed from the experimental and theoretical points of view. This coupling favours the 90 orientation of the magnetization of the adjacent magnetic films. It can be phenomenologically described by a term in the energy expression, which is biquadratic with respect to the magnetizations of the two films. The main experimental findings, as well as the theoretical models, explaining the phenomenon are discussed.
Brillouin light scattering investigations of exchange biased (110)-oriented NiFe/FeMn bilayers
(1997)
All contributing magnetic anisotropies in (110)-oriented exchange biased Ni 80 Fe 20 /Fe 50 Mn 50 double layers prepared by molecular beam epitaxy on Cu(110) single crystals have been determined by means of Brillouin light scattering. Upon covering the Ni 80 Fe 20 films by Fe 50 Mn 50 , a unidirectional anisotropy contribution appears, which is consistent with the measured exchange bias field. The uniaxial and fourfold in-plane anisotropy contributions are largely modified by an amount, which scales with the Ni 80 Fe 20 thickness, indicating an interface effect. The strong uniaxial anisotropy contribution shows an in-plane switching of the easy axis from [110] to [001] with increasing Ni 80 Fe 20 -layer thickness. The large mode width of the spin wave excitations, which exceeds the linewidth of uncovered Ni 80 Fe 20 films by a factor of more than six, indicates large spatial variations of the exchange coupling constant. (C) 1998 American Institute of Physics.
The tunneling splitting of the energy levels of a ferromagnetic particle in the presence of an applied magnetic field - previously derived only for the ground state with the path integral method - is obtained in a simple way from Schr"odinger theory. The origin of the factors entering the result is clearly understood, in particular the effect of the asymmetry of the barriers of the potential. The method should appeal particularly to experimentalists searching for evidence of macroscopic spin tunneling.
An analogue of the classical Riemann-Siegel integral formula for Dirichlet series associated to cusp forms is developed. As an application of the formula, we give a comparatively simple proof of the approximate functional equation for this type of Dirichlet series.
We compare different notions of differentiability of a measure along a vector field on a locally convex space. We consider in the \(L^2\)-space of a differentiable measure the analoga of the classical concepts of gradient, divergence and Laplacian (which coincides with the Ornstein-Uhlenbeck
operator in the Gaussian case). We use these operators for the extension of the basic results of Malliavin and Stroock on the smoothness of finite dimensional image measures under certain nonsmooth mappings to the case of non-Gaussian measures. The proof of this extension is quite direct and does not use any Chaos-decomposition. Finally, the role of this Laplacian in the
procedure of quantization of anharmonic oscillators is discussed.
The first observation of self-focusing of dipolar spin waves in garnet film media is reported. In particular, we show that the quasi-stationary diffraction of a finite-aperture spin wave beam in a focusing medium leads to the concentration of the wave power in one focal point rather than along a certain line (channel). The obtained results demonstrate the wide applicability of non-linear spin wave media to study non-linear wave phenomena using an advanced combined microwave-Brillouin light scattering technique for a two-dimensional mapping of the spin wave amplitudes.
Die Bewegungsplanung für Industrieroboter ist eine notwendige Voraussetzung, damit sich autonome Systeme kollisionsfrei durch die Umwelt bewegen können. Die Berücksichtigung von dynamischen Hindernissen zur Laufzeit erfordert allerdings leistungsfähige Algorithmen, zur Lösung dieser Aufgabenstellung in Echtzeit. Eine Möglichkeit zur Beschleunigung der Algorithmen ist der effiziente Einsatz von skalierbarer Parallelverarbeitung. Die softwaretechnische Umsetzung kann aber nur dann erfolgreich sein, wenn ein Parallelrechner zur Verfügung steht, der einen hohen Datendurchsatz bei geringer Latenzzeit bietet. Darüber hinaus muß dieser Parallelrechner unter vertretbarem Aufwand bedienbar sein und ein gutes Preisleistungsverhältnis aufweisen, damit die Parallelverarbeitung verstärkt in der Industrie zum Einsatz kommt. In diesem Artikel wird ein Workstation-Cluster auf der Basis von neun Standard- PCs vorgestellt, die über eine spezielle Kommunikationskarte miteinander vernetzt sind. In den einzelnen Abschnitten werden die gesammelten Erfahrungen bei der Inbetriebnahme, Systemadministration und Anwendung geschildert. Als Beispiel für eine Anwendung auf diesem Cluster wird ein paralleler Bewegungsplaner für Industrieroboter beschrieben.
A formula suitable for a quantitative evaluation of the tunneling effect in a ferromagnetic particle is derived with the help of the instanton method. The tunneling between n-th degenerate states of neighboring wells is dominated by a periodic pseudoparticle configuration. The low-lying level-splitting previously obtained with the LSZ method in field theory in which the tunneling is viewed as the transition of n bosons induced by the usual(vacuum) instanton is recovered.The observation made with our new result is that the tunneling effect increases at excited states. The results should be useful in analyzing results of experimental tests of macroscopic quantum coherence in ferromagnetic particles.
One of the many features needed to support the activities of autonomous systems is the ability of motion planning. It enables robots to move in their environment securely and to accomplish given tasks. Unfortunately, the control loop comprising sensing, planning, and acting has not yet been closed for robots in dynamic environments. One reason involves the long execution times of the motion planning component. A solution for this problem is offered by the use of highly computational parallelism. Thus, an important task is the parallelization of existing motion planning algorithms for robots so that they are suitable for highly computational parallelism. In several cases, completely new algorithms have to be designed, so that a parallelization is feasible. In this survey, we review recent approaches to motion planning using parallel computation. As a classification scheme, we use the structure given by the different approaches to the robot's motion planning. For each approach, the available parallel processing methods are discussed. Each approach is uniquely assigned a class. Finally, for each referenced research work, a list of keywords is given.
We present a general framework for developing search heuristics for au-tomated theorem provers. This framework allows for the construction ofheuristics that are on the one hand able to replay (parts of) a given prooffound in the past but are on the other hand flexible enough to deviate fromthe given proof path in order to solve similar proof problems. We substanti-ate the abstract framework by the presentation of three distinct techniquesfor learning appropriate search heuristics based on soADcalled features. Wedemonstrate the usefulness of these techniques in the area of equational de-duction. Comparisons with the renowned theorem prover Otter validatethe applicability and strength of our approach.
We present a method for making use of past proof experience called flexiblere-enactment (FR). FR is actually a search-guiding heuristic that uses past proofexperience to create a search bias. Given a proof P of a problem solved previouslythat is assumed to be similar to the current problem A, FR searches for P andin the "neighborhood" of P in order to find a proof of A.This heuristic use of past experience has certain advantages that make FRquite profitable and give it a wide range of applicability. Experimental studiessubstantiate and illustrate this claim.This work was supported by the Deutsche Forschungsgemeinschaft (DFG).
\(C^0\)-scalar-type spectrality criterions for operators \(A\), whose resolvent set contains the negative reals, are provided. The criterions are given in terms of growth conditions on the resolvent of \(A\) and the semi-group generated by \(A\).These criterions characterize scalar-type operators on the Banach space \(X\), if and only if \(X\) has no subspace isomorphic to the space of complex null-sequences.
The Fock space of bosons and fermions and its underlying superalgebra are represented by algebras of functions on a superspace. We define Gaussian integration on infinite dimensional superspaces, and construct superanalogs of the classical function spaces with a reproducing kernel - including the Bargmann-Fock representation - and of the Wiener-Segal representation. The latter representation requires the investigation of Wick ordering on Z 2 -graded algebras. As application we derive a Mehler formula for the Ornstein-Uhlenbeck semigroup on the Fock space.
In the Banach space co there exists a continuous function of bounded semivariation which does not correspond to a countably additive vector measure. This result is in contrast to the scalar case, and it has consequences for the characterization of scalar-type operators. Besides this negative result we introduce the notion of functions of unconditionally bounded variation which are exactly the generators of countably additive vector measures.
Starting from the mollified version of the Enskog equation for a hard-sphere fluid, a grid-free algorithm to obtain the solution is proposed. The algorithm is based on the finite pointset method. For illustration, it is applied to a Riemann problem. The shock-wave solution is compared to the results of Frezzotti and Sgarra where a good agreement is found.
In this note, answering a question of N. Maslova, we give a two-dimensional elementary example of the phenomenon indicated in the title. Perhaps this simple example may serve as an object of comparison for more refined models like in the theory of kinetic differential equations where similar questions still seem to be unsettled.
In dieser Arbeit wird die Problematik der sich rapide wandelnden industriellen CAx-Anwendungen betrachtet. Durch die Einfu"hrung der Feature-Technologie scheinen einige Probleme der Parallelisierung der Prozesse, des Simultaneous und des Concurrent Engineering sowie des Outsourcing überwindbar zu sein. Allerdings entwickelte sich die Feature-Technologie bisher ohne ausreichenden Bezug zur Konstruktionspraxis, was zu erheblichen Defiziten im industriellen Einsatz führte. Untersuchungen in der Automobilindustrie (AIFEMInitiative) zeigen, dass dies vielfach auf mangelnde Kommunikation zwischen Konstrukteuren und CAx-Experten zurückgeführt werden kann. Aufgrund des jetzigen Ansatzes der Feature-Technologie im Zusammenwirken mit dem extremen Zeitdruck in der Produktentwicklung besteht aber die Gefahr, die Produktdefinitionsprozesse nur nach den Kriterien Entwicklungszeit, Kosten und Produktqualität zu optimieren. Features dienen dabei nur als speziell angepasste Werkzeuge. Damit wird eine echte Innovation der Produkte behindert. Es wird aufgezeigt, wie die Feature-Technologie erweitert werden muss, um die Kreativität der Konstrukteure zu fördern und somit neuartige Produkte zu ermöglichen. Näher ausgeführt werden die Aspekte der benutzerdefinierten Features, der Datenstandardisierung, der Verarbeitung unvollsta"ndiger Information und der dynamischen Prozessunterstützung.
Instant Radiosity
(1997)
We present a fundamental procedure for instant rendering from the radiance equation. Operating directly on the textured scene description, the very efficient and simple algorithm produces photorealistic images without any kernel or solution discretization of the underlying integral equation. Rendering rates of a few seconds are obtained by exploiting graphics hardware, the deterministic
technique of the quasi-random walk for the solution of the global illumination problem, and the new method of jittered low discrepancy sampling.
In dieser Arbeit wird eine Integration der temporallogischen Verarbeitungskonzepte
der Programmiersprache ExTeLL in die objektorientierte Wirtssprache \(C^{++}\) vorgestellt. Dabei war unser Ziel eine Schnittstelle zur komfortablen Kommunikation der Sprachkomponenten zu entwickeln, derart daß die Sprachsynthese eine homogene Gesamtsprache darstellt . Hierbei haben wir besonderen Wert auf die Nutzung der Möglichkeiten der jeweils hinzugefügten Sprachkomponente und einen syntaktisch einheitlichen Aufbau der Gesamtsprache gelegt. Dies erforderte insbesondere die Integration des Typkonzepts von \(C^{++}\) sowie der Mechanismen zur Überladung von Funktionen und Prozeduren in ExTeLL und in der zugrundeliegenden Temporallogik
EITeL.
We investigate in how far interpolation mechanisms based on the nearest-neighbor rule (NNR) can support cancer research. The main objective is to usethe NNR to predict the likelihood of tumorigenesis based on given risk factors.By using a genetic algorithm to optimize the parameters of the nearest-neighbourprediction, the performance of this interpolation method can be improved sub-stantially. Furthermore, it is possible to detect risk factors which are hardly ornot relevant to tumorigenesis. Our preliminary studies demonstrate that NNR-based interpolation is a simple tool that nevertheless has enough potential to beseriously considered for cancer research or related research.
We report on the observation of quantized surface spin waves in periodic arrays of magnetic Ni81Fe19 wires by means of Brillouin light scattering spectroscopy. At small wavevectors (q_1 = 0 - 0.9*100000 cm^-1 ) several discrete, dispersionless modes with a frequency splitting of up to 0.9 GHz were observed for the wavevector oriented perpendicular to the wires. From the frequencies of the modes and the wavevector interval, where each mode is observed, the modes are identified as dipole-exchange surface spin wave modes of the film with quantized wavevector values determined by the boundary conditions at the lateral edges of the wires. With increasing wavevector the separation of the modes becomes smaller, and the frequencies of the discrete modes converge to the dispersion of the dipole-exchange surface mode of a continuous film.
Formale Beschreibungstechniken (FDTs) erlauben durch ihre formale Syntax und Semantik eine präzise Systembeschreibung und sind Grundlage für die formale Verifikation. Bei der Implementierung von Systemen wird jedoch nach wie vor von Hand implementiert, selbst wenn ausgereifte Werkzeuge zur automatischen Generierung von Kode direkt aus der formalen Spezifikation existieren. Die Ursache dafür liegt in dem Ruf dieser Werkzeuge, Kode mit extrem geringer Leistungsfähigkeit zu erzeugen. Es gibt jedoch kaum quantitative Leistungsvergleiche zwischen manuell und automatisch generierten Implementierungen, die dieses Vorurteil stützen oder widerlegen könnten. In diesem Beitrag wird ein solcher Leistungsvergleich anhand des Hochleistungsprotokolls XTP und der FDT Estelle vorgestellt. Er liefert eine Bestandsaufnahme des momentanen Entwicklungsstandes bei der automatischen Generierung von Kode aus Estelle-Spezifikationen im direkten Vergleich zu gut optimierten Handimplementierungen. Es zeigt sich, daß in dem betrachteten Fall eines komplexen Protokolls die Handimplementierung zwar merklich leistungsstärker ist. Dieser Leistungsvorteil wird jedoch durch einen sehr hohen Implementierungsaufwand sowie die Schwierigkeit, die Korrektheit bzgl. der Spezifikation sicherzustellen, erkauft. Im einzelnen Anwendungsfall kann es daher trotz der Leistungseinbußen durchaus vorteilhaft sein, automatisch Kode zu erzeugen, zumal in der Bestandsaufnahme festgestellt wurde, daß automatisch generierte Implementierungen z.T. besser abschneiden als erwartet. Zudem besteht - anders als bei der bereits umfassend optimierten Handimplementierung - noch ein erhebliches ungenutztes Potential zur Leistungsverbesserung der automatisch generierten Implementierung.
Liegruppen
(1997)
In this paper we provide a semantical meta-theory that will support the development of higher-order calculi for automated theorem proving like the corresponding methodology has in first-order logic. To reach this goal, we establish classes of models that adequately characterize the existing theorem-proving calculi, that is, so that they are sound and complete to these calculi, and a standard methodology of abstract consistency methods (by providing the necessary model existence theorems) needed to analyze completeness of machine-oriented calculi.
We study the problem of global solution of Fredholm integral equations. This means that we seek to approximate the full solution function (as opposed to the local problem, where only the value of the solution in a single point or a functional of the solution is sought). We analyze the Monte Carlo complexity, i.e. the complexity of stochastic solution of this problem. The framework for this analysis is provided by information based complexity theory. Our investigations complement previous ones on stochastic complexity of local solution and on deterministic complexity of
both local and global solution. The results show that even in the global case Monte Carlo algorithms can perform better than deterministic ones, although the difference is not as large as in the local case.
MP Prototype Specification
(1997)