### Refine

#### Year of publication

- 1999 (73) (remove)

#### Document Type

- Article (73) (remove)

#### Keywords

- AG-RESY (6)
- HANDFLEX (5)
- PARO (5)
- Network Protocols (2)
- Requirements/Specifications (2)
- Wannier-Stark systems (2)
- entropy (2)
- localization (2)
- quantum mechanics (2)
- resonances (2)
- theorem proving (2)
- 90° orientation (1)
- Ablagestruktur (1)
- Access System (1)
- Ad-hoc workflow (1)
- Adaption (1)
- Banach lattice (1)
- Case Study Erfahrungsdatenbank (1)
- Case-Based Reasoning (1)
- Classification Tasks (1)
- Computer supported cooperative work (1)
- Curie temperature (1)
- Distributed Multimedia Applications (1)
- Experience Base (1)
- Experience Database (1)
- Experiment (1)
- Extensibility (1)
- Fallstu (1)
- Ferromagnetism (1)
- Filter-Diagonalization (1)
- Generic Methods (1)
- Global Software Highway (1)
- HTML (1)
- Hardy space (1)
- INRECA (1)
- Interacting Magnetic Dots and Wires (1)
- Java (1)
- Language Constructs (1)
- MHEG (1)
- Mn-Si-C alloy films (1)
- Object-Relational DataBase Management Systems (ORDBMS) (1)
- Object-Relational Database Systems (1)
- Partial functions (1)
- Pullen Edmonds system (1)
- Quality Improvement Paradigm (QIP) (1)
- Quantum Chaos (1)
- Quantum mechanics (1)
- Repositories (1)
- Reuse (1)
- SDL-pattern a (1)
- SKALP (1)
- SQUID magnetometry (1)
- Sandercock-type multipath tandem Fabry-Perot interferometer (1)
- Scalar type operator (1)
- Software Engineering (1)
- Software development environment (1)
- Stark systems (1)
- Structural Adaptation (1)
- Structure (1)
- Structuring Approach (1)
- Vector-valued holomorphic function (1)
- Wannier-Bloch states (1)
- World Wide Web (1)
- World-Wide Web (1)
- Zugriffstruktur (1)
- Zugriffsystem (1)
- abstract description (1)
- analogical reasoning (1)
- analogy (1)
- anisotropic coupling mechanism (1)
- bcc-Fe(001) (1)
- biquadratic interlayer coupling (1)
- brillouin light scattering (1)
- chaotic dynamics (1)
- completeness (1)
- complex energ (1)
- complex energy resonances (1)
- comprehensive reuse (1)
- computer control (1)
- conservative extension (1)
- consistency (1)
- critical thickness (1)
- cross-correlation (1)
- dependency management (1)
- deposition temperature (1)
- dipole-exchange surface (1)
- dynamical systems (1)
- epitaxial growth (1)
- evolutionary spectrum (1)
- exchange-coupled rare-earth (1)
- flexible workflows (1)
- frames (1)
- function of bounded variation (1)
- higher order logic (1)
- hybrid knowledge representation (1)
- layered magnetic systems (1)
- lifetime statistics (1)
- lifetimes (1)
- locally stationary process (1)
- magnetization reversal process (1)
- magneto-optical Kerr effect (1)
- many-valued logic (1)
- mathematical concept (1)
- metastable Pd(001) (1)
- minimax rate (1)
- morphism (1)
- motion planning (1)
- nonlinear wavelet thresholding (1)
- numerical computation (1)
- patterned magnetic permalloy films (1)
- phase space (1)
- phase-space (1)
- problem formulation (1)
- project coordination (1)
- proof plans (1)
- qauntum mechanis (1)
- quasienergy (1)
- resolution (1)
- reuse repositories (1)
- search algorithms (1)
- second order logic (1)
- short-time periodogram (1)
- shortest sequence (1)
- sorted logic (1)
- soundness (1)
- spinwaves (1)
- structured permalloy films (1)
- tableau (1)
- tactics (1)
- temperature dependence (1)
- traceability (1)
- transition-metal (1)
- translation (1)
- traveling salesman problem (1)
- triple layer stacks (1)
- typical instance (1)
- vector measure (1)
- wall energy (1)
- wall thickness (1)

#### Faculty / Organisational entity

We present an entropy concept measuring quantum localization in dynamical systems based on time averaged probability densities. The suggested entropy concept is a generalization of a recently introduced [PRL 75, 326 (1995)] phase-space entropy to any representation chosen according to the system and the physical question under consideration. In this paper we inspect the main characteristics of the entropy and the relation to other measures of localization. In particular the classical correspondence is discussed and the statistical properties are evaluated within the framework of random vector theory. In this way we show that the suggested entropy is a suitable method to detect quantum localization phenomena in dynamical systems.

The semantics of everyday language and the semanticsof its naive translation into classical first-order language consider-ably differ. An important discrepancy that is addressed in this paperis about the implicit assumption what exists. For instance, in thecase of universal quantification natural language uses restrictions andpresupposes that these restrictions are non-empty, while in classi-cal logic it is only assumed that the whole universe is non-empty.On the other hand, all constants mentioned in classical logic arepresupposed to exist, while it makes no problems to speak about hy-pothetical objects in everyday language. These problems have beendiscussed in philosophical logic and some adequate many-valuedlogics were developed to model these phenomena much better thanclassical first-order logic can do. An adequate calculus, however, hasnot yet been given. Recent years have seen a thorough investigationof the framework of many-valued truth-functional logics. UnfortuADnately, restricted quantifications are not truth-functional, hence theydo not fit the framework directly. We solve this problem by applyingrecent methods from sorted logics.

In the scalar case one knows that a complex normalized function of boundedvariation \(\phi\) on \([0,1]\) defines a unique complex regular Borel measure\(\mu\) on \([0,1]\). In this note we show that this is no longer true in generalin the vector valued case, even if \(\phi\) is assumed to be continuous. Moreover, the functions \(\phi\) which determine a countably additive vectormeasure \(\mu\) are characterized.

Even though it is not very often admitted, partial functionsdo play a significant role in many practical applications of deduction sys-tems. Kleene has already given a semantic account of partial functionsusing a three-valued logic decades ago, but there has not been a satisfact-ory mechanization. Recent years have seen a thorough investigation ofthe framework of many-valued truth-functional logics. However, strongKleene logic, where quantification is restricted and therefore not truth-functional, does not fit the framework directly. We solve this problemby applying recent methods from sorted logics. This paper presents atableau calculus that combines the proper treatment of partial functionswith the efficiency of sorted calculi.

A novel method is presented which allows a fast computation of complex energy resonance states in Stark systems, i.e. systems in a homogeneous field. The technique is based on the truncation of a shift-operator in momentum space. Numerical results for space periodic and non-periodic systems illustrate the extreme simplicity of the method.

INRECA offers tools and methods for developing, validating, and maintaining classification, diagnosis and decision support systems. INRECA's basic technologies are inductive and case-based reasoning [9]. INRECA fully integrates [2] both techniques within one environment and uses the respective advantages of both technologies. Its object-oriented representation language CASUEL [10, 3] allows the definition of complex case structures, relations, similarity measures, as well as background knowledge to be used for adaptation. The objectoriented representation language makes INRECA a domain independent tool for its destined kind of tasks. When problems are solved via case-based reasoning, the primary kind of knowledge that is used during problem solving is the very specific knowledge contained in the cases. However, in many situations this specific knowledge by itself is not sufficient or appropriate to cope with all requirements of an application. Very often, background knowledge is available and/or necessary to better explore and interpret the available cases [1]. Such general knowledge may state dependencies between certain case features and can be used to infer additional, previously unknown features from the known ones.

In this paper we generalize the notion of method for proofplanning. While we adopt the general structure of methods introducedby Alan Bundy, we make an essential advancement in that we strictlyseparate the declarative knowledge from the procedural knowledge. Thischange of paradigm not only leads to representations easier to under-stand, it also enables modeling the important activity of formulatingmeta-methods, that is, operators that adapt the declarative part of exist-ing methods to suit novel situations. Thus this change of representationleads to a considerably strengthened planning mechanism.After presenting our declarative approach towards methods we describethe basic proof planning process with these. Then we define the notion ofmeta-method, provide an overview of practical examples and illustratehow meta-methods can be integrated into the planning process.

Extending the planADbased paradigm for auto-mated theorem proving, we developed in previ-ous work a declarative approach towards rep-resenting methods in a proof planning frame-work to support their mechanical modification.This paper presents a detailed study of a classof particular methods, embodying variations ofa mathematical technique called diagonaliza-tion. The purpose of this paper is mainly two-fold. First we demonstrate that typical math-ematical methods can be represented in ourframework in a natural way. Second we illus-trate our philosophy of proof planning: besidesplanning with a fixed repertoire of methods,metaADmethods create new methods by modify-ing existing ones. With the help of three differ-ent diagonalization problems we present an ex-ample trace protocol of the evolution of meth-ods: an initial method is extracted from a par-ticular successful proof. This initial method isthen reformulated for the subsequent problems,and more general methods can be obtained byabstracting existing methods. Finally we comeup with a fairly abstract method capable ofdealing with all the three problems, since it cap-tures the very key idea of diagonalization.

The development of complex software systems is driven by many diverse and sometimes contradictory requirements such as correctness and maintainability of resulting products, development costs, and time-to-market. To alleviate these difficulties, we propose a development method for distributed systems that integrates different basic approaches. First, it combines the use of the formal description technique SDL with software reuse concepts. This results in the definition of a use-case driven, incremental development method with SDL-patterns as the main reusable artifacts. Experience with this approach has shown that there are several other factors of influence, such as the quality of reuse artifacts or the experience of the development team. Therefore, we further combined our SDL-pattern approach with an improvement methodology known from the area of experimental software engineering. In order to demonstrate the validity of this integrating approach, we sketch some representative outcomings of a case study.