Refine
Year of publication
- 1999 (351) (remove)
Document Type
- Preprint (351) (remove)
Language
- English (351) (remove)
Keywords
- Case-Based Reasoning (5)
- Location Theory (5)
- case-based problem solving (5)
- Abstraction (4)
- Knowledge Acquisition (4)
- Fallbasiertes Schließen (3)
- Internet (3)
- Knowledge acquisition (3)
- Multicriteria Optimization (3)
- case-based reasoning (3)
- distributed software development (3)
- distributed software development process (3)
- explanation-based learning (3)
- problem solving (3)
- resolution (3)
- Algebraic Optimization (2)
- Brillouin light scattering spectroscopy (2)
- CAPlan (2)
- Combinatorial Optimization (2)
- Deduction (2)
- Fallbasiertes Schliessen (2)
- Geometrical Algorithms (2)
- Kinetic Schemes (2)
- MOLTKE-Projekt (2)
- SDL (2)
- Wissensakquisition (2)
- application (2)
- average density (2)
- building automation (2)
- case based reasoning (2)
- design patterns (2)
- formal specification (2)
- incompressible Navier-Stokes equations (2)
- lattice Boltzmann method (2)
- learning system (2)
- low Mach number limit (2)
- requirements engineering (2)
- reuse (2)
- spin wave quantization (2)
- Abelian groups (1)
- Agents (1)
- Algebraic optimization (1)
- Analysis (1)
- Analytic semigroup (1)
- Applications (1)
- Approximation (1)
- Approximation Algorithms (1)
- Automated Reasoning (1)
- Automated theorem proving (1)
- Autonomous mobile robots (1)
- Autoregression (1)
- Bayes risk (1)
- Bisector (1)
- Blackboard architecture (1)
- Brillouin light scattering (1)
- Brownian motion (1)
- CNC-Maschine (1)
- COMOKIT (1)
- Case Study (1)
- Case-based problem solving (1)
- Causal Ordering (1)
- Causality (1)
- Chapman Enskog distributions (1)
- Chorin's projection scheme (1)
- Classification (1)
- CoMo-Kit (1)
- Collocation Method plus (1)
- Complexity and performance of numerical algorithms (1)
- Computational Fluid Dynamics (1)
- Computer Assisted Tomograp (1)
- Concept mapping (1)
- Concept maps (1)
- Constraint Graphs (1)
- Contract net (1)
- Control Design Styles (1)
- Convexity (1)
- Cooperative decision making (1)
- Correlation (1)
- Cosine function (1)
- Coxeter groups (1)
- Crofton's intersection formulae (1)
- Damon-Eshbach spin wave modes (1)
- Decision Making (1)
- Declarative and Procedural Knowledge (1)
- Design Patterns (1)
- Design Styles (1)
- Diagnosesystem (1)
- Difference Reduction (1)
- Differential Cross-Sections (1)
- Discrete decision problems (1)
- Discrete velocity models (1)
- Distributed Computation (1)
- Distributed Deb (1)
- Distributed Software Development (1)
- Distributed Software Development Projects (1)
- Distributed System (1)
- Distributed software development support (1)
- Distributed systems (1)
- EBG (1)
- Ecommerce (1)
- Elastic properties (1)
- Equality reasoning (1)
- Equational Reasoning (1)
- Experimental Data (1)
- Feature Technology (1)
- Forbidden Regions (1)
- Fredholm integral equation of the second kind (1)
- Fuzzy Programming (1)
- GPS-satellite-to-satellite tracking (1)
- Global Optimization (1)
- Global Predicate Detection (1)
- Global optimization (1)
- HOT (1)
- HTE (1)
- Hadwiger's recursive de nition of the Euler number (1)
- Hamiltonian groups (1)
- Helmholtz decomposition (1)
- High frequency switching (1)
- Homogeneous Relaxation (1)
- Ill-posed Problems (1)
- Improperly posed problems (1)
- Impulse control (1)
- Intelligent Agents (1)
- Intelligent agents (1)
- Interleaved Planning (1)
- Iterative Methods (1)
- Jeffreys' prior (1)
- Kinetic Schems (1)
- Kinetic theory (1)
- Knuth-Bendix completion algorithm (1)
- Kullback Leibler distance (1)
- Lagrangian Functions (1)
- Lattice Boltzmann Method (1)
- Lattice Boltzmann methods (1)
- Lexicographic Order (1)
- Lexicographic max-ordering (1)
- Linear membership function (1)
- Local completeness (1)
- Location theory (1)
- Logic Design (1)
- Logical Time (1)
- MOO (1)
- Map Building (1)
- Markov process (1)
- Maturity of Software Engineering (1)
- Max-Ordering (1)
- Mechanical Engineering (1)
- Methods (1)
- Mie representation (1)
- Minkowski space (1)
- Multicriteria Location (1)
- Multicriteria optimization (1)
- Multiple Criteria (1)
- Multiple Objective Programs (1)
- NP-completeness (1)
- Navier-Stokes equations (1)
- Nonstationary processes (1)
- Numerical Simulation (1)
- Open-Source (1)
- PATDEX (1)
- Palm distribution (1)
- Palm distributions (1)
- Pareto Optimality (1)
- Pareto Points (1)
- Partial functions (1)
- Planning and Verification (1)
- Polynomial Eigenfunctions (1)
- Position- and Orientation Estimation (1)
- Potential transform (1)
- Problem Solvers (1)
- Process Management (1)
- Process support (1)
- Produktionsdesign (1)
- Project Management (1)
- Random Errors (1)
- Rarefied Polyatomic Gases (1)
- Rectifiability (1)
- Requirements engineering (1)
- Requirements/Specifications (1)
- Resonant tunneling diode (1)
- Saddle Points (1)
- Self-Referencing (1)
- Semantics of Programming Languages (1)
- Shannon capacity (1)
- Shannon optimal priors (1)
- Shock Wave Problem (1)
- Similarity Assessment (1)
- Smalltalk (1)
- Software Agents (1)
- Software Engineering (1)
- Software development (1)
- Software engineering (1)
- Spatial Binary Images (1)
- Spectral Analysis (1)
- Square-mean Convergence (1)
- Stoner-like magnetic particles (1)
- Tactics (1)
- Term rewriting systems (1)
- Theorem of Plemelj-Privalov (1)
- Topology Preserving Networks (1)
- Translation planes (1)
- Triangular fuzzy number (1)
- Vector Time (1)
- Vetor optimization (1)
- Virtual Corporation (1)
- Virtual Software Projects (1)
- Voronoi diagram (1)
- Wide Area Multimedia Group Interaction (1)
- Wissenserwerb (1)
- Word problem (1)
- Workflow Replication (1)
- World Wide Web (1)
- adaption (1)
- analogy (1)
- anisotropic coupling between magnetic i (1)
- approximation methods (1)
- arbitrary function (1)
- arrays of magnetic dots and wires (1)
- artificial intelligence (1)
- assembly sequence design (1)
- automated code generation (1)
- automated computer learning (1)
- automated synchronization (1)
- autonomes Lernen (1)
- autonomous learning (1)
- average densities (1)
- bicriterion path problems (1)
- bipolar quantum drift diffusion model (1)
- bootstrap (1)
- business process modelling (1)
- cancer (1)
- case-based planner (1)
- case-based planning (1)
- cash management (1)
- center hyperplane (1)
- centrally symmetric polytope (1)
- co-learning (1)
- combined systems with sha (1)
- common transversal (1)
- communication architectures (1)
- communication protocols (1)
- communication subsystem (1)
- compilation (1)
- complete presentations (1)
- compressible Navier Stokes equations (1)
- computer aided planning (1)
- computer-supported cooperative work (1)
- concept representation (1)
- conceptual representation (1)
- concurrent software (1)
- confluence (1)
- conservative extension (1)
- consistency (1)
- constraint satisfaction problem (CSP) (1)
- continuous media (1)
- convex distance funtion (1)
- convex operator (1)
- cooperative problem solving (1)
- customization of communication protocols (1)
- decision support (1)
- decrease direction (1)
- deficiency (1)
- density distribution (1)
- design processes (1)
- diagnostic problems (1)
- direct product (1)
- directional derivative (1)
- discrete element method (1)
- discrete equilibrium distributions (1)
- discrete velocity models (1)
- discretization (1)
- disjoint union (1)
- distributed (1)
- distributed c (1)
- distributed deduction (1)
- distributed document management (1)
- distributed enterprise (1)
- distributed groupware environment (1)
- distributed multi-platform software development (1)
- distributed multi-platform software development projects (1)
- distributed software configuration management (1)
- distributed softwaredevelopment tools (1)
- enhanced coercivity (1)
- exchange coupling (1)
- exchange rate (1)
- exchange-bias bilayer Fe/MnPd (1)
- experience base (1)
- experimental software engineering (1)
- exponential rate (1)
- f-dissimilarity (1)
- fallbasiertes Schliessen (1)
- fallbasiertes planen (1)
- final prediction error (1)
- finite difference method (1)
- formal description techniques (1)
- formal reasoning (1)
- formulation as integral equation (1)
- frames (1)
- frequency splitting betwe (1)
- gauge (1)
- general multidimensional moment problem (1)
- generalized Gummel itera (1)
- generic design of a customized communication subsystem (1)
- geodetic (1)
- geomagnetic field modelling from MAGSAT data (1)
- geometric measure theory (1)
- geometrical algorithms (1)
- geopotential determination (1)
- global optimization (1)
- goal oriented completion (1)
- granular flow (1)
- growth optimal portfolios (1)
- harmonic WFT (1)
- head-on collisions (1)
- heterogeneous large-scale distributed DBMS (1)
- high-level caching of potentially shared networked documents (1)
- higher-order anisotropies (1)
- higher-order tableaux calculus (1)
- higher-order theorem prover (1)
- hyperbolic systems of conservation laws (1)
- hyperplane transversal (1)
- industrial supervision (1)
- inelastic light scattering (1)
- information (1)
- information systems engineering (1)
- innermost termination (1)
- instanton method (1)
- intelligent agents (1)
- interest oriented portfolios (1)
- internal approximation (1)
- internet event synchronizer (1)
- intersection local time (1)
- inverse Fourier transform (1)
- inverse mathematical models (1)
- isochronous streams (1)
- knowledge space (1)
- lacunarity distribution (1)
- large deviations (1)
- learning (1)
- level splitting (1)
- linked abstraction workflows (1)
- locally maximal clone (1)
- location (1)
- location problem (1)
- logarithmic average (1)
- logarithmic utility (1)
- macroscopic quantum coherence (1)
- magnetic Ni80Fe20 wires (1)
- magnetostatic surface spin waves (1)
- many-valued logic (1)
- martingale measu (1)
- maximum-entropy (1)
- middleware (1)
- minimal paths (1)
- minimax risk (1)
- mobile agents (1)
- mobile agents approach (1)
- modelling time (1)
- modularity (1)
- moment realizability (1)
- monitoring and managing distributed development processes (1)
- multi-agent architecture (1)
- multicriteria minimal path problem is presented (1)
- multidimensional Kohonen algorithm (1)
- multimedia (1)
- multiple objective linear programming problem (1)
- multiple-view product modeling (1)
- multiresolution analysis (1)
- narrowing (1)
- negotiation (1)
- neural networks (1)
- non-convex optimization (1)
- noninformative prior (1)
- nonlinear thresholding (1)
- norm (1)
- normal cone (1)
- numeraire portfolios (1)
- object frameworks (1)
- order selection (1)
- order-sorted logic (1)
- order-two densities (1)
- order-two density (1)
- ovoids (1)
- paramodulation (1)
- plan enactment (1)
- polyhedral norm (1)
- portfolio optimisation (1)
- preservation of relations (1)
- problem formulation (1)
- problem solvers (1)
- process model (1)
- process modelling (1)
- process support system (PROSYT) (1)
- process-centred environments (1)
- profiles (1)
- programmable client-server systems (1)
- projected quasi-gradient method (1)
- protocol (1)
- pseudo-compressibility method (1)
- quadratic forms (1)
- quasi-one-dimensional spin wave envelope solitons (1)
- radiation therapy (1)
- rate control (1)
- reactive systems (1)
- real time (1)
- real-time (1)
- real-time temporal logic (1)
- receptive safety properties (1)
- reference prior (1)
- regularization by wavelets (1)
- rela (1)
- reliability (1)
- requirements (1)
- robustness (1)
- scaled translates (1)
- shape aniso-tropies (1)
- shear flow (1)
- short magnetic fieldpulses (1)
- similarity measure (1)
- single domain uniaxial magnetic particles (1)
- software agents (1)
- software project (1)
- software project management (1)
- spin wave excitations (1)
- squares (1)
- statistical experiment (1)
- stochastic stability (1)
- switching properties (1)
- system behaviour (1)
- tactics (1)
- tangent measure distributions (1)
- temporal logic (1)
- termination (1)
- theorem prover (1)
- theorem proving (1)
- thin h-BN films (1)
- time series (1)
- time-varying autoregression (1)
- topology preserving maps (1)
- transition rates (1)
- transverse bias field (1)
- treatment planning (1)
- trial systems (1)
- two-dimensional self-focused spin wave packets (1)
- typical examples (1)
- uncertainty principle (1)
- uniform ergodicity (1)
- uniqueness (1)
- value preserving portfolios (1)
- vector wavelets (1)
- virtual market place (1)
- viscosity solutions (1)
- visual process modelling environment (1)
- wavelet estimators (1)
- wavelet transform (1)
- weak termination (1)
- windowed Fourier transform (1)
- work coordination (1)
- yttrium-iron garnet (YIG) fi (1)
Faculty / Organisational entity
Contrary to symbolic learning approaches, which represent a learned concept explicitly, case-based approaches describe concepts implicitly by a pair (CB; sim), i.e. by a measure of similarity sim and a set CB of cases. This poses the question if there are any differences concerning the learning power of the two approaches. In this article we will study the relationship between the case base, the measure of similarity, and the target concept of the learning process. To do so, we transform a simple symbolic learning algorithm (the version space algorithm) into an equivalent case- based variant. The achieved results strengthen the hypothesis of the equivalence of the learning power of symbolic and case-based methods and show the interdependency between the measure used by a case-based algorithm and the target concept.
A large set of criteria to evaluate formal methods for reactive systems is presented. To make this set more comprehensible, it is structured according to a Concept-Model of formal methods. It is made clear that it is necessary to make the catalogue more specific before applying it. Some of the steps needed to do so are explained. As an example the catalogue is applied within the context of the application domain building automation systems to three different formal methods: SDL, statecharts, and a temporallogic.
In this paper we give the definition of a solution concept in multicriteria combinatorial optimization. We show how Pareto, max-ordering and lexicographically optimal solutions can be incorporated in this framework. Furthermore we state some properties of lexicographic max-ordering solutions, which combine features of these three kinds of optimal solutions. Two of these properties, which are desirable from a decision maker" s point of view, are satisfied if and only of the solution concept is that of lexicographic max-ordering.
A new approach for modelling time that does not rely on the concept of a clock is proposed. In order to establish a notion of time, system behaviour is represented as a joint progression of multiple threads of control, which satisfies a certain set of axioms. We show that the clock-independent time model is related to the well-known concept of a global clock and argue that both approaches establish the same notion of time.
Coloring terms (rippling) is a technique developed for inductive theorem proving which uses syntactic differences of terms to guide the proof search. Annotations (colors) to terms are used to maintain this information. This technique has several advantages, e.g. it is highly goal oriented and involves little search. In this paper we give a general formalization of coloring terms in a higher-order setting. We introduce a simply-typed lambda calculus with color annotations and present an appropriate (pre-)unification algorithm. Our work is a formal basis to the implementation of rippling in a higher-order setting which is required e.g. in case of middle-out reasoning. Another application is in the construction of natural language semantics, where the color annotations rule out linguistically invalid readings that are possible using standard higher-order unification.
This paper develops a sound and complete transformation-based algorithm forunification in an extensional order-sorted combinatory logic supporting constantoverloading and a higher-order sort concept. Appropriate notions of order-sortedweak equality and extensionality - reflecting order-sorted fij-equality in thecorresponding lambda calculus given by Johann and Kohlhase - are defined, andthe typed combinator-based higher-order unification techniques of Dougherty aremodified to accommodate unification with respect to the theory they generate. Thealgorithm presented here can thus be viewed as a combinatory logic counterpartto that of Johann and Kohlhase, as well as a refinement of that of Dougherty, andprovides evidence that combinatory logic is well-suited to serve as a framework forincorporating order-sorted higher-order reasoning into deduction systems aimingto capitalize on both the expressiveness of extensional higher-order logic and theefficiency of order-sorted calculi.
This paper describes a system that supports softwaredevelopment processes in virtual software corporations. A virtual software corporation consists of a set of enterprisesthat cooperate in projects to fulfill customer needs. Contracts are negotiated in the whole lifecycle of asoftware development project. The negotiations really influence the performance of a company. Therefore, it isuseful to support negotiations and planning decisions with software agents. Our approach integrates software agentapproaches for negotiation support with flexible multiserver workflow engines.
The concept of the Virtual Software Corporation ( VSC) has recently become a practical reality as a result of advances in communication and distributed technologies. However, there are significant difficulties with the management of the software development process within a VSC. The main problem is the significantly increased communicational complexity of the process model for such developments. The more classic managerial hierarchy is generally replaced by a "flatter" network of commitments. Therefore new solution approaches are required to provide the necessary process support. The purpose of this paper is to present a solution approach which models the process based on deontic logic. The approach has been validated against a case study where it was used to model commitments and inter-human communications within the software development process of a VSC. The use of the formalism is exemplified through a prototype system using a layered multi-agent architecture.
Many discrepancy principles are known for choosing the parameter \(\alpha\) in the regularized operator equation \((T^*T+ \alpha I)x_\alpha^\delta = T^*y^\delta\), \(||y-y^d||\leq \delta\), in order to approximate the minimal norm least-squares solution of the operator equation \(Tx=y\). In this paper we consider a class of discrepancy principles for choosing the regularization parameter when \(T^*T\) and \(T^*y^\delta\) are approximated by \(A_n\) and \(z_n^\delta\) respectively with \(A_n\) not necessarily self - adjoint. Thisprocedure generalizes the work of Engl and Neubauer (1985),and particular cases of the results are applicable to the regularized projection method as well as to a degenerate kernel method considered by Groetsch (1990).
This paper investigates the suitability of the mobile agents approach to the problem of integrating a collection of local DBMS into a single heterogeneous large-scale distributed DBMS. The paper proposes a model of distributed transactions as a set of mobile agents and presents the relevant execution semantics. In addition, the mechanisms which are needed to guarantee the ACID properties in the considered environment are discussed.
Compared to conventional techniques in computational fluid dynamics, the lattice Boltzmann method (LBM) seems to be a completely different approach to solve the incompressible Navier-Stokes equations. The aim of this article is to correct this impression by showing the close relation of LBM to two standard methods: relaxation schemes and explicit finite difference discretizations. As a side effect, new starting points for a discretization of the incompressible Navier-Stokes equations are obtained.
Information technology support for complex, dynamic, and distributed business processes as they occur in engineering domains requires an advanced process management system which enhances currently available workflow management services with respect to integration, flexibility, and adapt ation. We present an uniform and flexible framework for advanced process management on an a bstract level which uses and adapts agent technology from distributed artificial intelligence for both modelling and enacting of processes. We identify two different frameworks for applying agent tec hnology to process management: First, as a multi-agent system with the domain of process manag ement. Second, as a key infrastructure technology for building a process management system. We will then follow the latter approach and introduce different agent types for managing activities, products, and resources which capture specific views on the process.
In continous location problems we are given a set of existing facilities and we are looking for the location of one or several new facilities. In the classical approaches weights are assigned to existing facilities expressing the importance of the new facilities for the existing ones. In this paper, we consider a pointwise defined objective function where the weights are assigned to the existing facilities depending on the location of the new facility. This approach is shown to be a generalization of the median, center and centdian objective functions. In addition, this approach allows to formulate completely new location models. Efficient algorithms as well as structure results for this algebraic approach for location problems are presented. Extensions to the multifacility and restricted case are also considered.
This paper describes the architecture and concept of operation of a Framework for Adaptive Process Modeling and Execution (FAME). The research addresses the absence of robust methods for supporting the software process management life cycle. FAME employs a novel, model-based approach in providing automated support for different activities in the software development life cycle including project definition, process design, process analysis, process enactment, process execution status monitoring, and execution status-triggered process redesign. FAME applications extend beyond the software development domain to areas such as agile manufacturing, project management, logistics planning, and business process reengineering.
Facility Location Problems are concerned with the optimal location of one or several new facilities, with respect to a set of existing ones. The objectives involve the distance between new and existing facilities, usually a weighted sum or weighted maximum. Since the various stakeholders (decision makers) will have different opinions of the importance of the existing facilities, a multicriteria problem with several sets of weights, and thus several objectives, arises. In our approach, we assume the decision makers to make only fuzzy comparisons of the different existing facilities. A geometric mean method is used to obtain the fuzzy weights for each facility and each decision maker. The resulting multicriteria facility location problem is solved using fuzzy techniques again. We prove that the final compromise solution is weakly Pareto optimal and Pareto optimal, if it is unique, or under certain assumptions on the estimates of the Nadir point. A numerical example is considered to illustrate the methodology.
A General Hilbert Space Approach to Wavelets and Its Application in Geopotential Determination
(1999)
A general approach to wavelets is presented within a framework of a separable functional Hilbert space H. Basic tool is the construction of H-product kernels by use of Fourier analysis with respect to an orthonormal basis in H. Scaling function and wavelet are defined in terms of H-product kernels. Wavelets are shown to be 'building blocks' that decorrelate the data. A pyramid scheme provides fast computation. Finally, the determination of the earth's gravitational potential from single and multipole expressions is organized as an example of wavelet approximation in Hilbert space structure.
In this paper we consider the problem of optimizing a piecewise-linear objective function over a non-convex domain. In particular we do not allow the solution to lie in the interior of a prespecified region R. We discuss the geometrical properties of this problems and present algorithms based on combinatorial arguments. In addition we show how we can construct quite complicated shaped sets R while maintaining the combinatorial properties.
We provide an overview of UNICOM, an inductive theorem prover for equational logic which isbased on refined rewriting and completion techniques. The architecture of the system as well as itsfunctionality are described. Moreover, an insight into the most important aspects of the internalproof process is provided. This knowledge about how the central inductive proof componentof the system essentially works is crucial for human users who want to solve non-trivial prooftasks with UNICOM and thoroughly analyse potential failures. The presentation is focussedon practical aspects of understanding and using UNICOM. A brief but complete description ofthe command interface, an installation guide, an example session, a detailed extended exampleillustrating various special features and a collection of successfully handled examples are alsoincluded.
Cooperative decision making involves a continuous process, assessing the validity ofdata, information and knowledge acquired and inferred by the colleagues, that is, the shared knowledge space must be transparent. The ACCORD methodology provides aninterpretation framework for the mapping of domain facts - constituting the world model of the expert - onto conceptual models, which can be expressed in formalrepresentations. The ACCORD-BPM framework allows a stepwise and inarbitrary reconstruction of the problem solving competence of BPM experts as a prerequisite foran appropriate architecture of both BPM knowledge bases and the BPM-"reasoning device".
A map for an autonomous mobile robot (AMR) in an indoor environment for the purpose ofcontinuous position and orientation estimation is discussed. Unlike many other approaches, this map is not based on geometrical primitives like lines and polygons. An algorithm is shown , where the sensordata of a laser range finder can be used to establish this map without a geometrical interpretation of the data. This is done by converting single laser radar scans to statistical representations of the environ-ment, so that a crosscorrelation of an actu al converted scan and this representative results into the actual position and orientation in a global coordinate system. The map itsel f is build of representative scansfor the positions where the AMR has been, so that it is able to find its position and orientation by c omparing the actual scan with a scan stored in the map.
We present a mathematical knowledge base containing the factual know-ledge of the first of three parts of a textbook on semi-groups and automata,namely "P. Deussen: Halbgruppen und Automaten". Like almost all math-ematical textbooks this textbook is not self-contained, but there are somealgebraic and set-theoretical concepts not being explained. These concepts areadded to the knowledge base. Furthermore there is knowledge about the nat-ural numbers, which is formalized following the first paragraph of "E. Landau:Grundlagen der Analysis".The data base is written in a sorted higher-order logic, a variant of POST ,the working language of the proof development environment OmegaGamma mkrp. We dis-tinguish three different types of knowledge: axioms, definitions, and theorems.Up to now, there are only 2 axioms (natural numbers and cardinality), 149definitions (like that for a semi-group), and 165 theorems. The consistency ofsuch knowledge bases cannot be proved in general, but inconsistencies may beimported only by the axioms. Definitions and theorems should not lead to anyinconsistency since definitions form conservative extensions and theorems areproved to be consequences.
This paper addresses a model of analogy-driven theorem proving that is more general and cognitively more adequate than previous approaches. The model works at the level ofproof-plans. More precisely, we consider analogy as a control strategy in proof planning that employs a source proof-plan to guide the construction of a proof-plan for the target problem. Our approach includes a reformulation of the source proof-plan. This is in accordance with the well known fact that constructing ananalogy in maths often amounts to first finding the appropriate representation which brings out the similarity of two problems, i.e., finding the right concepts and the right level of abstraction. Several well known theorems were processed by our analogy-driven proof-plan construction that could not be proven analogically by previous approaches.
The relation between the Lattice Boltzmann Method, which has re- cently become popular, and the Kinetic Schemes, which are routinely used in Computational Fluid Dynamics, is explored. A new discrete velocity model for the numerical solution of Navier-Stokes equations for incom- pressible uid ow is presented by combining both the approaches. The new scheme can be interpreted as a pseudo-compressibility method and, for a particular choice of parameters, this interpretation carries over to the Lattice Boltzmann Method.
Compared to standard numerical methods for hyperbolic systems of conservation laws, Kinetic Schemes model propagation of information by particles instead of waves. In this article, the wave and the particle concept are shown to be closely related. Moreover, a general approach to the construction of Kinetic Schemes for hyperbolic conservation laws is given which summarizes several approaches discussed by other authors. The approach also demonstrates why Kinetic Schemes are particularly well suited for scalar conservation laws and why extensions to general systems are less natural.
The problem of finding an optimal location X* minimizing the maximum Euclidean distance to existing facilities is well solved by e.g. the Elzinga-Hearn algorithm. In practical situations X* will however often not be feasible. We therefore suggest in this note a polynomial algorithm which will find an optimal location X^F in a feasible subset F of the plane R^2
The problem of providing connectivity for a collection of applications is largely one of data integration: the communicating parties must agree on thesemantics and syntax of the data being exchanged. In earlier papers [#!mp:jsc1!#,#!sg:BSG1!#], it was proposed that dictionaries of definitions foroperators, functions, and symbolic constants can effectively address the problem of semantic data integration. In this paper we extend that earlier work todiscuss the important issues in data integration at the syntactic level and propose a set of solutions that are both general, supporting a wide range of dataobjects with typing information, and efficient, supporting fast transmission and parsing.
We consider a scale discrete wavelet approach on the sphere based on spherical radial basis functions. If the generators of the wavelets have a compact support, the scale and detail spaces are finite-dimensional, so that the detail information of a function is determined by only finitely many wavelet coefficients for each scale. We describe a pyramid scheme for the recursive determination of the wavelet coefficients from level to level, starting from an initial approximation of a given function. Basic tools are integration formulas which are exact for functions up to a given polynomial degree and spherical convolutions.
We consider a multiple objective linear program (MOLP) max{Cx|Ax = b,x in N_{0}^{n}} where C = (c_ij) is the p x n - matrix of p different objective functions z_i(x) = c_{i1}x_1 + ... + c_{in}x_n , i = 1,...,p and A is the m x n - matrix of a system of m linear equations a_{k1}x_1 + ... + a_{kn}x_n = b_k , k=1,...,m which form the set of constraints of the problem. All coefficients are assumed to be natural numbers or zero. The set M of admissable solutions {hat x} is an admissible solution such that there exists no other admissable solution x' with C{hat x} Cx'. The efficient solutions play the role of optimal solutions for the MOLP and it is our aim to determine the set of all efficient solutions
We investigate one of the classical problems of the theory ofterm rewriting, namely termination. We present an ordering for compar-ing higher-order terms that can be utilized for testing termination anddecreasingness of higher-order conditional term rewriting systems. Theordering relies on a first-order interpretation of higher-order terms anda suitable extension of the RPO.
In this paper we consider the problem of locating one new facility in the plane with respect to a given set of existing facility where a set of polygonal barriers restricts traveling. This non-convex optimization problem can be reduced to a finite set of convex subproblems if the objective function is a convex function of the travel distances between the new and the existing facilities (like e.g. the Median and Center objective functions). An exact Algorithm and a heuristic solution procedure based on this reduction result are developed.
A compact subset E of the complex plane is called removable if all bounded analytic functions on its complement are constant or, equivalently, i f its analytic capacity vanishes. The problem of finding a geometric characterization of the removable sets is more than a hundred years old and still not comp letely solved.
The asymptotic behaviour of a singular-perturbed two-phase Stefan problem due to slow diffusion in one of the two phases is investigated. In the limit the model equations reduce to a one-phase Stefan problem. A boundary layer at the moving interface makes it necessary to use a corrected interface condition obtained from matched asymptotic expansions. The approach is validated by numerical experiments using a front-tracking method.
This report presents the properties of a specification of the domain of process planning for rotary symmetrical workpieces. The specification results from a model for problem solving in this domain that involves different reasoners, one of which is an AI planner that achieves goals corresponding to machining workpieces by considering certain operational restrictions of the domain. When planning with SNLP (McAllester and Rosenblitt, 1991), we will show that the resulting plans have the property of minimizing the use of certain key operations. Further, we will show that, for elastic protected plans (Kambhampati et al., 1996) such as the ones produced by SNLP, the goals corresponding to machining parts of a workpiece are OE-constrained trivial serializable, a special form of trivial serializability (Barrett and Weld, 1994). However, we will show that planning with SNLP in this domain can be very difficult: elastic protected plans for machining parts of a workpiece are nonmergeable. Finally, we will show that, for sufix, prefix or sufix and prefix plans such as the ones produced by state-space planners, it is not possible to have both properties, being OEconstrained trivial serializable and minimizing the use of the key operations, at the same time.
A Tailored Real Time Temporal Logic for Specifying Requirements of Building Automation Systems
(1999)
A tailored real time temporal logic for specifying requirements of building automation systems is introduced and analyzed. The logic features several new real time operators, which are chosen with regard to the application area. The new operators improve the conciseness and readability of requirements as compared to a general-purpose real time temporal logic. In addition, some of the operators also enhance the expressiveness of the logic. A number of properties of the new operators are presented and proven.
Concept mapping is a simple and intuitive visual form of knowledge representation. Concept maps can be categorized as informal or formal, where the latter is characterized by implementing a semantics model constraining their components. Software engineering is a domain that has successfully adopted formal concept maps to visualize and specify complex systems. Automated tools have been implemented to support these models although their semantic constraints are hardcoded within the systems and hidden from users. This paper presents the Constraint Graphs and jKSImapper systems. Constraint Graphs is a flexible and powerful graphical system interface for specifying concept mapping notations. In addition, jKSImapper is a multi-user concept mapping editor for the Internet and the World Wide Web. Together, these systems aim to support user-definable formal concept mapping notations and distributed collaboration on the Internet and the World Wide Web.
In this paper we introduce a new type of single facility location problems on networks which includes as special cases most of the classical criteria in the literature. Structural results as well as a finite dominationg set for the optimal locations are developed. Also the extension to the multi-facility case is discussed.
The CBR team of the LISA is involved in several applied research projects based on the CBR paradigm. These applications use adaptation to solve the specific problems they face. So, we have capitalized some experience about how can be expressed and formalized adaptation processes. The bibliography on the subject is quite important but demonstrates a lake of formalism. At most, there exists some classifications about different types of adaptation.
We present a way to describe Reason Maintenance Systems using the sameformalism for justification based as well as for assumption based approaches.This formalism uses labelled formulae and thus is a special case of Gabbay'slabelled deductive systems. Since our approach is logic based, we are able toget a semantics oriented description of the systems in question.Instead of restricting ourselves to e.g. propositional Horn formulae, as wasdone in the past, we admit arbitrary logics. This enables us to characterizesystems as a whole, including both the reason maintenance component and theproblem solver, nevertheless maintaining a separation between the basic logicand the part that describes the label propagation. The possibility to freely varythe basic logic enables us to not only describe various existing systems, but canhelp in the design of completely new ones.We also show, that it is possible to implement systems based directly on ourlabelled logic and plead for "incremental calculi" crafted to attack undecidablelogics.Furthermore it is shown that the same approach can be used to handledefault reasoning, if the propositional labels are upgraded to first order.
Caching has long been used to reduce average access latency, from registers and memory pages cached by hardware, to the application level such as a web browser retaining retrieved documents. We focus here on the high-level caching of potentially shared networked documents and define two terms in relation to this type of caching: Zero latency refers to the condition where access to a document produces a cache hit on the local machine, that is, there is little or no latency due to the network (we assume that latency due to local disk and memory access is insignificant in comparison to network latency). A document with zero latency usually has been placed in the cache after a previous access, or has been pulled there through some prefetching mechanism. Negative latency refers to automatic presentation, or push, of a document to a user based on a prediction that the user will want that document. With an ideal system, a user would be presented with documents either that she was about to request, or that she would not know to request but that would be immediately useful to her.
We will answer a question posed in [DJK91], and will show that Huet's completion algorithm [Hu81] becomes incomplete, i.e. it may generate a term rewriting system that is not confluent, if it is modified in a way that the reduction ordering used for completion can be changed during completion provided that the new ordering is compatible with the actual rules. In particular, we will show that this problem may not only arise if the modified completion algorithm does not terminate: Even if the algorithm terminates without failure, the generated finite noetherian term rewriting system may be non-confluent. Most existing implementations of the Knuth-Bendix algorithm provide the user with help in choosing a reduction ordering: If an unorientable equation is encountered, then the user has many options, especially, the one to orient the equation manually. The integration of this feature is based on the widespread assumption that, if equations are oriented by hand during completion and the completion process terminates with success, then the generated finite system is a maybe non terminating but locally confluent system (see e.g. [KZ89]). Our examples will show that this assumption is not true.
The reasoning power of human-oriented plan-based reasoning systems is primarilyderived from their domain-specific problem solving knowledge. Such knowledge is, how-ever, intrinsically incomplete. In order to model the human ability of adapting existingmethods to new situations we present in this work a declarative approach for represent-ing methods, which can be adapted by so-called meta-methods. Since apparently thesuccess of this approach relies on the existence of general and strong meta-methods,we describe several meta-methods of general interest in detail by presenting the prob-lem solving process of two familiar classes of mathematical problems. These examplesshould illustrate our philosophy of proof planning as well: besides planning with thecurrent repertoire of methods, the repertoire of methods evolves with experience inthat new ones are created by meta-methods which modify existing ones.
Accelerating the maturation process within the software engineering discipline may result in boosts of development productivity. One way to enable this acceleration is to develop tools and processes to mimic evolution of traditional engineering disciplines. Principles established in traditional engineering disciplines represent high-level guidance to constructing these tools and processes. This paper discusses two principles found in the traditional engineering disciplines and how these principles can apply to mature the software engineering discipline. The discussion is concretized through description of the Collaborative Management Environment, a software system under collaborative development among several national laboratories.
Integrated project management means that design and planning are interleaved with plan execution, allowing both the design and plan to be changed as necessary. This requires that the right effects of change are propagated through the plan and design. When this is distributed among designers and planners, no one may have all of the information to perform such propagation and it is important to identify what effects should be propagated to whom when. We describe a set of dependencies among plan and design elements that allow such notification by a set of message-passing software agents. The result is to provide a novel level of computer support for complex projects.
Algorithmic ideal theory
(1999)
Algebraic geometers have used Gröbner bases as the main computational tool for many years, either to prove a theorem or to disprove a conjecture or just to experiment with examples in order to obtain a feeling about the structure of an algebraic variety. Non-trivial mathematical problems usually lead to non-trivial Gröbner basis computations, which is the reason why several improvements and efficient implementations have been provided by algebraic geometers (for example, the systems CoCoA, Macaulay and SINGULAR). The present paper starts with an introduction to some concepts of algebraic geometry which should be understood by people with (almost) no knowledge in this field. In the second chapter we introduce standard bases (generalization of Gröbner bases to non-well-orderings), which are needed for applications to local algebraic geometry (singularity theory), and a method for computing syzygies and free resolutions. In the third chapter several algorithms for primary decomposition of polynomial ideals are presented, together with a discussion of improvements and preferable choices. We also describe a newly invented algorithm for computing the normalization of a reduced affine ring. The last chapter gives an elementary introduction to singularity theory and then describes algorithms, using standard bases, to compute infinitesimal deformations and obstructions, which are basic for the deformation theory of isolated singularities. It is impossible to list all papers where Gröbner basis have been used in local and global algebraic geometry, and even more impossible to give an overview about these contributions. We have, therefore, included only a few references to papers which contain interesting applications and which are not mentioned in this tutorial paper. The interested reader will find many more in the other contributions of this volume and in the literature cited there.