Kaiserslautern - Fachbereich Informatik
Refine
Year of publication
- 1999 (267) (remove)
Document Type
- Preprint (206)
- Article (52)
- Report (5)
- Master's Thesis (3)
- Study Thesis (1)
Has Fulltext
- yes (267)
Keywords
- Case-Based Reasoning (11)
- AG-RESY (6)
- Fallbasiertes Schliessen (5)
- HANDFLEX (5)
- PARO (5)
- case-based problem solving (5)
- Abstraction (4)
- Fallbasiertes Schließen (4)
- Knowledge Acquisition (4)
- resolution (4)
- Internet (3)
- Knowledge acquisition (3)
- Maschinelles Lernen (3)
- Requirements/Specifications (3)
- case-based reasoning (3)
- distributed software development (3)
- distributed software development process (3)
- explanation-based learning (3)
- problem solving (3)
- theorem proving (3)
- CAPlan (2)
- Deduction (2)
- Diagnose technischer Systeme (2)
- KLUEDO (2)
- MOLTKE-Projekt (2)
- Network Protocols (2)
- PATDEX (2)
- Partial functions (2)
- SDL (2)
- Server (2)
- Wissensakquisition (2)
- World Wide Web (2)
- analogy (2)
- application (2)
- building automation (2)
- case based reasoning (2)
- conservative extension (2)
- consistency (2)
- design patterns (2)
- fallbasiertes Schliessen (2)
- formal specification (2)
- frames (2)
- learning system (2)
- many-valued logic (2)
- problem formulation (2)
- requirements engineering (2)
- reuse (2)
- tactics (2)
- Ablagestruktur (1)
- Access System (1)
- Ad-hoc workflow (1)
- Adaption (1)
- Agents (1)
- Automated Reasoning (1)
- Automated theorem proving (1)
- Automation (1)
- Autonomous mobile robots (1)
- Bitlisten (1)
- Blackboard architecture (1)
- CNC-Maschine (1)
- COMOKIT (1)
- Case Study (1)
- Case Study Erfahrungsdatenbank (1)
- Case-based problem solving (1)
- Causal Ordering (1)
- Causality (1)
- Classification Tasks (1)
- CoMo-Kit (1)
- Computer supported cooperative work (1)
- Concept mapping (1)
- Concept maps (1)
- Constraint Graphs (1)
- Contract net (1)
- Cooperative decision making (1)
- Correlation (1)
- DES (1)
- Datenreduktion (1)
- Declarative and Procedural Knowledge (1)
- Diagnosesystem (1)
- Didaktik (1)
- Difference Reduction (1)
- Distributed Computation (1)
- Distributed Deb (1)
- Distributed Multimedia Applications (1)
- Distributed Software Development (1)
- Distributed Software Development Projects (1)
- Distributed System (1)
- Distributed software development support (1)
- Distributed systems (1)
- Dublin Core (1)
- EBG (1)
- Ecommerce (1)
- Equality reasoning (1)
- Equational Reasoning (1)
- Experience Base (1)
- Experience Database (1)
- Experiment (1)
- Expertensysteme (1)
- Extensibility (1)
- Fachdidaktik (1)
- Fallstu (1)
- Farbmodell (1)
- Gebäude (1)
- Gebäudeautomation (1)
- Generic Methods (1)
- Global Predicate Detection (1)
- Global Software Highway (1)
- HOT (1)
- HTE (1)
- HTML (1)
- Harvest (1)
- Huffman (1)
- Huffman-Code (1)
- IDEA (1)
- INRECA (1)
- Induktive Logische Programmierung (1)
- Induktiven Logischen Programmierung (1)
- Intelligent Agents (1)
- Intelligent agents (1)
- Interleaved Planning (1)
- Java (1)
- Knuth-Bendix completion algorithm (1)
- Komprimierung (1)
- Kryptographie (1)
- Kryptologie (1)
- Language Constructs (1)
- Laufkomprimierung (1)
- Lernalgorithmen (1)
- Logical Time (1)
- Logische Programmierung (1)
- MHEG (1)
- MOO (1)
- MVP-L (1)
- Map Building (1)
- Maturity of Software Engineering (1)
- Metadaten (1)
- Methods (1)
- Object-Relational DataBase Management Systems (ORDBMS) (1)
- Object-Relational Database Systems (1)
- Open-Source (1)
- PATDEX 2 (1)
- Planning and Verification (1)
- Position- and Orientation Estimation (1)
- Problem Solvers (1)
- Process Management (1)
- Process support (1)
- Produktionsdesign (1)
- Project Management (1)
- Quality Improvement Paradigm (QIP) (1)
- RDF (1)
- RGB (1)
- RLE (1)
- RSA (1)
- Repositories (1)
- Requirements engineering (1)
- Resource Description Framework (1)
- Reuse (1)
- SDL-oriented Object Modeling Technique (1)
- SDL-pattern a (1)
- SKALP (1)
- SOMT (1)
- Self-Referencing (1)
- Semantics of Programming Languages (1)
- Similarity Assessment (1)
- Smalltalk (1)
- Software Agents (1)
- Software Engineering (1)
- Software development (1)
- Software development environment (1)
- Software engineering (1)
- Structural Adaptation (1)
- Structure (1)
- Structuring Approach (1)
- Systemarchitektur (1)
- Tactics (1)
- Term rewriting systems (1)
- Topology Preserving Networks (1)
- Unterricht (1)
- Vector Time (1)
- Virtual Corporation (1)
- Virtual Software Projects (1)
- Wide Area Multimedia Group Interaction (1)
- Wissenserwerb (1)
- Workflow Replication (1)
- World-Wide Web (1)
- Zugriffstruktur (1)
- Zugriffsystem (1)
- abstract description (1)
- adaption (1)
- analogical reasoning (1)
- artificial intelligence (1)
- assembly sequence design (1)
- automated code generation (1)
- automated computer learning (1)
- automated synchronization (1)
- autonomes Lernen (1)
- autonomous learning (1)
- business process modelling (1)
- case-based planner (1)
- case-based planning (1)
- co-learning (1)
- combined systems with sha (1)
- communication architectures (1)
- communication protocols (1)
- communication subsystem (1)
- compilation (1)
- completeness (1)
- comprehensive reuse (1)
- computer aided planning (1)
- computer-supported cooperative work (1)
- concept representation (1)
- conceptual representation (1)
- concurrent software (1)
- confluence (1)
- constraint satisfaction problem (CSP) (1)
- continuous media (1)
- cooperative problem solving (1)
- customization of communication protocols (1)
- decision support (1)
- dependency management (1)
- design processes (1)
- diagnostic problems (1)
- disjoint union (1)
- distributed (1)
- distributed c (1)
- distributed deduction (1)
- distributed document management (1)
- distributed enterprise (1)
- distributed groupware environment (1)
- distributed multi-platform software development (1)
- distributed multi-platform software development projects (1)
- distributed software configuration management (1)
- distributed softwaredevelopment tools (1)
- experience base (1)
- experimental software engineering (1)
- fallbasiertes planen (1)
- flexible workflows (1)
- formal description techniques (1)
- formal reasoning (1)
- generic design of a customized communication subsystem (1)
- goal oriented completion (1)
- heterogeneous large-scale distributed DBMS (1)
- high-level caching of potentially shared networked documents (1)
- higher order logic (1)
- higher-order tableaux calculus (1)
- higher-order theorem prover (1)
- hybrid knowledge representation (1)
- industrial supervision (1)
- information systems engineering (1)
- innermost termination (1)
- intelligent agents (1)
- internet event synchronizer (1)
- isochronous streams (1)
- knowledge space (1)
- knowledge-based planning (1)
- konzeptuelle Modelierung (1)
- learning (1)
- learning algorithms (1)
- linked abstraction workflows (1)
- mathematical concept (1)
- middleware (1)
- mobile agents (1)
- mobile agents approach (1)
- modelling time (1)
- modularity (1)
- monitoring and managing distributed development processes (1)
- morphism (1)
- motion planning (1)
- multi-agent architecture (1)
- multimedia (1)
- multiple-view product modeling (1)
- narrowing (1)
- negotiation (1)
- object frameworks (1)
- order-sorted logic (1)
- paramodulation (1)
- plan enactment (1)
- problem solvers (1)
- process model (1)
- process modelling (1)
- process support system (PROSYT) (1)
- process-centred environments (1)
- profiles (1)
- programmable client-server systems (1)
- project coordination (1)
- proof plans (1)
- protocol (1)
- rate control (1)
- reactive systems (1)
- real time (1)
- real-time (1)
- real-time temporal logic (1)
- receptive safety properties (1)
- reliability (1)
- requirements (1)
- reuse repositories (1)
- search algorithms (1)
- second order logic (1)
- shortest sequence (1)
- similarity measure (1)
- software agents (1)
- software project (1)
- software project management (1)
- sorted logic (1)
- soundness (1)
- system behaviour (1)
- tableau (1)
- temporal logic (1)
- termination (1)
- theorem prover (1)
- topology preserving maps (1)
- traceability (1)
- translation (1)
- traveling salesman problem (1)
- typical examples (1)
- typical instance (1)
- verlustfrei (1)
- virtual market place (1)
- visual process modelling environment (1)
- weak termination (1)
- wissensbasierte Systeme (1)
- wissensbasierter Systeme der Arbeitsplanerstellung (1)
- work coordination (1)
Faculty / Organisational entity
A fundamental variance reduction technique for Monte Carlo integration in the framework of integro-approximation problems is
presented. Using the method of dependent tests a successive hierarchical function approximation algorithm is developed, which
captures discontinuities and exploits smoothness in the target function. The general mathematical scheme and its highly efficient
implementation are illustrated for image generation by ray tracing,
yielding new and much faster image synthesis algorithms.
We study the global solution of Fredholm integral equations of the second kind by the help of Monte Carlo methods. Global solution means that we seek to approximate the full solution function. This is opposed to the usual applications of Monte Carlo, were one only wants to approximate a functional of the solution. In recent years several researchers developed Monte Carlo methods also for the global problem. In this paper we present a new Monte Carlo algorithm for the global solution of integral equations. We use multiwavelet expansions to approximate the solution. We study the behaviour of variance on increasing levels, and based on this, develop a new variance reduction technique. For classes of smooth kernels and right hand sides we determine the convergence rate of this algorithm and show that it is higher
than those of previously developed algorithms for the global problem. Moreover, an information-based complexity analysis shows that our algorithm is optimal among all stochastic algorithms of the same computational
cost and that no deterministic algorithm of the same cost can reach its convergence rate.
Approximation properties of the underlying estimator are used to improve the efficiency of the method of dependent tests. A multilevel approximation procedure is developed such that in each level the number of samples is balanced with the level-dependent variance, resulting in a considerable reduction of the overall computational cost. The new technique is applied to the Monte Carlo estimation of integrals depending on a parameter.
Im Bereich des Software Engineering werden komplexe Software-Entwicklungsprojekte betrachtet. Im Rahmen dieser Projekte werden große Mengen von Informationen bearbeitet. Diese Informationen werden in Software-Artefakten (z.B. in Projektplänen oder Entwicklungsdokumenten, wie Anforderungsbeschreibungen)
festgehalten. Die Artefakte werden während der Entwicklung und der Wartung eines Softwaresystems häufig geändert. Änderungen einer Information in einem Artefakt haben häufig Änderungen
im selben und in anderen Artefakten zur Folge, da Beziehungen innerhalb und zwischen den in den Artefakten festgehaltenen Informationen bestehen. Die Beziehungen liegen meist nicht explizit vor, so daß die Konsequenzen einer Änderung schwer zu überblicken sind. In dieser Arbeit wurde ein Verfolgbarkeitsansatz ausgewählt, der den Benutzer bei der Durchführung von Änderungen an Artefakten unterstützt. Unterstützung bedeutet hierbei, daß der Aufwand zur Durchführung einer Änderung reduziert wird und weniger Fehler bei der Durchführung gemacht werden.
In der Arbeit wurden Anforderungen an einen auszuwählenden Verfolgbarkeitsansatz gestellt. Eine Anforderung war, daß er auf verschiedene Bereiche des Software Engineering, wie z.B. Systementwurf oder Meßplanung, mit jeweils sehr unterschiedlichen Artefakten, anwendbar sein sollte. Die durchgeführte
Literaturrecherche und die anschließende Bewertung anhand der gestellten Anforderungen ergaben, daß das Prinzip der Metamodellierung in Verbindung mit Wissensbankverwaltungssystemen ein geeigneter Verfolgbarkeitsansatz ist. Eine Evaluation, die sich auf Fallstudien aus den Bereichen
"Objektorientierter Entwurf mit UML" und "Meßplanung mit GQM" bezog, ergab, daß das Wissensbankverwaltungssystem
ConceptBase, das auf der Wissensrepräsentationssprache 0-Telos basiert, ein geeignetes Werkzeug zur Unterstützung des Verfolgbarkeitsansatzes ist.
Versions- und Konfigurationsmanagement sind zentrale Instrumente zur intellektuellen Beherrschung komplexer Softwareentwicklungen. In stark wiederverwendungsorientierten Softwareentwicklungsansätzen -wie vom SFB bereitgestellt- muß der Begriff der Konfiguration von traditionell produktorientierten Artefakten auf Prozesse und sonstige Entwicklungserfahrungen erweitert werden. In dieser Veröffentlichung wird ein derartig erweitertes Konfigurationsmodell vorgestellt. Darüberhinau wird eine Ergänzung traditioneller Projektplanungsinformationen diskutiert, die die Ableitung maßgeschneiderter Versions- und Konfigurationsmanagementmechanismen vor Projektbeginn ermöglichen.
The development of complex software systems is driven by many diverse and sometimes contradictory requirements such as correctness and maintainability of resulting products, development costs, and time-to-market. To alleviate these difficulties, we propose a development method for distributed systems that integrates different basic approaches. First, it combines the use of the formal description technique SDL with software reuse concepts. This results in the definition of a use-case driven, incremental development method with SDL-patterns as the main reusable artifacts. Experience with this approach has shown that there are several other factors of influence, such as the quality of reuse artifacts or the experience of the development team. Therefore, we further combined our SDL-pattern approach with an improvement methodology known from the area of experimental software engineering. In order to demonstrate the validity of this integrating approach, we sketch some representative outcomings of a case study.
Die Entwicklung des Zusammenlebens der Menschen geht immer mehr den Weg zur Informations- und Mediengesellschaft. Nicht zuletzt aufgrund der weltweiten Vernetzung ist es uns in minutenschnelle möglich, fast alle erdenklichen Informationen zu Hause auf den Bildschirm geliefert zu bekommen. Es findet sich so jeder zwar in einer gewissen schützenden Anonymität, aber dennoch einer genauso gewollten, wie erschreckenden Transparenz wieder. Jeder klassifiziert in gewisser Weise Informationen, die er preisgibt etwa in öffentliche, persönliche und vertrauliche Nachrichten. Gerade hier müssen Techniken und Methoden bereitstehen, um in dieser anonymen Transparenz Informationen, die nur für spezielle Empfänger gedacht sind vor unbefugtem Zugriff zu schützen und nur denjenigen zugänglich zu machen, die dazu berechtigt sind. Diesen Wunsch hat nicht nur allgemein die Gesellschaft, sondern im speziellen wird die Entwicklung auf diesem Gebiet gerade von staatlichen und militärischen Einrichtungen gefordert und gefördert. So sind häufig eingesetzte Werkzeuge die Methoden der Kryptologie, aber solange es geheime Nachrichten gibt, wird es Angreifer geben, die versuchen, sich unberechtigten Zugang zu diesen Informationen zu verschaffen. Da die ständig wachsende Leistung von EDV-Anlagen das "Knacken" von Verschlüsselungsmethoden begünstigt, muß zu immer sichereren Chiffrierverfahren übergegangen werden. Dieser Umstand macht das Thema Kryptologie für den Moment hochaktuell und auf lange Sicht zu einem zeitlosen Forschungsgebiet der Mathematik und Informatik.
Using an experience factory is one possible concept for supporting and improving reuse in software development. (i.e., reuse of products, processes, quality models, ...). In the context of the Sonderforschungsbereich 501: "Development of Large Systems with Generic methods" (SFB501), the Software Engineering Laboratory (SE Lab) runs such an experience factory as part of the infrastructure services it offers. The SE Lab also provides several tools to support the planning, developing, measuring, and analyzing activities of software development processes. Among these tools, the SE Lab runs and maintains an experience base, the SFB-EB. When an experience factory is utilized, support for experience base maintenance is an important issue. Furthermore, it might be interesting to evaluate experience base usage with regard to the number of accesses to certain experience elements stored in the database. The same holds for the usage of the tools provided by the SE LAB. This report presents a set of supporting tools that were designed to aid in these tasks. These supporting tools check the experience base's consistency and gather information on the usage of SFB-EB and the tools installed in the SE Lab. The results are processed periodically and displayed as HTML result reports (consistency checking) or bar charts (usage profiles).
Manipulating deformable linear objects - Vision-based recognition of contact state transitions -
(1999)
A new and systematic approach to machine vision-based robot manipulation of deformable (non-rigid) linear objects is introduced. This approach reduces the computational needs by using a simple state-oriented model of the objects. These states describe the relation of the object with respect to an obstacle and are derived from the object image and its features. Therefore, the object is segmented from a standard video frame using a fast segmentation algorithm. Several object features are presented which allow the state recognition of the object while being manipulated by the robot.
Comprehensive reuse and systematic evolution of reuse artifacts as proposed by the Quality Improvement Paradigm (QIP) do not only require tool support for mere storage and retrieval. Rather, an integrated management of (potentially reusable) experience data as well as project-related data is needed. This paper presents an approach exploiting object-relational database technology to implement the QIP-driven reuse repository of the SFB 501. Requirements, concepts, and implementational aspects are discussed and illustrated through a running example, namely the reuse and continuous improvement of SDL patterns for developing distributed systems. Based on this discussion, we argue that object-relational database management systems (ORDBMS) are best suited to implement such a comprehensive reuse repository. It is demonstrated how this technology can be used to support all phases of a reuse process and the accompanying improvement cycle. Although the discussions of this paper are strongly related to the requirements of the SFB 501 experience base, the basic realization concepts, and, thereby, the applicability of ORDBMS, can easily be extended to similar applications, i. e., reuse repositories in general.