Now showing 1 - 9 of 9
  • Publication
    Towards safety-awareness and dynamic safety management
    Future safety-critical systems will be highly automated or even autonomous and they will dynamically cooperate with other systems as part of a comprehensive ecosystem. This together with increasing utilization of artificial intelligence introduces uncertainties on different levels, which detriment the application of established safety engineering methods and standards. These uncertainties might be tackled by making systems safety-aware and enabling them to manage themselves accordingly. This paper introduces a corresponding conceptual dynamic safety management framework incorporating monitoring facilities and runtime safety-models to create safety-awareness. Based on this, planning and execution of safe system optimizations can be carried out by means of self-adaptation. We illustrate our approach by applying it for the dynamic safety assurance of a single car.
  • Publication
    Towards integrating undependable self-adaptive systems in safety-critical environments
    Modern cyber-physical systems (CPS) integrate more and more powerful computing power to master novel applications and adapt to changing situations. A striking example is the recent progression in the automotive market towards autonomous driving. Powerful artificial intelligent algorithms must be executed on high performant parallelized platforms. However, this cannot be employed in a safe way, as the platforms stemming from the consumer electronics (CE) world still lack required dependability and safety mechanisms. In this paper, we present a concept to integrate undependable self-adaptive subsystems into safety-critical environments. For this, we introduce self-adaptation envelopes which manage undependable system parts and integrate within a dependable system. We evaluate our approach by a comprehensive case study of autonomous driving. Thereby, we show that the potential failures of the AUTOSAR Adaptive platform as exemplary undependable system can be handled by our concept. In overall, we outline a way of integrating inherently undependable adaptive systems into safety-critical CPS.
  • Publication
    WAP: Digital dependability identities
    ( 2015) ; ;
    Papadopoulos, Yiannis
    ;
    Armengaud, Eric
    ;
    Zeller, Marc
    ;
    Höfig, Kai
    Cyber-Physical Systems (CPS) provide enormous potential for innovation but a precondition for this is that the issue of dependability has been addressed. This paper presents the concept of a Digital Dependability Identity (DDI) of a component or system as foundation for assuring the dependability of CPS. A DDI is an analyzable and potentially executable model of information about the dependability of a component or system. We argue that DDIs must fulfill a number of properties including being universally useful across supply chains, enabling off-line certification of systems where possible, and providing capabilities for in-field certification of safety of CPS. In this paper, we focus on system safety as one integral part of dependability and as a practical demonstration of the concept, we present an initial implementation of DDIs in the form of Conditional Safety Certificates (also known as ConSerts). We explain ConSerts and their practical operationalization based on an illustrative example.
  • Publication
    A safety engineering framework for open adaptive systems
    In recent years it has become more and more evident that openness and adaptivity are key characteristics of next generation distributed systems. The reason for that is not least the advent of computing trends like Ubiquitous Computing, Ambient Intelligence, and Cyber Physical Systems, where systems are usually open for dynamic integration and able to react adaptively to changing situations. Despite being open and adaptive it is a common requirement for such systems to be safe. However, traditional safety assurance techniques, both state-of-the-practice and state-of-the-art, are not sufficient in this context. We recently developed some initial solution concepts based on conditional safety certificates and corresponding runtime analyses. In this paper we show how to operationalize these concepts. To this end we present in detail how to specify conditional safety certificates, how to transform them into suitable runtime models, and how these models finally support dynamic safetyevaluations.
  • Publication
    Approaching runtime trust assurance in open adaptive systems
    In recent years it has become more and more evident that the ability of systems to adapt themselves is an increasingly important requirement. This is not least driven by emerging computing trends like Ubiquitous Computing, Ambient Intelligence, and Cyber Physical Systems, where systems have to react on changing user needs, service/device availability and resource situations. Despite being open and adaptive it is a common requirement for such systems to be trustworthy, whereas traditional assurance techniques for related system properties like safety, reliability and security are not sufficient in this context. We recently developed the Plug&Safe approach for composition time safety assurance in systems of systems. In this position paper we provide an overview on Plug&Safe, elaborate the different facets of trust, and discuss how our approach can be augmented to enable trust assurance in open adaptive systems.
  • Publication
    Conditional safety certificates in open systems
    In the wake of current computing trends like Ubiquitous Computing, Ambient Intelligence and Cyber Physical Systems, new application domains like Car2Car emerged. One key characteristic of these new application domains is their openness with respect to dynamic integration of devices and components. It is obvious that traditional safety assurance techniques, both state of the practice and state of the art, are not sufficient in this context. A possible solution approach would be to shift portions of the safety assurance process into run time. This can be reached by the integration of appropriate run time safety models and corresponding dynamic evaluation mechanisms. In this paper we sketch out our recent work on conditional safety certificates, which facilitate such dynamic safety evaluation. We conclude with a brief discussion and state promising research directions for the future.
  • Publication
    Engineering dynamic adaptation for achieving cost-efficient resilience in software-intensive embedded systems
    Resilience has been successfully realized in automotive systems to increase system reliability at reasonable costs. Using dynamic adaptation, the system adapts to runtime errors - caused by internal system faults or adverse environmental situations like critical driving situations - in order to provide the best possible functionality and to guarantee system safety in any given system and environmental state. This paper introduces an engineering approach for developing resilient systems using dynamic adaptation. The approach is based on component-oriented modeling and on analyses of component compositions. We describe how component-oriented modeling and compositional analyses enable the usage of dynamic adaptation for achieving a trade-off between availability and cost in safety-critical, resilient systems and how it helps to manage the complexity inherent in component composition.
  • Publication
    Runtime safety models in open systems of systems
    Upcoming application domains, from ambient assisted living to car2car, show the need for openness, flexibility and safety in next generation embedded systems. Whilst there are several approaches tackling the dynamic reconfiguration and integration of components, there has not been much research done on safety of such systems. As many of the application domains are inherently safety critical this hinders open systems of systems to unfold their full potential. Models at runtime have shown to foster dynamic adaptation of software systems. In a similar way, the integration of appropriate runtime safety models and dynamic evaluation mechanisms into systems seems to be a viable approach to enable safety management at runtime. In this paper we sketch out our modeling approach for adaptive ad hoc systems and present first results with respect to the integration and usage of safety models at runtime.
  • Publication
    Development of safe and reliable embedded systems using dynamic adaptation
    A major application of dynamic adaptation is the development of safe and reliable embedded systems. In contrast to classical redundancy approaches dynamic adaptation can react much more flexible to different kinds of errors including changes in the environment. Moreover dynamic adaptation can usually be realized much more cost-efficient than classical redundancy or faulttolerance mechanisms. Using dynamic adaptation for developing dependable systems requires means to explicitly specify the adaptation behavior and to analyze the effects of dynamic adaptation on system reliability and particularly safety. However, these activities are very complex and error prone and hence pose the need for a sound and seamless engineering support. For this reason, this position paper points out some of the lessons we have learned over the last years of applying and advancing dynamic adaptation for the development of safe and reliable adaptive systems. We furthermore discuss and classify current achievements in research and practice and derive corresponding future research challenges.