Now showing 1 - 10 of 19
  • Publication
    Computer Scientist's and Programmer's View on Quantum Algorithms: Mapping Functions' APIs and Inputs to Oracles
    ( 2022) ;
    Tcholtchev, Nikolay Vassilev
    ;
    ;
    Quantum Computing (QC) is a promising approach which is expected to boost the development of new services and applications. Specific addressable problems can be tackled through acceleration in computational time and advances with respect to the complexity of the problems, for which QC algorithms can support the solution search. However, QC currently remains a domain that is strongly dominated by a physics' perspective. Indeed, in order to bring QC to industrial grade applications we need to consider multiple perspectives, especially the one of software engineering and software application/service programming. Following this line of thought, the current paper presents our computer scientist's view on the aspect of black-box oracles, which are a key construct for the majority of currently available QC algorithms. Thereby, we observe the need for the input of API functions from the traditional world of software engineering and (web-)services to be mapped to the above mentioned black-box oracles. Hence, there is a clear requirement for automatically generating oracles for specific types of problems/algorithms based on the concrete input to the belonging APIs. In this paper, we discuss the above aspects and illustrate them on two QC algorithms, namely Deutsch-Jozsa and the Grover's algorithm.
  • Publication
    Model-based Method to Utilize a Catalogue of Quality Requirements in Software Development
    Complex software-based systems must comply with both functional and non-functional requirements (NFRs) to provide usefulness. This paper presents a structured catalogue of quality requirements and a model-based approach to collect NFRs from the catalogue in a given project context. The NFR catalogue is structured according to the quality criteria from the ISO 25000 series of standards and can be further extended. This catalogue can be applied in specific software development or modernization projects and in the preparation of tenders. This application to a specific project context is achieved by using the BPMN-NFR method presented in this paper. In this method, pattern recognition in system models is used to build a soft-goal model that serves as a filter mechanism for selecting relevant quality requirements from the catalogue. Through enrichment with context information concrete system related non-functional requirements are derived, which can be used for the system development. This model-based method was developed and applied in the context of the modernization of the budgetary procedures of Germany's federal government.
  • Publication
    Quantum DevOps: Towards reliable and applicable NISQ Quantum Computing
    Quantum Computing is emerging as one of the great hopes for boosting current computational resources and enabling the application of ICT for optimizing processes and solving complex and challenging domain specific problems. However, the Quantum Computing technology has not matured to a level where it can provide a clear advantage over high performance computing yet. Towards achieving this "quantum advantage", a larger number of Qubits is required, leading inevitably to a more complex topology of the computing Qubits. This raises additional difficulties with decoherence times and implies higher Qubit error rates. Nevertheless, the current Noisy Intermediate-Scale Quantum (NISQ) computers can prove useful despite the intrinsic uncertainties on the quantum hardware layer. In order to utilize such error-prone computing resources, various concepts are required to address Qubit errors and to deliver successful computations. In this paper describe and motivate the need for the novel concept of Quantum DevOps. which entails regular checking of the reliability of NISQ Quantum Computing (QC) instances. By means of testing the computational reliability of basic quantum gates and computations (C-NOT, Hadamard, etc.)it consequently estimates the likelihood for a large scale critical computation (e.g. calculating hourly traffic flow models for a city) to provide results of sufficient quality. Following this approach to select the best matching (cloud) QC instance and having it integrated directly with the processes of development, testing and finally the operations of quantum based algorithms and systems enables the Quantum DevOps concept.
  • Publication
    Modellbasierte Methode zur Ableitung nicht-funktionaler Anforderungen im Kontext der Softwaremodernisierung
    ( 2020) ; ; ;
    Knauer, Christian
    ;
    Ganz, Angelika
    Komplexe softwarebasierte Systeme müssen sowohl funktionale als auch nicht-funktionale Anforderungen (NFA) erfüllen, um von den Nutzern akzeptiert und als Unterstützung der Arbeitsaufgaben angenommen zu werden. Wir stellen in diesem Artikel eine modellbasierte Methode zur systematischen Ableitung nicht-funktionaler Anforderungen aus einem harmonisierten Katalog von Qualitätskriterien dar. Diese Methode wurde im Kontext der Modernisierung der Haushaltsverfahren des Bundes entwickelt und angewendet. Zunächst wurde ein Katalog von Eigenschaften erstellt, in dem generische nicht-funktionale Anforderungen aufgeführt sind. Der NFA-Katalog ist nach den Qualitätskriterien der ISO-Normenreihe 25000 strukturiert und erweiterbar. Dieser Katalog steht für die konkreten Entwicklungs- oder Modernisierungsvorhaben bspw. für die Erstellung von Ausschreibungen zur Verfügung. In einem methodischen Vorgehen wird mit Hilfe von Mustererkennung in Systemmodellen ein Softgoal-Modell aufgebaut, das als Filtermechanismus für die Auswahl der relevanten nicht-funktionalen Eigenschaften dient. Durch eine Anreicherung mit Systemkontextinformationen werden so konkrete und auf das System bezogene Anforderungen abgeleitet, welche sich für die Systementwicklung nutzen lassen.
  • Publication
    Advanced Software Engineering
    ( 2019)
    Schieferdecker, Ina
    ;
    Software rules them all! In every industry now, software plays a dominant role in technical and business innovations, in improving functional safety, and also for increasing convenience. Nevertheless, software is not always designed, (re)developed, and/or secured with the necessary professionalism, and there are unnecessary interruptions in the development, maintenance, and operating chains that adversely affect reliable, secure, powerful, and trustworthy systems. Current surveys such as the annual World Quality Report put it bluntly, directly correlated with the now well-known failures of large-scale, important and/or safetycritical infrastructures caused by software. It is thus high time that software development be left to the experts and that space be created for the use of current methods and technologies. The present article sheds light on current and future software engineering approaches that can also and especially be found in the Fraunhofer portfolio.
  • Publication
    Enabling the interoperability of the Modelica DSL and Matlab Simulink towards the development of self-adaptive dynamic systems
    ( 2018)
    Tcholtchev, Nikolay
    ;
    ;
    Wagner, Michael
    ;
    ; ;
    Domain Specific Languages (DSL) are an important concept that is used in industry, in order to enable the fast and cost-efficient design of specific functions/components, and/or to target particular aspects of the systems' development and operation. In the current article, the authors describe their experiences on the integration of the Modelica DSL into a platform that enables the integration and interoperability of model-based tools across the various phases of the system development process. Furthermore, it is illustrated how Matlab Simulink can be used in parallel in the course of the same system design undertaking. Thereby, the authors present their approach and compare different tools which were used, in order to efficiently complete the integration, and finally exemplify the outcome on a case study related to a self-adaptive dynamic system from the automotive domain.
  • Publication
    Approaches to automation and interoperability in systems engineering
    Creating large and complex systems typically involves a number of potentially geographically separated development teams and a number of various different tools. These two aspects are two important dimensions of complexity, which should be considered when planning large system engineering efforts. The importance of these aspects has become increasingly eminent and recent approaches try to handle these issues. Among them are OSLC (Open Services for Lifecycle Collaboration) and ModelBus®. Those approaches address the platform aspect of system engineering on complementary level of abstraction and are a step forward with respect to integration and interoperability challenges. These approaches set de facto standards, which is important to increase efficiency in systems engineering. Based on these considerations big European initiatives like the ARTMIS Joint Undertaking projects CRYSTAL and VARIES started to work on a Reference Technology Platform with the goal to improve interoperability. The talk will present challenges and solutions in large-scale system engineering efforts and focuses on the aspect of collaboration and standardisation.
  • Publication
    Integrating the Modelica DSL into a platform for model-based tool interoperability
    ( 2014)
    Tcholtchev, Nikolay
    ;
    ;
    Wagner, Michael
    ;
    ; ;
    Domain Specific Languages (DSL) are an important concept that is used in industry, in order to enable the fast and cost efficient design of specific functions/components, and/or to target particular aspects of the systems' development and operation. In the current paper, we describe our experiences on the integration of the Modelica DSL into a platform that enables the integration and interoperability of model-based tools across the various phases of the system development process. Thereby, we present our approach, compare different tools which were used, in order to efficiently complete the integration, and finally exemplify the outcome on a case study from the automotive domain.
  • Publication
    Model-based testing in legacy software modernization: An experience report
    ( 2013) ;
    Kranz, Marco
    ;
    ; ;
    García Flaquer, Ana
    With the advent of cloud computing more and more vendors strive to modernize legacy applications and deploy them into the cloud. In particular when the legacy system is still applied in the field, the vendor must ensure a seamless change to the modernized system to not lose any economical assets and to keep the business running. As with normal development processes, testing is also inevitable for a modernization process to gain confidence that the modernized system behaves correctly. This paper describes an experience report from the FP 7 research project REMICS that deals with model-driven modernization of legacy systems to the cloud. We employed a model-based testing process for safeguarding the correct migration of the modernized system's functionality. As test modeling language, the UML Testing Profile was applied. The modernized system, called DOME, was one of the case studies contributed by one of the business partners of the project.