Now showing 1 - 5 of 5
  • Publication
    Analysis of functional software dependencies through supervised execution
    ( 2014)
    Jahic, Jasmin
    ;
    Knowing functional interferences between system components is imperative when developing safety critical systems. In this paper, we describe an approach for reliably detecting functional dependencies of software components to other system entities through supervised execution of software using the simulation techniques. Supervised execution enables monitoring of the internal state of system components and therefore permits the rapid detection of component behavior changes across simulation contexts. Credibility of results is achieved through collected test coverage metrics.
  • Publication
    A pattern-based approach to DSL development
    ( 2012)
    Schäfer, Christian
    ;
    ;
    Tool support for the development of Domain-specific Languages (DSLs) is continuously increasing. This reduces implementation effort for DSLs and enables the development of rather complex languages within reasonable amounts of time. However, the lack of commonly agreed and applied language engineering processes, many times turns DSL development into a set of creative activities, whose outcomes depend on the experience of the developers involved. Consequently, outcomes of language engineering activities are unpredictable with respect to their quality, and are often not maintainable either. We have therefore developed an approach that transfers the concept of architecture and design patterns from software engineering to language development. In this paper, we propose this approach and evaluate its applicability in a case study.
  • Publication
    Extensible and automated model-evaluations with INProVE
    ( 2011)
    Kemmann, Sören
    ;
    ;
    Model-based development is gaining more and more importance for the creation of software-intensive embedded systems. One important aspect of software models is model quality. This does not imply functional correctness, but non-functional properties, such as maintainability, scalability, extensibility. Lots of effort was put into development of metrics for control flow models.In the embedded systems domain however, domain specific- and data flow languages are commonly applied for model creation. For these languages, existing metrics are not applicable. Domain and project specific quality metrics therefore are informally defined; tracking conformance to these metrics is a manual and effort consuming task. To resolve this situation, we developed INProVE. INProVE is a model-based framework that supports definition of quality metrics in an intuitive, yet formal notion. It provides automated evaluation of design models through its indicators.Applied in different industry projects to complex models, INProVE has proven its applicability for quality assessment of dataflow oriented design models not only in research, but also in practice.