Keywords

1 Introduction

Software quality is an essential competitive factor for the success of software companies today. Increasing the software quality levels of software products and services requires an adequate integration of quality requirements (QRs) in the software life-cycle. However, QRs management is problematic in software development in general [1] and in rapid software development (RSD) in particular [2]. In order to support decision-makers in QR management in RSD, the Q-Rapids (Quality-aware Rapid Software Development) method defines an evidence-based, data-driven quality-aware rapid software development approach in which QRs are incrementally elicited, refined and improved. Q-Rapids builds upon data gathered from several heterogeneous sources. Data is analysed and aggregated into quality-related strategic indicators (e.g., customer satisfaction, product quality) which are presented to decision-makers using a highly informative dashboard.

In this paper, we present the current status and first evaluation of the tool support for the Q-Rapids method, that we call Q-Rapids Tool. Nowadays, the tool gathers and aggregates data about system quality (e.g. SonarQube, Jenkins) and process productivity (e.g. GitLab, Jira) to visualize it from historical and current perspectives.

The rest of the paper is structured as follows. Section 2 briefly presents the Q-Rapids method. Section 3 introduces the architecture of the tool and describes each of its modules. Section 4 discusses the evaluation of the first release of the tool performed by the uses cases of the Q-Rapids project and Sect. 5 presents a roadmap for the following releases. Finally, Sect. 6 sketches some conclusions.

2 Q-Rapids Method

Q-Rapids is a data-driven, quality-aware rapid software development method that is being developed in the context of an EU H2020 project with the same nameFootnote 1. In Q-Rapids, quality requirements will be identified from available data and evaluated with respect to some selected indicators [3].

Q-Rapids aims to increase software quality through the following goals (see Fig. 1(a)):

  • Gathering and analyzing data from project management tools, software repositories, quality of service and system usage. The analysis of this data allows to assess systematically and continuously software quality using a set of quality-related indicators (e.g., customer satisfaction).

  • Providing decision-makers with a highly informative dashboard to help them making data-driven, requirements-related strategic decisions in rapid cycles.

  • Extending the rapid software development process considering the comprehensive integration of quality requirements and their management in a way that favors software quality and that brings a significant productivity increase to the software lifecycle.

Fig. 1.
figure 1

(a) The Q-Rapids method and (b) quality model.

In order to characterize quality-based strategic indicators, we define a quality model based on the Quamoco approach [4]. Quamoco faces the problem of traditional software quality models, which provide either abstract quality characteristics or concrete quality measurements, by integrating both aspects. The extra value of our quality model is to enable the aggregation from the raw data gathered to useful strategic indicators at the company level rendered in the dashboard. Concretely, metrics are computed from gathered data from data sources and are aggregated into product/process factors, and these factors are ultimately aggregated into strategic indicators (see Fig. 1(b)). The generic quality model, including the aggregations, used for the Q-Rapids Tool evaluation is reported in [5]. Concrete results of adopting Q-Rapids method, in one of the Q-Rapids project use cases, to characterize code quality are reported in [6].

One of the Q-Rapids project outcomes is a software tool to support the life-cycle development presented in Fig. 1(a) covering the first two project goals. The Q-Rapids Tool is being developed iteratively and its current version includes the following functionality:

  • Gather information from several data sources.

  • Aggregate the data from data sources to strategic indicators.

  • Visualize the current assessment of the strategic indicators allowing decision-makers to analyze the current status of the project.

  • Visualize historical data allowing decision-makers to make trend analysis to anticipate risks.

  • Allow decision-makers to drill-down through different levels of data to understand the rationale of the current status.

3 Q-Rapids Tool

The architecture of the Q-Rapids Tool is depicted in Fig. 2. The components are grouped in two packages: Data Gathering and Analysis and Strategic Decision Making.

Fig. 2.
figure 2

Q-Rapids Tool architecture

The Data Gathering and Analysis package includes three modules grouping the different phases of the data gathering and analysis process. The Data Ingestion module is the responsible of gathering the raw data from the different tools (Data Producers). Having this independent module helps us to integrate data from several data providers, making this heterogeneity of data transparent to the other modules. Once the data is in the system (Distributed Data Sink), the Data Analysis and Processing module is responsible of executing the quality model assessment. This assessment consists of aggregating the gathered data into metrics, product and process factors, and strategic indicators (see Fig. 1b). The Strategic Decision Making package includes the Strategic Dashboard component responsible of the interaction with the decision-maker.

The current version (hereafter called Q-Rapids prototype) was released in December 2017. This prototype was extensively tested, validated and evaluated agains real conditions in software development projects run by the companies providing use cases to the Q-Rapids project (four different evaluation use cases).

Next, we report the status of the two packages of the Q-Rapids tool prototype.

3.1 Data Gathering and Analysis

Data Producers.

The heterogeneous sources supported collect data about static code analysis (e.g., SonarQube), executed tests during development (e.g., Jenkins), code repositories (e.g., SVN, Git, GitLab), and issue tracking tools (e.g., Redmine, GitLab, JIRA, Mantis).

Data Ingestion.

It consists of several Apache KafkaFootnote 2 connectors to gather data from data producers. These connectors query the API of data producers to ingest the data into Kafka. For instance, the Jira connector reads all the features from each issue (e.g., description, assignee, due date) from the JSON document got from the Jira APIFootnote 3. Apache Kafka is a Big Data technology serving as primary ingestion layer and messaging platform, and offering scalability via clusters capabilities. This has been the more challenging module from the technical point of view. The diversity of data producers has been the main challenge, not only because of the number of tools but also because of the different versions of the same tool. We also faced the fact that some tools are used differently in the four companies where the tool has been evaluated, e.g. different metadata for issues.

Distributed Data Sink.

This module is used for data storage, indexing and analysis purposes. Both the raw data (i.e., collected data) and the quality model assessment (i.e., aggregations) are stored a search engine, namely Elasticsearch from the Elastic stackFootnote 4. This allows to define four types of indexes, three for the quality model assessment elements (strategic indicators, product and process factors, and metrics), and the fourth for the raw data. As Apache Kafka, the Elastic Stack offers scalability via cluster capabilities, which is required in the multinational IT company of the Q-Rapids project.

Data Analysis and Processing.

It performs the quality model assessment based on the raw data gathered following a bottom-up approach. First, raw data is used to calculate the metrics, whose calculation is normalized and interpreted after assessing the collected data. Due to such assessment, their value goes from 0 to 1, being 0 the worst value and 1 the best value regarding quality. This value come from a utility function [4], which interprets the raw data value by either the preferences of experts or learned data. Once the metrics are calculated, they are aggregated into product and process factors, and then into strategic indicators. The aggregations are computed considering the weights on child elements, and then stored in the distributed data sink.

3.2 Strategic Decision Making

The Strategic Decision Making package includes Strategic Dashboard component that provides the user interface of the Q-Rapids tool. It is a web application that consumes data from the Distributed Data Sink module.

The main purpose of this component is to provide an easy, attractive yet informative interface to allow decision-makers accessing the different features of the tool. Figure 3 shows the landing page of Q-Rapids Dashboard.

Fig. 3.
figure 3

Q-Rapids Dashboard landing page: Strategic Indicators View (Color figure online)

Q-Rapids Dashboard includes four views, and for each view the user can choose whether seeing the current assessment or viewing the historical data; graphically or in a textual way. The four views correspond to:

  • Strategic Indicators View: general strategic indicators status (see Fig. 3).

  • Detailed Strategic Indicators View: for each strategic indicator, the dashboard visualizes the status of the factors affecting the strategic indicator.

  • Factors View: for each factor, the dashboard displays the status of its metrics.

  • Metrics View: the dashboard visualizes the metrics status.

The key feature of this tool is the aggregation of heterogeneous elements. In order to be able to aggregate different kind of data, the tool works with normalized and interpreted data (see Sect. 3.1). Therefore, the values shown by the tool are in the range 0 to 1, where 0 indicates bad quality and 1 good quality.

Figure 3 visualizes strategic indicators using gauge charts, which provides a quick visual trouble identification mechanism for decision-makers. The speedometer needle in the red zone indicates a potential risk, and in the green zone the strengths. The black mark in the gauge indicates the target value to reach. Figure 4 shows alternative ways to visualize strategic indicators. From left to right, there are graphical views to visualize all the factors impacting in a strategic indicator using radar charts (left), charts visualizing the historical data, showing the evolution of the strategic indicators (middle), and the evolution of factors impacting in it (right).

Fig. 4.
figure 4

Alternative views to visualize strategic indicators

In order to facilitate the analysis and the understanding of the status of the strategic indicators assessment, the user can navigate forward and backwards from the different levels of abstraction views in the following order:

figure a

A complete description of the dashboard functionality is available as User’s Guide that can be downloaded from the Q-Rapids project website (downloads section), jointly to a video tutorialFootnote 5 of the dashboard.

4 Tool Evaluation

We designed a semi-structured interview to evaluate the Q-Rapids prototype in January 2018. We aimed at understanding amongst others its usability, ease of use, and relevance from the perspective of product owners and identifying needs for improvements. We measured usability, ease of use, and relevance using the Likert-scales defined in [7, 8]. Each Likert-scale includes up to four statements to be rated using a response scale from 1: strongly disagree to 5: strongly agree.

Before each evaluation, we selected one project per industrial partner, configured and installed the Q-Rapids prototype, and collected the corresponding project data for a period of not less than 2 weeks. Then, we performed individual evaluations with eight product owners from the four companies involved in Q-Rapids project. Each evaluation session includes four steps. After explaining the study goals and procedures, we trained each participant in the Q-Rapids prototype using the video mentioned above. Then, we asked the participant to analyze the status of the project’s strategic indicators, quality factors, and metrics using the Q-Rapids prototype. We encouraged the participant to think aloud and mention both positive and negative aspects of the Q-Rapids prototype. Finally, we asked the participant to answer a feedback questionnaire on the usability, ease of use, and relevance of the Q-Rapids prototype.

More than half of the participants (n = 5) consider the Q-Rapids prototype as moderately usable (Mdn = 3.25, Mode = 3, Min = 2.5, Max = 5). They perceive the in-formation provided by the Q-Rapids prototype as useful. However, they claim there is a need for linking the strategic indicators, quality factors and metrics with other information sources (e.g., source code, user stories, and list of issues) in order to better support the decision making process. The participants agree that integrating several data sources is an added value for supporting the decision making process in their companies. The majority of the participants (n = 7) considered the Q-Rapids prototype as easy to use (Mdn = 4, Mode = 4, Min = 3, Max = 5). They recommended adding functionalities for sorting values and filtering information by selecting time periods or project milestones would further increase the ease of use of the Q-Rapids prototype. Furthermore, more than half of the participants (n = 5) considered the Q-Rapids tool as relevant (Mdn = 4, Mode = 4, Min = 3, Max = 4). They commented the prototype has high potential to support a closer work between managers and the developers.

The evaluation results are only an indication and cannot be generalized because of a convenient sample of participants used the Q-Rapids prototype to solve few tasks in a controlled environment

5 Roadmap

There are several tools in the market for aggregating and visualizing data in a graphical way. For example, software quality tools (SonarQube, Black Duck, Bitergia), Business Intelligence tools providing dashboards (Tableau, Microsoft Power BI, Pentaho) and reports (ReportServer, JasperReports, BIRT). The common way of working of these tools is that the organization using the tool should customise their own visualizations depending on their data, Q-Rapids method and tool face this customization at level of data, i.e. designing the quality model and the quality model is visualized through a generic dashboard. Giving us the opportunity of adding analysis capabilities over the quality model. Additionally, we envisage the Q-Rapids Tool as a more powerful tool with specific capabilities to support decision-making in managing quality in rapid software development. Next releases of the tool are planned for August 2018 and August 2019.

The new features planned for the next release are: (1) the use of Bayesian networks [9] to estimate the strategic indicators assessment, (2) what-if analysis techniques, (3) candidate QR suggestions, and (4) collection of data at run-time.

Besides the new features, we will include some improvements suggested by the industrial partners during the evaluation. One of the most demanded improvement has been the access to the raw data. We will materialize this request allowing decision-makers to drill-down until raw data, giving them the option to have a deeper analysis arriving to the source of the problem.

6 Conclusions

Q-Rapids Tool is a data-driven tool that allows decision-makers managing the quality of their products. The Q-Rapids prototype provides four sets of functionality: (1) data gathering from several and heterogeneous data source tools: project management (GitLab, Jira, Mantis, Redmine), software repositories (Git, GitLab, SVN), code quality (SonarQube), and continuous integration (Jenkings); (2) calculation and aggregation of data into three levels of abstraction (metrics, product and process factors, and strategic indicators) shaping a quality model containing preferences of experts or learned data; (3) visualization of the aggregated data (current and historical); and (4) navigation through the aggregated data. The different levels of abstraction in the quality model support decisions at different levels in organizations. The visualization functionalities include the current and historical data that can be displayed graphically or in textual form. The historical data support decision-makers to make trend analysis to anticipate risks. The dashboard includes drill-down capabilities making possible to visualize the behavior of strategic indicators allowing to visualize the reasons behind a bad assessment (i.e. which metric is affecting negatively).

The evaluation results of the first Q-Rapids prototype indicate that product owners perceive it as easy to use and relevant. However strategic indicators, quality factors, and metrics have to be linked with further information (e.g., source code and product backlog) to better support the decision making process. We plan to evaluate subsequent versions of the Q-Rapids prototype by performing case studies.