top of page
Concrete

Live Aggregate Tests Results In Grafana

  • Writer: John Chitiris
    John Chitiris
  • Dec 18, 2020
  • 3 min read

It has been a lot of time ago when we took a decision as a QA Automation Team to split our regression automation suites per business area. This move gave us two main benefits:

  1. Better organization of the tests per suite based on the business of the product

  2. Flexibility to run a specific set of tests when the time is limited, there is an ongoing hotfix or an update that affects a specific area of the product.


Problems to Solve

  1. Having the regression automation suites divided per business area running into our CI Tool (Jenkins) raised a problem which was actually that there was no way to aggregate test results of all the suites (pass, fail, skip).

  2. Find a way to see live results while the suite is running without waiting to finish after hours and see the report, was also a challenge for us.


Solution


This was a time when I realized that we needed a common dashboard where we will be able to see a results summary of all test steps across suites in one place.


In order to implement it, there were two things that needed to be done:

  1. Save the results in the DB (influxDB) after each test step execution - live results

  2. Connect infuxDB with Grafana


Prerequisite


TestNG framework is used for test execution and reportNG in order to get the @KnowDefect implementation and the test class that you want to run is included in a testng.xml file.

@KnownDefect annotation is used to point out a bug that exists from the previous release and makes the test step fail.


Tools


Why influxDB you may wonder? The answer here is why this DB is designed to store large volumes of time series data and at the same time perform quickly real-time analysis - this is what I actually needed ...

On the other Grafana despite the fact, that has become one of the most popular technology to compose observability dashboards, it actually provides me a clear view to present my data.


Structure - Model


The next step in the process is to design the model. The DTO (Data Transfer Object) is designed with the below fields :

  • Test Case name

  • Test Step name

  • Test Step Status

  • Suite Name

  • Project Name

  • Known Defect (if exist and annotated on top of TestNG - @Test annotation)

  • Description(TestNG - @Test description)


That's was my model, now I needed to figure out a way to get all the above info and save it in DB. The way to achieve it was to create a TestNG listener like a service implementation that actually gets the needed model data from the ITestResult interface of the TestNG framework.


Example

  1. Create a class (with name ResultSender) that will be responsible to send the data to DB in the results measurement (table).

  2. Then, create a TestNG Listener (with name GrafanaListener) being involved during the execution and actually save the data in DB upon each step execution.


public class ResultSender {
    private static final InfluxDB INFLXUDB = InfluxDBFactory.connect("<INFUXDB_IP>", "<INFUXDB_USERNAME>", "<INFUXDB_PWD>");
    private static final String DATABASE = "<DB_NAME>";

    static {
        INFLXUDB.setDatabase(DATABASE);
    }

    static void send(final Point point) {
        INFLXUDB.write(point);
    }
}

@Slf4j
public class GrafanaListener implements ITestListener, ISuiteListener {
    static final String PASS = "PASS";
    static final String FAIL = "FAIL";
    static final String SKIPPED = "SKIPPED";
    static final String KNOWN = "KNOWN";
    private String project;


    Map<String,String> known = new HashMap<>();

    public void onStart(ISuite suite) {
        project =  findProjectName(suite);
    }

    public void onTestStart(ITestResult iTestResult) {
    }

    public void onTestSuccess(ITestResult iTestResult) {
        this.sendTestMethodStatus(iTestResult, PASS,null);
    }

    public void onTestFailure(ITestResult iTestResult) {
        String status =     (known.containsKey(getTestName(iTestResult)+"_"+iTestResult.getName())) ? KNOWN : FAIL;
        this.sendTestMethodStatus(iTestResult, status,known.get(getTestName(iTestResult)+"_"+iTestResult.getName()));
    }

    public void onTestSkipped(ITestResult iTestResult) {
        this.sendTestMethodStatus(iTestResult, SKIPPED,null);
    }

    public void onTestFailedButWithinSuccessPercentage(ITestResult iTestResult) {

    }

    public void onFinish(ISuite suite) {

    }

    public void onStart(ITestContext iTestContext) {
        Arrays.stream(iTestContext.getAllTestMethods()).forEach(it->{
        if(it.getMethod().getAnnotation(KnownDefect.class) != null)
            known.put(it.getTestClass().getName().substring(it.getTestClass().getName().lastIndexOf(".") + 1,it.getTestClass().getName().length())+"_"+it.getMethod().getName(),
                    it.getMethod().getAnnotation(KnownDefect.class).toString().substring(it.getMethod().getAnnotation(KnownDefect.class).toString().indexOf("description=")+("description=").length()+1,it.getMethod().getAnnotation(KnownDefect.class).toString().length()-2));
        });
    }

    public void onFinish(ITestContext iTestContext) {
    }

    private void sendTestMethodStatus(ITestResult iTestResult, String status,String jira) {
        String bug = (!StringUtils.isEmpty(jira)) ? jira : "No" ;
        Point point = Point.measurement("results")
                .time(System.currentTimeMillis(), TimeUnit.MILLISECONDS)
                .tag("test", getTestName(iTestResult))
                .tag("result", status)
                .tag("suite", findSuiteNameFromPackage(iTestResult.getTestClass().getName()))
                .tag("project", project)

                .addField("name", iTestResult.getName())
                .addField("bug", bug) //known defects

                .addField("description", iTestResult.getMethod().getDescription())
                .addField("duration", (iTestResult.getEndMillis() - iTestResult.getStartMillis()))
                .build();
        ResultSender.send(point);
    }

    private String getTestName(ITestResult iTestResult){
        return iTestResult.getTestClass().getName().substring(iTestResult.getTestClass().getName().lastIndexOf(".") + 1, iTestResult.getTestClass().getName().length());
    }

    private String findSuiteNameFromPackage(String packageName) {
        //Implementation to find suite name name via testng xml
        return "<Suite_Name>";
    }

    private String findProjectName(ISuite suite) {
        //Implementation to find project name via testng xml
        return project;
}

automation TestNG based test with @KnowDefect annotation looks like this :


public class TestDemo {

    @Test(alwaysRun = true, description = "Verify")
    void testStep_1() {

    }

    @KnownDefect(description = "https://<JIRA_URL>/browse/XXX")
    @Test(alwaysRun = true, description = "Verify 2", dependsOnMethods = "testStep_1")
    void testStep_2() {

    }
}

Finally, I created a dashboard in Grafana and do this influxDB query to get all results:


SELECT count("status") FROM "results" WHERE ("project" = '<YOUR_PROJECT>') GROUP BY "result"

Of course, you can do the corresponding query to get results also per suite.

As a result, a Grafana dashboard now looks like this :




Comments


bottom of page