2017 up to now  | 2016 | 2015 | 2014 | 2013 | 2012 | 2011 | 2010 | 2009 | 2008 | 2007

(older archive entries before 2007 are not shown here, but included in the onsite-search)

Mailing List - Entries of 2012


[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [QF-Test] How to perform QF-Test suite (formal) reviews? - Looking at test coverage...


  • Subject: Re: [QF-Test] How to perform QF-Test suite (formal) reviews? - Looking at test coverage...
  • From: "Berg, Klaus-Peter" <klaus-peter.berg@?.com>
  • Date: Wed, 1 Feb 2012 10:24:20 +0100

Hi all,
 
when it comes to unit testing we have to look a test coverage too as part of every test review.
Here are two ideas:
 
First of all, I think "unit testing" is not the domain of QF-Test. It is best suited to end user acceptance testing or integration testing.
However, QF-Test suites can *trigger* the execution of specific SUT code parts but this is only *indirect* "unit testing".
Anyway, we can measure what parts of the code are executed and by doing this we can measure at least "line coverage" as with pure Java unit tests too (e.g., with JUnit or TestNG together with EMMA, JCoverage, Cobertura or Clover, etc.).
 
The second thing I see is measuring "Requirements coverage". Requirements coverage would be an indication for how much of the requirements are covered by test cases written with QF-Test.
This process would require a "tool" (or a visual review) that can map test specs (e.g., written down in Quality Center, Test Director, etc.) that express requirements to test cases expressed in QF-Test. In this environment the "coverage" is the degree of how many test specs are linked to concrete QF-Test cases in all available QF-Test suites.
 
Best regards,
Klaus Peter Berg