2017 up to now  | 2016 | 2015 | 2014 | 2013 | 2012 | 2011 | 2010 | 2009 | 2008 | 2007

(older archive entries before 2007 are not shown here, but included in the onsite-search)

Mailing List - Entries of 2007

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [QF-Test] How to manage the Description of smoke tests and their actual implementation

  • Subject: Re: [QF-Test] How to manage the Description of smoke tests and their actual implementation
  • From: Martin Moser <martin.moser@?.de>
  • Date: Mon, 16 Jul 2007 17:13:55 +0200

Hello Stephan,

it seems, that nobody wants to share their knowledge with you.

I visited some customers, which did an integration with QualityCenter and
QF-Test and started QF-Test right out of QualityCenter, so they had also
the linkage between QF-Test and the test-case description. Some of them
used the direct batch-call, others the daemon-mode.

I personally like the idea of combining data-drivers and modularized
implementation of test-cases. So, what I do, is that I make very low-level
procedures via Capture&Replay, which do only small steps in the GUI. Then I
read the test-data via a data-driver node from a CSV file or from
Excel-file (example should be on the mailing-list, otherwise contact me).
Some of the test-variables contain names of those low-level procedures. So
I abuse test-data-sources as test-command structuring parts. The efforts
for this approach is, that you've to create the low-level procedures and a
proper test-data file structure first, where I'm investigating some smart
solutions for future versions. Changes in the UI have to be updated in the
according procedures later, but with a proper naming-scheme you could avoid
touching the data-files.

Perhaps some other people want to share their approaches, ...


--On Freitag, Juli 13, 2007 07:47:35 +0200 Wiesner Stephan
<stephan.wiesner@?.ch> wrote:

Hi list,
I would like to present how we are planning to implement our smoke tests
process with QFTest and get some feedback from you. How do you manage
the test design, their description, maintenance and the actual tests?

I work in the banking business. We have product owners, people with
banking knowledge but low technical skill. They are responsible for the
requirements of our products and they write the testcases. They love
Word and Excel and for the "real" testcases we use Quality Center.
We do execute a lot of smoke tests while developing our products and as
the software is still under development, we have to change those. Fast
and often and often only after the code changed.
As we have a lot of products and only a small team for the
testautomation we do not have deep knowledge of the business needs and
can not follow all changes. Therefore we need our product owners to talk
to us and to hand us updated smoke tests (preferably in Word or Excel),
which we can then automate/update.

Our smoke tests in QFTest consist mainly of simple capture and replay
tests, which we just rerecord if something changes. Works fast and
simple, but this means that they are not (well, hardly) documented. We
do reference the word document with the textual description, though, by
giving them unique IDs. Combined the documentation is quite good.

That's how we currently implement our process and it seems to work, but
as we are still in a pilot project, I would really appreciate feedback
of how you make sure that your tests are documented and how you
communicate with your providers of testcases?

Greetings from Berne,
Stephan Wiesner

Testmanager ESTM
Telefon +41 (0)31 551 78 68

RTC Real-Time Center AG
Schwarzenburgstr. 160
3097 Liebefeld

qftest-list mailing list

Martin Moser                           martin.moser@?.de
Quality First Software GmbH            http://www.qfs.de
Tulpenstr. 41                          Tel: +49 8171 919874
DE-82538 Geretsried                    Fax: +49 8171 919876
GF: Gregor Schmid, Karlheinz Kellerer  HRB München 14083

Videos Downloads Documentation Buy Free Trial