Pages

Tuesday, July 26, 2011

Guidelines for Automated Tester

·         Concise: As simple as possible.
·         Self-Checking: Test reports its own results; needs no human interpretation.
·         Repeatable: Test can be run many times in a row without human intervention.
·         Robust: Test produces same result now and forever. Tests are not affected by changes in the external environment.
·         Sufficient: Tests verify all the requirements of the software being tested.
·         Necessary: Everything in each test contributes to the specification of desired behavior.
·         Clear: Every statement is easy to understand.
·         Efficient: Tests run in a reasonable amount of time.
·         Specific: Each test failure points to a specific piece of broken functionality; unit test failures provide "defect triangulation".
·         Independent: Each test can be run by itself or in a suite with an arbitrary set of other tests in any order.
·         Maintainable: Tests should be easy to understand and modify and extend.
·         Traceable: To and from the code it tests and to and from the requirements.

The Software Testing Life Cycle – Automation Software Testing

The Software Testing Life Cycle, (STLC), is the road map to automation success. It consists of a set of phases that define what testing activities to do and when to do them. It also enables communication and synchronization between the various groups that have input to the overall testing process. In the best of worlds the STLC parallels the Software Development Life Cycle, coordinating activities, thus providing the vehicle for a close working relationship between testing and development departments.
The following is a "meat-and-potatoes" list of name for the phases of the STLC:
  1. Planning
  2. Analysis
  3. Design
  4. Construction
  5. Testing – Initial test cycles, bug fixes and re-testing
  6. Final Testing and Implementation
  7. Post Implementation
Each phase defines five to twenty high level testing tasks or processes to prepare and execute both manual and automated testing. A few examples are in order:
  1. Planning
    • Marketing group writes a document defining the product
    • Define problem reporting procedures
    • High level test plan
    • Begin analyzing scope of project
    • Identify acceptance criteria
    • Setup automated test environment

  1. Analysis
    • Marketing and Development groups work together to write a product requirements document
    • Develop functional validation matrices based on business requirements
    • Identify which test cases make sense to automate
    • Setup database to track components of the automated testing system, i.e. reusable modules
    • Map baseline data to test cases

  1. Design
    • Development group writes a detailed document defining the architecture of the product
    • Revise test plan and cases based on changes
    • Revisit test cycle matrices and timelines
    • Develop risk assessment criteria (McCabe tools help here)
    • Formalize details of the automated testing system, i.e. file naming conventions and variables
    • Decide if any set of test cases to be automated lend themselves to a data driven/template model
    • Begin scripting automated test cases and building reusable modules
As the STLC is continually refined it will spell out the organization of the testing process. What steps need to be taken and when, to ensure that when the software is ready to test, both the manual and automated testing system will be in place and ready to go. The idea here is to start early and be ready to respond to change.
One of the biggest reasons automation fails is the lack of preparation early in the process. This is due in part to a lack of understanding of what needs to be done and when. The steps are not difficult, it is just a matter of understanding how the STLC works. It does not take any more time and effort to succeed than it does to fail.

Advantages of Automation Software Testing

·         Reliable
·         Repeatable
·         Comprehensive
·         Reusable:
·         Better Quality Software
·         Fast
·         Cost Reduction
·         Saves Time and Money
·         Improves Accuracy
·         Increases Test Coverage
·         Does What Manual Testing Cannot
·         Helps Developers and Testers
·         Improves Team Morale

Manual Testing and Automation Testing

Manual software testing is performed by a human sitting in front of a computer carefully going through application screens, trying various usage and input combinations, comparing the results to the expected behavior and recording their observations. Manual tests are repeated often during development cycles for source code changes and other situations like multiple operating environments and hardware configurations.

An automated software testing tool is able to playback pre-recorded and predefined actions, compare the results to the expected behavior and report the success or failure of these manual tests to a test engineer. Once automated tests are created they can easily be repeated and they can be extended to perform tasks impossible with manual testing.
 
Practical features of automated software testing systems:
  • Run all day and night in unattended mode
  • System continues running even if a test case fails
  • Keep the automated system up and running at all costs
  • Recognize the difference between hard and soft errors
  • Write out meaningful logs
  • One point maintenance
  • Easy to update reusable modules
  • Text strings stored in variables easy to find and update
  • Written in an English-like language easy to understand
  • Automated most important business functions first.
  • Quickly add scripts and modules to the system for new features
  • Don’t waste time with very complex features, keep it simple
  • Collect other useful information such as operating system and CASE tool message
  • Track components of the automated testing system in a database
  • Track reusable modules to prevent redundancy
  • Carefully test the testing system !
  • Keep track of tests coverage provide by automated test suites
  • Track which test cases are automated and which are manual
  • Use same architecture for Web or GUI based application testing
  • Make sure baseline data is defined and process in place to refresh data
  • Keep test environment clean and up-to-date
  • Test case management - store test cases in a database for maintenance purposes
  • Track tests that pass, as well as test that fail.

Tuesday, July 19, 2011

Tips to write a good bug report

Report the problem immediately:

If you found any bug while testing, do not wait to write detail bug report later. Instead write the bug report immediately. This will ensure a good and reproducible bug report. If you decide to write the bug report later on then chances are high to miss the important steps in your report.

Reproduce the bug three times before writing bug report:

Your bug should be reproducible. Make sure your steps are robust enough to reproduce the bug without any ambiguity. If your bug is not reproducible every time you can still file a bug mentioning the periodic nature of the bug.

Test the same bug occurrence on other similar module:

Sometimes developer use same code for different similar modules. So chances are high that bug in one module can occur in other similar modules as well. Even you can try to find more severe version of the bug you found.

Write a good bug summary:

Bug summary will help developers to quickly analyze the bug nature. Poor quality report will unnecessarily increase the development and testing time. Communicate well through your bug report summary. Keep in mind bug summary is used as a reference to search the bug in bug inventory.

Read bug report before hitting Submit button:

Read all sentences, wording, steps used in bug report. See if any sentence is creating ambiguity that can lead to misinterpretation. Misleading words or sentences should be avoided in order to have a clear bug report.

Do not use Abusive language:

It’s nice that you did a good work and found a bug but do not use this credit for criticizing developer or to attack any individual.