OLE QA Guide - Testing and Documenting Results

 

Testing

This article covers how to execute and document the results of your tests in the OLETS Jira project. In particular, this article relates to how to test a new Test Case.  If you are retesting a previously failed case, revision of test steps may not be necessary.

Top of Page

Testing Process

  • User Testing consists of the following steps:
    • Revising the Test Case
      • You will need to revisit the Test case and ensure that the steps, if they exist, adequately match the current functionality of the OLE system.
      • If testing steps do not yet exist, you will need to review the steps necessary to fulfill the purpose statement of the Test Case.
    • Executing the Steps
      • You will need to execute the testing steps by hand.
    • Providing Feedback
      • Pass or fail the Test Case to advance it through the workflow, and if there were complications or a failure, provide a comment describing the results.  In general, a comment on a passed test case is helpful, but not strictly necessary.

Top of Page
Top of Section

Determining Success or Failure

The main point on which success or failure hinges during user testing is the fulfillment of the Acceptance Criteria statement, written out as a statement of purpose on the Test Case. The AC statement is meant to give the QA Team and testers a clear, well-defined goal to meet in testing.

There may be some cases in which the function is fulfilled, but not in the exact way that was requested by the Functional Specification. The details are a secondary point, but it is sometimes necessary to fail a Test Case based on the finer details of functionality. If, for example, information needs to be displayed, but is not displayed in a way which is clear and useful to the users, a Test Case might fulfill its stated purpose yet fail on finer details.  If however that information is displayed in a way that is usable and clear, but not as clear as it could be, the Test Case would need to be passed, but with feedback indicating how it could be improved.  This feedback could later become an Enhancement (non-PP) to be reviewed by the Functional Specification teams for possible resolution.

Failure for such issues of nuance is a difficult thing to decide. The QA Team encourages you to think in terms of the big picture: does the software accomplish its task in a reasonably useful way? If so, the best course of action is generally to pass the Test Case, but share your insight in a comment so that we can address the issue in a later release of the software.

Top of Page
Top of Section

Advance the Test Case in the Workflow

This should always be the first step you take in providing feedback on the outcome of your Test Case. You will click either "Test Passed" or "Test Failed" in the menu bar at the top of the Test Case (pictured below).

A dialog box will open, allowing you to enter the date tested and a comment elaborating on the test results.  The heading at the top of the dialog box and the confirmation button in the bottom right corner will show either "Test Passed" or "Test Failed," depending upon your selection.  Although it was previously necessary to select the outcome of the test on this dialog box, choosing from "None," "Passed," or "Failed," this is no longer the case.

Note that there is an "Attachment" option. If you want to add an attachment, you can do so through this dialog box. The different kinds of attachments most used in OLE testing will be discussed in detail a bit later on in this document, with links to more specific instructions on how to create and attach them.

Top of Page
Top of Section

Document Your Findings with a Comment

There are two things to include when writing up a comment to document your test results. The first, and most important, is to be explicit about whether the test passed or failed. The reason for this is that once your comment has been attached, it will appear in a long list of comments on the issue, and will not be attached in any particular way to the results you selected from the test results dialog box.

If you need to add further information or rectify a mistake you made in the comment, you can do so by scrolling down to your comment at the bottom of the screen, then clicking the pencil icon that appears when you hover your mouse over the comment. If you need to add an additional comment at any time – to reply to a discussion that develops on a particular Test Case, for example – you can use the "Comment" button at the bottom of the screen.

Comments are where you will want to document any feedback you have to give on a particular Test Case. For example, if you feel that a Test Case should pass because it does accomplish exactly what the acceptance criteria specified, but you feel that it does not do so in a particularly satisfactory way, we really want to hear about it. Kuali software is meant to be designed by the users, and we want to craft the application so that it can be used efficiently and effectively.

Top of Page
Top of Section

Attach a Screenshot of OLE

It may sometimes be the case that you will want to show a particular error message that you have received, or that you need to show where some element is missing from the layout of a particular screen. In this case, you will want to include a screenshot of the OLE application along with your test results.

Detailed instructions are available for taking screenshots and attaching files to Jira issues.

Top of Page
Top of Section

 

Operated as a Community Resource by the Open Library Foundation