[email protected] www.fe.up.pt/~jpf TQS - Teste e Qualidade de Software (Software Testing and Quality) Software Inspections, Reviews and Static Analysis João.

1 [email protected] www.fe.up.pt/~jpfTQS - Teste e Qualidade d...
Author: Edison Carneiro Tomé
0 downloads 8 Views

1 [email protected] www.fe.up.pt/~jpfTQS - Teste e Qualidade de Software (Software Testing and Quality) Software Inspections, Reviews and Static Analysis João Pascoal Faria

2 Index Introduction Software inspectionsPersonal design and code reviews according to the PSP Automated static analysis techniques More checklists References and further reading

3 Introduction

4 Reviews along the software lifecycleExecute acceptance tests Specify Requirements Execute system tests System/acceptance test plan & test cases review/audit Requirements review Design review Code reviews Specify/Design Code System/acceptance tests Design Execute integration tests Integration test plan & test cases review/audit Specify/Design Code Integration tests Code Execute unit tests Unit test plan & test cases review/audit Specify/Design Code Unit tests (source: I. Burnstein, pg.15)

5 Types of reviews Target / Review Item (What) Formality (How and Who)Requirements review Design review Code review User documentation review [Proj. Man. | Config. Man. | QA | V&V | Test |...] [plan | report] review not the focus here Formality (How and Who) detect errors and problems check conformity with specification and fitness for purpose self review inspection peer review audit walkthrough check quality attributes and detect quality faults V&V and QA check adherence to standards ... check progress not the focus here Purpose / Goals (Why)

6 Types of reviews by formality (1)Self review (desk-check) Informal review performed by the author (SWEBOK) Walkthrough Designer or programmer leads members of the development team and other interested parties through a software product; participants ask questions and make comments about possible errors, violation of development standards, and other problems (IEEE Std for Software Reviews and Audits) Peer review “I show you mine and you show me yours” (SWEBOK)

7 Types of reviews by formality (2)Inspection A visual examination of a software product to detect and identify anomalies, including errors and deviations from standards and specifications (…) Peer examination led by impartial facilitators who are trained in inspection techniques (IEEE Std for Software Reviews and Audits) Audit An independent examination of a software product, software process, or set of software processes to assess compliance with specifications, standards, contractual agreements, or other criteria (IEEE Std for Software Reviews and Audits)

8 Reviews versus testingA software system is more than the code; it is a set of related artifacts; these may contain defects or problem areas that should be reworked or removed; quality-related attributes of these artifacts should be evaluated Reviews allow us to detect and eliminate errors/defects early in the software life cycle (even before any code is available for testing), where they are less costly to repair Most problems have their origin in requirements and design; requirements and design artifacts can be reviewed but not executed and tested Early prototyping is equally important to reveal problems in requirements and high-level architectural design A code review usually reveals directly the location of a bug, while testing requires a debugging step to locate the origin of a bug Adherence to coding standards cannot be checked by testing

9 Self versus independent reviewsBoth important! Self reviews and tests: Performed by the author Should find most of the defects More efficient in discovering and removing defects Demonstrate professionalism and respect for “those who come next” Independent reviews and tests: Performed by peers or specialized personal Independency and distance allows finding other defects

10 Efficacy of defect removal techniques(defect detection and removal) (Self) (Indep.) (Self) (Indep.) (Self) (Indep.) Source: Xerox data (using PSP), Personal Software Process course materials, SEI, 2006

11 Technical versus management reviewsTechnical Reviews - examine work products of the software project (code, requirement specifications, software design documents, test documentation, user documentation, installation procedures) for V&V and QA purposes Different degrees of formality Covered here Management Reviews - determine adequacy of and monitor progress or inconsistencies against plans and schedules and requirements May be exercised on plans and reports of many types (risk management plans, project management plans, software configuration management plans, audit reports, progress reports, V&V reports, etc.) Not covered here

12 Software inspections

13 Definitions of software inspectionA visual examination of a software product to detect and identify software anomalies, including errors and deviations from standards and specifications (…) Inspections are peer examinations led by impartial facilitators who are trained in inspection techniques [IEEE Std for software reviews and audits] A formal evaluation technique in which software requirements, design, or code are examined in detail by a person or group other than the author to detect faults, violations of development standards, and other problems [SWEBOK] Inspection (in software engineering) refers to peer review of any work product by trained individuals who look for defects using a well defined process. An inspection might also be referred to as a Fagan inspection after Michael Fagan, the inventor of the process. [http://en.wikipedia.org/wiki/Software_inspection]

14 Roles, responsibilities and attendance(or moderator) (may be the author or an “advocate”) (source: I. Burnstein)

15 Components of a formal review planReview goals Items being reviewed Preconditions for the review Roles, team size, participants Training requirements Review steps and procedures Checklists and other related documents to be distributed to participants Time requirements Nature of the review log and summary report Rework and follow-up criteria and procedures (source: I. Bursntein)

16 Components of a formal review reportChecklist will all items checked and commented List of defects found (w/description, type, frequency, location, severity, …) List of attendees Review metrics Time & effort: # participants, meeting duration, total preparation time for the review team Size: size of the item being reviewed (usually LOC or number of pages) Defects: number of defects found Ratios: Defects/Time, Defects/Size, Size/Time Status of the reviewed item Accept – item accepted in its present form or with minor rework; no further verification Conditional accept – item needs rework, to be checked and verified by moderator Re-inspect – considerable rework must be done; the inspection needs to be repeated Estimate of rework effort and date for completion Signatures and date

17 Personal design and code reviews according to the PSP

18 What is the PSP? It’s a methodology developed by Watts Humphrey in the Software Engineering Institute (SEI) in 1993 and subsequently enhanced, with the purpose of helping individual software engineers deliver quality products on predictable schedules (source: Philip Miller, An SEI Process Improvement Path to Software Quality, QUATIC 2007)

19 Quality and productivity impacts“System test time is cut to the bone” (source: Philip Miller, An SEI Process Improvement Path to Software Quality, QUATIC 2007)

20 Basic PSP principles

21 Basic PSP Process (*) 1. Planning: Produce a plan to do the work. (PLAN) 2. Development: Perform the work. a. design the program (DLD) b. review the design (DLDR) c. code the program (CODE) d. review the code (CR) e. compile and fix all defects (COMPILE) f. test the program and fix all defects (UT) 3. Postmortem: Compare actual performance against the plan, record process data, produce a summary report, and document all ideas for process improvement. (PM) (*) For software development at the module level

22 PSP Design Review Script (1/2)Purpose To guide you in reviewing detailed designs Entry Criteria Completed program design documented with the PSP Design templates Design Review checklist (see example later) Design standard (note: based on sound design principles) Defect Type standard (see example later) Time and Defect Recording logs (see example later) General Where the design was previously verified, check that the analyses covered all of the design were updated for all design changes are correct are clear and complete Steps (See next slide) Exit Criteria A fully reviewed detailed design One or more Design Review checklists for every design reviewed Documented design analysis results All identified defects fixed and all fixes checked Completed Time and Defect Recording logs

23 PSP Design Review Script (2/2)Step Activities Description 1 Preparation Examine the program and checklist and decide on a review strategy. Examine the program to identify its state machines, internal loops, and variable and system limits. Use a trace table or other analytical method to verify the correctness of the design. 2 Review Follow the Design Review checklist. Review the entire program for each checklist category; do not try to review for more than one category at a time! Check off each item as you complete it. Complete a separate checklist for each product or product segment reviewed. 3 Fix Check Check each defect fix for correctness. Re-review all changes. Record any fix defects as new defects and, where you know the defective defect number, enter it in the fix defect space.

24 PSP Code Review Script (1/2)Purpose To guide you in reviewing programs Entry Criteria A completed and reviewed program design Source program listing Code Review checklist (see example later) Coding standard (see example later) Defect Type standard Time and Defect Recording logs General Do the code review with a source-code listing; do not review on the screen! Steps (See next slide) Exit Criteria A fully reviewed source program One or more Code Review checklists for every program reviewed All identified defects fixed Completed Time and Defect Recording logs

25 PSP Code Review Script (2/2)Step Activities Description 1 Review Follow the Code Review checklist. Review the entire program for each checklist category; do not try to review for more than one category at a time! Check off each item as it is completed. For multiple procedures or programs, complete a separate checklist for each. 2 Correct Correct all defects. If the correction cannot be completed, abort the review and return to the prior process phase. To facilitate defect analysis, record all of the data specified in the Defect Recording log instructions for every defect. 3 Check Check each defect fix for correctness. Re-review all design changes. Record any fix defects as new defects and, where you know the number of the defect with the incorrect fix, enter it in the fix defect space.

26 Example Design Review Checklist (1)Engineer: ______ Date: ___ Program: ______ Language: ________ Complete Verify that the design covers all of the applicable requirements. All specified outputs are produced. All needed inputs are furnished. All required includes are stated. External Limits Where the design assumes or relies upon external limits, determine if behavior is correct at nominal values, at limits, and beyond limits. Logic Use a trace table, mathematical proof, or similar method to verify the logic. Verify that program sequencing is proper. Stacks, lists, and so on are in the proper order. Recursion unwinds properly. Verify that all loops are properly initiated, incremented, and terminated. Examine each conditional statement and verify all cases. State Analysis For each state machine, verify that the state transitions are all complete and orthogonal. Internal Limits Where the design assumes or relies upon internal limits, determine if behavior is correct at nominal values, at limits, and beyond limits.

27 Example Design Review Checklist (2)Special Cases Check all special cases. Ensure proper operation with empty, full, minimum, maximum, negative, and zero values for all variables. Protect against out-of-limits, overflow, and underflow conditions. Ensure “impossible” conditions are absolutely impossible. Handle all possible incorrect or error conditions. Functional Use Verify that all functions, procedures, or methods are fully understood and properly used. Verify that all externally referenced abstractions are precisely defined. System Considerations Verify that the program does not cause system limits to be exceeded. Verify that all security-sensitive data are from trusted sources. Verify that all safety conditions conform to the safety specifications. Names Verify that all special names are clear, defined, and authenticated the scopes of all variables and parameters are self-evident or defined all named items are used within their declared scopes Standards Ensure that the design conforms to all applicable design standards.

28 Example C++ Coding Standard (1)Purpose To guide implementation of C++ programs Program Headers Begin all programs with a descriptive header. Header Format /******************************************************************/ /* Program Assignment: the program number */ /* Name: your name */ /* Date: the date you started developing the program */ /* Description: a short description of the program and what it does */ Listing Contents Provide a summary of the listing contents Contents Example /* Listing Contents: */ /* Reuse instructions */ /* Modification instructions */ /* Compilation instructions */ /* Includes */ /* Class declarations: */ /* CData */ /* ASet */ /* Source code in c:/classes/CData.cpp: */ /* CData() */ /* Empty() */

29 Example C++ Coding Standard (2)Reuse Instructions Describe how the program is used: declaration format, parameter values, types, and formats. Provide warnings of illegal values, overflow conditions, or other conditions that could potentially result in improper operation. Reuse Instruction Example /******************************************************************/ /* Reuse instructions */ /* int PrintLine(char *line_of_character) */ /* Purpose: to print string, ‘line_of_character’, on one print line */ /* Limitations: the line length must not exceed LINE_LENGTH */ /* Return 0 if printer not ready to print, else */ Identifiers Use descriptive names for all variable, function names, constants, and other identifiers. Avoid abbreviations or single-letter variables. Identifier Example int number_of_students; /* This is GOOD */ float: x4, j, ftave; /* This is BAD */ Comments Document the code so the reader can understand its operation. Comments should explain both the purpose and behavior of the code. Comment variable declarations to indicate their purpose. Good Comment if (record_count > limit) /* have all records been processed? */ Bad Comment if (record_count > limit) /* check if record count exceeds limit */ Major Sections Precede major program sections by a block comment that describes the processing done in the next section.

30 Example C++ Coding Standard (3)/******************************************************************/ /* The program section examines the contents of the array ‘grades’ and calcu- */ /* lates the average class grade */ Blank Spaces Write programs with sufficient spacing so they do not appear crowded. Separate every program construct with at least one space. Indenting Indent each brace level from the preceding level. Open and close braces should be on lines by themselves and aligned. Indenting Example while (miss_distance > threshold) { success_code = move_robot (target _location); if (success_code == MOVE_FAILED) printf(“The robot move has failed.\n”); } Capitalization Capitalize all defines. Lowercase all other identifiers and reserved words. To make them readable, user messages may use mixed case. Capitalization Examples #define DEFAULT-NUMBER-OF-STUDENTS 15 int class-size = DEFAULT-NUMBER-OF-STUDENTS;

31 Example C++ Code Review Checklist (1)Engineer: ______ Date: ___ Program: ______ Language: ________ Complete Verify that the code covers all of the design. Includes Verify that the includes are complete. Initialization Check variable and parameter initialization. at program initiation at start of every loop at class/function/procedure entry Calls Check function call formats. pointers parameters use of ‘&’ Names Check name spelling and use. Is it consistent? Is it within the declared scope? Do all structures and classes use ‘.’ reference? Strings Check that all strings are identified by pointers terminated by NULL

32 Example C++ Code Review Checklist (2)Pointers Check that all pointers are initialized NULL pointers are deleted only after new new pointers are always deleted after use Output Format Check the output format. Line stepping is proper. Spacing is proper. () Pairs Ensure that () are proper and matched. Logic Operators Verify the proper use of ==, =, ||, and so on. Check every logic function for (). Line-by-line check Check every line of code for instruction syntax proper punctuation Standards Ensure that the code conforms to the coding standards. File Open and Close Verify that all files are properly declared opened closed Should be continuously improved based on personal defect data!

33 Time Recording Log PurposeEngineer: ______ Date: ___ Program: ______ Language: ________ Project Phase Start Date and Time Int. Time Stop Date and Time Delta Time Comments Purpose Use this form to record the time you spend on each project activity. For the PSP, phases often have only one activity; larger projects usually have multiple activities in a single process phase. These data are used to complete the Project Plan Summary. Keep separate logs for each program. General Record all of the time you spend on the project. Record the time in minutes. Be as accurate as possible. If you need additional space, use another copy of the form. If you forget to record the starting, stopping, or interruption time for an activity, promptly enter your best estimate.

34 Time Recording Log InstructionsProject Enter the program name or number. Phase Enter the name of the phase for the activity you worked on, e.g. Planning, Design, Test. Start Date and Time Enter the date and time when you start working on a process activity. Interruption Time Record any interruption time that was not spent on the process activity. If you have several interruptions, enter their total time. You may enter the reason for the interrupt in comments. Stop Date and Time Enter the date and time when you stop working on that process activity. Delta Time Enter the clock time you actually spent working on the process activity, less the interruption time. Comments Enter any other pertinent comments that might later remind you of any unusual circumstances regarding this activity.

35 Defect Recording Log PurposeEngineer: ______ Date: ___ Program: ______ Language: ________ Project Date Number Type Inject Remove Fix Time Fix Ref. Description Purpose Use this form to hold data on the defects that you find and correct. These data are used to complete the Project Plan Summary form. General Record each defect separately and completely. If you need additional space, use another copy of the form.

36 Defect Recording Log InstructionsProject Give each program a different name or number. For example, record test program defects against the test program. Date Enter the date on which you found the defect. Number Enter the defect number. For each program or module, use a sequential numb. starting with 1 (or 001, etc.). Type Enter the defect type from the defect type standard (see next). Use your best judgment in selecting which type applies. Inject Enter the phase when this defect was injected. Use your best judgment. Remove Enter the phase during which you fixed the defect. (This will generally be the phase when you found the defect.) Fix Time Enter the time that you took to find and fix the defect. This time can be determined by stopwatch or by judgment. Fix Ref. If you or someone else injected this defect while fixing another defect, record the number of the improperly fixed defect. If you cannot identify the defect number, enter an X. Description Write a succinct description of the defect that is clear enough to later remind you about the error and help you to remember why you made it.

37 PSP Defect Type StandardType Number Type Name Description 10 Documentation Comments, messages 20 Syntax Spelling, punctuation, typos, instruction formats 30 Build, Package Change management, library, version control 40 Assignment Declaration, duplicate names, scope, limits 50 Interface Procedure calls and references, I/O, user formats 60 Checking Error messages, inadequate checks 70 Data Structure, content 80 Function Logic, pointers, loops, recursion, computation, function defects 90 System Configuration, timing, memory 100 Environment Design, compile, test, or other support system problems

38 Time and defect data in the project summary (based on time and defect logs)(MS Access & Excel tools)

39 Other measures in the project summary (based on time, defect and size counters)Review rate Higher rates generally give lower-yield reviews Code review rate suggested: < 200 LOC/hour Document review rate suggested: < 4 pages/hour Defects removal efficiency Typical defect removal rates in design reviews: 3 to 5 defects / hour Typical defect removal rates in code review: 5 to 10 defects / hour Defect removal leverage (DRL) Measures the effectiveness of a process step at removing defects relative to a base process DRL for phase X with respect to unit test (UT): DRL(X/UT) = (defects removed/hour phase X) / (defects removed/hour unit test) Yield Measures (a posteriori) the effectiveness of a process step/phase at removing defects Yield = 100 * (defects found) / (defects found + not found at this phase) Defect density: defects / KLOC

40 Automated static analysis techniques

41 Static code analysis toolsRule based - perform checks that result in observations on coding practices; look for constructs that "look dangerous" Metric based – perform checks that result in observations on code quality metrics values such as Cyclomatic Complexity and Nesting Depth Usually applied to source code, but can also be applied to intermediate code and object code Tools:

42 Formal proofs of program correctnessProve the Hoare triple: {P} S {Q} P, Q – Pre-condition, Post-condition (specification) S – Program (implementation) If the program S is executed from an initial state (of instance variables and arguments) that meets the pre-condition P, then, at the end of execution, the final state (result value and instance variables) meets the post-condition Q Weak correctness: if S terminates, then Q holds Strong correctness: S terminates and Q holds May be partially automated (or at least supported by tools that check the internal consistency of the proof) Tools: Isabelle, etc. See: Formal Methods in Software Engineering

43 Symbolic execution Also called abstract interpretation or abstract execution Program execution is simulated with expressions (function of initial values) instead of concrete values for program variables Example: Analysis: Program: begin X := X + Y; Y := X – Y; X := X – Y; end

44 Model checking Check a finite state machine model of the system against high-level properties (such as safety properties like reachability and absence of cycles) usually expressed in temporal logic Particularly useful to check (hardware/software) systems’ designs based on state machines Very successful technique Tools: SPIN, SMV, etc.

45 Program slicing Also called code slicingAuxiliary technique that extracts all statements relevant to the computation of a given variable Useful in program debugging, software maintenance and program understanding Program slices can be used to reduce the effort in examining software by allowing a software auditor to focus attention on one computation at a time

46 More checklists

47 A sample general checklist for reviewing software documentsCoverage and completeness Are all essential items completed? Have all irrelevant items been omitted? Is the technical level of each topic addressed properly for this document? Is there a clear statement of goals for this document? (Don't forget: more documentation does not mean better documentation) Correctness Are there incorrect items? Are there any contradictions? Are the any ambiguities? Clarity and Consistency Are the material and statements in the document clear? Are the examples clear, useful, relevant and correct? Clarity and Consistency (cont.) Are the diagrams, graphs and illustrations clear, correct, use the proper notation, effective, in the proper place? Is the terminology clear and correct? Is there a glossary of technical terms that is complete and correct? Is the writing style clear (nonambiguous)? References and Aids to Document Comprehension Is there an abstract or introduction? Is there a well placed table of contents? Are the topics or items broken down in a manner that is easy to follow and is understandable? Is there a bibliography that is clear, complete and correct? Is there an index that is clear, complete and correct? Is the page and figure numbering correct and consistent? (adapted from Ilene Burnstein, Practical Software Testing, pg. 327)

48 A sample specification/requirements attributes checklistWhat to consider Complete Is anything missing or forgotten? Is it thorough? Does it include everything necessary to make it stand alone? Accurate Is the proposed solution correct? Does it properly define the goal? Are there any errors? Precise, Unambiguous and Clear Is the description exact and not vague? Is there a single interpretation? Is it easy to read and understandable? Consistent Is the description of the feature written so that it doesn't conflict with itself or other items in the specification? Relevant Is the statement necessary to specify the feature? Is there extra information that should be left out? Is the feature traceable to an original customer need? Feasible Can the feature be implemented with the available personnel, tools, and resources within the specified budget and schedule? Code-free Does the specification stick with defining the product and not the underlying software design, architecture, and code? Testable Can the feature be tested? Is enough information provided that a tester could create tests to verify its operation? (adatped from: Ron Patton, Software Testing)

49 A sample supplementary checklist for design reviews (for high-level architectural design and detailed design) Are the high-level and detailed design consistent with requirements? Do they address all the functional and quality requirements? Is detailed design consistent with high-level design? Are design decisions properly highlighted and justified and traced back to requirements? Are design alternatives identified and evaluated? Are design notations (ex: UML), methods (ex: OOD, ATAM) and standards chosen and used adequately? Are naming conventions being followed appropriately? Is the system structuring (partitioning into sub-systems, modules, layers, etc.) well defined and explained? Are the responsibilities of each module and the relationships between modules well defined and explained? Do modules exhibit strong cohesion and weak coupling? Is there a clear and rigorous description of each module interface, both at the syntactic and semantic level? Are dependencies identified? Have user interface design issues, including standardization, been addressed properly? Is there a clear description of the interfaces between this system and other software and hardware systems? Have reuse issues been properly addressed, namely the possible reuse of COTS (commercial off the shelf) components (buy-or-build decision) and in-house reusable components? Is the system designed so that it can be tested at various levels (unit, integration and system)? (adapted from: Ilene Burnstein, pg )

50 A sample general code review checklist (1)Design Issues Does each unit implement a single function? Are there instances where the unit should he partitioned? Is code consistent with detailed design? Does the code cover detailed design? Data Items Is there an input validity check? Arrays-check array dimensions, boundaries, indices. Variables - are they all defined, initiated? have correct types and scopes been checked? Are all variables used? Computations Are there computations using variables with inconsistent data types? Are there mixed-mode computations? Is the target value of an assignment smaller than the right-hand expression? Is over- or underflow a possibility (division by zero)? Are there invalid uses of integers or floating point arithmetic? Are there comparisons between floating point numbers? Are there assumptions about the evaluation order in Boolean expressions? Are the comparison operators correct?

51 A sample general code review checklist (2)Control Flow Issues Will the program, module or, unit eventually terminate? Is there a possibility of an infinite loop, a loop with a premature exit, a loop that never executes? Interface Issues Do the number and attributes of the parameters used by a caller match those of the called routine? Is the order of parameters also correct and consistent in caller and callee? Does a function or procedure alter a parameter that is only meant as an input parameter? If there are global variables, do they have corresponding definitions and attributes in all the modules that use them? Input/output Issues Have all files been opened for use? Are all files properly closed at termination? If files are declared are their attributes correct? Are EOF or I/O errors conditions handed correctly? Is I/O buffer size and record size compatible?

52 A sample general code review checklist (3)Portability Issues Is there an assumed character set, and integer or floating point representation? Are their service calls that mar need to be modified? Error Messages Have all warnings and informational messages been checked and used appropriately? Comments/Code Documentation Has the code been properly documented? Are there global, procedure, and line comments where appropriate? Is the documentation clear, and correct, and does it support understanding? Code Layout and White Space Has white space and indentation been used to support understanding of code logic and code intent? Maintenance Does each module have a single exit point? Are the modules easy to change (low coupling and high cohesion)? (adapted from: Ilene Burnstein, pg. 331)

53 A sample code review checklist for C programs (1)Data Items Are all variables lowercase? Are all variables initialized? Are variable names consistent, and do they reflect usage? Are all declarations documented (except for those that are very simple to understand)? Is each name used for a singe function (except for loop variable names)? Is the scope of the variable as intended? Constants Are all constants in uppercase? Are all constants defined with a "#define"? Are all constants used in multiple files defined in an INCLUDE header file? Pointers Are pointers declared properly as pointers? Are the pointers initialized properly?

54 A sample code review checklist for C programs (2)Control Are if/then, else, and switch statements used clearly and properly? Strings Strings should have proper pointers. Strings should end with a NULL. Brackets All curly brackets should have appropriate indentations and be matched Logic Operators Do all initializations use an " = " and not an " = ="? Check to see that all logic operators are correct, for example, use of = / = =, and || Computations Are parentheses used in complex expressions and are they used properly for specifying precedences? Are shifts used properly? (adapted from: Ilene Burnstein, pg. 331)

55 Types of (end-user) software documentation(1)Packaging text and graphics. Box, carton, wrapping, and so on. Might contain screen shots from the software, lists of features, system requirements, and copyright information. Marketing material, ads, and other inserts. These are all the pieces of paper you usually throw away, but they are important tools used to promote the sale of related software, add-on content, service contracts, and so on. The information for them must be correct for a customer to take them seriously. Warranty/registration. This is the card that the customer fills out and sends in to register the software. It can also be part of the software and display onscreen for the user to read, acknowledge, and even complete online. EULA. Pronounced "you-la," it stands for End User License Agreement. This is the legal document that the customer agrees to that says, among other things, that he won't copy the software nor sue the manufacturer if he's harmed by a bug. The EULA is sometimes printed on the envelope containing the media-the floppy or CD. It also may pop up onscreen during the software's installation. Labels and stickers. These may appear on the media, on the box, or on the printed material. There may also be serial number stickers and labels that seal the EULA envelope. See in a following slide an example of a disk label and all the information that needs to be checked. Installation and setup instructions. Sometimes this information is printed on the media, but it also can be included as a separate sheet of paper or, if it's complex software, as an entire manual.

56 Types of (end-user) software documentation (2)User's manual. The usefulness and flexibility of online manuals has made printed manuals much less common than they once were. Most software now comes with a small, concise "getting started"-type manual with the detailed information moved to online format. The online manuals can be distributed on the software's media, on a Web site, or a combination of both. Online help. Online help often gets intertwined with the user's manual, sometimes even replacing it. Online help is indexed and searchable, making it much easier for users to find the information they're looking for. Many online help systems allow natural language queries so users can type "Tell me how to copy text from one program to another" and receive an appropriate response. Tutorials, wizards, and CBT (Computer Based Training). These tools blend programming code and written documentation. They're often a mixture of both content and high-level, macro-like programming and are often tied in with the online help system. A user can ask a question and the software then guides him through the steps to complete the task. Microsoft's Office Assistant, sometimes referred to as the "paper clip guy" is an example of such a system. Samples, examples, and templates. An example of these would be a word processor with forms or samples that a user can simply fill in to quickly create professional-looking results. A compiler could have snippets of code that demonstrate how to use certain aspects of the language. Error messages. Often neglected; ultimately fall under the category of documentation. (adapted from: Ron Patton, Software Testing, pg )

57 Information to check in a sample disk label(source: Ron Patton, Software Testing)

58 A sample (end-user) documentation review checklistWhat to Check What to Consider General Areas Audience Does the documentation speak to the correct level of audience, not too novice, not too advanced? Terminology Is the terminology proper for the audience? Are the terms used consistently? If acronyms or abbreviations are used, are they standard ones or do they need to be defined? Make sure that your company's acronyms don't accidentally make it through. Are all the terms indexed and cross-referenced correctly? Content and subject matter Are the appropriate topics covered? Are any topics missing? How about topics that shouldn't be included, such as a feature that was cut from the product and no one told the manual writer. Is the material covered in the proper depth? Correctness Just the facts Is all the information factually and technically correct? Look for mistakes caused by the writers working from outdated specs or sales people inflating the truth. Check the table of contents, the index, and chapter references. Try the Web site URLs. Is the product support phone number correct? Try it. Step by step Read all the text carefully and slowly. Follow the instructions exactly. Assume nothing! Resist the temptation to fill in missing steps; your customers won't know what's missing. Compare your results to the ones shown in the documentation. Figures and screen captures Check figures for accuracy and precision. Are they of the correct image and is the image correct? Make sure that any screen captures aren't from prerelease software that has since changed. Are the figure captions correct? Samples and examples Load and use every sample just as a customer would. If it's code, type or copy it in and run it. There's nothing more embarrassing than samples that don 't work-and it happens all the time! Spelling and grammar In an ideal world, these types of bugs wouldn't mate it through to you. Spelling and grammar checkers are too commonplace not to be used. It's possible, though, that someone forgot to perform the check or that a specialized or technical term slipped through. It's also possible that the checking had to be done manually, such as in a screen capture or a drawn figure. Don't take it for granted. (adapted from: Ron Patton, Software Testing, pg. 195)

59 Quality attributes (or dimensions) to check in technical informationCan be checked by asking probing questions, like: Is the information appropriate for the intended audience? Is information presented from a user’s point of view? Is there a focus on real tasks? Is the reason for the information evident? Do titles and headings reveal real tasks? Build your own check list! Adapt to your needs! Source: Developing Quality Technical Information (DQTI), Hargis, IBM, 1997 not only for software, not only for end-user documentation (also documentation for developers and maintainers)

60 References and further readingPractical Software Testing, Ilene Burnstein, Springer-Verlag, 2003 Chapter 10 – Reviews as a testing activity Software Testing, Ron Patton, SAMS, 2001 Chapters 4 (Examining the Specification), 6 (Examining the Code) and 12 (Testing the Documentation) Guide to the Software Engineering Body of Knowledge (SWEBOK), IEEE Computer Society IEEE Standard for User Documentation (IEEE Std ) IEEE Recommended Practices for Software Requirements Specification (IEEE Std ) IEEE Recommended Practices for Software Design Descriptions (ANSI/IEEE Std ) IEEE Standard for Software Reviews and Audits (IEEE Std ) Available from ieeeexplore from FEUP IEEE Standard for Software Quality Assurance Plans (IEEE Std ) Available on the web PSP: A Self-Improvement Process for Software Engineers, Watts S. Humphrey, 2005 An SEI Process Improvement Path to Software Quality, Philip Miller, SEI, QUATIC 2007 Developing Quality Technical Information (DQTI), G. Hargis, Prentice-Hall, 1997 (first edition), 2004 (second edition)