STATIC TESTING
• set of testing methods and techniques in which the component or system under test IS NOT RUN/EXECUTED
• can be applied to non-executable work products other than software, like
- DESIGN
- DOCUMENTATION
- SPECIFICATION, etc
GOALS OF STATIC TESTING
• QUALITY IMPROVEMENT
• DEFECT DETECTION
• EVALUATION OF CHARACTERISTICS LIKE:
- readability - completeness - correctness - testability - consistency of the work products under review
BOTH VERIFICATION AND VALIDATION
• in agile software development during requirements development whole team make sure that those requirements and related work products meet THE DEFINITION OF READY
• ENSURING THAT REQUIREMENTS ARE
COMPLETE
UNDERSTANDABLE
TESTABLE
USER CASES CONTAIN TESTABLE ACCEPTABLE CRITERIA
STATIC ANALYSIS - TECHNIQUES
Evaluating the work product under test (usually code, requirements or design documents) using tools
Examples of s.a. Techniques:
1. CODE MEASUREMENTS (e.g measuring its size or cyclomatic complexity)
2. CONTROL FLOW ANALYSIS
3. DATA FLOW ANALYSIS
4. CHECKING THE COMPATIBILITY OF VARIABILITY TYPES, VERIFICATION OF TEH CORRECT APPLICATION OF CODE WRITING STANDARS
Static analysis - included in CI frameworks as one step of TEH automated deployment pipeline
• assessing MAINTAINABILITY, PERFORMANCE AND VULNERABILITY OF CODE TO SECURITY ATTACKS
WORK PRODUCTS SUBJECTED TO STATIC ANALYSIS (any can, but most common)
WORK PRODUCTS REVIEWED BY STATIC ANALYSIS THAT HAVE A FORMAL STRUCTURE WE TEST AGAINST
• SOURCE CODE (against standards and grammar of certain language)
• MODELS (e,g, UML diagrams)
• TEXT DOCUMENTS
VALUE OF STATIC TESTING
•EFFECTIVE AND EFFICIENT but not cheap
• helps catching DEFECTS EARLY
- ESPECIALLY WITH DESIGN DEFECTS
• building confidence in the product
• shard understanding with stakeholders
• IDENTIFYING DEFECTS THAT ARE DIFFICULT TO DETECT IN SUBSEQUENT DYNAMIC TESTING
• identifying defects impossible to defy in dynamic testing
- INFEASIBLE CODE (unreachable)
- UNUSED CODE
- INCORRECT USE OF LACK OF USE OF DESIGN PATTERNS IN CODE
- DEFECTS IN NON-EXECUTABLE PRODUCTS, like documentation
- detecting ambiguities, contradictions, omissions, oversights, redundant info or inconsistencies in documentation (requirement specification or architecture design) -> preventing defects this way
- increased efficiency of programming by:
• improving the design and code maintainability by imposing uniform standards
- reducing cost and time of software development
- reducing the cost of quality throughput the software development cycle by reducing costs of maintenance phase
- improving communication among team members by conducting reviews
COST OF QUALITY
TOTAL COST INCURRED FOR QUALITY ACTIVITIES, THAT IS COST OF:
- PREVENTATIVE ACTIVITIES (e.g. cost of training)
- DETECTION (e.g. cost of testing)
- INTERNAL FAILURES (e.g. cost of fixing defects found in production)
- EXTERNAL FAILURES (e.g. cost of fixing field defects found by users)
DIFFERENCE BETWEEN STATIC AND DYNAMIC TESTING
• same goal (evaluate product quality and identify defetcs) but different type of defects
• static testing - DEFECTS DIRECTLY IN The WORK PRODUCT
we do not find failures because software wasn’t executed
• dynamic testing- first sign of malfunction is FAILURE
• static - INTERNAL QUALITY AND CONSISTENCY OF WORK PRODUCTS
• dynamic - EXTERNAL, VISIBLE BEHAVIOUR
• static -> applied to non-executable work products
Dynamic - performed against a running work product; can measure performance, e.g. response time
TYPICAL DEFECTS THAT ARE EASIER AND CHEAPER TO DETECT AND FIX WITH STATIC TESTING
REVIEWS
Form of providing early feedback to the team
•can be done early in SDLc
BENEFITS OF EARY AND FREQUENTLY STAKEHOLDER FEEDBACK
• info about potential quality issues
• meeting their vision- costly rework
• delivering what’s of most value to stakeholders
TYPES OF REVIEWS - DEPENDING ON FORMALITY
• GENERIC REVIEW - structured bit flexible framework
Planning —> review initiation —> individual review —> communication and analysis —> fixing and reporting
5 GENERIC ACTIVITIES IN THE WORK PRODUCT REVIEW PROCESS
STATUSES FOR ANOMALIES - MOST COMMON CATEGORIZATION
PLANNING - REVIEW PROCESS 1#
REVIEW INITIATION - 2# REVIEW PROCESS
INDIVIDUAL REVIEW - 3# REVIEW PROCESS
• CENTRAL PHASE
• review activities using chosen techniques
• taking notes - any comments, questions, recommendations, concerns, relevant observations
- documenting everything in a problem log - often supported by a defect management or review support tool
COMMUNICATION AND ANALYSIS - #4 REVIEW PROCESS
• meetings, calls
• analysis of ANOMALIES (found problems) reported by reviewers
• categorizing ANOMALIES AS DEFECTS OR FALSE POSITIVES
• if defect occurs - DELEGATING PEOPLE TO FIX THE DEFECT, DEFINING PARAMETERS, SUCH AS status priority, severity
• evaluating and documenting the level of quality characteristics that were defined in the planning phase as those being reviewed
• conclusions of the review are evaluated against exit criteria to decide what to do next
FIXING AND REPORTING - #5 REVIEW PROCESS
• final stage
• creates defect reports on detected defects that require changes
• author of the review process will carry out defect removal
• changes are confirmed
• Review report created
MODERN CODE REVIEW (MCR)
• quality control technique - verify if software quality and customer satisfaction by identifying defects, improving code, and speeding up the development process
• asynchronous and lightweight review process with tools like Gerrit
ROLES AND RESPONSIBILITIES IN REVIEWS
MANAGER
•responsible for scheduling the review
•decides to conduct the review
• designates staff and sets a budget and timeframe
• monitors the cost-effectiveness of the review on an ongoing basis
• executes control decisions in case of unsatisfactory results
AUTHOR
• creates the work product under review
• removes defects in the work product under review (if necessary)
• prepares the material for review, though they might be distributed by the leader
• may provide technical explanations
• can evaluate the work of the reviewers in terms of meritorious value of their comments
MODERATOR/faciliator
• ensures the smooth running of review meetings (if they take place)
• acts as a mediator if it’s necessary
• ensures that a safe atmosphere of mutual trust and respect is created