GTDigital
_______________________________________________
Product
Design Testing and QA Testing
by Jerry Bellott, MSEE
(V5.0 ©2022
Testing of products is performed to find
design errors and manufacturing defects. The amount and degree of depth of
testing planned often depends upon the number of available groups, staff
members, and budgets. This document describes testing to be performed in the
ideal, full-scale test scenario for a product that is perhaps large, has many
components, and is very expensive.
The same concepts can be employed in a
practical way for smaller, less complicated products.
1.
Engineering – Each individual
is responsible for verifying the correctness of their design components or
design changes before releasing them to other engineers or engineering test
groups that exist for additional follow-on testing in a large organization.
2.
QA Test Groups – Often exist in
engineering division of a large company to add additional capabilities that can
be performed in parallel with design engineering. Another advantage is that
sometimes tests planned by a separate organization may find mistakes by having
no pre-conceived notions or assumptions about the product function.
3.
Factory Testing –
The goal of factory testing is to verify that each product is free of
manufacturing defects. By contrast, the goal of Engineering QA (or “design
verification”) is to make sure that designs are free of design errors
(non-conformance to product requirements, specifications, and quality
goals/checklists).
These engineers take many technologies
and quality criteria into account, including:
1. Design Hardware
Technologies (hardware design implementation engineers)
Examples: current standards, best practices, logic correctness,
performance, analog circuit simulation, sensitivity analysis, impedance
matching, design for EMI, thermal considerations, component worst case timing /
de-rating of components, pre-layout simulation and post-layout simulation if
applicable, post-verification against
simulation, component availability over manufacturing life, vendor reputation
and longevity of vendor business, bill of materials meets budget, second
sources for critical components, design complete on time, etc.
2. Design Quality
Goals and Criteria
Quality areas of expertise that come into play for hardware designs: DFSS
techniques (see Circuit Design and QA Test article), reliability of components,
MTBF of overall system, testability by automated diagnostics, design for
controllability and observability of behavior of all system components by
central processor diagnostic routines (makes testing simpler, also facilitates
accurate automated testing in field), use of user and software event activity
logs, storage of error codes with timestamps, parts vendor reputation, second
sourcing of parts, parts availability, inclusion of test points for
troubleshooting, design for automated factory testing, PCBA layout
specifications or other requirements on layout and components required for
manufacturability are met (including: proper specification of PCB manufacturing
tolerances for net impedances, use of manufacturer’s PCB precision and accuracy
specifications when planning location of components and vias.).
Typical steps include:
1. Unit Testing
a.
Hardware Unit Testing -
Using diagnostic test firmware (if the design uses a processor) and lab
instruments (signal generators, storage scopes, spectrum analyzers, logic
analyzers, protocol analyzers (many analysis capabilities are combined in
modern units from Tektronix, other vendors. MatLab can be used to analyze
downloaded signal information. Other common equipment: network analyzers, BERR
testers.
b.
Software Unit Testing –
Using code test utilities, custom test scripts and test code for analysis of
behavior and returned values.
2.
Software System Testing of
build - to the extent possible without running on system hardware.
3.
System Integration Testing of Hardware - Using system hardware diagnostic test firmware.
System hardware diagnostics can also be used to regression test
system hardware when hardware problems are suspected later. This capability is
invaluable and can save time.
4.
System Integration Testing of Software and Hardware - Often the goal is to boot the OS and get basic features to
work, including a console and the ability to access each piece of hardware.
Hardware has ideally been exhaustively pre-tested with diagnostics
that can also be used for regression testing during integration of hardware and
software to help identify source of problems.
5.
Full System Testing –
All product features
Additional
Product Design tests typically include: (These
steps help verify that the product will work in the field.)
1.
Interface Standards Compliance –
Sometimes a third party test facility or purchased test suite is required to
verify compliance of complex PC networking, telecommunication, and wireless
interfaces.
2.
Environmental - temperature,
humidity, altitude, and salt-air/corrosion
3.
Shock (sudden impact) and Vibration
4. EMI/EMC
Compliance
5. Connected Product
Compatibility Testing – Interface specifications and standards verification do not guarantee
100% that two products from two vendors will
definitely work together properly. If advertising remarks about compatibility, an
actual test of interworking of products should be performed by your company to
guarantee results for customers.
Typical tasks performed may include:
1.
System testing of a new
product.
This involves double checking the design with the QA group’s own system test.
This can be beneficial because test designers think of test cases without
knowing the details of the design, so a thorough test by a separate group is
likely to catch additional areas to correct.
2.
Regression testing when
a new product release introduces bug fixes.
When a project is large and integrates many designs into one
product, the QA test group is able to repeat a full or partial system test to
ensure that modified modules or components work properly, and that the overall
system operates properly.
3.
Reproducing problems that
are escalated from the factory or field to the engineering division. Often
finding a way to reproduce a problem is half the battle when it comes to
solving the problem because the conditions which cause the problem to manifest
itself usually point to specific areas to study in more detail to find the
underlying root cause.
Often QA test engineers are able to pinpoint the root causes after reproducing
problems.
Identification of root causes may also involve additional analysis
by the designers after a problem has been reproduced.
4.
Use of Factory QA Data - QA
engineering groups and can study factory yield data to prioritize areas for
improvement. Often other organizations become involved, including parts
procurement, manufacturing engineers, and design engineers to work to improve
yield (% PASS success in all QA factory processes before delivery).
5. Contribute to
Design Reviews
This not only allows QA testers to become familiar with designs
and changes being implemented, but also to contribute comments using their
expertise.
1.
Introduction
Overview of Goals and Purpose
Change History
2.
Applicable Documents and Reference Info - requirements, specifications, and quality
criteria documents
3.
System Version to be tested
Which hardware and software versions are to be tested.
a.
Hardware Versions
b.
Software Versions
c.
Firmware Versions
d.
FPGA HDL Code
Versions
4.
Test Configurations
a.
Diagrams - Shows DUT,
main test equipment, and other systems connected during testing.
b.
Lab Test Equipment
(Serial Numbers, Calibration Status)
c.
Diagnostic or other
Custom Software/Scripts used with Test Configurations
5.
Overview of Test Strategy
Testing is accomplished by several means: (b-d are often described in their own
documents).
a.
Test Cases
b.
Compliance Certification –EMI/EMC agency certification, safety,
environmental per product requirements.
c.
Standards Certification – e.g. verification of complex protocol stacks
supporting networking, telecom, or wireless interfaces. May required third
party test house with good industry reputation.
d.
Verification by Analysis (paper analysis of QA factors; e.g. a
table of MTBF info or DFMECA)
Test cases are run in the engineering lab and often fall into
these categories:
·
Functional tests -
try each major feature in representative ways
·
Hardware tests
include test cases that exercise specific components within the system and
cover the product requirements
·
Software/firmware
tests cover feature requirements
·
Performance tests
introduce loading (I/O traffic or other real-time demanding background
activity)
·
Test Scripts, when
applicable, feed in simulated controls and monitor the correctness of system
actions, using input permutations across the feature set.
Compliance and interface standards certification tests are
typically describe in separate documents, and may re-use the test cases from
the system test document.
6.
Test Descriptions for Each Test Case – Can be reviewed before test procedures are
planned in detail; very efficient approach.
a.
1-3 brief paragraphs
describing the approach to be taken to test each area. (This section can be
submitted for peer review before Test Steps are planned in detail.)
b.
Test Cases are
Numbered as: [CATEGORY]-[SUBCATEGORY]-[NUMBER]
Example Test Case Numbers: HW-Memory-001 or SW-DiskAccess-001 or
SYS-UserInterface-001
·
Prefix is category
name, Make up your own; make them easy to remember.
·
Subcategory is
portion of the category to be tested (e.g. PS for
Power Supply, MEM for Memory, SATA for type of disk interfaces, PCDVD for DVD
player/recorder, etc.)
·
Number is several
digits, starting with 001.
As engineers work with the test plan, it becomes easy to
remember or look up the tests in their records with this system of naming.
7.
Test Procedure Steps - Lists of lab steps for each test summarized in
Test Descriptions section.
a.
Configuration
b.
Equipment Required
c.
Setup
d.
Test Steps
8.
Traceability Table (List of Requirements and Test Case #’s that
provide coverage.)
·
Table lists
requirements (line items in any documents that drive the product design) and
shows which test cases in the system test plan provide coverage for each
requirement. This helps ensure that the product has been adequately tested for
conformance to the project goals.
·
A multi-column
traceability table can be created in Word or Excel using cut and paste,
prepared in about half a day after test cases are listed (It is an easy,
convenient method).
·
More complex
approaches such as hierarchical document linking can sometimes be cumbersome
and time consuming, without improving results.
9.
“Test PASS Criteria” list and Test Results Log
It is sometimes practical to print a copy of the test plan and write results
and observations in directly in this section for each test case. This can be
filed.
Final results can be summarized in a brief Word or Excel
Table with comments about testing.
·
Test PASS criteria
reflect the appropriate requirements, specifications, or quality criteria
covered by the test cases, as listed in the Traceability Table.
·
Note: When an
organization lacks project requirements and specifications, testers sometimes
are asked to proceed to help the organization. In this type of situation,
testers can best proceed by documenting the test pass criteria they will use in
terms of the product data sheets or other product features and performance info
available to them. The test pass criteria can be made a part of the test plan
document for the record, and is ideally reviewed by engineers prior to testing.
_______________________________________________