product development and maintenance.CMMI is the designated successor of the CMM. [CMMI]See also Capability Maturity Model (CMM). capture/playback tool:A type of test execution tool where inputs are capture/replay tool:See capture/playback tool. CASE:Acronym for Computer Aided Software Engineering. CAST:Acronym for Computer Aided Software Testing.See also test automation. utputs(effects),whic cause-effect graphing:A black box test design technique in which test cases are designed from cause-effect graphs.[BS 7925/2] cause-effect analysis:See cause-effect graphing. cause-effect decision table:See decision table component,system or person complies with changeability:The capability of the software product to enable specified modifications to be implemented.[ISO 9126]See also maintainability. change control:See configuration control. change control board:See configuration control board. checker:See reviewer Chow's coverage metrics:See N-switch coverage.[Chow] classification tree:A tree showing equivalence parititions hierarchically ordered,which is used to design test cases in the classification tree method.See also classification tree method. classification tree black box test design technique in which test cases described a cl to execute combinations of representatives code:Computer instructions and data definitions expressed in a r rogramming language or in a form output by an assembler,compiler or other translator.[EEE 610] code analyzer:See static code analyzer. code coverage:An analysis method that determines which parts of the software have been overed)by the test suite and which parts have not been executed,e.g.statement coverage,decision coverage or condition coverage. code-based testing:See white box testing co-existence:The capability of the software product to co-exist with other independent software in a common environment sharing common resources.[ISO 9126]See also portability. commercial off-the-shelf software:See off-the-shelf sofiware. comparator:See test comparator 11
11 product development and maintenance. CMMI is the designated successor of the CMM. [CMMI] See also Capability Maturity Model (CMM). capture/playback tool: A type of test execution tool where inputs are recorded during manual testing in order to generate automated test scripts that can be executed later (i.e. replayed). These tools are often used to support automated regression testing. capture/replay tool: See capture/playback tool. CASE: Acronym for Computer Aided Software Engineering. CAST: Acronym for Computer Aided Software Testing. See also test automation. cause-effect graph: A graphical representation of inputs and/or stimuli (causes) with their associated outputs (effects), which can be used to design test cases. cause-effect graphing: A black box test design technique in which test cases are designed from cause-effect graphs. [BS 7925/2] cause-effect analysis: See cause-effect graphing. cause-effect decision table: See decision table. certification: The process of confirming that a component, system or person complies with its specified requirements, e.g. by passing an exam. changeability: The capability of the software product to enable specified modifications to be implemented. [ISO 9126] See also maintainability. change control: See configuration control. change control board: See configuration control board. checker: See reviewer. Chow's coverage metrics: See N-switch coverage. [Chow] classification tree: A tree showing equivalence parititions hierarchically ordered, which is used to design test cases in the classification tree method. See also classification tree method. classification tree method: A black box test design technique in which test cases, described by means of a classification tree, are designed to execute combinations of representatives of input and/or output domains. [Grochtmann] code: Computer instructions and data definitions expressed in a programming language or in a form output by an assembler, compiler or other translator. [IEEE 610] code analyzer: See static code analyzer. code coverage: An analysis method that determines which parts of the software have been executed (covered) by the test suite and which parts have not been executed, e.g. statement coverage, decision coverage or condition coverage. code-based testing: See white box testing. co-existence: The capability of the software product to co-exist with other independent software in a common environment sharing common resources. [ISO 9126] See also portability. commercial off-the-shelf software: See off-the-shelf software. comparator: See test comparator
compatibility testing:See interoperability testing. orderange complete testing:See exhaustive testing completion criteria:See exit criteria. complexity:The degree to which a component or system has a design and/or internal structure that is difficult to understand,maintain and verify.See also cyclomatic complexity. nce:Th capab the software produ adhere to standards,conventions or regulations in laws and similar prescriptions.[SO9126] compliance testing:The process of testing to determine the compliance of the component or system. component:A minimal software item that can be tested in isolation. component integration testing:Testing performed to expose defects in the interfaces and interaction between integrated components. s of its componen specifeation:A descrption of a component's functon in term e under specif itions,and required non-function component testing:The testing of individual software components.[After IEEE610] compound condition:Two or more single conditions joined by means of a logical operator (AND,OR or XOR),e.g.'A>B AND C>1000 concrete test case:See low level test case testing:Testing to how the activities within the inte al ved ei handled by th como vities or by ent o or system.[] condition:A logical expression that can be evaluated as True or False,e.g.A>B.See also test condition. condition combination coverage:See multiple condition coverage condition combination testing:See multiple condition testing. condition coverage:The percentage of condition outcomes that have bee exercised by a test suite.100%condition coverage requires ach singe condition in very decision statement to be tested as True and False condition determination coverage:The percentage of all single condition outcomes that independently affect a decision outcome that have been exercised by a test case suite. 100%condition determination coverage implies 100%decision condition coverage. condition determination testing:A w to execute condition testing:A white box test design technique in which test cases are designed to execute condition outcomes. condition outcome:The evaluation of a condition to True or False 12
12 compatibility testing: See interoperability testing. compiler: A software tool that translates programs expressed in a high order language into their machine language equivalents. [IEEE 610] complete testing: See exhaustive testing. completion criteria: See exit criteria. complexity: The degree to which a component or system has a design and/or internal structure that is difficult to understand, maintain and verify. See also cyclomatic complexity. compliance: The capability of the software product to adhere to standards, conventions or regulations in laws and similar prescriptions. [ISO 9126] compliance testing: The process of testing to determine the compliance of the component or system. component: A minimal software item that can be tested in isolation. component integration testing: Testing performed to expose defects in the interfaces and interaction between integrated components. component specification: A description of a component’s function in terms of its output values for specified input values under specified conditions, and required non-functional behavior (e.g. resource-utilization). component testing: The testing of individual software components. [After IEEE 610] compound condition: Two or more single conditions joined by means of a logical operator (AND, OR or XOR), e.g. ‘A>B AND C>1000’. concrete test case: See low level test case. concurrency testing: Testing to determine how the occurrence of two or more activities within the same interval of time, achieved either by interleaving the activities or by simultaneous execution, is handled by the component or system. [After IEEE 610] condition: A logical expression that can be evaluated as True or False, e.g. A>B. See also test condition. condition combination coverage: See multiple condition coverage. condition combination testing: See multiple condition testing. condition coverage: The percentage of condition outcomes that have been exercised by a test suite. 100% condition coverage requires each single condition in every decision statement to be tested as True and False. condition determination coverage: The percentage of all single condition outcomes that independently affect a decision outcome that have been exercised by a test case suite. 100% condition determination coverage implies 100% decision condition coverage. condition determination testing: A white box test design technique in which test cases are designed to execute single condition outcomes that independently affect a decision outcome. condition testing: A white box test design technique in which test cases are designed to execute condition outcomes. condition outcome: The evaluation of a condition to True or False
confidence test:See smoke test. configuration auditing:The function to check on the contents of libraries of configuration items,e.g.for standards compliance.[EEE610] configuration control:An element of configuration management,consisting of the evaluation,co-ordination,approval or disapproval,and implementation of changes to configuration items after formal establishment of their configuration identification.[IEEE 6101 configuration control board (CCB):A group of people responsible for luating and o configuration items,and Tor ensuring configuration identification:An element of configuration mana ment consisting of characteristics in technical documentation.[IEEE 610] configuration item:An aggregation of hardware,software or both,that is designated for configuration management and treated as a single entity in the configuration management process.IEEE 610 ative dire ance ion and ntrol c function physic acteris ing and impleme ntation status and verify iance with [IEEE 6101 configuration management tool:A tool that provides support for the identification and control of configuration items,their status over changes and versions,and the release of baselines consisting of configuration items. configuration testing:See portability testing. confirmation testing:See re-testing conformance testing:See compliance testing consistency:The degree of uniformity,standardization,and freedom from contradiction among the documents or parts of a component or system.[IEEE 610] control flow:A sequence of events(paths)in the execution through a component or system. control flow control flow graph:An abstract representation of all possible sequences of events(paths)in the execution through a component or system. control flow path:See path. continuous representatio ility maturity model structure wherein capability leve provide a conversion testing:Testing of software used to convert data from existing systems for use in replacement systems. 3
13 confidence test: See smoke test. configuration: The composition of a component or system as defined by the number, nature, and interconnections of its constituent parts. configuration auditing: The function to check on the contents of libraries of configuration items, e.g. for standards compliance. [IEEE 610] configuration control: An element of configuration management, consisting of the evaluation, co-ordination, approval or disapproval, and implementation of changes to configuration items after formal establishment of their configuration identification. [IEEE 610] configuration control board (CCB): A group of people responsible for evaluating and approving or disapproving proposed changes to configuration items, and for ensuring implementation of approved changes. [IEEE 610] configuration identification: An element of configuration management, consisting of selecting the configuration items for a system and recording their functional and physical characteristics in technical documentation. [IEEE 610] configuration item: An aggregation of hardware, software or both, that is designated for configuration management and treated as a single entity in the configuration management process. [IEEE 610] configuration management: A discipline applying technical and administrative direction and surveillance to: identify and document the functional and physical characteristics of a configuration item, control changes to those characteristics, record and report change processing and implementation status, and verify compliance with specified requirements. [IEEE 610] configuration management tool: A tool that provides support for the identification and control of configuration items, their status over changes and versions, and the release of baselines consisting of configuration items. configuration testing: See portability testing. confirmation testing: See re-testing. conformance testing: See compliance testing. consistency: The degree of uniformity, standardization, and freedom from contradiction among the documents or parts of a component or system. [IEEE 610] control flow: A sequence of events (paths) in the execution through a component or system. control flow analysis: A form of static analysis based on a representation of sequences of events (paths) in the execution through a component or system. control flow graph: An abstract representation of all possible sequences of events (paths) in the execution through a component or system. control flow path: See path. continuous representation: A capability maturity model structure wherein capability levels provide a recommended order for approaching process improvement within specified process areas. [CMMI] conversion testing: Testing of software used to convert data from existing systems for use in replacement systems
cost of quality:The total costs incurred on quality activities and issues and often split into prevention costs,appraisal costs,internal failure costs and external failure costs. COTS:Acronym for Commercial Off-The-Shelf software.See fthe-shelfsofware. coverage analysis:Measurement of achieved coverage to a specified coverage item during test execution referring to predetermined criteria to determine whether additional testing is required and if so,which test cases are needed. coverage measurement tool:See coverage tool. coverage tool:A tool that provides objective measures of what structural elements,e.g. statements,branches have been exercised by a test suite. custom software:See bespoke software. eyelomatic my:The number of independent paths through a program.Cyclomatic compl er of edges/links in a graph 0 d parts of the graph (e.g.a called graph and a subroutine) IAfter McCabel cyclomatic number:See cvclomatic complexity daily build:a devel at a a eo mplete s day (usually o con ent em is ny time e including all latest changes. data definition:An executable statement where a variable is assigned a value. data driven testing:A scripting technique that stores test input and expected results in a table sheet so that a single control script can cute all of the test in the table. a capture/playback tools.Fewster and GrahamSee also keyorddiven esng ng sed to sup executio n tools such as data flow:An abstractr esentation of the sequence and possible changes of the state of data flow analysis:A form of static analysis based on the definition and usage of variables. data flow coverage:The percentage of definition-use pairs that have been exercised by a test suite. data integrity testing:See database integrity testing database integrity testing:Testing the methods and processes used to access and manage the data(base),to ensure access methods,processes and data rules function as expected and that during access to the database,data is not corrupted or unexpectedly deleted,updated or created
14 cost of quality: The total costs incurred on quality activities and issues and often split into prevention costs, appraisal costs, internal failure costs and external failure costs. COTS: Acronym for Commercial Off-The-Shelf software. See off-the-shelf software. coverage: The degree, expressed as a percentage, to which a specified coverage item has been exercised by a test suite. coverage analysis: Measurement of achieved coverage to a specified coverage item during test execution referring to predetermined criteria to determine whether additional testing is required and if so, which test cases are needed. coverage measurement tool: See coverage tool. coverage item: An entity or property used as a basis for test coverage, e.g. equivalence partitions or code statements. coverage tool: A tool that provides objective measures of what structural elements, e.g. statements, branches have been exercised by a test suite. custom software: See bespoke software. cyclomatic complexity: The number of independent paths through a program. Cyclomatic complexity is defined as: L – N + 2P, where - L = the number of edges/links in a graph - N = the number of nodes in a graph - P = the number of disconnected parts of the graph (e.g. a called graph and a subroutine) [After McCabe] cyclomatic number: See cyclomatic complexity. D daily build: a development activity where a complete system is compiled and linked every day (usually overnight), so that a consistent system is available at any time including all latest changes. data definition: An executable statement where a variable is assigned a value. data driven testing: A scripting technique that stores test input and expected results in a table or spreadsheet, so that a single control script can execute all of the tests in the table. Data driven testing is often used to support the application of test execution tools such as capture/playback tools. [Fewster and Graham] See also keyword driven testing. data flow: An abstract representation of the sequence and possible changes of the state of data objects, where the state of an object is any of: creation, usage, or destruction. [Beizer] data flow analysis: A form of static analysis based on the definition and usage of variables. data flow coverage: The percentage of definition-use pairs that have been exercised by a test suite. data flow testing: A white box test design technique in which test cases are designed to execute definition and use pairs of variables. data integrity testing: See database integrity testing. database integrity testing: Testing the methods and processes used to access and manage the data(base), to ensure access methods, processes and data rules function as expected and that during access to the database, data is not corrupted or unexpectedly deleted, updated or created
dead code:See unreachable code. debugger:See debugging tool. debugging:The process of finding.analyzing and removing the causes of failures in softwar debugging tool:A tool used by programmers to reproduce failures,investigate the state of programs and find the corresponding defect.Debuggers enable programmers to execute programs step by step,to halt a program at any program statement and to set and examine program variables. ion:A flow has two or more alternative routes.A decision condition rage:The e of all condition and decision est suite 100%de ision conditior coverage implies both 100%condition coverage and 100%decision coverage. decision condition testing:A white box test design technique in which test cases are designed to execute condition outcomes and decision outcomes. iioncoverae mle othes that have been exer d by a test h coverage and 100%statement coverage decision outcome:The result of a decision(which therefore determines the branches to be taken). deeision table:A table showing combinations of inputs and/or stimuli(causes)with their associated outputs and/or actions(effects),which can be used to design test cases. decision table testing:A black box test design technique in which test cases are designed to execute the ations of input s and/or stimuli (causes)shown in a decision table [Veenendaal]See also decision table Awhite bo test design technquewhch test cases are designed defect:A flaw in a component or system that can cause the component or system to fail to perform its required function,e.g.an incorrect statement or data definition.A defect,if encountered during execution,may cause a failure of the component or system. defect based technique:See defect based test design technique. defect based test design technique: procedure to derive and/or select test cases ta defect density:The number of defects identified in a component or system divided by the size of the component or system (expressed in standard measurement terms,e.g.lines-of- code,number of classes or function points). Defect Deteetion Pereentage(DDP):The number of defects found by a test phase,divided by the number found by that test phase and any other means afterwards. The process estiga [After IEEE 1044] 15
15 dead code: See unreachable code. debugger: See debugging tool. debugging: The process of finding, analyzing and removing the causes of failures in software. debugging tool: A tool used by programmers to reproduce failures, investigate the state of programs and find the corresponding defect. Debuggers enable programmers to execute programs step by step, to halt a program at any program statement and to set and examine program variables. decision: A program point at which the control flow has two or more alternative routes. A node with two or more links to separate branches. decision condition coverage: The percentage of all condition outcomes and decision outcomes that have been exercised by a test suite. 100% decision condition coverage implies both 100% condition coverage and 100% decision coverage. decision condition testing: A white box test design technique in which test cases are designed to execute condition outcomes and decision outcomes. decision coverage: The percentage of decision outcomes that have been exercised by a test suite. 100% decision coverage implies both 100% branch coverage and 100% statement coverage. decision outcome: The result of a decision (which therefore determines the branches to be taken). decision table: A table showing combinations of inputs and/or stimuli (causes) with their associated outputs and/or actions (effects), which can be used to design test cases. decision table testing: A black box test design technique in which test cases are designed to execute the combinations of inputs and/or stimuli (causes) shown in a decision table. [Veenendaal] See also decision table. decision testing: A white box test design technique in which test cases are designed to execute decision outcomes. defect: A flaw in a component or system that can cause the component or system to fail to perform its required function, e.g. an incorrect statement or data definition. A defect, if encountered during execution, may cause a failure of the component or system. defect based technique: See defect based test design technique. defect based test design technique: A procedure to derive and/or select test cases targeted at one or more defect categories, with tests being developed from what is known about the specific defect category. See also defect taxonomy. defect density: The number of defects identified in a component or system divided by the size of the component or system (expressed in standard measurement terms, e.g. lines-ofcode, number of classes or function points). Defect Detection Percentage (DDP): The number of defects found by a test phase, divided by the number found by that test phase and any other means afterwards. defect management: The process of recognizing, investigating, taking action and disposing of defects. It involves recording defects, classifying them and identifying the impact. [After IEEE 1044]