QUALITY DICTIONARY
A
Acceptable Quality Level (AQL) AQL is the limit of a satisfactory process average at a particular quality level when a continuing series of lots is considered.
Acceptance Sampling Inspection of a sample from a lot to decide whether to accept or not accept that lot. There are two types – attributes sampling and variables sampling.
Accredited Registrars are qualified organizations certified by a national body to perform audits to the QS9000 standard and to register the audited facility as meeting these requirements for a given commodity.
Accuracy The closeness of agreement between an observed value and an accepted reference value. Also, see Precision.
Activity A process, function, or task that occurs over time and has recognizable results. Activities combine to form business processes.
Activity Analysis: the analysis and measurement (in terms of time, cost, and throughput) of distinct units of work that make up a process.
Activity Model A graphic representation of a business process that exhibits the activities and their interdependencies that make up the business process to any desired level of detail. An activity model reveals the interactions between activities in terms of inputs and outputs while showing the controls placed on each activity and the types of resources assigned to each activity.
Activity, non-value added – Any activity that provides a negative return on the investment or allocation of resources to that activity. Within broad limits, the enterprise benefits by allocating fewer resources to non-value added activities.
Activity, value-added – Any activity that contributes directly to the performance of a mission, and could not be eliminated without impairing the mission.
Activity-based costing (ABC) A system for making business decisions based on cost information of fundamental business activities as tasks related to product design, development, quality, manufacturing, distribution, customer acquisition, service, and support. ABC is sometimes considered a form of business process re-engineering as it insists on surfacing a manageable number of cost drivers that can be used to trace variable business costs to customers, products, and processes.
Affinity Diagram A way to organize facts, opinions, ideas, and issues into natural groupings as an aid to diagnosis a complex problem. A large number of ideas are generated and then organized into groupings to reveal major themes.
AIAG Automotive Industry Action Group
Algorithm (1) A finite set of well-defined rules for the solution of a problem in a finite number of steps. (2) Any sequence of operations for performing a specific task.
Algorithm analysis A software task to ensure that the algorithms selected are correct, appropriate, and stable, and meet all accuracy, timing, and sizing requirements.
Alignment-a scale which measures how close an employee’s personal needs are to the organization’s needs.
ALPHA RISK – The probability of accepting the alternate hypothesis when, in reality, the null hypothesis is true.
ALTERNATE HYPOTHESIS – A tentative explanation that indicates that an event does not follow a chance distribution; a contrast to the null hypothesis.
Analysis of Means (ANOM) Developed by Ellis R. Ott in 1967 (later enhanced by Edward Schilling), ANOM is a statistical procedure for troubleshooting industrial processes and analyzing the results of experimental designs with factors at fixed levels. It provides a graphical display of data. Ellis R. Ott developed the procedure in 1967 because he observed that non-statisticians had difficulty understanding the analysis of variance.
Analysis of Variance (ANOVA) is a basic statistical technique for analyzing experimental data. It subdivides the total variation of a data set into meaningful parts associated with specific sources of variation to test a hypothesis on the parameters of the model or to estimate variance components.
Analysis (1) To separate into elemental parts or basic principles to determine the nature of the whole (2) A course of reasoning showing that a certain result is a consequence of assumed premises. (3) (ANSI) The methodical investigation of a problem, and the separation of the problem into smaller related units for further detailed study.
ANALYTIC NETWORK PROCESS The Analytic Network Process (ANP), though based on the Analytic Hierarchy Process, is a system for the analysis, synthesis, and justification of complex decisions with the capability to model non-linear relations between the elements. ANP allows the decision maker(s) to leap beyond the traditional hierarchy to the interdependent environment of network modeling. The ANP is designed for problems characterized by the added complexity of interdependencies such as feedback and dependencies among problem elements. Using a network approach makes it possible to represent and analyze interactions, incorporate non-linear relations between the elements, and synthesize mutual effects by a single logical procedure.
Analytical Hierarchy Process (AHP) Developed by Thomas Saaty, AHP provides a proven, effective means to deal with complex decision-making and can assist with identifying and weighting selection criteria, analyzing the data collected for the criteria, and expediting the decision-making process.
Anomaly. Anything observed in the documentation or operation of the software that deviates from expectations based on previously verified software products or reference documents. See bug, defect, error, exception, fault.
Appearance Item Is a product that is visible once the vehicle is completed. Certain customers will identify appearance items on the engineering drawings. In these cases, special approval for appearance (color, grain, texture, etc.) is required before production part submissions.
Apportionment is Synonymous with the term Reliability Apportionment, which is the assignment of reliability goals from system to subsystem in such a way that the whole system will have the required reliability.
An approved Drawing Is an engineering drawing signed by the engineer and released through the customer’s system.
Approved Material Approved Materials are materials governed either by industry standard specifications (e.g., SAE, ASTM, DIN, ISO) or by customer specifications.
APQP Advanced Product Quality Planning
AQP Advanced Quality Plan
Architecture The organizational structure of a system or component.
Arrow Diagram is another term for a PERT or CPM chart. It is a graphic description of the sequential steps that must be completed before a project can be completed.
The Arrow Diagram method establishes the most suitable daily plan and monitors its progress efficiently. … The arrow diagram method utilized by PERT or CPM, is a network of lines that connect all the elements related to plan execution. It is typically represented graphically by either a horizontal or vertical tree structure connecting the elements.
Assertion A logical expression specifying a program state that must exist or a set of conditions that program variables must satisfy at a particular point during program execution.
Assertion checking Checking of user-embedded statements that assert relationships between elements of a program. An assertion is a logical expression that specifies a condition or relation among program variables. Tools that test the validity of assertions as the program is executing or tools that perform formal verification of assertions have this feature.
Assessment An evaluation process includes a document review, an on-site audit, and an analysis and report. Customers may also include a self-assessment, internal audit results, and other materials in the assessment.
Assignable Cause The cause(s) of variation in a process which have a source that is identified, and can be eventually eliminated. [Same as Special Cause]
ASSIGNABLE VARIATIONS – Variations in data that can be attributed to specific causes.
Attribute – A characteristic that may take on only one value, e.g. 0 or 1.
Attribute Data 1. Product, process, or component data that is qualitative, rather than quantitative. 2. Product, process, or component data that is measured strictly by either conforming or not. Such data is counted in discrete units such as dollars, hours, items, and yes/no options. The alternative to attributes data is variables data, which is data that is measured on a continuous and infinite scale such as temperature or distance. Charts that use attribute data include bar charts, pie charts, Pareto charts, and some control charts.
AUDIT – A periodic inspection to ensure that a process is conforming to its specifications.
Audit (Quality) An independent review was conducted to compare some aspects of quality performance with a standard for that performance. (Juran, Quality Control Handbook)
Audit (1)An independent examination of a work product or set of work products to assess compliance with specifications, standards, contractual agreements, or other criteria. (2) (ANSI) To conduct an independent review and examination of system records and activities to test the adequacy and effectiveness of data security and data integrity procedures, to ensure compliance with established policy and operational procedures, and to recommend any necessary changes.
Audit trail (1) (ISO) Data in the form of a logical path linking a sequence of events, is used to trace the transactions that have affected the contents of a record. (2) A chronological record of system activities that is sufficient to enable the reconstruction, reviews, and examination of the sequence of environments and activities surrounding or leading to each event in the path of a transaction from its inception to the output of final results.
Availability A product or service’s ability to perform its intended function at a given time and under appropriate conditions. It can be expressed by the ratio operative time/total time where operative time is the time that it is functioning or ready to function.
Average chart (X-bar chart) A control chart in which the average of the subgroup, represented by the X-bar, is to determine the stability or lack thereof in the process. Average charts are usually paired with range charts or sample standard deviation charts for complete analysis.
Average Outgoing Quality (AOQ) The expected average quality level of outgoing product for a given value of incoming product quality.
Average Outgoing Quality Limit (AOQL) The maximum average outgoing quality over all possible levels of incoming quality for a given acceptance sampling plan and disposal specification
B
Balanced Scorecard A framework that translates a company’s vision and strategy into a coherent set of performance measures. Developed by Robert Kaplan and David Norton (published in the Harvard Business Review in 1993), a balanced business scorecard helps businesses evaluate how well they meet their strategic objectives. It typically has four to six components, each with a series of sub-measures. Each component highlights one aspect of the business. The balanced scorecard includes measures of performance that are lagging (return on capital, profit), medium-term indicators (like customer satisfaction indices), and leading indicators (such as adoption rates for, or revenue from, new products).
Baldrige Award Malcolm Baldridge National Quality Award: An annual award given to a United States company that excels in quality management and quality achievement. [Same as MBNA.]
The bar chart A chart that compares different groups of data to each other through the use of bars that represent each group. Bar charts can be simple, in which each group of data consists of a single type of data, or grouped or stacked, in which the groups of data are broken down into internal categories. representation.
Baseline A specification or product that has been formally reviewed and agreed upon serves as the basis for further development, and that can be changed only through formal change control procedures.
Batch A definite quantity of some product or material produced under conditions that are considered uniform.
Batch processing Execution of programs serially with no interactive processing. Contrast with real-time processing.
Benchmark A standard against which measurements or comparisons can be made.
Benchmark Data The results of an investigation to determine how competitors and/or best-in-class companies achieve their level of performance.
Benchmarking: a structured approach for identifying the best practices from industry and government, and comparing and adapting them to the organization’s operations. Such an approach is aimed at identifying more efficient and effective processes for achieving intended results, and suggesting ambitious goals for program output, product/service quality, and process improvement…
Best practice – A way or method of accomplishing a business function or process that is considered to be superior to all other known methods.
BETA RISK The probability of accepting the null hypothesis when, in reality, the alternate hypothesis is true.
Bias A systematic error, that contributes to the difference between a population mean of measurements or test results and an accepted reference value.
Bill of Material Total list of all components/materials required to manufacture the product.
Black Belt The leader of the team is responsible for applying the Six Sigma process.
Black-box testing (1) Testing that ignores the internal mechanism or structure of a system or component and focuses on the outputs generated in response to selected inputs and execution conditions. (2) Testing is conducted to evaluate the compliance of a system or component with specified functional requirements and corresponding predicted results. Syn. functional testing, input/output driven testing. Contrast with white-box testing.
Block Diagram The block diagram is a simple pictorial representation of a system/subsystems linked to illustrate the relationships between components/subsystems
BOM Bill Of Material
Boundary value (1) (IEEE) A data value that corresponds to a minimum or maximum input, internal, or output value specified for a system or component. (2) A value that lies at, just inside, or just outside a specified range of valid input and output values.
Brainstorming A tool used to encourage creative thinking and new ideas. A group formulates and records as many ideas as possible concerning a certain subject, regardless of the content of the ideas. No discussion, evaluation, or criticism of ideas is allowed until the brainstorming session is complete.
Branch An instruction that causes program execution to jump to a new point in the program sequence, rather than execute the next instruction. Syn: jump.
Branch analysis (Myers) A test case identification technique that produces enough test cases such that each decision has a true and a false outcome at least once. Contrast with path analysis.
Branch coverage (NBS) A test coverage criteria which requires that for each decision point each possible branch be executed at least once. Syn: decision coverage. Contrast with condition coverage, multiple condition coverage, path coverage, and statement coverage.
Breakthrough thinking A management technique that emphasizes the development of new, radical approaches to traditional constraints, as opposed to incremental or minor changes in thought that build on the original approach.
Bug A fault in a program that causes the program to perform in an unintended or unanticipated manner. See anomaly, defect, error, exception, fault.
Business process – A collection of activities that work together to produce a defined set of products and services. All business processes in an enterprise exist to fulfill the mission of the enterprise. Business processes must be related in some way to mission objectives.
Business Process Improvement (BPI) – The betterment of an organization’s business practices through the analysis of activities to reduce or eliminate non-value added activities or costs, while at the same time maintaining or improving quality, productivity, timeliness, or other strategic or business purposes as evidenced by measures of performance. Also called functional process improvement.
Business Process Reengineering (BPR) A structured approach by all or part of an enterprise to improve the value of its products and services while reducing resource requirements. The transformation of a business process to achieve significant levels of improvement in one or more performance measures relating to fitness for purpose, quality, cycle time, and cost by using the techniques of streamlining and removing added activities and costs.
Business Process: a collection of related, structured activities — a chain of events — that produces a specific service or product for a particular customer or customers
C
Capability is the total range of inherent variation in a stable process. It is determined using data from control charts. The control charts shall indicate stability before capability calculations can be made. Histograms are to be used to examine the distribution pattern of individual values and verify a normal distribution. When analysis indicates a stable process and a normal distribution, the indices Cp and Cpk can be calculated. If analysis indicates a non-normal distribution, advanced statistical tools such as PPM analysis, will be required to determine capability. If control charts show the process to be non-stable, the index Ppk can be calculated.
CAR Corrective Action Request
Care mapping Medical procedure for a particular diagnosis in a diagrammatic form that includes key decision points used to coordinate care and instruct patient.
Cause That which produces an effect or brings about a change.
Cause & Effect diagram A tool used to analyze all factors (causes) that contribute to a given situation or occurrence (effect) by breaking down main causes into smaller and smaller sub-causes. It is also known as the Ishikawa or the fishbone diagram.
Cause effect graphing(1) Test data selection technique The input and output domains are partitioned into classes and analysis is performed to determine which input classes cause which effect. A minimal set of inputs is chosen which will cover the entire effect set. (2) (Myers) A systematic method of generating test cases representing combinations of conditions. See testing, functional.
Centre Line The line on a statistical process control chart represents the characteristic’s central tendency.
CFT Cross-Functional Team.
Change control The processes, authorities, and procedures to be used for all changes that are made to the computerized system and/or the system’s data. Change control is a vital subset of the Quality Assurance [QA] program within an establishment and should be clearly described in the establishment’s SOPs, See: configuration control.
Change tracker A software tool that documents all changes made to a program.
Characteristics A definable or measurable feature of a process, product, or variable.
Central Tendency Numerical average, e.g., mean, median, and mode; center line on a statistical process control chart.
CHART A form used to display information obtained through data collection when measuring defects and/or problems.
CHARTER A document that specifies the purpose of a team, its power, its reporting relationships, and its specific responsibilities.
Check sheet A customized form used to record data. Usually, it is used to record how often some activity occurs. A list of things to do.
CIM Computer Integrated Manufacturing
Client/server A term used in a broad sense to describe the relationship between the receiver and the provider of a service. In the world of microcomputers, the term client-server describes a networked system where front-end applications, as the client, make service requests upon another networked system. Client-server relationships are defined primarily by software. In a local area network [LAN], the workstation is the client and the file server is the server. However, client-server systems are inherently more complex than file server systems. Two disparate programs must work in tandem, and there are many more decisions to make about separating data and processing between the client workstations and the database server. The database server encapsulates database files and indexes, restricts access, enforces security, and provides applications with a consistent interface to data via a data dictionary.
Clinical practice guidelines A general term for statements of accepted medical procedure for a particular diagnosis.
CMI Certified Mechanical Inspector
Code audit An independent review of source code by a person, team, or tool to verify compliance with software design documentation and programming standards. Correctness and efficiency may also be evaluated. Contrast with code inspection, code review, and code walkthrough. See static analysis.
Code inspection (Myers/NBS) A manual [formal] testing [error detection] technique where the programmer reads source code, statement by statement, to a group who asks questions analyzing the program logic, analyzing the code for a checklist of historically common programming errors, and analyzing its compliance with coding standards. In contrast with, code audit, code review, and code walkthrough. This technique can also be applied to other software and configuration items. Syn: Fagan Inspection. See static analysis.
Code program, source code.
Code review (IEEE) A meeting at which software code is presented to project personnel, managers, users, customers, or other interested parties for comment or approval. Contrast with code audit, code inspection, and code walkthrough. See static analysis.
Code walkthrough (MyersINBS) A manual testing [error detection] technique where the program (source code] logic [structure] is traced manually [mentally] by a group with a small set of test cases, while the state of program variables is manually monitored, to analyze the programmer’s logic and assumptions. Contrast with code audit, code inspection, and code review. See static analysis.
Coding standards Written procedures describing coding [programming] style conventions specifying rules governing the use of individual constructs provided by the programming language, and naming, formatting, and documentation requirements that prevent programming errors control complexity and promote understandability of the source code. Syn: development standards, programming standards.
Common Cause Variation is variation caused by the process. It is produced by the interaction of aspects of the process that affect every occurrence
Common Cause Variation that affects all the individual values of a process Common causes Inherent causes of variation in a process. They are typical of the process, not unexpected. That is not to say that they must be tolerated; on the contrary, once special causes of variation are largely removed, a focus on removing common causes of variation can pay big dividends.
Comparitor (IEEE) A software tool that compares two computer programs, files, or sets of data to identify commonalities or differences. Typical objects of comparison are similar versions of source code, object code, database files, or test results.
Completeness (NIST) The property that all necessary parts of the entity are included. The completeness of a product is often used to express the fact that all requirements have been met by the product. See traceability analysis.
Complexity (IEEE) (1) The degree to which a system or component has a design or implementation that is difficult to understand and verify. (2) About any of a set of structure-based metrics that measure the attribute in (1).
Computer-aided software engineering (CASE)An automated system for the support of software development including an integrated toolset, i.e., programs, which facilitate the accomplishment of software engineering methods and tasks such as project planning and estimation, system and software requirements analysis, design of data structure, program architecture and algorithm procedure, coding, testing, and maintenance.
Computer system audit (ISO) An examination of the procedures used in a computer system to evaluate their effectiveness and correctness and to recommend improvements. See software audit.
Computer system security(IEEE) The protection of computer hardware and software from accidental or malicious access, use, modification, destruction, or disclosure. Security also pertains to personnel, data, communications, and the physical protection of computer installations.
Confidence Level The probability that a random variable x lies within a defined interval.
Confidence Limit The two values that define the confidence interval.
Configurable, off-the-shelf software (COTS)Application software, sometimes general purpose, is written for a variety of industries or users in a manner that permits users to modify the program to meet their individual needs.
Configuration control (IEEE) An element of configuration management, consisting of the evaluation, coordination, approval or disapproval, and implementation of changes to configuration items after formal establishment of their configuration identification. See change control.
Configuration management (IEEE) A discipline applying technical and administrative direction and surveillance to identify and document the functional and physical characteristics of a configuration item, control changes to those characteristics, record and report change processing and implementation status, and verify compliance with specified requirements. See: configuration control, change control
Conformance Meeting requirements or specifications.
Confounding Allowing two or more variables to vary together so that it is impossible to separate their unique effects.
Consensus Acceptance of a team decision so that everyone on the team can live with the decision and support it.
The consensus Method is used in reaching a unanimous agreement by voluntarily giving consent. An agreement to support a decision.
consistency checker A software tool used to test requirements in design specifications for both consistency and completeness.
Consistency (IEEE) The degree of uniformity, standardization, and freedom from contradiction among the documents or parts of a system or component.
Consumers Risk the Probability of accepting a lot when, in fact, the lot should have been rejected (see BETA RISK).
Continuous Data Numerical information at the interval or ratio level; subdivision is conceptually meaningful; can assume any number within an interval, e.g., 14.652 amps.
Continuous improvement On-going improvement of any and all aspects of an organization including products, services, communications, environment, functions, individual processes, etc.
Continuous Process Improvement A policy that encourages, mandates, and/or empowers employees to find ways to improve the process and product performance measures on an ongoing basis.
Continuous Random Variable A random variable that can assume any value continuously in some specified interval.
Control Chart is a line chart with control limits. It is based on the work of Shewhart and Deming. By mathematically constructing control limits at 3 standard deviations above and below the average, one can determine what variation is due to normal ongoing causes (common causes) and what variation is produced by unique events (special causes). By eliminating the special causes first and then reducing common causes, quality can be improved.
Control Charts Statistical charts are used in process measurement. Used to differentiate process variation caused by common cause versus special cause or assignable cause.
Control flow analysis (IEEE) A software V&V task to ensure that the proposed control flow is free of problems, such as design or code elements that are unreachable or incorrect.
Control flow diagram. (IEEE) A diagram that depicts the set of all possible sequences in which operations may be performed during the execution of a system or program. Types include box diagrams, flowcharts, input-process-output charts, and state diagrams. Contrast with data flow diagram. See: call graph, structure chart.
Control limit A statistically-determined line on a control chart is used to analyze variation within a process. If variation exceeds the control limits, then the process is being affected by special causes and is said to be “out of control.” A control limit is not the same as a specification limit.
Control Plans Control Plans are written descriptions of the systems for controlling parts and processes. They are written by suppliers to address the important characteristics and engineering requirements of the product. Each part shall have a Control Plan, but in many cases, “family” Control Plans can cover several parts produced using a common process. Customer approval of Control Plans may be required before production part submission.
Control Plans Written descriptions of the systems for controlling parts and processes.
Control Point is the desired result of a process.
Control Specifications Specifications are called for by the product being manufactured.
Corrective Action Documented and purposeful change implemented to eliminate forever a specific cause of an identified non-conformance.
Corrective Action actions (s) are designed to identify and eliminate root causes of non-conformances and non-conformities.
Corrective Action Plan A Corrective Action Plan is a plan for correcting a process or part quality issue.
Corrective maintenance (IEEE) Maintenance is performed to correct faults in hardware or software. In contrast with adaptive maintenance, preventative maintenance.
Correctness (IEEE) The degree to which software is free from faults in its specification, design, and coding. The degree to which software, documentation and other items meet specified requirements. The degree to which software, documentation and other items meet user needs and expectations, whether specified or not.
Cost of Poor Quality Internal and External Failure Cost plus Appraisal and Prevention Costs
Cost of poor quality The costs incurred by producing products or services of poor quality. These costs usually include the cost of inspection, reworking duplicate work, scrapping rejects, replacements and refunds, complaints, and loss of customers and reputation.
Cost of Quality The total labor, materials, and overhead costs are attributed to 1) preventing nonconforming products or services, 2) appraising products or services to ensure conformance, or 3) correcting or scrapping nonconforming products or services.
Count chart (c chart) An attributes data control chart that evaluates process stability by charting the counts of occurrences of a given event in successive samples.
Count-per-unit chart (u chart) A control chart that evaluates process stability by charting the number of occurrences of a given event per unit sampled, in a series of samples.
Coverage analysis (NIST) Determining and assessing measures associated with the invocation of program structural elements to determine the adequacy of a test run. Coverage analysis is useful when attempting to execute each statement, branch, path, or iterative structure in a program. Tools that capture this data and provide reports summarizing relevant information have this feature See: testing, branch; testing, path; testing, statement.
Cp Commonly used process capability index defined as [USL (upper spec limit) – LSL(lower spec limit)] / [6 x sigma], where sigma is the estimated process standard deviation.
Cp/Cpk Capability Ratio/Capability Index
Cpk Commonly used process capability index defined as the lesser of USL – m / 3sigma or m – LSL / 3sigma, where sigma is the estimated process standard deviation.
QCT ASQ Certified Quality Technician
Crash (IEEE) The sudden and complete failure of a computer system or component.
CRE ASQ Certified Reliability Engineer
Critical Characteristics Critical Characteristics are those product requirements (dimensions, performance tests) or process parameters that can affect compliance with government regulations of safe vehicle/product function and which require specific supplier, assembly, shipping, or monitoring and inclusion on Control Plans. Critical characteristics are identified with the inverted delta symbol.
Critical control point (CA) A function or an area in a manufacturing process or procedure, the failure of which, or loss of control over, may have an adverse effect on the quality of the finished product and may result in an unacceptable health risk.
Critical design review(IEEE) A review is conducted to verify that the detailed design of one or more configuration items satisfies specified requirements; to establish the compatibility among the configuration items and other items of equipment, facilities, software, and personnel; to assess risk areas for each configuration item; and, as applicable, to assess the results of producibility analyses, review preliminary hardware product specifications, evaluate preliminary test planning, and evaluate the adequacy of preliminary operation and support documents. See preliminary design review, and system design review.
Criticality analysis. (IEEE) Analysis which identifies all software requirements that have safety implications, and assigns a criticality level to each safety-critical requirement based upon the estimated risk.
Criticality(IEEE) The degree of impact that a requirement, module, error, fault, failure, or other item has on the development or operation of a system. Syn: severity.
Cumulative sum chart Control chart that shows the cumulative sum of deviations from a set value in successive samples. Each plotted point indicates the algebraic sum of the last point and all deviations since.
CUSTOMER & SUPPLIER REQUIREMENTS WORKSHEET An information gathering tool to use with any work activity. It breaks down a job into its parts: Customer Requirements and Supplier Requirements.
Customer Satisfaction Index (American)Introduced in 1994 by the University of Michigan and the American Society for Quality CSI measures customer satisfaction at the national level. CSI has been on a continual decline from 1994 through 1997 suggesting that quality improvements are not keeping pace with consumer expectations.
Customer The receiver of an output of a process, either internal or external to the organization. Can be a person, department, company, etc.
CUTOFF POINT The point which partitions the acceptance region from the reject region.
Cycle time: The time that elapses from the beginning to the end of a process or sub-process.
Cyclic redundancy [check] code (CRC)A technique for error detection in data communications used to assure a program or data file has been accurately transferred. The CRC is the result of a calculation on the set of transmitted bits by the transmitter which is appended to the data. At the receiver, the calculation is repeated and the results are compared to the encoded value. The calculations are chosen to optimize error detection. In contrast with check summation, parity check.
Cyclomatic complexity(1) (McCabe) The number of independent paths through a program. (2) (NBS) The cyclomatic complexity of a program is equivalent to the number of decision statements plus 1.
D
DATA Factual information is used as a basis for reasoning, discussion, or calculation; often refers to quantitative information.
Data analysis (IEEE) (1) Evaluation of the description and intended use of each data item in the software design to ensure the structure and intended use will not result in a hazard. Data structures are assessed for data dependencies that circumvent isolation, partitioning, data aliasing, and fault containment issues affecting safety, and the control or mitigation of hazards. (2) Evaluation of the data structure and usage in the code to ensure each is defined and used properly by the program. Usually performed in conjunction with logic analysis.
Data Collection Gathering facts on how a process works and/or how a process is working from a customer’s point of view. All data collection is driven by a knowledge of the process and guided by statistical principles.
Data corruption (ISO) is A violation of data integrity. Syn: data contamination.
Data dictionary (IEEE) (1) A collection of the names of all data items used in a software system, together with relevant properties of those items; e.g., length of the data item, representation, etc. (2) A set of definitions of data flows, data elements, files, databases, and processes referred to in a leveled data flow diagram set.
Data flow analysis (IEEE) A software V&V task to ensure that the input and output data and their formats are properly defined and that the data flows are correct.
Data flow diagram (IEEE) A diagram that depicts data sources, data sinks, data storage, and processes performed on data as nodes, and the logical flow of data as links between the nodes. Syn: data flowchart, data flow graph.
Data integrity (IEEE) The degree to which a collection of data is complete, consistent, and accurate. Syn: data quality.
Data validation(1) (ISO) A process used to determine if data are inaccurate, incomplete, or unreasonable. The process may include format checks, completeness checks, check key tests, reasonableness checks and limit checks. (2) The checking of data for correctness or compliance with applicable standards, rules, and conventions.
DCP Dynamic Control Plan/Dimensional Control Plan.
Dead code Program code statements which can never execute during program operation. Such code can result from poor coding style or can be an artifact of previous versions or debugging efforts. Dead code can be confusing and is a potential source of erroneous software changes.
Debugging (Myers) Determining the exact nature and location of a program error, and fixing the error.
Decision coverage (Myers) A test coverage criteria requiring enough test cases such that each decision has a true and false result at least once, and that each statement is executed at least once. Syn: branch coverage. Contrast with condition coverage, multiple condition coverage, path coverage, and statement coverage.
Decision matrix A tool used to evaluate problems, solutions, or ideas. The possibilities are listed down the left-hand side of the matrix and relevant criteria are listed across the top. Each possibility is then rated on a numeric scale of importance or effectiveness (e.g. on a scale of 1 to 10) for each criterion, and each rating is recorded in the appropriate box. When all ratings are complete, the scores for each possibility are added to determine which has the highest overall rating and thus deserves the greatest attention.
Decision table (IEEE) A table used to show sets of conditions and the actions resulting from them.
Defect An error in the construction of a product or service that renders it unusable; an error that causes a product or service to not meet requirements.
DEFECT FREE A personal performance standard that says specifications should be met every time. An attitude that displays a personal commitment to doing the job right the first time, every time.
Defect Nonconformance to requirements. See anomaly, bug, error, exception, fault. defect analysis. See failure analysis.
DEGREES OF FREEDOM The number of independent measurements available for estimating a population parameter
Deming cycle Alternate name for the Plan-Do-Check-Act cycle, a four-stage approach to problem-solving. It is also sometimes called the Shewhart cycle.
Deming Cycle Plan Do Study Act also Shewhart cycle
Deming, W. Edwards is Known as the father of quality control. Deming began his work in quality control in the United States during World War II to aid the war effort. After the war, he went to Japan to help in the rebuilding of their country. His methods of quality control became an integral part of Japanese industry. Deming is a celebrated author and is well-known for his “14 Points” for effective management.
DENSITY FUNCTION The function which yields the probability that a particular random variable takes on any one of its possible values.
DEPENDENT VARIABLE A Response Variable; e.g., y is the dependent or “Response” variable where Y=f (Xl. . . XN) variable.
Design for Manufacturability and Assembly A simultaneous engineering process designed to optimize the relationship between design function, manufacturability, and ease of assembly.
Design Information Checklist A mistake-proofing checklist designed to assure that all important items were considered in establishing design requirements.
Design phase (IEEE) The period in the software life cycle during which the designs for architecture, software components, interfaces, and data are created, documented and verified to satisfy requirements.
Design Reviews A proactive process to prevent problems and misunderstandings.
Design specification (NIST) A specification that documents how a system is to be built. It typically includes system or component structure, algorithms, control logic, data structures, data set [file] use information, input/output formats, interface descriptions, etc Contrast with design standards, and requirement. See software design description.
Design Validation Testing to ensure that the product conforms to defined user needs and/or requirements. Design validation follows successful design verification and is normally performed on the final product under defined, operating conditions. Multiple validations may be performed if there are different intended uses.
Design Verification Testing to ensure that all design outputs meet design input requirements. Design verification may include activities such as Design Review, Performing Alternate Calculations, Understanding Tests & Demonstrations, and Review of Design Stage Documents Before Release.
Development methodology (ANSI) A systematic approach to software creation that defines development phases and specifies the activities, products, verification procedures, and completion criteria for each phase. See incremental development, rapid prototyping, spiral model, and waterfall model.
DFA Design For Assembly
DFM Design For Manufacturing
DFMEA Design Failure Mode Effects Analysis: An analytical technique used to assure that potential design failure modes and associated causes have been considered and addressed. [See FMEA, PFMEA]
Diagnostic journey/Remedial journey A problem-solving approach in which a problem is investigated by looking first at symptoms, and gradually working back towards root causes. Once root causes have been established, experimentation and tracking are used in the remedial journey – the finding of a cure for the roots of the problem.
Diagnostic (IEEE) About the detection and isolation of faults or failures. For example, a diagnostic message, or a diagnostic manual.
Discounted Cash Flow A method of performing an economic analysis that takes the time value of money into the account. Used to remove interest rates and inflation factors from a calculation so that the results of the analysis are comparable.
DISCRETE RANDOM VARIABLE A random variable that can assume values only from a definite number of discrete values.
DISTRIBUTIONS Tendency of large numbers of observations to group themselves around some central value with a certain amount of variation or “scatter” on either side
Documentation Material defining the process to be followed (e.g, quality manual, operator instructions, graphics, pictorials).
DOE (Design of experiments) DOE is the science of designing sets of experiments that will generate enough useful data to make sound decisions without costing too much or taking too long.
DPO defects per unit
DPMO defects per million opportunities
Durability The probability that an item will continue to function at customer expectation levels, at the useful life without requiring overhaul or rebuild due to wear out.
Dynamic analysis (NBS) Analysis is performed by executing the program code. Contrast with static analysis. See: testing
E
Effectiveness The state of having produced a decided or desired effect; the state of achieving customer satisfaction
Efficiency A measure of performance that compares the output with cost or resource utilization
Embedded software (IEEE) Software that is part of a larger system and performs some of the requirements of that system; e,g., software used in an aircraft or rapid transit system. Such software does not provide an interface for the user. See firmware,
Employee involvement Regular participation of employees in decision-making and suggestions. The driving forces behind increasing the involvement of employees are the conviction that more brains are better, that people in the process know it best, and that involved employees will be more motivated to do what is best for the organization.
Empowerment Usually refers to giving employees decision-making and problem-solving authority within their jobs.
The end user (ANSI) (1) A person, device, program, or computer system that uses an information system for data processing in information exchange. (2) A person whose occupation requires the use of an information system but does not require any knowledge of computers or computer programming. See user.
Entity relationship diagram (IEEE) A diagram that depicts a set of real-world entities and the logical relationships among them. See the data structure diagram.
Entity The representation of a set of real or abstract things (people, objects, places, events, ideas, a combination of things, etc.) that are recognized as the same type because they share the same characteristics and can participate in the same relationships.
Environment all of the process conditions surrounding or affecting the manufacture and quality of a part or product.
Environment (ANSI) (1) Everything that supports a system or the performance of a function. (2) The conditions that affect the performance of a system or function.
Equivalence class partitioning (Myers) Partitioning the input domain of a program into a finite number of classes [sets], to identify a minimal set of well-selected test cases to represent these classes. There are two types of input equivalence classes, valid and invalid.
Error (ISO) A discrepancy between a computed, observed, or measured value or condition and the true, specified, or theoretically correct value or condition. See anomaly, bug, defect, exception, fault.
Error analysis See: debugging, failure analysis.
Error detection Techniques are used to identify errors in data transfers. See: check summation, cyclic redundancy check [CRC], parity check, longitudinal redundancy.
Error guessing (NBS) Test data selection technique. The selection is to pick values that are likely to cause errors.
Error seeding (IEEE) The process of intentionally adding known faults to those already in a computer program to monitor the rate of detection and removal, and estimating the number of faults remaining in the program. Contrast with mutation analysis.
Event A happening, is the arrival of a significant point in time, a change in the status of something, or the occurrence of something external that causes the business to react.
Event table A table that lists events and the corresponding specified effect[s] of or reaction[s] to each event.
Exception conditions/responses table A special type of event table.
Exception (IEEE) An event that causes suspension of normal program execution. Types include addressing exception, data exception, operation exception, overflow exception, protection exception, and underflow exception.
Execution trace (IEEE) A record of the sequence of instructions executed during the execution of a computer program. Often takes the form of a list of code labels encountered as the program executes. Syn: code trace, control flow trace. See retrospective trace, subroutine trace, symbolic trace, and variable trace.
EXECUTIVE OVERVIEW The course that teaches key executives their role in the Quality Process.
Expectations Customer perceptions about how a product or service will meet their needs and requirements; expectations for a product or service are shaped by many factors; including the specific use the customer intends to make of it, prior experience with a similar product or service, and representations and commitments made by marketing and advertising.
EXPERIMENT A test under defined conditions to determine an unknown effect; to illustrate or verify a known law; to test or establish a hypothesis.
EXPERIMENTAL ERROR Variation in observations made under identical test conditions. Also called residual error. The amount of variation cannot be attributed to the variables included in the experiment.
External customer A person or organization outside your organization who receives the output of a process. Of all external customers, the end-user should be the most important.
External test data (NBS) Test data that is at the extreme or boundary of the domain of an input variable or which produces results at the boundary of an output domain.
F
Facilitator Person who helps a team with issues of teamwork, communication, and problem-solving. A facilitator should not contribute to the actual content of the team’s project, focusing instead as an observer on the team’s functioning as a group.
Factorial Design Factorial designs are generally employed in engineering and manufacturing experiments. It is appropriate when several factors are to be investigated at two or more levels and the interaction of factors may be important. Also, see Design of Experiments.
FACTORS Independent variables.
Fail-safe (IEEE) A system or component that automatically places itself in a safe operational mode in the event of a failure,
Failure analysis Determining the exact nature and location of a program error to fix the error, identify and fix other similar errors, and initiate corrective action to prevent future occurrences of this type of error. Contrast with debugging.
Failure Mode Effects Analysis A technique that systematically analyzes the types of failures which will be expected as a product is used, and what the effects of each “failure mode” will be.
Failure Modes and Effects Analysis (IEC) A method of reliability analysis intended to identify failures, at the basic component level, which has significant consequences affecting the system performance in the application considered.
Failure Modes and Effects Criticality Analysis (IEC) A logical extension of FMEA which analyzes the severity of the consequences of failure
Failure (IEEE) The inability of a system or component to perform its required functions within specified performance requirements. See bug, crash, exception, fault.
Fault An incorrect step, process, or data definition in a computer program that causes the program to perform in an unintended or unanticipated manner. See anomaly, bug, defect, error, exception.
Fault Tree Analysis (IEC) The identification and analysis of conditions and factors which cause or contribute to the occurrence of a defined undesirable event, usually one which significantly affects system performance, economy, safety, or other required characteristics.
FEA Finite Element Analysis
Feasibility study Analysis of the known or anticipated need for a product, system, or component to assess the degree to which the requirements, designs, or plans can be implemented.
Finite Element Analysis A technique for modeling a complex structure. When the mathematical model is subjected to known loads, the displacement of the structure may be determined.
FISHBONE DIAGRAM Also known as a Cause and Effect Analysis Diagram, used by a problem-solving team during brainstorming to logically list and display known and potential causes of a problem. Analysis of the listed causes is done to identify root causes.
Fixed Cost A cost that does not vary with the amount or degree of production. The costs that remain if an activity or process stops.
FIXED EFFECTS MODEL Experimental treatments are specifically selected by the researcher. Conclusions only apply to the factor levels considered in the analysis. Inferences are restricted to the experimental levels.
FIXING Temporary actions taken to make the output of a process conform to its specifications.
A flow Chart is a pictorial representation showing all of the steps of a process.
Flowchart A graphical representation of a given process delineating each step. It is used to diagram how the process functions and where waste, error, and frustration enter the process.
The flowchart lists the order of activities. The circle symbol indicates the beginning or end of the process. The box indicates action items and the diamond indicates decision points. A beneficial technique is to map the ideal process and the actual process and identify the differences as targets for improvements.
Flowchart or flow diagram (2) (ISO) A graphical representation in which symbols are used to represent such things as operations, data, flow direction, and equipment, for the definition, analysis, or solution of a problem. (2) (IEEE) A control flow diagram in which suitably annotated geometrical figures are used to represent operations, data, or equipment, and arrows are used to indicate the sequential flow from one to another. Syn: flow diagram. See block diagram, box diagram, bubble chart, graph, input-process-output chart, and structure chart.
FLUCTUATIONS Variances in data are caused by a large number of, minute variations or differences
FMA Failure Mode Analysis.
FMEA Failure Mode and Effects Analysis
FMEA Failure Mode Effects Analysis: An analytical technique used to assure that potential failure modes and associated causes have been considered and addressed.
Force Field Analysis A tool, developed by social psychologist Kurt Lewin, is used to analyze the opposing forces involved in causing/resisting any change. It is shown in balance sheet format with forces that will help (driving forces) listed on the left and forces that hinder (restraining forces) listed on the right.
Formal qualification review (IEEE) The test, inspection, or analytical process by which a group of configuration items comprising a system is verified to have met specific contractual performance requirements. Contrast with code review, design review, requirements review, and test readiness review.
FREQUENCY DISTRIBUTION The pattern or shape formed by the group of measurements in a distribution
Frequency distribution An organization of data, usually in a chart, which depicts how often different events occur. A histogram is one common type of frequency distribution, and a frequency polygon is another.
FTC First Time Capability.
Function A specific set of skills and resources that can be used to perform one or more activities that make up a process. Usually, several functions are associated with a single process.
Functional configuration audit (IEEE) An audit is conducted to verify that the development of a configuration item has been completed satisfactorily, that the item has achieved the performance and functional characteristics specified in the functional or allocated configuration identification, and that its operational and support documents are complete and satisfactory. See physical configuration audit.
Functional design (IEEE) (1) The process of defining the working relationships among the components of a system. See architectural design. (2) The result of the process in (1).
Functional Economic Analysis (FEA) A technique for analyzing and evaluating alternative information system investments and management practices. Within DoD, FEA is a business case. Also, a document that contains a fully justified proposed improvement project with all supporting data.
Functional Process Improvement A structured approach by all or part of an enterprise to improve the value of its products and services while reducing resource requirements. Also referred to as business process improvement (BPI), business process redesign, and business reengineering.
Functional requirement (IEEE) A requirement that specifies a function that a system or system component must be able to perform.
Functional Verification Functional Verification is testing to ensure the part conforms to all customer and supplier engineering performance and material requirements. Functional verification (to applicable customer engineering material and performance standards) may be required by some customers annually unless another frequency is established in a customer approval control plan. Results shall be available for customer review upon request.
G
Gage R&R Gage Repeatability & Reproducibility
Gantt chart A bar chart that shows planned work and finished work in time. Each task in a list has a bar corresponding to it. The length of the bar is used to indicate the expected or actual duration of the task.
Grade An indicator of category or rank related to features or characteristics that cover different sets of needs for products or services intended for the same functional use.
Graph (IEEE) A diagram or other representation consisting of a finite set of nodes and internode connections called edges or arcs. In contrast with the blueprint. See: block diagram, box diagram, bubble chart, call graph, cause-effect graph, control flow diagram, data flow diagram, directed graph, flowchart, input-process-output chart, structure chart, and transaction flowgraph.
Graphic software specifications Documents such as charts, diagrams, and graphs that depict program structure, states of data, control, transaction flow, HIPO, and cause-effect relationships; and tables including truth, decision, event, state-transition, module interface, exception conditions/responses necessary to establish design integrity.
Green Belt An individual who supports the implementation and application of Six Sigma tools by way of participation on project teams
H
Hazard analysis A technique used to identify conceivable failures affecting system performance, human safety, or other required characteristics. See FMEA, FMECA, FTA, software hazard analysis, software safety requirements analysis, software safety design analysis, software safety code analysis, software safety test analysis, and software safety change analysis.
Hazard probability (DOD) The aggregate probability of occurrence of the individual events that create a specific hazard.
Hazard severity (DOD) An assessment of the consequence of the worst credible mishap that could be caused by a specific hazard.
Hazard (DOD) A condition that is a prerequisite to a mishap.
Histogram A specialized bar chart showing the distribution of measurement data. It will pictorially reveal the amount and type of variation within a process. It is a bar chart showing a distribution of variables. An example would be to line up by height a group of people in a course. Normally one would be the tallest and one would be the shortest and there would be a cluster of people around an average height. Hence the phrase “normal distribution”. This tool helps identify the cause of problems in a process by the shape of the distribution as well as the width of the distribution.
Homogeneity of Variance The variances of the groups being contrasted are equal (as defined by a statistical test of significant difference).
Hoshin kanri Japanese term for hoshin planning, a form of interactive strategic planning which aids the flow of information up and down the organizational layers in a systematic, productive way.
Hoshin planning A method of strategic planning for quality. It helps executives integrate quality improvement into the organization’s long-range plan. It is a method used to ensure that the mission, vision, goals, and annual objectives of an organization are communicated to and implemented by everyone, from the executive level to the ‘front line level.”
I
Inspection A manual testing technique in which program documents [specifications (requirements, design, source code, or user’s manuals are examined in a very formal and disciplined manner to discover errors, violations of standards, and other problems. Checklists are a typical vehicle used in accomplishing this technique.
Useful Links:
- Productivity Tools
- Quality Tools
- Process optimization Tools
- Lean Manufacturing Tools
- How to Apply for MNC Jobs
static analysis, code audit, code inspection, code review, and code walkthrough.
Inspection Activities, such as measuring, examining, testing, gaging one or more characteristics of a product or service, and comparing these with specified requirements to determine conformity.
Instability Unnaturally large fluctuations in a pattern.
installation and checkout phase (IEEE) The period
in the software life cycle during which a software product is integrated into its operational environment and tested in this environment to ensure that it performs as required.
Instruction set (1) (IEEE) The complete set of instructions recognized by a given computer or provided by a given programming language. (2) (ISO) The set of the instructions of a computer, of a programming language, or the programming languages in a programming system. See computer instruction set.
Instruction (1) (ANSI/IEEE) A program statement that causes a computer to perform a particular operation or set of operations. (2) (ISO) In a programming language, a meaningful expression specifies one operation and identifies its operands, if any.
Instrumentation (NIBS) The insertion of additional code into a program to collect information about program behavior during program execution. Useful for dynamic analysis techniques such as assertion checking, coverage analysis, and tuning.
Interface analysis (IEEE) Evaluation of (1) software requirements specifications with hardware, user, operator, and software interface requirements documentation, (2) software design description records with hardware, operator, and software interface requirements specifications, (3) source code with hardware, operator, and software interface design documentation, for correctness, consistency, completeness, accuracy, and readability. Entities to evaluate include data items and control items.
Interface requirement (IEEE) A requirement that specifies an external item with which a system or system component must interact, or sets forth constraints on formats, timing, or other factors caused by such an interaction.
Interface (1) (ISO) A shared boundary between two functional units, defined by functional characteristics, common physical interconnection characteristics, signal characteristics, and other characteristics, as appropriate The concept involves the specification of the connection of two devices having different functions. (2) A point of communication between two or more processes, persons, or other physical entities. (3) A peripheral device that permits two or more devices to communicate.
Interim Approval Permits shipment of products for a specified period or quantity.
Internal customer Someone within your organization, further downstream in a process, who receives the output of your work.
Interrelations Digraph is a graphical representation of all the factors in a complicated problem, system, or situation. It is typically used in conjunction with one of the other quality tools, particularly the affinity diagram. Frequently the header cards from the affinity diagram are used as the starting point for the interrelations digraph.
Interval Numeric categories with equal units of measure but no absolute zero point, i.e., quality scale or index.
Invalid inputs 1 (NBS) Test data that lie outside the domain of the function the program represents. (2) These are not only inputs outside the valid range for data to be input, i.e., when the specified input range is 50 to 100, but also unexpected inputs, especially when these unexpected inputs may easily occur; e,g., the entry of alpha characters or special keyboard characters when only numeric data is valid, or the input of abnormal command sequences to a program.
Ishikawa Diagram A problem-solving tool that uses a graphic description of the various process elements to analyze potential sources of variation, or problems. [Same as Cause and Effect Diagram, or Fishbone Diagram]
Ishikawa, Kaoru One of Japan’s quality control pioneers. He developed the cause & effect diagram (Ishikawa diagram) in 1943 and published many books addressing quality control. In addition to his work at Kawasaki, Ishikawa was a long-standing member of the Union of Japanese Scientists and Engineers and an assistant professor at the University of Tokyo.
ISIR Initial Sample Inspection Report
ISO 9000 A family of ISO standards that apply to quality management and quality assurance. Specifically, quality systems.
J
JIT Just-In-Time: An inventory control system where components/products and services are delivered to the customer only when needed.
Job control language. (IEEE) A language is used to identify a sequence of jobs, describe their requirements to an operating system, and control their execution.
Job. (IEEE) A user-defined unit of work that is to be accomplished by a computer. For example, the compilation, loading, and execution of a computer program. See job control language.
Juran, Joseph M. One of the quality gurus, and, like Deming, an early student of the work of Walter Shewhart at Western Electric. His work has specialized in linking management to quality engineering. Dr. Juran is the founder of the Juran Institute which has long been the vehicle of his work in quality management and is well-known for espousing “the quality trilogy” of quality planning, quality control, and quality improvement. Juran has authored many books and other works to spread awareness of quality management ideas and applications.
Just in time, A policy calls for the delivery of material, products, or services at the time they are needed in an activity or process to reduce inventory, wait time, and spoilage.
Just-in-time instruction Training is given as needed for immediate application, without lag time and the usual loss of retention.
K
Kaizen is Taken from the Japanese words kai and zen where kai means change and zen means good. The popular meaning is continuous improvement of all areas of a company not just quality.
Key Performance Indicators KPI refers to the short list of measurable parameters that will indicate how well the business is doing at attaining its goals. In a manufacturing quality scenario, this may be the amount of scrap or rework that gets metered. In a service quality scenario, such as an insurance company, this may be the open inventory of unprocessed claims. In brand management, market share in itself and comparison with competing brands are sure to be relevant. In logistics, on-time deliveries, empty return loads, or missing items are candidate indicators.
KJ method Another name for the affinity diagram, after its inventor, Kawakita Jiro.
Knowledge Management The leveraging of collective wisdom to increase responsiveness and innovation.
L
Life cycle methodology. The use of any one of several structured methods to plan, design, implement, and test. and operate a system from its conception to the termination of its use. See waterfall model.
Line Charts Charts are used to track the performance without relationship to process capability or control limits.
Logic analysis. (IEEE) Evaluates the safety-critical equations, algorithms, and control logic of the software design. (2) Evaluates the sequence of operations represented by the coded program and detect programming errors that might create hazards.
Lower Control Limit A horizontal dotted line plotted on a control chart represents the lower process limit capabilities of a process.
M
Maintainability The probability that a failed system can be made operable in a specified interval or downtime.
Maintainability. (IEEE) The ease with which a software system or component can be modified to correct faults, improve performance or other attributes, or adapt to a changing environment. Syn: modifiability
Maintenance. (QA) Activities such as adjusting, cleaning, modifying, and overhauling equipment to assure performance by requirements. Maintenance to a software system includes correcting software errors, adapting software to a new environment, or making enhancements to software See: adaptive maintenance, corrective maintenance, preventive maintenance.
Management Systems Software tools for supporting the modeling, analysis, and enactment of business processes.
Master Black Belt A teacher and mentor of Black Belts. Provides support, reviews projects, and undertakes larger scale projects.
The Matrix Analysis method quantifies and arranges matrix diagram data so that the information is easy to visualize and comprehend. The relationships between the elements shown in a matrix diagram are quantified by obtaining numerical data for intersection cells. Of the seven new QC tools, this is the only numerical analysis method. The results of this technique, however, are presented in diagram form. … One major technique that this method also utilizes is known as principal components analysis.
The matrix Diagram method clarifies problematic spots through multidimensional thinking. … The matrix diagram method identifies corresponding elements involved in a problem situation or event. These elements are arranged in rows and columns on a chart that shows the presence or absence of relationships among collected pairs of elements. … Effective problem solving is facilitated at the intersection points, also referred to as the idea conception points. … Matrix diagrams are classified based on their pattern into five groups: (1) the L type matrix, (2) the T type matrix, (3) the Y type matrix, (4) the X type matrix, and (5) the C type matrix.
MBNA Malcolm Baldrige National Quality Award: An annual award given to a United States company that excels in quality management and quality achievement. [Same as Baldrige Award.]
MBTI Myers Briggs Type Indicator
Measurement The assignment of numbers to actions or events.
Mean The average of a group of measurement values. The mean is determined by dividing the sum of the values by the number of values in the group.
mean time between failures. A measure of there liability of a computer system, equal to the average operating time of equipment between failures, as calculated on a statistical basis from the known failure rates of various components of the system.
Mean time to failure (MTF). A measure of reliability, giving the average time before the first failure.
Mean time to repair (MTR). A measure of the reliability of a piece of repairable equipment, giving the average time between repairs.
measurable. Capable of being measured.
Measure. (IEEE) A quantitative assessment of the degree to which a software product or process possesses a given attribute.
measurement. The process of determining the value of some quantity in terms of a standard unit.
Median The middle of a group of measurement values when arranged in numerical order. For example, in the group (32, 45, 78, 79, 101), 78 is the median. If the group contains an even number of values, the median is the average of the two middle values.
Metric, software quality. (IEEE) A quantitative measure of the degree to which software possesses a given attribute that affects its quality.
Mission statement A written declaration of the purpose of an organization or project team. Organizational mission or vision statements often include an organizational vision for the future, goals, and values. Mission describes the activities for achieving the Vision. The Mission is the cause and the Vision the effect. The Mission statement may also contain verb (design, train, maintain, etc.) + object (system, strategy, etc.) + target value (how much, #, %, etc. This is optional) + time limit (should be same time frame as Vision)
Mode The most frequently occurring value in a group of measurements
MSA Measurement System Analysis.
MTBF Mean Time Between Failure
MTTF. mean time to failure.
MTTR. mean time to repair.
Multiple condition coverage. (Myers) A test coverage criteria require enough test cases such that all possible combinations of conditions outcomes in each decision, and all points of entry, are invoked at least once. Contrast with branch coverage, condition, decision coverage, path coverage, path coverage, and statement coverage.
Mutation analysis. (NBS) A method to determine test set thoroughness by measuring the extent to which a test set can discriminate the program from slight variants [mutants] of the program
N
NCR Non-Conformance Report
Noise In the context of quality management, noise is essentially variability. For example, if you are making ketchup, noise in the process comes from variations in the quality of incoming tomatoes, changes in ambient temperature and humidity, variations in machinery performance, variations in the quality of human factors, etc.
Nominal Unordered categories which indicate membership or nonmembership with no implication of quantity, i.e., assembly area number one, part numbers, etc.
The nominal group technique Technique is used to encourage creative thinking and new ideas but is more controlled than brainstorming. Each member of a group writes down his or her ideas and then contributes one to the group pool. All contributed ideas are then discussed and prioritized.
Non Value Added Activity An activity performed in a process that does not add value to the output product or service, which may or may not have a valid business reason for being performed.
Non-Conformance Product or material which does not conform to the customer specifications and/or requirements. Sometimes used interchangeably with non-conformity.
Non-Conforming Product Product which does not conform to the customer specifications and/or requirements. Same as Non-Conformance.
Non-Conformity Nonfulfilment of a specified requirement. Sometimes used interchangeably with non-conformance.
Non-critical code analysis. (IEEE) (1) Examines software elements that are not designated safety-critical and ensure that these elements do not cause a hazard. (2) Examines portions of the code that are not considered safety-critical code to ensure they do not cause hazards. Generally, safety-critical code should be isolated from non-safety-critical code. This analysis is to show this isolation is complete and that interfaces between safety-critical code and non-safety-critical code do not create hazards.
NORMAL DISTRIBUTION A continuous, symmetrical density function characterized by a bell-shaped curve, e.g., distribution of sampling averages.
np chart A control chart indicating the number of defective units in a given sample.
A null hypothesis is typically a hypothesis of no difference. That is why the word “null” in “null hypothesis” is used — it is the hypothesis of no difference. Despite the “null” in the “null hypothesis,” there are occasions when the parameter is not hypothesized to be 0. For example, the null hypothesis may be that the parameter is equal to a specific value.
O
Object-oriented programming. A technology for writing programs that are made up of self-sufficient modules that contain all of the information needed to manipulate a given data structure. The modules are created in class hierarchies so that the code or methods of a class can be passed to other modules. New object modules can be easily created by inheriting the characteristics of existing classes. See object, object-oriented design.
Ongoing Process Capability Ongoing Process Capability is a long-term measure of statistical process control or process performance. It differs from preliminary process capability by utilizing data from a longer period to include all common causes of variation, in particular, those common causes that may result in process shifts affecting several sample intervals. Systematic or repetitive patterns of the special cause may also be included if the underlying reasons for these special causes are understood. The time required for ongoing capability evaluation depends on the time required for the sources of variation to vary throughout their full ranges, but will typically be three to six months.
Online. (IEEE) About a system or mode of operation in which input data enter the computer directly from the point of origin or output data are transmitted directly to the point where they are used. For example, an airline reservation system. Contrast with batch. See conversational, interactive, real-time.
Operation and maintenance phase. (IEEE) The period in the software life cycle during which a software product is employed in its operational environment is monitored for satisfactory performance and modified as necessary to correct problems or to respond to changing requirements.
Operation exception. (IEEE) An exception occurs when a program encounters an invalid operation code.
Operational Definition A description in quantifiable terms of what to measure and the steps to follow to consistently measure it.
ORDINAL Ordered categories (ranking) with no information about the distance between each category, i.e., the rank ordering of several measurements of an output parameter
ORDINATE The vertical axis of a graph.
Organization Diagnostics The process of identifying organization problems with individuals, processes, procedures, technology, culture, etc.
Organize To arrange by systematic planning.
OSHA Occupational Safety and Health Administration
OTM One Time Measure
OTP Outline Test Plan
Outcome the degree to which output meets the needs and expectations of the customer
OUTPUTS Products or services provided to others; the result of a process
P
P Control Chart A control chart that determines the stability of a process by finding what percentage of total units in a sample is defective.
Pareto chart A bar chart that orders data from the most frequent to the least frequent, allowing the analyst to determine the most important factor in a given situation or process.
Pareto Diagramme A chart which ranks, or places in order, common occurrences.
Pareto principle The idea that a few root problems are responsible for the large majority of consequences. The Pareto principle is derived from the work of Vilfredo Pareto, a turn-of-the-century Italian economist who studied the distributions of wealth in different countries. He concluded that a fairly consistent minority of about 20% of people controlled the large majority about 80% of a society’s wealth. This same distribution has been observed in other areas and has been termed the Pareto principle. It is defined by J.M. Juran as the idea that 80% of all effects are produced by only 20% of the possible causes.
Parts Per Million (PPM)PPM is a way of stating the performance of a process in terms of actual or projected defective material. PPM data can be used to indicate areas of variation requiring attention.
Performance Measure An indicator that can be used to evaluate the quality, cost, or cycle time characteristics of an activity or process usually against a target or standard value.
Performance requirement. (IEEE) A requirement that imposes conditions on a functional requirement; e.g., a requirement that specifies the speed. accuracy, or memory usage with which a given function must be performed.
PFMEA Process Failure Mode Effects Analysis: An analytical technique used to assure that potential process failure modes and associated causes have been considered and addressed.
Physical requirement. (IEEE) A requirement that specifies a physical characteristic that a system or system component must possess; e.g., material, shape, size, weight.
Pie chart A chart that compares groups of data to the whole data set by showing each group as a “slice” of the entire “pie.” Pie charts are particularly useful for investigating what percentage each group represents.
Plan Do Study Act (PDSA or PDCA) Originally Shewhart’s Plan Do Check Act or the application of the scientific method to engineering and management. Deming later changed Check to Study. A look before you leap approach to standardization or maintenance (Standardize Do Check Act), solving problems and improvement or reactive mode (Check Act Plan Do) and achieving opportunities and new developments or proactive mode (Plan Do Check Act)…
Plan-Do-Check-Act (PDCA) cycle A four-step improvement process originally conceived of by Walter A. Shewhart. The first step involves planning for the necessary improvement; the second step is the implementation of the plan; the third step is to check the results of the plan; the last step is to act upon the results of the plan. It is also known as the Shewhart cycle, the Deming cycle, and the PDCA cycle.
Platform. The hardware and software must be present and functioning for an application program to run [perform] as intended. A platform includes, but is not limited to the operating system or executive software, communication software, and microprocessor. network, input/output hardware, any generic software libraries, database management, user interface software, and the like.
Poka-yoke Japanese approach to mistake proofing. Primarily activities for front-line employees empowered to make changes to their work processes to enhance accuracy, safety, and efficiency.
Policy deployment Another name for hoshin planning.
Population A group of similar items from which a sample is drawn. Often referred to as the universe.
PPAP Production Parts Approval Process
PPM Predictive Preventative Maintenance
Precision The closeness of agreement between randomly selected individual measurements or test results. Also, see Accuracy.
Predictive Maintenance Maintenance is performed on equipment just before the predicted breakdown of that equipment.
Preliminary Bill of Material An initial Bill of Material completed before design and print release.
Preliminary Process Capability Studies Preliminary Process Capability Studies are short-term studies conducted to obtain early information on the performance of new or revised processes relative to internal or customer requirements. In many cases, preliminary studies should be conducted at several points in the evolution of new processes (e.g., at the equipment or tooling subcontractor’s plant, after installation at the supplier’s plant) These studies should be based on as many measurements as possible. When X Bar and R charts, at least twenty subgroups (typically three to five pieces) are required to obtain sufficient data for decision making. When this amount of data is not available, control charts should be started with whatever data is available.
Prevention The practice of eliminating unwanted variation a priori (before the fact), e.g., predicting a future condition from a control chart and then applying corrective action before the predicted event transpires.
Preventive Action Action(s) are designed to prevent the occurrence of non-conformances, or non-conformities.
Preventive Maintenance Maintenance is performed on equipment, with the intent of prolonging equipment life and/or preventing breakdown and malfunction.
Price of Non-Quality (PONQ) What it costs to do things wrong, resulting in losses such as time, money, and opportunity. An equation for estimating PONQ is the amount of time required to fix a defect x the number of defects x the hourly wage rate (fully burdened with overhead, overtime, benefits, etc.).
Probability The chance of something happening; the percent or number of occurrences over a large number of trials.
Probability of an Event The number of successful events is divided by the total number of trials.
Problem A deviation from a specified standard.
Problem-Solving The process of solving problems; is the isolation and control of those conditions which generate or facilitate the creation of undesirable symptoms.
Procedures Documented processes that are used when work affects more than one function or department of an organization.
Process A particular method of doing something, generally involving several steps or operations. (2) A series of actions that lead to the desired result; converting inputs into outputs. (3) A collection of activities that together produce a usable product or service by applying resources from one or more functional areas. (4) Combination of people, equipment, materials, methods, and environment that produce output.
Process Average The central tendency of a given process characteristic across a given amount of time or at a specific point in time.
Process capability 1. A statistical measure indicating the inherent variation for a given event in a stable process is usually defined as the process width divided by 6 sigmas. 2. Competence of the process, based on tested performance, to achieve certain results.
Process capability index Measurement indicates the ability of a process to produce specified results. Cp and Cpk are two process capability indices.
Process Comparison A logical method of questioning that compares the process conditions when a nonconforming output was produced with the process conditions when a conforming output was produced.
Process Control Operations with a built-in finding and adjusting step to keep a product or service in conformance with the specifications.
Process Control Chart Any of several various types of graphs upon which data are plotted against specific control limits.
Process Decision Program Chart PDPC Chart is a method that graphically displays as many alternatives and contingencies that can be determined in advance as strategies for dealing with them can be determined in advance.
Process Flow Diagram Depicts the flow of material through the process, including any rework or repair operations.
Process Model Also Activity Model A graphic representation of a business process that exhibits the activities and their interdependencies that make up the business process to any desired level of detail. An activity model reveals the interactions between activities in terms of inputs and outputs while showing the controls placed on each activity and the types of resources assigned to each activity.
Process Portal Software focuses the user of the Portal on the explicit knowledge required to solve his/her particular problem, or deal with a particular situation or series of events. Changes Implicit Knowledge to Explicit Knowledge.
Process Spread The range of values that a given process characteristic displays; this particular term most often applies to the range but may also encompass the variance. The spread may be based on a set of data collected at a specific point in time or may reflect the variability across a given amount of time.
Process Variation The variables in a process that affect outcomes. Two types of process variation are special cause and common cause.
Producers Risk the Probability of rejecting a lot when, in fact, the lot should have been accepted (see ALPHA RISK).
Product Assurance Plan A part of the Product Quality Plan. It is a prevention-oriented management tool that addresses product design, process design, and when applicable software design.
Production database. The computer file contains the establishment’s current production data.
Production Part Approval Submissions The submissions are based on small quantities of parts taken from a significant production run made with production tooling, processes, and cycle times. Parts for production part approval are checked by the supplier to all engineering requirements.
Production Trial Run Product made using all production tools, processes, equipment, environment, facility, and cycle time.
Program design language. (IEEE) A specification language with special constructs and, sometimes, verification protocols, is used to develop, analyze, and document a program design.
Program mutation. (IEEE) A computer program that has been purposely altered from the intended version to evaluate the ability of program test cases to detect the alteration. See testing, mutation.
Program.(1) (ISO) A sequence of instructions suitable for processing. Processing may include the use of an assembler, a compiler, an interpreter, or another translator to prepare the program for execution. The instructions may include statements and necessary declarations. (2) (ISO) To design, write, and test programs. (~) (ANSI) In programming languages, a set of one or more interrelated modules is capable of being executed. (4) Loosely, a routine. (5) Loosely, to write a routine.
Programming language. (IEEE) A language used to express computer programs. See computer language, high-level language, and low-level language.
programming standards. See coding standards.
Project A problem. usually calling for planned action.
Project plan. (NIST) A management document describing the approach taken for a project. The plan typically describes work to be done, resources required, methods to be used, the configuration management and quality assurance procedures to be followed, the schedules to be met, the project organization, etc. Project in this context is a generic term. Some projects may also need integration plans, security plans, test plans, quality assurance plans, etc. See documentation plan, software development plan, test plan, and software engineering.
Proof of correctness. (NBS) The use of techniques of mathematical logic to infer that a relation between program variables assumed true at program entry implies that another relation between program variables holds at program exit.
protocol. achieving communication.
Prototyping. Using software tools to accelerate the software development process by facilitating the identification of required functionality during analysis and design phases. A limitation of this technique is the identification of system or software problems and hazards. See: rapid prototyping
Q
QIS Quality Information System
QS-9000 A quality standard, based on ISO 9000, used by the American domestic automobile manufacturers to register their suppliers.
Qualification, installation. (FDA) Establishing confidence that process equipment and ancillary systems are compliant with appropriate codes and approved design intentions, and that manufacturer’s recommendations are suitably considered.
Qualification, operational. (FDA) Establishing confidence that process equipment and subsystems are capable of consistently operating within established limits and tolerances.
qualification, process performance. (FDA) Establishing confidence that the process is effective and reproducible.
Qualification, product performance. (FDA) Establishing confidence through appropriate testing that the finished product produced by a specified process meets all release requirements for functionality and safety.
Quality “Quality is conformance to specifications. “British Defence Industries Quality Assurance Panel “Quality is conformance to requirements.” Philip Crosby’s “Quality is fitness for purpose.” Dr. Juran “Quality is synonymous with customer needs and expectations.” R J Mortiboys “Quality is a predictable degree of uniformity and dependability, at low cost and suited to the market. “Dr. Edward Deming “Quality is meeting the (stated) requirements of the customer- now and in the future.” Mike Robinson “Quality is the total composite product and service characteristics of marketing, engineering, manufacturing, and maintenance through which the product and service in use will meet the expectations by the customer.” Armand Feigenbaum “Totality of characteristics of an entity that bear on its ability to satisfy stated and implied needs.” ISO 8402: 1994. Quality The ability of a product, or service, to meet customer requirements, both stated and unstated
Quality Assurance All the planned and systematic activities implemented within the quality systems to provide adequate confidence the requirements for quality will be met.
quality assurance, software. (IEEE) (1) A planned and systematic pattern of all actions necessary to provide adequate confidence that an item or product conforms to established technical requirements. (2) A set of activities designed to evaluate the process by which products are developed or manufactured.
Quality assurance. (1) (ISO) The planned systematic activities necessary to ensure that a component, module, or system conforms to established technical requirements. (2) All actions that are taken to ensure that a development organization delivers products that meet performance requirements and adhere to standards and procedures. (3) The policy, procedures, and systematic actions established in an enterprise to provide and maintain some degree of confidence in data integrity and accuracy throughout the life cycle of the data, which includes input, update, manipulation, and output. (4) (QA) The actions are planned and performed, to provide confidence that all systems and components that influence the quality of the product are working as expected individually and collectively.
Quality Audit A systematic and independent examination to determine whether quality-related activities are implemented effectively and comply with the quality systems and/or quality standards.
Quality Characteristics The characteristics of an output of a process that is important to the customer. Identification of quality characteristics requires knowledge of the customer’s needs and expectations.
Quality circles 1. Quality improvement teams or groups. 2. In Japan, groups of employees formed the study and shared information regarding quality control issues and theory.
Quality Control The process by which actual product or service performance is measured and compared with a standard, and action is taken to eliminate any non-conformances
Quality Council A group of senior management within given operational units who plan, implement, facilitate, and monitor the QUALITY PROCESS.
Quality Function Deployment (QFD) A requirements identification analysis, flow down, and tracking technique. It focuses on quality and communication to translate customer needs into product and process design specifics. Also known as the “house of quality.”
Quality function deployment (QFD) A technique used to translate customer requirements into appropriate goals for each stage of product or service development and output. The two approaches to quality function deployment are known as the House of Quality and the Matrix of Matrices.
Quality improvement A systematic approach to the processes of work that looks to remove waste, loss, rework, frustration, etc. to make the processes of work more effective, efficient, and appropriate.
Quality improvement team A group of employees that take on a project to improve a given process or design a new process within an organization.
Quality loss functions An algebraic function that illustrates the loss of quality that occurs when a characteristic deviates from its target value. It is expressed often in monetary terms. Dr. Genichi Taguchi coined this term; his work suggests that quality losses vary as the square of the deviation from the target.
Quality Management The planned actions taken to ensure the effective implementation of an organization’s quality systems.
Quality Manual A document stating the quality policy and describing the quality systems of an organization.
Quality Planning Quality Planning is a structured process for defining the methods that will be used in the production of a specific product or family of products. Quality planning embodies the concepts of defect prevention and continuous improvement as contrasted with defect detection.
Quality Policy A statement and a genuine commitment from Top Management regarding their position relative to Quality Products and/or Services.
Quality Process A planned strategy that ensures all employees will be able to produce defect-free work.
Quality Records Quality Records are the documented evidence that the supplier’s processes were executed according to the quality system documentation and record results.
Quality System Organizational structure, procedures, processes, and resources required to implement quality management.
R
Random Selecting a sample so each item in the population has an equal chance of being selected; lack of predictability; without a pattern.
Random Sample One or more samples are randomly selected from the universe (population).
Random Sampling The process of selecting units for sample size, so that all units have an equal chance of being selected as the sample.
Random Variable A variable that can assume any value from a set of possible values.
Random Variations Variations in data that result from causes that cannot be pinpointed or controlled.
Range The difference between the highest and lowest values in a set of values or “subgroup.”
Range Control Chart Control chart in which the range of the subgroup is used to track the instantaneous variation within a process, i.e. the variation in the process at any one time when many input factors would not have time to vary enough to make a detectable difference. Range charts are usually paired with average charts for complete analysis.
Ranks Values are assigned to items in a sample to determine their relative occurrence in a population.
Record. (1) (ISO) is a group of related data elements treated as a unit. [A data element (field) is a component of a record, and a record is a component of a file (database)].
Registered Suppliers Registered Suppliers are suppliers who have received third-party registration to a specific quality system standard for the commodity supplied.
Regrade Action was taken on a non-conforming product that changes the classification, or category of the product for use in alternative applications. Cannot be done without customer approval/direction. [Also see Repair/Rework]
Regression analysis A statistical technique used to determine the best mathematical expression to describe the relationship between a response and independent variables.
Regression analysis and testing. (IEEE) A software V&V task to determine the extent of V&V analysis and testing that must be repeated when changes are made to any previously examined software products. See testing and regression.
The relations Diagram method is a technique developed to clarify intertwined causal relationships in a complex situation to find an appropriate solution. It is typically represented graphically as squared ellipses (concepts) connected by directed lines (arrowheads show the direction). The directed lines represent causal relations between the concepts.
Release. (IEEE) The formal notification and distribution of an approved version. See version.
Reliability assessment. (ANSI/IEEE) The process of determining the achieved level of reliability for an existing system or system component.
Reliability The probability of a product or service successfully doing its job under given conditions.
Reliability The probability that an item will continue to function at customer expectation levels at a measurement point, under specified environmental and duty cycle conditions.
Reliability. (IEEE) The ability of a system or component to perform its required functions under stated conditions for a specified period. See software reliability.
Repair Action was taken on the non-conforming product so that the product will fulfill the intended usage, although the product may not
conform to the original requirements. [Also see Regrade/Rework]
Replication Observations were made under identical test conditions.
Representative Sample A sample which accurately reflects a specific condition or set of conditions within the universe.
Requirement. (IEEE) (1) A condition or capability needed by a user to solve a problem or achieve an objective (2) A condition or capability that must be met or possessed by a system or system component to satisfy a contract or standard. specification, or other formally imposed documents. (3) A documented representation of a condition or capability as in (1) or (2). See design requirement, functional requirement, implementation requirement, interface requirement, performance requirement, and physical requirement.
Requirements analysis. (IEEE) (1) The process of studying users needs to arrive at a definition of a system, hardware, or software requirements. (2) The process of studying and refining system, hardware, or software requirements. See prototyping, software engineering.
Requirements phase. (IEEE) The period in the software life cycle during which the requirements, such as functional and performance capabilities for a software product, are defined and documented.
Resources-those items were necessary for a team to understand a problem and implement solutions; also, the time to work on solutions, access to manufacturing engineers, etc.
revalidation. Relative to software changes, revalidation means validating the change itself, assessing the nature of the change to determine potential ripple effects, and performing the necessary regression testing.
Review. (IEEE) A process or meeting during which a work product or set of work products, is presented to project personnel, managers, users, customers, or other interested parties for comment or approval. Types include code review, design review, formal qualification review, requirements review, and test readiness review. In contrast with audit, inspection. See static analysis.
Rework Action taken on the non-conforming product so that it will meet the specified requirements.
RFP Request For Proposal
RFQ Request For Quotation
Risk assessment. (DOD) A comprehensive evaluation of the risk and its associated impact.
Risk The possibility of loss, injury, disadvantage, or destruction. Apply this definition to the issues of program management and you have the starting point for successful risk management
Risk. (IEEE) A measure of the probability and severity of undesired effects. Often taken as the simple product of probability and consequence.
Robust design An approach to the planning of new products and services that harnesses Taguchi methods.
Robust The ability of a product or service to function appropriately regardless of external conditions and other uncontrollable factors.
Root Cause Analysis Using one or more various tools to determine the root cause of a specific failure.
Root Cause The lowest level cause of a failure, or variation in a product, component, or process.
RPN Risk Priority Number (ref: FMEA)
Run 1) SPC: A consecutive number of points consistently increasing or decreasing. 2) Production: The production of a specified number of sequential units.
Run chart Also known as a line chart, or line graph. A chart that plots data over time, allowing you to identify trends and anomalies
S
Safety (DOD) Freedom from those conditions that can cause death, injury, occupational illness, damage to or loss of equipment or property, or damage to the environment.
Sample One or more observations are drawn from a larger collection of observations or universe (population).
Sample standard deviation chart (s chart) Control chart in which the standard deviation of the subgroup is tracked to determine the variation within a process over time. Sample standard deviation charts are usually paired with average charts for complete analysis.
Scatter diagrams show the pattern of the relationship between two variables that are thought to be related. For example, is there a relationship between outside temperature and cases of the common cold? As temperatures drop, do colds increase? The closer the points hug a diagonal line the more closely there is a one-to-one relationship.
Scatterplot A tool that studies the possible relationship between two variables expressed on the x-axis and y-axis of a graph. The direction and density of the points plotted will indicate various relationships or a lack of any relationship between the variables.
Seven tools of quality Quality improvement tools include the histogram, Pareto chart, check sheet, control chart, cause-and-effect diagram, flowchart, and scatter diagram.
SFMEA System Failure Mode and Effects Analysis.
Shewhart cycle Another name for the Plan-Do-Check-Act cycle. It is also sometimes called the Deming cycle.
Shewhart, Walter A. The father of statistical process control or statistical quality control. He pioneered statistical quality control and improvement methods when he worked for Western Electric and Bell Telephone in the early decades of the 20th century.
Sigma is a statistical unit of measure that reflects process capability. The sigma scale of measure is perfectly correlated to such characteristics as defects per unit, parts per million defectives, and the probability of a failure/error.
Significant Characteristics Product and process characteristics are designated by the customer, including governmental regulations and safety, and/or selected by the supplier through knowledge of the product and process.
Simulation analysis. (IEEE) A software V&V task to simulate critical tasks of the software or system environment to analyze logical or performance characteristics that would not be practical to analyze manually.
Simulation. (1) (NBS) Use of an executable model to represent the behavior of an object. During testing the computational hardware, the external environment, and even code segments may be simulated. (2) (IEEE) A model that behaves or operates like a given system when provided a set of controlled inputs. Contrast with emulation.
Simulator. IEEE) A device, computer program, or system that behaves or operates like a given system when provided a set of controlled inputs. In contrast with the emulator. A simulator provides inputs or responses that resemble anticipated process parameters. Its function is to present data to the system at known speeds and in a proper format.
Simultaneous Engineering A way of simultaneously designing products, and the processes for manufacturing those products, through the use of cross-functional teams to assure manufacturability and reduce cycle time.
Six Sigma Structured application of the tools and techniques of Total Quality Management on a Project Basis to achieve strategic business results
Six Sigma has a failure rate of 3.4 parts per million or 99.99966% good
Six Sigma Application of the define, measure, analyze, improve and control steps.
Sizing. (IEEE) The process of estimating the amount of computer storage or the number of source lines required for a software system or component. Contrast with timing.
Skill Ability to perform a task or function.
Software characteristic. An inherent, possibly accidental, trait, quality, or property of software; e.g., functionality, performance, attributes, design constraints, number of states, lines, or branches.
Software design description. (IEEE) A representation of software created to facilitate analysis, planning, implementation, and decision making. The software design description is used as a medium for communicating software design information and may be thought of as a blueprint or model of the system. See structured design, design description, and specification.
Software development plan. (NIST) The project plan for the development of a software product. In contrast with the software development process, the software life cycle.
software development process. (IEEE) The process by which user needs are translated into a software product. the process involves translating user needs into software
Software documentation. (NIST) Technical data or information, including computer listings and printouts, in human-readable form, that describe or specify the design or details, explain the capabilities or provide operating instructions for using the software to obtain desired results from a software system. See: specification; specification, requirements: specification, design; software design description; test plan, test report, user’s guide.
Software element. (IEEE) A deliverable or in-process document produced or acquired during software development or maintenance. Specific examples include but are not limited to: (1) Project planning documents; i.e., software development plans, and software verification and validation plans. (2) Software requirements and design specifications. (3) Test documentation.(4) Customer-deliverable documentation.(5) Program source code. (6) Representation of software solutions implemented in firmware(7) Reports; i.e., review, audit, project status. (8) Data; i.e., defect detection, test. Contrast with software item. See configuration item.
Software engineering environment. (IEEE) The hardware, software, and firmware are used to perform a software engineering effort. Typical elements include computer equipment. compilers, assemblers, operating systems, debuggers, simulators, emulators, test tools, documentation tools, and database management systems.
Software engineering. (IEEE) The application of a systematic, disciplined, quantifiable approach to the development, operation, and maintenance of software; i.e., the application of engineering to software. See project plan, requirements analysis, architectural design, structured design, system safety, testing, and configuration management.
Software hazard analysis. (C, CDRH) The identification of safety-critical software, the classification and estimation of potential hazards, and identification of program path analysis to identify hazardous combinations of internal and environmental program conditions. See risk assessment, software safety change analysis, software safety code analysis, software safety design analysis, software safety requirements analysis, software safety test analysis, and system safety.
Software item. (IEEE) Source code. object code, job control code, control data, or a collection of these items. Contrast with software element.
Software life cycle. (NIST) Period beginning when a software product is conceived and ending when the product is no longer available for use The software life cycle is typically broken into phases denoting activities such as requirements, design, programming, testing, installation, and operation and maintenance. In contrast with the software development process. See waterfall model.
Software reliability. (IEEE) (1) the probability that software will not cause the failure of a system for a specified time under specified conditions. The probability is a function of the inputs to and use of the system in the software The inputs to the system determine whether existing faults if any are encountered. (2) The ability of a program to perform its required functions accurately and reproducibly under stated conditions for a specified period.
Software review. (IEEE) An evaluation of software elements to ascertain discrepancies from planned results and to recommend improvement. This evaluation follows a formal process. Syn: software audit. See code audit, code inspection, code review, code walkthrough, design review, specification analysis, and static analysis.
Software safety code analysis. (IEEE) Verification that the safety-critical portion of the design is correctly implemented in the code. See logic analysis, data analysis, interface analysis, constraint analysis, programming style analysis, non-critical code analysis, timing and sizing analysis, software hazard analysis, and system safety.
Software safety design analysis. (IEEE) Verification that the safety-critical portion of the software design correctly implements the safety-critical requirements and introduces no new hazards. See logic analysis, data analysis, interface analysis, constraint analysis, functional analysis, software element analysis, timing and sizing analysis, and reliability analysis. software hazard analysis, system safety.
Software safety requirements analysis. (IEEE) An analysis evaluating software and interface requirements to identify errors and deficiencies that could contribute to a hazard. See criticality analysis, specification analysis, timing and sizing analysis, different software systems analyses, software hazard analysis, and system safety.
Software safety test analysis. (IEEE) An analysis demonstrating that safety requirements have been correctly implemented and that the software functions safely within its specified environment. Tests may include; unit level tests, interface tests, software configuration item testing, system level testing, stress testing, and regression testing. See software hazard analysis and system safety.
Software. (ANSI) Programs, procedures, rules, and any associated documentation about the operation of a system. Contrast with hardware See application software, operating system, system software. utility software.
Source code.(1) (IEEE) Computer instructions and data definitions are expressed in a form suitable for input to an assembler, compiler, or other translators. (2) The human-readable version of the list of instructions [program] that cause a computer to perform a task. Contrast with object code. See source program, programming language.
SPC Statistical Process Control The application of statistical methods to analyze data, study, and monitor process capability and performance. Use of control charts to monitor process performance.
SPC Statistical Process Control: The use of statistical techniques such as control charts to analyze a process, enabling appropriate actions to achieve a stable process…
Special causes (Assignable Cause) Causes of variation in a process that are not inherent in the process itself but originate from circumstances that are out of the ordinary. Special causes are indicated by points that fall outside the limits of a control chart.
Specification analysis. (IEEE) Evaluation of each safety-critical software requirement concerning a list of qualities such as completeness, correctness, consistency, and testability. robustness, integrity, reliability, usability, flexibility, maintainability, portability, interoperability, accuracy, audibility, and performance. internal instrumentation, security, and training.
Specification limits An engineering or design requirement that must be met to produce a satisfactory product.
Specification tree. (IEEE) A diagram that depicts all of the specifications for a given system and
Specification, product. (IEEE) A document that describes the as-built version of the software
Specification, and requirements. (NIST) A specification that documents the requirements of a system or system component. It typically includes functional requirements. performance requirements, interface requirements, design requirements [attributes and constraints], development [coding] standards, etc Contrast with requirement.
specification, test case. See test case.
Specification, design. (NIST) A specification that documents how a system is to be built. It typically includes system or component structure, algorithms, control logic, data structures, data set [file] use information, input/output formats, interface descriptions, etc Contrast with design standards, and requirement. See software design description.
Specification, functional. (NIST) A specification that documents the functional requirements for a system or system component. It describes what the system or component is to do rather than how it is to be built. Often part of a requirements specification. In contrast with the requirement.
Specification. (IEEE) A document that specifies, in a complete, precise, verifiable manner, the requirements, design, behavior, or other characteristics of a system or component, and often, the procedures for determining whether these provisions have been satisfied. In contrast with the requirement. See: specification, formal; specification, requirements; specification, functional; specification, performance; specification, interface; specification, design; coding standards; design standards.
Specifications are engineering requirements for judging the acceptability of a part characteristic. For the production part approval process, every feature of the product as identified by engineering specifications must be measured. Actual measurement and test results are required. Specifications should not be confused with control limits which represent “the voice of the process”.
Spider Diagram A visual reporting tool for the performance of several indicators. Also known as a “radar chart” this tool makes visible the gaps between the current and desired performance.
Spiral model. (IEEE) A model of the software development process in which the constituent activities, typically requirements analysis, preliminary and detailed design. coding, integration, and testing, are performed iteratively until the software is complete Syn: evolutionary model, Contrast with incremental development; rapid prototyping; waterfall model.
SQA Supplier Quality Assistance
SQC Statistical Quality Control: The application of statistical techniques to measure variation in materials, parts, components, and products. The process of maintaining acceptable levels of product quality by using statistical techniques.
Stable Process A process from which all special causes of variation have been eliminated and only common causes remain.
Standard Deviation A statistical index of variability which describes the spread.
Standard operating procedures. (SOP) Written procedures [prescribing and describing the steps to be taken in normal and defined conditions] which are necessary to assure control of products and processes.
Static analysis. (1) (NBS) Analysis of a program that is performed without executing the program. (2) (IEEE) The process of evaluating a system or component based on its form, structure, content, and documentation. Contrast with dynamic analysis. See code audit, code inspection, code review, code walk-through, design review, and symbolic execution.
static analyzer. (ANSI/IEEE) A software tool that aids in the evaluation of a computer program without executing the program. Examples include checkers, compilers, cross-reference generators, standards enforcers, and flow-charters.
Statistical Control Is the condition of a process from which all special causes of variation have been eliminated and only common causes remain. Statistical control is evidenced on, a control chart by the absence of points beyond the control limits and by the absence of any nonrandom patterns or trends.
Statistical process control (SPC) Analysis and control of a process through the use of statistical techniques, particularly control charts.
Statistical quality control (SQC) Analysis and control of quality through the use of statistical techniques, the focus is on the product, not the process.
Stratification A process of grouping data according to a common characteristic.
Structural variation Variation is caused by recurring system-wide changes such as seasonal changes or long-term trends.
Structured design. (IEEE) Any disciplined approach to software design that adheres to specified rules based on principles such as modularity, top-down design, and stepwise refinement of data, system structure, and processing steps. See data structure-centered design, input-processing-output, modular decomposition, object-oriented design, rapid prototyping, stepwise refinement, and structured programming. transaction analysis, transform analysis, graphical software specification/design documents, modular software, and software engineering.
Structured programming. (IEEE) Any software development technique that includes structured design and results in the development of structured programs. See structured design.
Stub. (NES) Special code segments that when invoked by a code segment under test will simulate the behavior of designed and specified modules not yet constructed.
SubgroupA logical grouping of objects or events which displays only random event-to-event variations, e.g., the objects or events are grouped to create homogenous groups free of assignable or special causes. By the minimum within-group variability, any change in the central tendency or variance of the universe will be reflected in the “subgroup to subgroup’ variability.
Submission Level Refers to the level of evidence required for production part submissions.
Subroutine trace. (IEEE) A record of all or selected subroutines or function calls performed during the execution of a computer program and. optionally, the values of parameters passed to and returned by each subroutine or function. Syn: call trace. See execution trace, retrospective trace, symbolic trace, and variable trace.
Subroutine. (IEEE) A routine that returns control to the program or subprogram that is called it. Note: This term is defined differently in various programming languages. See module.
Subsystem A major part of a system that itself has the characteristics of a system, usually consisting of several components.
Supplier Anyone whose output (materials, information, service, etc.) becomes an input to another person or group in a process of work. A supplier can be external or internal to the organization.
Suppliers are The people who provide inputs to jobs, whether from inside or outside Your Company. In quality improvement, the customer and supplier relationship becomes an interactive relationship that calls for agreeing to and communicating specifications.
Surveillance Audit A post-registration quality audits to ensure the quality systems are still effectively implemented and continuous improvement is evident.
Symbolic execution. (IEEE) A static analysts technique in which program execution is simulated using symbols, such as variable names, rather than actual values for input data and program outputs are expressed as logical or mathematical expressions involving these symbols.
Symbolic trace. (IEEE} A record of the source statements and branch outcomes that are encountered when a computer program is executed using symbolic, rather than actual values for input data. See execution trace. retrospective trace, subroutine trace, variable trace.
SYMPTOM That which serves as evidence of something not seen.
syntax. The structural or grammatical rules that define how symbols in a language are to be combined to form words, phrases, expressions, and other allowable constructs.
SYSTEM That which is connected according to a scheme
System A combination of several components or pieces of equipment integrated to perform a specific function.
System administrator. The person that is charged with the overall administration, and operation of a computer system. The System Administrator is normally an employee or a member of the establishment. Syn: system manager.
System analysis. (ISO) A systematic investigation of a real or planned system to determine the functions of the system and how they relate to each other and any other system. See requirements phase.
System design review. (IEEE) A review was conducted to evaluate how the requirements for a system have been allocated to configuration items, the system engineering process that produced the allocation, the engineering planning for the next phase of the effort, manufacturing considerations, and the planning for production engineering. See design review.
System design. (ISO) A process of defining the hardware and software architecture, components, modules, interfaces, and data for a system to satisfy specified requirements. See design phase, architectural design, and functional design.
System documentation. (ISO) The collection of documents that describe the requirements, capabilities, limitations, design,
operation, and maintenance of an information processing system. See specification, test documentation, and user’s guide.
System integration. (ISO) The progressive linking and testing of system components into a complete system. See incremental integration.
System life cycle. The course of developmental changes through which a system passes from its conception to the termination of its use; a.g., the phases, and activities associated with the analysis. acquisition, design, development, test, integration, operation, maintenance, and modification of a system. See software life cycle.
System safety. (DOD) The application of engineering and management principles, criteria, and techniques to optimize all aspects of safety within the constraints of operational effectiveness, time, and cost throughout all phases of the system life cycle. See risk assessment, software safety change analysis, software safety code analysis, software safety design analysis, software safety requirements analysis, software safety test analysis, and software engineering.
System software. (ISO) Application-independent software that supports the running of application software (2) (IEEE) Software designed to facilitate the operation and maintenance of a computer system and its associated programs: eg., operating systems, assemblers, utilities. Contrast with application software See: support software
System. (1) (ANSI) People, machines, and methods are organized to accomplish a set of specific functions. (2) (DOD) A composite, at any level of complexity, of personnel, procedures, materials, tools, equipment, facilities, and software The elements of this composite entity are used together in the intended operational or support environment to perform a given task or achieve a specific purpose, support. or mission requirement.
The systematic Diagram method searches for the most appropriate and effective means of accomplishing given objectives. … Systematic diagrams can be divided into two types: The constituent component analysis diagram breaks down the main subject into its basic elements and depicts their relationships to the objectives and means of obtaining those objectives. The plan development diagram systematically shows the means and procedures necessary to successfully implement a given plan. It is typically represented graphically either a horizontal or vertical tree structure connecting the elements.
T
Taguchi, Genichi Developed a set of practices known as Taguchi Methods, as they are known in the U.S., for improving quality while reducing costs. Taguchi Methods focus on the design of efficient experiments and increasing of signal to noise ratios. Dr. Taguchi also articulated the development of the quality loss function. Currently, he is executive director of the American Supplier Institute and director of the Japan Industrial Technology Institute.
Tampering Not differentiating between common and special cause variation and changing the process.
Test case. (IEEE) Documentation specifying inputs predicted results and a set of execution conditions for a test item. Syn: test
case specification.
Test design. (IEEE) Documentation specifying the details of the test approach for a software feature or combination of software features and identifying the associated tests. See: testing functional; cause-effect graphing; boundary value analysis; equivalence class partitioning; error guessing; testing, structural; branch analysis; path analysis; statement coverage; condition coverage: decision coverage; multiple-condition coverage.
Test documentation. (IEEE) Documentation describing plans for, or results in ot the testing of a system or component, Types include test case specification, test incident report, test log, test plan, test procedure, and test report.
Test driver. (IEEE) A software module is used to invoke a module under test and, often, provide test inputs, control, monitor execution, and report test results. Syn: test harness.
Test log. (IEEE) A chronological record of all relevant details about the execution of a test.
TEST OF SIGNIFICANCE A procedure to determine whether a quantity subjected to random variation differs from a postulated value by an amount greater than that due to random variation alone.
Test phase. (IEEE) The period in the software life cycle in which the components of a software product are evaluated and
integrated, and the software product is evaluated to determine whether or not requirements have been satisfied.
Test plan. (IEEE) Documentation specifying the scope, approach, resources, and schedule of intended testing activities. It identifies test items, the features to be tested, the testing tasks, responsibilities, required resources, and any risks requiring contingency planning. See test design and validation protocol.
Test procedure. (NIST) A formal document developed from a test plan that presents detailed instructions for the setup, operation, and evaluation of the results for each defined test. See test case.
Test report. (IEEE) A document describing the conduct and results of the testing carried out for a system or system component.
Test. (IEEE) An activity in which a system or component is executed under specified conditions, the results are observed or recorded and an evaluation is made of some aspect of the system or component.
Testability. (IEEE) (1) The degree to which a system or component facilitates the establishment of test criteria and the performance of tests to determine whether those criteria have been met. (2) The degree to which a requirement is stated in terms that permit the establishment of test criteria and performance of tests to determine whether those criteria have been met.
Testing, acceptance. (IEEE) Testing is conducted to determine whether or not a system satisfies its acceptance criteria and to enable the customer to determine whether or not to accept the system. Contrast with testing, development; testing, operational. See testing and qualification.
Testing, alpha [a]. (Pressman) Acceptance testing is performed by the customer in a controlled environment at the developer’s site. The software is used by the customer in a setting approximating the target environment with the developer observing and recording errors and usage problems.
Testing, beta [B]. (1) (Pressman) Acceptance testing is performed by the customer in a live application of the software, at one or more end-user sites, in an environment not controlled by the developer. (2) For medical device software such use may require an Investigational Device Exemption [ICE] or Institutional Review Board (IRS] approval.
Testing, compatibility. The process of determining the ability of two or more systems to exchange information. In a situation where the developed software replaces an already working program, an investigation should be conducted to assess possible comparability problems between the new software and other programs or systems. See: different software system analysis; testing, integration; testing, interface. program variables. Feasible only for small, simple programs.
Testing, design-based functional. (NBS) The application of test data derived through functional analysis extended to include design functions as well as requirement functions. See testing, functional.
Testing, development. (IEEE) Testing is conducted during the development of a system or component, usually in the development environment by the developer. Contrast with testing, acceptance; testing, operational.
Testing, formal. (IEEE) Testing is conducted following test plans and procedures that have been reviewed and approved by a customer, user, or designated level of management. Antonym: informal testing.
Testing, functional. (IEEE) (1) Testing that ignores the internal mechanism or structure of a system or component and focuses on the outputs generated in response to selected inputs and execution conditions. (2) Testing is conducted to evaluate the compliance of a system or component with specified functional requirements and corresponding predicted results. Syn: black-box testing, input/output driven testing. In contrast with testing, structural.
testing, interface. (IEEE) Testing is conducted to evaluate whether systems or components pass data and control correctly to one another. Contrast with the testing, unit; testing, and system. See testing and integration.
Testing, operational. (IEEE) Testing is conducted to evaluate a system or component in its operational environment. Contrast with testing, development; testing, acceptance; See testing, system.
Testing, parallel. (ISO) Testing a new or an alternate data processing system with the same source data that is used in another system. The other system is considered the standard of comparison. Syn: parallel run.
Testing, path. (NBS) Testing to satisfy coverage criteria that each logical path through the program be tested. Often paths through the program are grouped into a finite set of classes. One path from each class is then tested. Syn path coverage. Contrast with testing, branch; testing, statement; branch coverage; condition coverage; decision coverage.
Testing, qualification. (IEEE) Formal testing is usually conducted by the developer for the consumer, to demonstrate that the software meets its specified requirements. See testing, acceptance; testing, system.
Testing, regression. (NIST) Rerunning test cases that a program has previously executed correctly to detect errors spawned by changes or corrections made during software development and maintenance.
Testing, system. (IEEE) The process of testing an integrated hardware and software system to verify that the system meets its specified requirements. Such testing may be conducted in both the development environment and the target environment.
testing, unit. (1) (NIST) Testing of a module for typographic, syntactic, and logical errors, for correct implementation of its design, and the satisfaction of its requirements. (2) (IEEE) Testing is conducted to verify the implementation of the design for one software element; e.g., a unit or module; or a collection of software elements. Syn: component testing.
Testing, usability. designed in a manner such that the information is displayed in an understandable fashion enabling the operator to correctly interact with the system?
testing, volume. Testing is designed to challenge a system’s ability to manage the maximum amount of data over some time. This type of testing also evaluates a system’s ability to handle overload situations in an orderly fashion.
Testing, worst case. Testing encompasses upper and lowers limits and circumstances that pose the greatest chance finding of errors. Syn: most appropriate challenge conditions. See testing, boundary value; testing, invalid case; testing. special case: testing, stress; testing, volume.
Testing. integration. (IEEE) An orderly progression of testing in which software elements, hardware elements, or both are combined and tested, to evaluate their interactions, until the entire system has been integrated.
Testing. performance. (IEEE) Functional testing is conducted to evaluate the compliance of a system or component with specified performance requirements.
Testing. special case. A testing technique using input values that seem likely to cause program errors; e.g., “0”, “1”, NULL, an empty string. See error guessing.
Testing. statement. (NIST) Testing to satisfy the criterion that each statement in a program is executed at least once during program testing. Syn: statement coverage. Contrast with testing, branch; testing, path; branch coverage; condition coverage; decision coverage; multiple condition coverage; path coverage.
Testing. valid case. A testing technique using valid [normal or expected] input values or conditions. See equivalence class partitioning.
Testing. (IEEE) (1) The process of operating a system or component under specified conditions, observing or recording the results, and making an evaluation of some aspect of the system or component. (2) The process of analyzing a software item to detect the differences between existing and required conditions, i.e., bugs, and to evaluate the features of the software items. See dynamic analysis, static analysis, and software engineering.
TGR Things Gone Right.
TGW Things Gone Wrong.
TOPS Team Oriented Problem Solving
Total Quality Management (TQM) TQM is management and control activities based on the leadership of top management and based on the involvement of all employees and all departments from planning and development to sales and service. These management and control activities focus on quality assurance by which those qualities which satisfy the customer are built into products and services during the above processes and then offered to consumers.
Total Quality Management Managing for quality in all aspects of an organization focusing on employee participation and customer satisfaction. Often used as a catch-all phrase for implementing various quality control and improvement tools.
Total Quality Management/Total Quality Leadership (TQM/TQL) Both a philosophy and a set of guiding principles that represent the foundation of the continuously improving organization. TQM/TQL is the application of quantitative methods and human resources to improve the material and services supplied to an organization, all the processes within an organization, and the degree to which the needs of the customer are met, now and in the future. TQM/TQL integrates fundamental management techniques, existing improvement efforts, and technical tools under a disciplined approach focused on continuous improvement.
TQM Total Quality Management: A management approach of an organization centered on quality.
Trace. (IEEE) (1) A record of the execution of a computer program, showing the sequence of instructions executed, the names and values of variables, or both. Types include execution trace, retrospective trace, subroutine trace, symbolic trace, and variable trace. (2) To produce a record as in (1). (3) To establish a relationship between two or more products of the development process: a.g., to establish the relationship between a given requirement and the design element that implements that requirement.
Traceability analysis. (IEEE) The tracing of (1) Software Requirements Specifications requirements to system requirements in concept documentation, (2) software design descriptions to software requirements specifications and software requirements specifications to software design descriptions, (3) source code to corresponding design specifications and design specifications to source code. Analyze identified relationships for correctness, consistency, completeness, and accuracy. See traceability, traceability matrix.
Traceability matrix. (IEEE) A matrix that records the relationship between two or more products; ag., a matrix that records the relationship between the requirements and the design of a given software component. See traceability, traceability analysis.
Traceability The ability to trace a product back through the process, and identify all sub-processes, components, and equipment that were involved in its manufacture.
Traceability. (IEEE) (1) The degree to which a relationship can be established between two or more products of the development process, especially products having a predecessor-successor or master-subordinate relationship to one another; ag., the degree to which the requirements and design of a given software component match. See consistency. (2) The degree to which each element in a software development product establishes its reason for existing; e.g., the degree to which each element in a bubble chart references the requirement that it satisfies. See traceability analysis and traceability matrix.
The transition period is Time when an organization is moving away from an old way of thinking to a new way.
Tree diagram A chart is used to break any task, goal, or category into increasingly detailed levels of information. Family trees are a classic example of a tree diagram.
TRIZ Theory of Inventive Problem Solving
Trojan horse. A method of attacking a computer system, typically by providing a useful program that contains code intended to compromise a computer system by secretly providing for unauthorized access, the unauthorized collection of privileged system or user data, the unauthorized reading or altering of files, the performance of unintended and unexpected functions, or the malicious destruction of software and hardware See: bomb, virus, worm.
Type I error Rejecting something acceptable. Also known as an alpha error.
Type II error Accepting something that should have been rejected. Also known as beta error
U
u chart A control chart showing the count of defects per unit in a series of random samples.
UPPER CONTROL LIMIT A horizontal line on a control chart (usually dotted) represents the upper limits of process capability.
Usability. (IEEE) The ease with which a user can operate, prepare inputs for, and interpret a system or component.
User. (ANSI) Any person, organization, or functional unit that uses the services of an information processing system. See the end user.
User’s guide. (ISO) Documentation that describes how to use a functional unit, and that may include a description of the rights and responsibilities of the user, the owner, and the supplier of the unit. Syn: user manual, operator manual
V
VA/VE Value Analysis/Value Engineering.
Validate. To prove to be valid.
Validation Establishing proof that a design, product, or process will perform to specifications.
Validation protocol. (FDA) A written plan stating how validation will be conducted, including test parameters, and product characteristics. production equipment, and decision points on what constitutes acceptable test results. See test plan.
Validation, process. (FDA) Establishing documented evidence that provides a high degree of assurance that a specific process will consistently produce a product meeting its predetermined specifications and quality characteristics.
Validation, software. (NBS) Determination of the correctness of the final program or software produced from a development project concerning the user’s needs and requirements. Validation is usually accomplished by verifying each stage of the software development life cycle. See verification and software.
Validation. (1) (FDA) Establishing documented evidence that provides a high degree of assurance that a specific process will consistently produce a product meeting its predetermined specifications and quality attributes. Contrast with data validation.
Value Added Activity An activity in a process that adds value to an output product or service, that is, the activity merits the cost of the resources it consumes in production.
Value Added Any action, activity, or process that adds direct value to the output of the action, activity, or process.
Value added Each time work is done to inputs to transform them into something of greater usefulness as an end product.
Variable A characteristic that may take on different values.
Variable Cost A cost element that varies directly with the amount of product or service produced by activity or cost. Variable costs go to zero if the activity stops.
Variables data Data that is measured on a continuous and infinite scale such as temperature, distance, and pressure rather than in discreet units or yes/no options. Variables data is used to create histograms, some control charts, and sometimes run charts.
Variance A measure of deviation from the mean in a sample or population.
VARIATION Any quantifiable difference between individual measurements; such differences can be classified as being due to common causes (random) or special causes (assignable)
Variation Change in the output or result of a process. Variation can be caused by common causes, special causes, tampering, or structural variation.
Vendor. A person or an organization that provides software and (or hardware and/or firmware and/or documentation to the user for a fee or in exchange for services. Such a firm could be a medical device manufacturer.
Verification Establishing proof that a design, product, or process is within specifications.
Verification, software. (NIBS) In general the demonstration of consistency completeness, and correctness of the software at each stage and between each stage of the development life cycle. See validation, software.
Verify. (ANSI) (1) To determine whether transcription of data or other operation has been accomplished accurately. (2) To check the results of data entry; a.g., keypunching. (3) (Webster) To prove to be true by demonstration.
Version number. A unique identifier is used to identify software items and the related software documentation which are subject to configuration control. The execution of a virus program compromises a computer system by performing unwanted or unintended functions which may be destructive.
Vision is Often incorporated into an organizational mission (or vision) statement to clarify what the organization hopes to be doing at some point in the future. The vision should act as a guide in choosing courses of action for the organization.
Vision Statement Vision and Mission have a cause and effect relationship. Vision should reflect what the organization sees for itself 5-10 years down the road. The short time frame helps assure that the organization revitalizes itself every decade or so. The Vision statement should contain direction (improve, decrease, etc.) + indicator (quality, customer satisfaction, etc.) + target value (how much, #, %, etc.) + time limit (by when).
Voice of the Customer Customer feedback both positive and negative including likes, dislikes, problems, and suggestions.
Voice of the Process Statistical data is feedback to the people in the process to make decisions about the process stability and/or capability as a tool for continuous improvement
WXYZ
Waiver Written authorization to use or release a quantity of material, components, or stores already manufactured but not conforming to the specified requirements.
walkthrough. See code walkthrough.
X & R CHARTS A control chart which is a representation of process capability over time; displays the variability in the process average and range across time.
XmR Charts Control chart which uses a moving range. Typically two but can have a larger range.
Z
Zero Defects
Zero defects Philip Crosby’s recommended performance standard leaves no doubt regarding the goal of total quality. Crosby’s theory holds that people can continually move closer to this goal by committing themselves70 to their work and the improvement process
Also Read:-