DICTIONARY ON QUALITY, PRODUCTIVITY & BUSINESS EXCELLENCE
Academic Quality Improvement Project (AQIP): A forum for higher education institutions to review one another’s action projects.
Acceptance quality limit (AQL):In a continuing series of lots, a quality level that, for the purpose of sampling inspection, is the limit of a satisfactory process average.
Acceptance number: The maximum number of defects or defectives allowable in a sampling lot for the lot to be acceptable.
Acceptance sampling: Inspection of a sample from a lot to decide whether to accept that lot. There are two types: attributes sampling and variables sampling. In attributes sampling, the presence or absence of a characteristic is noted in each of the units inspected. In variables sampling, the numerical magnitude of a characteristic is measured and recorded for each inspected unit; this involves reference to a continuous scale of some kind.
Acceptance sampling plan: A specific plan that indicates the sampling sizes and associated acceptance or nonacceptance criteria to be used. In attributes sampling, for example, there are single, double, multiple, sequential, chain and skip-lot sampling plans. In variables sampling, there are single, double and sequential sampling plans. For detailed descriptions of these plans, see the standard ANSI/ISO/ASQ A3534-2-1993: Statistics—Vocabulary and Symbols—Statistical Quality Control.
Accreditation: Certification by a recognized body of the facilities, capability, objectivity, competence and integrity of an agency, service or operational group or individual to provide the specific service or operation needed. The term has multiple meanings depending on the sector. Laboratory accreditation assesses the capability of a laboratory to conduct testing, generally using standard test methods. Accreditation for healthcare organizations involves an authoritative body surveying and verifying compliance with recognized criteria, similar to certification in other sectors.
Accreditation body: An organization with authority to accredit other organizations to perform services such as quality system certification.
Accuracy: The characteristic of a measure.
ACLASS Accreditation Services: An ANSI-ASQ National Accreditation Board company that provides accreditation services for: testing and calibration laboratories in accordance with ISO/IEC 17025; reference material producers in accordance with ISO Guide 34; and inspection bodies in accordance with ISO/IEC 17020.
Activity based costing: An accounting system that assigns costs to a product based on the amount of resources used to design, order or make it.
Activity network diagram: An arrow diagram used in planning.
Advanced Product Quality Planning (APQP): High level automotive process for product realization, from design through production part approval.
Adverse event: Healthcare term for any event that is not consistent with the desired, normal or usual operation of the organization; also known as a sentinel event.
Affinity diagram: A management tool for organizing information (usually gathered during a brainstorming activity).
Alignment: Actions to ensure that a process or activity supports the organization’s strategy, goals and objectives.
American Association for Laboratory Accreditation (A2LA): An organization that formally recognizes another organization’s competency to perform specific tests, types of tests or calibrations.
American Customer Satisfaction Index (ACSI): Released for the first time in October 1994, an economic indicator and cross industry measure of the satisfaction of U.S. household customers with the quality of the goods and services available to them. This includes goods and services produced in the United States and imports from foreign firms that have substantial market shares or dollar sales. ASQ is a founding sponsor of the ACSI, along with the University of Michigan Business School and the CFI Group.
American National Standards Institute (ANSI): A private, nonprofit organization that administers and coordinates the U.S. voluntary standardization and conformity assessment system. It is the U.S. member body in the International Organization for Standardization, known as ISO.
American National Standards Institute-American Society for Quality (ANSI-ASQ): Organization that accredits certification bodies for ISO 9001 quality management systems, ISO 14001 environmental management systems and other industry specific requirements.
American Society for Nondestructive Testing (ASNT): A technical society for nondestructive testing (NDT) professionals.
American Society for Quality (ASQ): A professional, not-forprofit association that develops, promotes and applies quality related information and technology for the private sector, government and academia. ASQ serves more than 108,000 individuals and 1,100 corporate members in the United States and 108 other countries.
American Society for Quality Control (ASQC): Name of ASQ from 1946 through the middle of 1997, when the name was changed to ASQ.
American Society for Testing and Materials (ASTM): Not-forprofit organization that provides a forum for the development and publication of voluntary consensus standards for materials, products, systems and services.
American Society for Testing and Materials (ASTM) International: Not-forprofit organization that provides a forum for the development and publication of voluntary consensus standards for materials, products, systems and services.
American Society for Training and Development (ASTD): A membership organization providing materials, education and support related to workplace learning and performance.
American standard code for information interchange (ASCII): Basic computer characters accepted by all American machines and many foreign ones.
Analysis of means (ANOM): A statistical procedure for troubleshooting industrial processes and analyzing the results of experimental designs with factors at fixed levels. It provides a graphical display of data. Ellis R. Ott developed the procedure in 1967 because he observed that nonstatisticians had difficulty understanding analysis of variance. Analysis of means is easier for quality practitioners to use because it is an extension of the control chart. In 1973, Edward G. Schilling further extended the concept, enabling analysis of means to be used with non-normal distributions and attributes data in which the normal approximation to the binomial distribution does not apply. This is referred to as analysis of means for treatment effects.
Analysis of variance (ANOVA): A basic statistical technique for determining the proportion of influence a factor or set of factors has on total variation. It subdivides the total variation of a data set into meaningful component parts associated with specific sources of variation to test a hypothesis on the parameters of the model or to estimate variance components. There are three models: fixed, random and mixed.
Andon board: A production area visual control device, such as a lighted overhead display. It communicates the status of the production system and alerts team members to emerging problems (from andon, a Japanese word meaning “light”).
ANSI ACS X12: Transaction standards for electronic communication and shipping notification.
Appraisal cost: The cost of ensuring an organization is continually striving to conform to customers’ quality requirements.
Arrow diagram: A planning tool to diagram a sequence of events or activities (nodes) and their interconnectivity. It is used for scheduling and especially for determining the critical path through nodes.
AS9100: An international quality management standard for the aerospace industry published by the Society of Automotive Engineers and other organizations worldwide. It is known as EN9100 in Europe and JIS Q 9100 in Japan. The standard is controlled by the International Aerospace Quality Group (see listing).
Asia Pacific Laboratory Accreditation Cooperation (APLAC): A cooperative of laboratory accreditation bodies.
Assessment: A systematic evaluation process of collecting and analyzing data to determine the current, historical or projected compliance of an organization to a standard.
Assignable cause: A name for the source of variation in a process that is not due to chance and therefore can be identified and eliminated. Also called “special cause.”
Assn. for Quality and Participation (AQP): Was an independent organization until 2004, when it became an affiliate organization of ASQ. Continues today as ASQ’s Team and Workplace Excellence Forum.
Attribute data: Go/no-go information. The control charts based on attribute data include percent chart, number of affected units chart, count chart, count per unit chart, quality score chart and demerit chart.
Attributes, method of: Method of measuring quality that consists of noting the presence (or absence) of some characteristic (attribute) in each of the units under consideration and counting how many units do (or do not) possess it. Example: go/no-go gauging of a dimension.
Audit: The on-site verification activity, such as inspection or examination, of a process or quality system, to ensure compliance to requirements. An audit can apply to an entire organization or might be specific to a function, process or production step.
Automotive Industry Action Group (AIAG): A global automotive trade association with about 1,600 member companies that focuses on common business processes, implementation guidelines, education and training.
Autonomation: A form of automation in which machinery automatically inspects each item after producing it and ceases production and notifies humans if a defect is detected. Toyota expanded the meaning of jidohka to include the responsibility of all workers to function similarly—to check every item produced and, if a defect is detected, make no more until the cause of the defect has been identified and corrected. Also see “jidohka.”
Availability: The ability of a product to be in a state to perform its designated function under stated conditions at a given time.
Average chart: A control chart in which the subgroup average, X-bar, is used to evaluate the stability of the process level.
Average outgoing quality (AOQ): The expected average quality level of an outgoing product for a given value of incoming product quality.
Average outgoing quality limit (AOQL): The maximum average outgoing quality over all possible levels of incoming quality for a given acceptance sampling plan and disposal specification.
Average run lengths (ARL): On a control chart, the number of subgroups expected to be inspected before a shift in magnitude takes place.
Average sample number (ASN): The average number of sample units inspected per lot when reaching decisions to accept or reject.
Average total inspection (ATI): The average number of units inspected per lot, including all units in rejected lots (applicable when the procedure calls for 100% inspection of rejected lots). (www.asq.org)
Balanced Scorecard: A framework which translates a company’s vision and strategy into a coherent set of performance measures. Developed by Robert Kaplan and David Norton (published in the Harvard Business Review in 1993), a balanced business scorecard helps businesses evaluate how well they meet their strategic objectives. It typically has four to six components, each with a series of sub-measures. Each component highlights one aspect of the business. The balanced scorecard includes measures of performance that are lagging (return on capital, profit), medium-term indicators (like customer satisfaction indices) and leading indicators (such as adoption rates for, or revenue from, new products).
Baldrige Award: Malcolm Baldridge National Quality Award: An annual award given to a United States company that excels in quality management and quality achievement. [Same as MBNA.]
Bar chart: A chart that compares different groups of data to each other through the use of bars that represent each group. Bar charts can be simple, in which each group of data consists of a single type of data, or grouped or stacked, in which the groups of data are broken down into internal categories. representation.
Baseline: A specification or product that has been formally reviewed and agreed upon, that serves as the basis for further development, and that can be changed only through formal change control procedures.
Batch: A definite quantity of some product or material produced under conditions that are considered uniform.
Batch processing: Execution of programs serially with no interactive processing. Contrast with real time processing.
Benchmark: A standard against which measurements or comparisons can be made.
Benchmark Data: The results of an investigation to determine how competitors and/or best in class companies achieve their level of performance.
Benchmarking: A structured approach for identifying the best practices from industry and government, and comparing and adapting them to the organization’s operations. Such an approach is aimed at identifying more efficient and effective processes for achieving intended results, and suggesting ambitious goals for program output, product/service quality, and process improvement..
Best practice – A way or method of accomplishing a business function or process that is considered to be superior to all other known methods.
BETA RISK: The probability of accepting the null hypothesis when, in reality, the alternate hypothesis is true.
Bias: A systematic error, which contributes to the difference between a population mean of measurements or test results and an accepted reference value.
Bill of Material Total list of all components/materials required to manufacture the product.
Black Belt: The leader of the team responsible for applying the Six Sigma process.
Black-box testing:(1) Testing that ignores the internal mechanism or structure of a system or component and focuses on the outputs generated in response to selected inputs and execution conditions. (2) Testing conducted to evaluate the compliance of a system or component with specified functional requirements and corresponding predicted results. Syn. functional testing, input/output driven testing. Contrast with white-box testing.
Block Diagram: The block diagram is a simple pictorial representation of a system/sub systems linked to illustrate the relationships between components/subsystems
BOM: Bill Of Material
Boundary value: (1) (IEEE) A data value that corresponds to a minimum or maximum input, internal, or output value specified for a system or component. (2) A value which lies at, or just inside or just outside a specified range of valid input and output values.
Brainstorming: A tool used to encourage creative thinking and new ideas. A group formulates and records as many ideas as possible concerning a certain subject, regardless of the content of the ideas. No discussion, evaluation, or criticism of ideas is allowed until the brainstorming session is complete.
Branch: An instruction which causes program execution to jump to a new point in the program sequence, rather than execute the next instruction. Syn: jump.
Branch analysis: (Myers) A test case identification technique which produces enough test cases such that each decision has a true and a false outcome at least once. Contrast with path analysis.
Branch coverage: (NBS) A test coverage criteria which requires that for each decision point each possible branch be executed at least once. Syn: decision coverage. Contrast with condition coverage, multiple condition coverage, path coverage, statement coverage.
Breakthrough thinking: A management technique which emphasizes the development of new, radical approaches to traditional constraints, as opposed to incremental or minor changes in thought that build on the original approach.
Bug: A fault in a program which causes the program to perform in an unintended or unanticipated manner. See: anomaly, defect, error, exception, fault.
Business process – A collection of activities that work together to produce a defined set of products and services. All business processes in an enterprise exist to fulfill the mission of the enterprise. Business processes must be related in some way to mission objectives.
Business Process Improvement (BPI) – The betterment of an organization’s business practices through the analysis of activities to reduce or eliminate non-value added activities or costs, while at the same time maintaining or improving quality, productivity, timeliness, or other strategic or business purposes as evidenced by measures of performance. Also called functional process improvement.
Business Process Reengineering (BPR): A structured approach by all or part of an enterprise to improve the value of its products and services while reducing resource requirements. The transformation of a business process to achieve significant levels of improvement in one or more performance measures relating to fitness for purpose, quality, cycle time, and cost by using the techniques of streamlining and removing added activities and costs.
Business Process: A collection of related, structured activities — a chain of events — that produces a specific service or product for a particular customer or customers.
Capability: Is the total range of inherent variation in a stable process. It is determined using data from control charts. The control charts shall indicate stability before capability calculations can be made. Histograms are to be used to examine the distribution pattern of individual values and verify a normal distribution. When analysis indicates a stable process and a normal distribution, the indices Cp and Cpk can be calculated. If analysis indicates a non normal distribution, advanced statistical tools such as PPM analysis, will be required to determine capability. If control charts show the process to be non stable, the index Ppk can be calculated.
CAR: Corrective Action Request
Care mapping: Medical procedure for a particular diagnosis in a diagrammatic form that includes key decision points used to coordinate care and instruct patient.
Cause: That which produces an effect or brings about a change.
Cause Effect diagram: A tool used to analyze all factors (causes) that contribute to a given situation or occurrence (effect) by breaking down main causes into smaller and smaller sub-causes. It is also known as the Ishikawa or the fishbone diagram.
Cause effect graphing: (1) Test data selection technique The input and output domains are partitioned into classes and analysis is performed to determine which input classes cause which effect. A minimal set of inputs is chosen which will cover the entire effect set. (2) (Myers) A systematic method of generating test cases representing combinations of conditions. See: testing, functional.
Centre Line: The line on a statistical process control chart which represents the characteristic’s central tendency.
CFT: Cross Functional Team.
Change control: The processes, authorities for, and procedures to be used for all changes that are made to the computerized system and/or the system’s data. Change control is a vital subset of the Quality Assurance [QA] program within an establishment and should be clearly described in the establishment’s SOPs, See: configuration control.
Change tracker: A software tool which documents all changes made to a program.
Characteristics: A definable or measurable feature of a process, product, or variable.
Central Tendency: Numerical average, e.g., mean, median, and mode; center line on a statistical process control chart.
CHART: A form used to display information obtained through data collection when measuring defects and/or problems.
CHARTER: A document that specifies the purpose of a team, its power, it’s reporting relationships, and its specific responsibilities.
Check sheet: A customized form used to record data. Usually, it is used to record how often some activity occurs. A list of things to do.
CIM: Computer Integrated Manufacturing
Client/server: A term used in a broad sense to describe the relationship between the receiver and the provider of a service. In the world of microcomputers, the term client-server describes a networked system where front-end applications, as the client, make service requests upon another networked system. Client-server relationships are defined primarily by software. In a local area network [LAN], the workstation is the client and the file server is the server. However, client-server systems are inherently more complex than file server systems. Two disparate programs must work in tandem, and there are many more decisions to make about separating data and processing between the client workstations and the database server. The database server encapsulates database files and indexes, restricts access, enforces security, and provides applications with a consistent interface to data via a data dictionary.
Clinical practice guidelines: A general term for statements of accepted medical procedure for a particular diagnosis.
CMI: Certified Mechanical Inspector
Code audit: An independent review of source code by a person, team, or tool to verify compliance with software design documentation and programming standards. Correctness and efficiency may also be evaluated. Contrast with code inspection, code review, code walkthrough. See: static analysis.
Code Inspection: (Myers/NBS) A manual [formal] testing [error detection] technique where the programmer reads source code, statement by statement, to a group who ask questions analyzing the program logic, analyzing the code with respect to a checklist of historically common programming errors, and analyzing its compliance with coding standards. Contrast with, code audit, code review, code walkthrough. This technique can also be applied to other software and configuration items. Syn: Fagan Inspection. See: static analysis.
Code program, source code.
Code Review: (IEEE) A meeting at which software code is presented to project personnel, managers, users, customers, or other interested parties for comment or approval. Contrast with code audit, code inspection, code walkthrough. See: static analysis.
Code Walkthrough: (MyersINBS) A manual testing [error detection] technique where program (source code] logic [structure] is traced manually [mentally] by a group with a small set of test cases, while the state of program variables is manually monitored, to analyze the programmer’s logic and assumptions. Contrast with code audit, code inspection, code review. See: static analysis.
Coding Standards: Written procedures describing coding [programming] style conventions specifying rules governing the use of individual constructs provided by the programming language, and naming, formatting, and documentation requirements which prevent programming errors, control complexity and promote understandability of the source code. Syn: development standards, programming standards.
Common Cause Variation is variation caused by the process. It is produced by the interaction of aspects of the process that affect every occurrence
Common Cause Variation: that affects all the individual values of a process Common causes Inherent causes of variation in a process. They are typical of the process, not unexpected. That is not to say that they must be tolerated; on the contrary, once special causes of variation are largely removed, a focus on removing common causes of variation can pay big dividends.
Comparitor: (IEEE) A software tool that compares two computer programs, files, or sets of data to identify commonalities or differences. Typical objects of comparison are similar versions of source code, object code, data base files, or test results.
Completeness: (NIST) The property that all necessary parts of the entity are included. Completeness of a product is often used to express the fact that all requirements have been met by the product. See: traceability analysis.
Complexity: (IEEE) (1) The degree to which a system or component has a design or implementation that is difficult to understand and verify. (2) Pertaining to any of a set of structure based metrics that measure the attribute in (1).
Computer Aided Software Engineering: (CASE)An automated system for the support of software development including an integrated tool set, i.e., programs, which facilitate the accomplishment of software engineering methods and tasks such as project planning and estimation, system and software requirements analysis, design of data structure, program architecture and algorithm procedure, coding, testing and maintenance.
Computer System Audit: (ISO) An examination of the procedures used in a computer system to evaluate their effectiveness and correctness and to recommend improvements. See: software audit.
Computer System Security:(IEEE) The protection of computer hardware and software from accidental or malicious access, use, modification, destruction, or disclosure. Security also pertains to personnel, data, communications, and the physical protection of computer installations.
Confidence Level: The probability that a random variable x lies within a defined interval.
Confidence Limit: The two values that define the confidence interval.
Configurable: Off-the-shelf software (COTS)Application software, sometimes general purpose, written for a variety of industries or users in a manner that permits users to modify the program to meet their individual needs.
Configuration control: (IEEE) An element of configuration management, consisting of the evaluation, coordination, approval or disapproval, and implementation of changes to configuration items after formal establishment of their configuration identification. See: change control.
Configuration Management: (IEEE) A discipline applying technical and administrative direction and surveillance to identify and document the functional and physical characteristics of a configuration item, control changes to those characteristics, record and report change processing and implementation status, and verifying compliance with specified requirements. See: configuration control, change control
Conformance: Meeting requirements or specifications.
Confouding: Allowing two or more variables to vary together so that it is impossible to separate their unique effects.
Consensus: Acceptance of a team decision so that everyone on the team can live with the decision and support it.
Consensus Method: used in reaching unanimous agreement by voluntarily giving consent. An agreement to support a decision.
consistency checker A software tool used to test requirements in design specifications for both consistency and completeness.
Consistency: (IEEE) The degree of uniformity, standardization, and freedom from contradiction among the documents or parts of a system or component.
Consumers Risk: Probability of accepting a lot when, in fact, the lot should have been rejected (see BETA RISK).
Continuous Data: Numerical information at the interval of ratio level; subdivision is conceptually meaningful; can assume any number within an interval, e.g., 14.652 amps.
Continuous improvement: On-going improvement of any and all aspects of an organization including products, services, communications, environment, functions, individual processes, etc.
Continuous Process Improvement: A policy that encourages, mandates, and/or empowers employees to find ways to improve process and product performance measures on an ongoing basis.
Continuous Random Variable: A random variable which can assume any value continuously in some specified interval.
Control Chart:is a line chart with control limits. It is based on the work of Shewhart and Deming. By mathematically constructing control limits at 3 standard deviations above and below the average, one can determine what variation is due to normal ongoing causes (common causes) and what variation is produced by unique events (special causes). By eliminating the special causes first and then reducing common causes, quality can be improved.
Control Charts: Statistical charts used in process measurement. Used to differentiate process variation caused by common cause versus special cause or assignable cause.
Control flow analysis: (IEEE) A software V&V task to ensure that the proposed control flow is free of problems, such as design or code elements that are unreachable or incorrect.
Control flow diagram: (IEEE) A diagram that depicts the set of all possible sequences in which operations may be performed during the execution of a system or program. Types include box diagram, flowchart, input-process-output chart, state diagram. Contrast with data flow diagram. See: call graph, structure chart.
Control limit: A statistically-determined line on a control chart used to analyze variation within a process. If variation exceeds the control limits, then the process is being affected by special causes and is said to be “out of control. A control limit is not the same as a specification limit.
Control Plans: Control Plans are written descriptions of the systems for controlling parts and processes. They are written by suppliers to address the important characteristics and engineering requirements of the product. Each part shall have a Control Plan, but in many cases, “family” Control Plans can cover a number of parts produced using a common process. Customer approval of Control Plans may be required prior to production part submission.
Control Plans: Written descriptions of the systems for controlling parts and processes.
Control Point: is the desired result of a process.
Control Specifications: Specifications called for by the product being manufactured.
Corrective Action: Documented and purposeful change implemented to eliminate forever a specific cause of an identified non conformance.
Corrective Action: Action(s) designed to identify and eliminate root causes of non-conformances and non-conformities.
Corrective Action Plan: A Corrective Action Plan is a plan for correcting a process or part quality issue.
Corrective Maintenance: (IEEE) Maintenance performed to correct faults in hardware or software. Contrast with adaptive maintenance, preventative maintenance.
Correctness: (IEEE) The degree to which software is free from faults in its specification, design and coding. The degree to which software, documentation and other items meet specified requirements. The degree to which software, documentation and other items meet user needs and expectations, whether specified or not.
Cost of Poor Quality: Internal and External Failure Cost plus Appraisal and Prevention Costs
Cost of poor quality: The costs incurred by producing products or services of poor quality. These costs usually include the cost of inspection, rework, duplicate work, scrapping rejects, replacements and refunds, complaints, and loss of customers and reputation.
Cost of Quality: The total labor, materials, and overhead costs attributed to: 1) preventing nonconforming products products or services, 2) appraising products or service to ensure conformance, or 3) correcting or scrapping nonconforming products products or service.
Count Chart: (c chart) An attributes data control chart that evaluates process stability by charting the counts of occurrences of a given event in successive samples.
Count-per-unit Chart: (u chart) A control chart that evaluates process stability by charting the number of occurrences of a given event per unit sampled, in a series of samples.
Coverage Analysis: (NIST) Determining and assessing measures associated with the invocation of program structural elements to determine the adequacy of a test run. Coverage analysis is useful when attempting to execute each statement, branch, path, or iterative structure in a program. Tools that capture this data and provide reports summarizing relevant information have this feature See: testing, branch; testing, path; testing, statement.
Cp: Commonly used process capability index defined as [USL (upper spec limit) – LSL(lower spec limit)] / [6 x sigma], where sigma is the estimated process standard deviation.
Cp/Cp:k Capability Ratio/Capability Index
Cpk: Commonly used process capability index defined as the lesser of USL – m / 3sigma or m – LSL / 3sigma, where sigma is the estimated process standard deviation.
CQRE: PIQC’s Certified Quality Engineering Professional
CHRP: PIQC’s Certified Human Resources Professional
CPM: Critical Path Method
CQA: ASQ Certified Quality Auditor
CQE: ASQ Certified Quality Engineer
CQM: ASQ Certified Quality Manager
CQPPIQC’s Certified Quality Professional
CQSP PIQC’s Certified Software Quality Professional
CQT ASQ Certified Quality Technician
Crash (IEEE) The sudden and complete failure of a computer system or component.
CRE ASQ Certified Reliability Engineer
Critical Characteristics Critical Characteristics are those product requirements (dimensions, performance tests) or process parameters that can affect compliance with government regulations of safe vehicle/product function and which require specific supplier, assembly, shipping, or monitoring and inclusion on Control Plans. Critical characteristics are identified with the inverted delta symbol.
Critical control point (CA) A function or an area in a manufacturing process or procedure, the failure of which, or loss of control over, may have an adverse affect on the quality of the finished product and may result in a unacceptable health risk.
Critical design review(IEEE) A review conducted to verify that the detailed design of one or more configuration items satisfy specified requirements; to establish the compatibility among the configuration items and other items of equipment, facilities, software, and personnel; to assess risk areas for each configuration item; and, as applicable, to assess the results of producibility analyses, review preliminary hardware product specifications, evaluate preliminary test planning, and evaluate the adequacy of preliminary operation and support documents. See: preliminary design review, system design review.
Criticality analysis. (IEEE) Analysis which identifies all software requirements that have safety implications, and assigns a criticality level to each safety-critical requirement based upon the estimated risk.
Criticality (IEEE) The degree of impact that a requirement, module, error, fault, failure, or other item has on the development or operation of a system. Syn: severity.
Cumulative sum chart Control chart that shows the cumulative sum of deviations from a set value in successive samples. Each plotted point indicates the algebraic sum of the last point and all deviations since.
CUSTOMER & SUPPLIER REQUIREMENTS WORKSHEET An information gathering tool to use with any work activity. It breaks down a job into its component parts: Customer Requirements and Supplier Requirements.
Customer Satisfaction Index (American)Introduced in 1994 by University of Michigan and American Society for Quality CSI measures customer satisfaction at national level. CSI has been on a continual decline from 1994 through 1997 suggesting that quality improvements are not keeping pace with consumer expectations.
Customer The receiver of an output of a process, either internal or external to the organization. Can be a person, department, company, etc.
CUTOFF POINT The point which partitions the acceptance region from the reject region.
Cycle time: The time that elapses from the beginning to the end of a process or sub-process.
Cyclic redundancy [check] code (CRC)A technique for error detection in data communications used to assure a program or data file has been accurately transferred. The CRC is the result of a calculation on the set of transmitted bits by the transmitter which is appended to the data. At the receiver the calculation is repeated and the results compared to the encoded value. The calculations are chosen to optimize error detection. Contrast with check summation, parity check.
Cyclomatic complexity (1) (McCabe) The number of independent paths through a program. (2) (NBS) The cyclomatic complexity of a program is equivalent to the number of decision statements plus
DATA: Factual information used as a basis for reasoning, discussion, or calculation; often refers to quantitative information.
Data Analysis: (IEEE)(1) Evaluation of the description and intended use of each data item in the software design to ensure the structure and intended use will not result in a hazard. Data structures are assessed for data dependencies that circumvent isolation, partitioning, data aliasing, and fault containment issues affecting safety, and the control or mitigation of hazards. (2) Evaluation of the data structure and usage in the code to ensure each is defined and used properly by the program. Usually performed in conjunction with logic analysis.
Data Collection: Gathering facts on how a process works and/or how a process is working from a customer’s point of view. All data collection is driven by a knowledge of the process and guided by statistical principles.
Data Corruption: (ISO) A violation of data integrity. Syn: data contamination.
Effectiveness The state of having produced a decided or desired effect; the state of achieving customer satisfaction
Efficiency A measure of performance that compares output with cost or resource utilization
Embedded Software (IEEE) Software that is part of a larger system and performs some of the requirements of that system; e,g., software used in an aircraft or rapid transit system. Such software does not provide an interface with the user. See: firmware,
Employee involvement Regular participation of employees in decision-making and suggestions. The driving forces behind increasing the involvement of employees are the conviction that more brains are better, that people in the process know it best, and that involved employees will be more motivated to do what is best for the organization.
Empowerment Usually refers to giving employees decision-making and problem-solving authority within their jobs.
End user (ANSI) (1) A person, device, program, or computer system that uses an information system for the purpose of data processing in information exchange. (2) A person whose occupation requires the use of an information system but does not require any knowledge of computers or computer programming. See: user.
Entity relationship diagram (IEEE) A diagram that depicts a set of real-world entities and the logical relationships among them. See: data structure diagram.
Entity The representation of a set of real or abstract things (people, objects, places, events, ideas, combination of things, etc.) that are recognized as the same type because they share the same characteristics and can participate in the same relationships.
Environment all of the process conditions surrounding or affecting the manufacture and quality of a part or product.
Environment (ANSI) (1) Everything that supports a system or the performance of a function. (2) The conditions that affect the performance of a system or function.
Equivalence class partitioning (Myers) Partitioning the input domain of a program into a finite number of classes [sets], to identify a minimal set of well selected test cases to represent these classes. There are two types of input equivalence classes, valid and invalid.
Error (ISO) A discrepancy between a computed, observed, or measured value or condition and the true, specified, or theoretically correct value or condition. See: anomaly, bug, defect, exception, fault.
Error analysis See: debugging, failure analysis.
Error detection Techniques used to identify errors in data transfers. See: check summation, cyclic redundancy check [CRC], parity check, longitudinal redundancy.
Error guessing (NBS) Test data selection technique. The selection is to pick values that are likely to cause errors.
Error seeding (IEEE) The process of intentionally adding known faults to those already in a computer program for the purpose of monitoring the rate of detection and removal, and estimating the number of faults remaining in the program. Contrast with mutation analysis.
Event A happening, the arrival of a significant point in time, a change in status of something or the occurrence of something external that causes the business to react.
Event table A table which lists events and the corresponding specified effect[s] of or reaction[s] to each event.
Exception conditions/responses table A special type of event table.
Exception (IEEE) An event that causes suspension of normal program execution. Types include addressing exception, data exception, operation exception, overflow exception, protection exception, underflow exception.
Execution trace (IEEE) A record of the sequence of instructions executed during the execution of a computer program. Often takes the form of a list of code labels encountered as the program executes. Syn: code trace, control flow trace. See: retrospective trace, subroutine trace, symbolic trace, variable trace.
EXECUTIVE OVERVIEW The course that teaches key executives their role in the Quality Process.
Expectations Customer perceptions about how a product or service will meet their needs and requirements; expectations for a product or service are shaped by many factors; including the specific use the customer intends to make of it, prior experience with a similar product or service and representations and commitments made by marketing and advertising.
EXPERIMENT A test under defined conditions to determine an unknown effect; to illustrate or verify a known law; to test or establish a hypothesis.
EXPERIMENTAL ERROR Variation in observations made under identical test conditions. Also called residual error. The amount of variation which cannot be attributed to the variables included in the experiment.
External customer A person or organization outside your organization who receives the output of a process. Of all external customers, the end-user should be the most important.
External test data (NBS) Test data that is at the extreme or boundary of the domain of an input variable or which produces results at the boundary of an output domain.
Facilitator Person who helps a team with issues of teamwork, communication, and problem-solving. A facilitator should not contribute to the actual content of the team’s project, focusing instead as an observer of the team’s functioning as a group.
Factorial Design Factorial design are generally employed in engineering and manufacturing experiments. It is appropriate when several factors are to be investigated at two or more levels and interaction of factors may be important. Also see Design of Experiments.
FACTORS Independent variables.
Fail-safe (IEEE) A system or component that automatically places itself in a safe operational mode in the event of a failure,
Failure analysis Determining the exact nature and location of a program error in order to fix the error, to identify and fix other similar errors, and to initiate corrective action to prevent future occurrences of this type of error. Contrast with debugging.
Failure Mode Effects Analysis A technique that systematically analyzes the types of failures which will be expected as a product is used, and what the effects of each “failure mode” will be.
Failure Modes and Effects Analysis (IEC) A method of reliability analysis intended to identify failures, at the basic component level, which have significant consequences affecting the system performance in the application considered.
Failure Modes and Effects Criticality Analysis (IEC) A logical extension of FMEA which analyzes the severity of the consequences of failure
Failure (IEEE) The inability of a system or component to perform its required functions within specified performance requirements. See: bug, crash, exception, fault.
Fault An incorrect step, process, or data definition in a computer program which causes the program to perform in an unintended or unanticipated manner. See: anomaly, bug, defect, error, exception.
Fault Tree Analysis (IEC) The identification and analysis of conditions and factors which cause or contribute to the occurrence of a defined undesirable event, usually one which significantly affects system performance, economy, safety or other required characteristics.
FEA Finite Element Analysis
Feasibility study Analysis of the known or anticipated need for a product, system, or component to assess the degree to which the requirements, designs, or plans can be implemented.
Finite Element Analysis A technique for modeling a complex structure. When the mathematical model is subjected to known loads, the displacement of the structure may be determined.
FISHBONE DIAGRAM Also known as a Cause and Effect Analysis Diagram, used by a problem solving team during brainstorming to logically list and display known and potential causes to a problem. Analysis of the listed causes is done to identify root causes.
Fixed Cost A cost that does not vary with the amount or degree of production. The costs that remain if an activity or process stops.
FIXED EFFECTS MODEL Experimental treatments are specifically selected by the researcher. Conclusions only apply to the factor levels considered in the analysis. Inferences are restricted to the experimental levels.
FIXING Temporary actions taken to make the output of a process conform to its specifications.
Flow Chart is a pictorial representation showing all of the steps of a process.
Flowchart A graphical representation of a given process delineating each step. It is used to diagram how the process actually functions and where waste, error, and frustration enter the process.
Flowchart lists the order of activities. The circle symbol indicates the beginning or end of the process. The box indicates action items and the diamond indicates decision points. A beneficial technique is to map the ideal process and the actual process and identify the differences as targets for improvements.
Flowchart or flow diagram (2) (ISO) A graphical representation in which symbols are used to represent such things as operations, data, flow direction, and equipment, for the definition, analysis, or solution of a problem. (2) (IEEE) A control flow diagram in which suitably annotated geometrical figures are used to represent operations, data, or equipment, and arrows are used to indicate the sequential flow from one to another. Syn: flow diagram. See: block diagram, box diagram, bubble chart, graph, input-process-output chart, structure chart.
FLUCTUATIONS Variances in data, which are caused by a large number of, minute variations or differences
FMA Failure Mode Analysis.
FMEA Failure Mode and Effects Analysis
FMEA Failure Mode Effects Analysis: An analytical technique used to assure that potential failure modes and associated causes have been considered and addressed.
Force Field Analysis A tool, developed by social psychologist Kurt Lewin, which is used to analyze the opposing forces involved in causing/resisting any change. It is shown in balance sheet format with forces that will help (driving forces) listed on the left and forces that hinder (restraining forces) listed on the right.
Formal qualification review (IEEE) The test, inspection, or analytical process by which a group of configuration items comprising a system is verified to have met specific contractual performance requirements. Contrast with code review, design review, requirements review, test readiness review.
FREQUENCY DISTRIBUTION The pattern or shape formed by the group of measurements in a distribution
Frequency distribution An organization of data, usually in a chart, which depicts how often an different events occur. A histogram is one common type of frequency distribution, and a frequency polygon is another.
FTC First Time Capability.
Function A specific set of skills and resources that can be used to perform one or more activities that make up a process. Usually several functions are associated with a single process.
Functional configuration audit (IEEE) An audit conducted to verify that the development of a configuration item has been completed satisfactorily, that the item has achieved the performance and functional characteristics specified in the functional or allocated configuration identification, and that its operational and support documents are complete and satisfactory. See: physical configuration audit.
Functional design (IEEE) (1) The process of defining the working relationships among the components of a system. See: architectural design. (2) The result of the process in (1).
Functional Economic Analysis (FEA) A technique for analyzing and evaluating alternative information system investments and management practices. Within DoD, FEA is a business case. Also, a document that contains a fully justified proposed improvement project with all supporting data.
Functional Process Improvement A structured approach by all or part of an enterprise to improve the value of its products and services while reducing resource requirements. Also referred to as business process improvement (BPI), business process redesign, and business reengineering.
Functional requirement (IEEE) A requirement that specifies a function that a system or system component must be able to perform.
Functional Verification Functional Verification is testing to ensure the part conforms to all customer and supplier engineering performance and material requirements. Functional verification (to applicable customer engineering material and performance standards) may be required by some customers annually unless another frequency is established in a customer approval control plan. Results shall be available for customer review upon request.
Gage R&R Gage Repeatability & Reproducibility
Gantt chart A bar chart that shows planned work and finished work in relation to time. Each task in a list has a bar corresponding to it. The length of the bar is used to indicate the expected or actual duration of the task.
Grade An indicator of category or rank related to features or characteristics that cover different sets of needs for products or services intended for the same functional use.
Graph (IEEE) A diagram or other representation consisting of a finite set of nodes and internode connections called edges or arcs. Contrast with blueprint. See: block diagram, box diagram, bubble chart, call graph, cause-effect graph, control flow diagram, data flow diagram, directed graph, flowchart, input-process-output chart, structure chart, transaction flowgraph.
Graphic software specifications Documents such as charts, diagrams, graphs which depict program structure,, states of data, control, transaction flow, HIPO, and cause-effect relationships; and tables including truth, decision, event, state-transition, module interface, exception conditions/responses necessary to establish design integrity.
Green Belt An individual who supports the implementation and application of Six Sigma tools by way of participation on project teams.
Hazard analysis A technique used to identify conceivable failures affecting system performance, human safety or other required characteristics. See: FMEA, FMECA, FTA, software hazard analysis, software safety requirements analysis, software safety design analysis, software safety code analysis, software safety test analysis, software safety change analysis.
Hazard probability (DOD) The aggregate probability of occurrence of the individual events that create a specific hazard.
Hazard severity (DOD) An assessment of the consequence of the worst credible mishap that could be caused by a specific hazard.
Hazard (DOD) A condition that is prerequisite to a mishap.
Histogram A specialized bar chart showing the distribution of measurement data. It will pictorially reveal the amount and type of variation within a process. It is a bar chart showing a distribution of variables. An example would be to line up by height a group of people in a course. Normally one would be the tallest and one would be the shortest and there would be a cluster of people around an average height. Hence the phrase “normal distribution”. This tool helps identify the cause of problems in a process by the shape of the distribution as well as the width of the distribution.
Homogenity of Variance The variances of the groups being contrasted are equal (as defined by statistical test of significant difference).
Hoshin kanri Japanese term for hoshin planning, a form of interactive strategic planning which aids the flow of information up and down the organizational layers in a systematic, productive way.
Hoshin planning A method of strategic planning for quality. It helps executives integrate quality improvement into the organization’s long-range plan. It is a method used to ensure that the mission, vision, goals, and annual objectives of an organization are communicated to and implemented by everyone, from the executive level to the ‘front line’ level.
Inspection A manual testing technique in which program documents [specifications (requirements, design, source code or user’s manuals are examined in a very formal and disciplined manner to discover errors, violations of standards and other problems. Checklists are a typical vehicle used in accomplishing this technique. See: static analysis, code audit, code inspection, code review, code walkthrough.
Inspection Activities, such as measuring, examining, testing, gaging one or more characteristics of a product or service, and comparing these with specified requirements to determine conformity.
Instability Unnaturally large fluctuations in a pattern.
installation and checkout phase (IEEE) The period of time in the software life cycle during which a software product is integrated into its operational environment and tested in this environment to ensure that it performs as required.
Instruction set (1) (IEEE) The complete set of instructions recognized by a given computer or provided by a given programming language. (2) (ISO) The set of the instructions of a computer, of a programming language, or of the programming languages in a programming system. See: computer instruction set.
Instruction (1) (ANSI/IEEE) A program statement that causes a computer to perform a particular operation or set of operations. (2) (ISO) In a programming language, a meaningful expression that specifies one operation and identifies its operands, if any.
Instrumentation (NIBS) The insertion of additional code into a program in order to collect information about program behavior during program execution. Useful for dynamic analysis techniques such as assertion checking, coverage analysis, tuning.
Interface analysis (IEEE) Evaluation of: (1) software requirements specifications with hardware, user, operator, and software interface requirements documentation, (2) software design description records with hardware, operator, and software interface requirements specifications, (3) source code with hardware, operator, and software interface design documentation, for correctness, consistency, completeness, accuracy, and readability. Entities to evaluate include data items and control items.
Interface requirement (IEEE) A requirement that specifies an external item with which a system or system component must interact, or sets forth constraints on formats, timing, or other factors caused by such an interaction.
Interface (1) (ISO) A shared boundary between two functional units, defined by functional characteristics, common physical interconnection characteristics, signal characteristics, and other characteristics, as appropriate The concept involves the specification of the connection of two devices having different functions. (2) A point of communication between two or more processes, persons, or other physical entities. (3) A peripheral device which permits two or more devices to communicate.
Interim Approval Permits shipment of products for a specified time period or quantity.
Internal customer Someone within your organization, further downstream in a process, who receives the output of your work.
Interrelations Digraph is a graphical representation of all the factors in a complicated problem, system, or situation. It is typically used in conjunction with one of the other quality tools, particularly the affinity diagram. Frequently the header cards from the affinity diagram are used as the starting point for the interrelations digraph.
Interval Numeric categories with equal units of measure but no absolute zero point, i.e., quality scale or index.
Invalid inputs 1 (NBS) Test data that lie outside the domain of the function the program represents. (2) These are not only inputs outside the valid range for data to be input, i.e., when the specified input range is 50 to 100, but also unexpected inputs, especially when these unexpected inputs may easily occur; e,g., the entry of alpha characters or special keyboard characters when only numeric data is valid, or the input of abnormal command sequences to a program.
Ishikawa Diagram A problem-solving tool that uses a graphic description of the various process elements to analyze potential sources of variation , or problems.
Ishikawa, Kaoru One of Japan’s quality control pioneers. He developed the cause & effect diagram (Ishikawa diagram) in 1943 and published many books addressing quality control. In addition to his work at Kawasaki, Ishikawa was a long-standing member of the Union of Japanese Scientists and Engineers and an assistant professor at the University of Tokyo.
ISIR Initial Sample Inspection Report
ISO 9000 A family of ISO standards that apply to quality management and quality assurance Specifically.
Jidohka: Stopping a line automatically when a defective part is detected. Any necessary improvements can then be made by directing attention to the stopped equipment and the worker who stopped the operation. The jidohka system puts faith in the worker as a thinker and allows all workers the right to stop the line on which they are working. Also see “autonomation.”
JIS Q 9100: An international quality management standard for the aerospace industry. Also see AS9100.
Job Instruction: Quality system documentation that describes work conducted in one function in a company, such as setup, inspection, rework or operator.
Joint Commission: A U.S. healthcare accreditation body; formerly known as Joint Commission for the Accreditation of Healthcare Organizations.
Judgment Inspection: A form of inspection to determine nonconforming product. Also see “informative inspection.”
Juran Trilogy: Three managerial processes identified by Joseph M. Juran for use in managing for quality: quality planning, quality control and quality improvement.
Just-in-time (JIT) Manufacturing: An optimal material requirement planning system for a manufacturing process in which there is little or no manufacturing material inventory on hand at the manufacturing site and little or no incoming inspection.
Just-in-time (JIT) Training: The provision of training only when it is needed to all but eliminate the loss of knowledge and skill caused by a lag between training and use.
Kanban: A Japanese term for one of the primary tools of a justin- time system. It maintains an orderly and efficient flow of materials throughout the entire manufacturing process. It is usually a printed card that contains specific information such as part name, description and quantity.
Key performance indicator (KPI): A statistical measure of how well an organization is doing in a particular area. A KPI could measure a company’s financial performance or how it is holding up against customer requirements.
Key process: A major system level process that supports the mission and satisfies major consumer requirements.
Key product characteristic: A product characteristic that can affect safety or compliance with regulations, fit, function, performance or subsequent processing of product.
Key process characteristic: A process parameter that can affect safety or compliance with regulations, fit, function, performance or subsequent processing of product.
Key results area: Customer requirements that are critical for the organization’s success.
Kitting: A process in which assemblers are supplied with kits—a box of parts, fittings and tools—for each task they perform. This eliminates time consuming trips from one parts bin, tool crib or supply center to another to get necessary materials.
Kruskal-Wallis test: A nonparametric test to compare three or more samples. It tests the null hypothesis that all populations have identical distribution functions against the alternative hypothesis that at least one of the samples differs only with respect to location (median), if at all. It is the analogue to the F-test used in analysis of variance. While analysis of variance tests depend on the assumption that all populations under comparison are normally distributed, the Kruskal-Wallis test places no such restriction on the comparison. It is a logical extension of the Wilcoxon Mann- Whitney Test (see listing).
Laboratory: A test facility that can include chemical, metallurgical, dimensional, physical, electrical and reliability testing or test validation.
Laboratory scope: A record containing the specific tests, evaluations and calibrations a laboratory has the ability and competency to perform, the list of equipment it uses, and a list of the methods and standards to which it adheres to each of these.
Last off part comparison: A comparison of the last part off a production run with a part off the next production run to verify that the quality level is equivalent.
Layout inspection: The complete measurement of all dimensions shown on a design record.
Lead time: The total time a customer must wait to receive a product after placing an order.
Leadership: An essential part of a quality improvement effort. Organization leaders must establish a vision, communicate that vision to those in the organization and provide the tools and knowledge necessary to accomplish the vision.
Lean: Producing the maximum sellable products or services at the lowest operational cost while optimizing inventory levels.
Lean enterprise: A manufacturing company organized to eliminate all unproductive effort and unnecessary investment, both on the shop floor and in office functions.
Lean manufacturing/production: An initiative focused on eliminating all waste in manufacturing processes. Principles of lean manufacturing include zero waiting time, zero inventory, scheduling (internal customer pull instead of push system), batch to flow (cut batch sizes), line balancing and cutting actual process times. The production systems are characterized by optimum automation, just-in-time supplier delivery disciplines, quick changeover times, high levels of quality and continuous improvement.
Lean migration: The journey from traditional manufacturing methods to one in which all forms of waste are systematically eliminated.
Level loading: A technique for balancing production throughput over time. Life cycle stages: Design, manufacturing, assembly, installation, operation and shutdown periods of product development
Line balancing: A process in which work elements are evenly distributed and staffing is balanced to meet takt time (see listing).
Listening post: An individual who, by virtue of his or her potential for having contact with customers, is designated to collect, document and transmit pertinent feedback to a central collection authority in the organization.
Load-load: A method of conducting single-piece flow in which the operator proceeds from machine to machine, taking the part from one machine and loading it into the next. The lines allow different parts of a production process to be completed by one operator, eliminating the need to move around large batches of work-in-progress inventory.
Lost customer analysis: Analysis conducted to determine why a customer or a class of customers was lost.
Lot: A defined quantity of product accumulated under conditions considered uniform for sampling purposes.
Lot, batch: A definite quantity of some product manufactured under conditions of production that are considered uniform.
Lot quality: The value of percentage defective or of defects per hundred units in a lot.
Lot size (also referred to as N): The number of units in a lot.
Lot tolerance percentage defective (LTPD): Expressed in percentage defective, the poorest quality in an individual lot that should be accepted. Note: LTPD is used as a basis for some inspection systems and is commonly associated with a small consumer risk.
Lower control limit (LCL): Control limit for points below the central line in a control chart.
Maintainability: The probability that a given maintenance action for an item under given usage conditions can be performed within a stated time interval when the maintenance is performed under stated conditions using stated procedures and resources.
Maintainability has two categories: serviceability (the ease of conducting scheduled inspections and servicing) and repairability (the ease of restoring service after a failure).
Malcolm Baldrige National Quality Award (MBNQA): An award established by the U.S. Congress in 1987 to raise awareness of quality management and recognize U.S. companies that have implemented successful quality management systems. Awards can be given annually in six categories: manufacturing, service, small business, education, healthcare and nonprofit. The award is named after the late Secretary of Commerce Malcolm Baldrige, a proponent of quality management. The U.S. Commerce Department’s National Institute of Standards and Technology manages the award, and ASQ administers it.
Management review: A periodic management meeting to review the status and effectiveness of the organization’s quality management system.
Manager: An individual charged with managing resources and processes.
Manufacturing resource planning (MRP II): Material requirements planning (see listing), plus capacity planning and finance, interface to translate operational planning into financial terms and into a simulation tool to assess alternative production plans.
Mapping symbols or icons: An easy, effective way to communicate the flow of materials and information through a plant. The symbol type doesn’t matter, as long as the use is consistent from map to map. Mapping the flow helps identify constraints and potential improvement opportunities.
Master Black Belt (MBB): Six Sigma or quality expert responsible for strategic implementations in an organization. An MBB is qualified to teach other Six Sigma facilitators the methods, tools and applications in all functions and levels of the company and is a resource for using statistical process control in processes.
Material handling: Methods, equipment and systems for conveying materials to various machines and processing areas and for transferring finished parts to assembly, packaging and shipping areas.
Material requirements planning (MRP): A computerized system typically used to determine the quantity and timing requirements for production and delivery of items to both customers and suppliers. Using MRP to schedule production at various processes will result in push production because any predetermined schedule is an estimate only of what the next process will actually need.
Matrix: A planning tool for displaying the relationships among various data sets.
Mean: A measure of central tendency; the arithmetic average of all measurements in a data set.
Mean time between failures (MTBF): The average time interval between failures for repairable product for a defined unit of measure; for example, operating hours, cycles and miles.
Measure: The criteria, metric or means to which a comparison is made with output.
Measurement: The act or process of quantitatively comparing results with requirements.
Measurement system: All operations, procedures, devices and other equipment or personnel used to assign a value to the characteristic being measured.
Measurement uncertainty: The result of random effects and imperfect correction of systemic effects in obtaining a measurement value that results in variation from the actual true value; also known as measurement error.
Median: The middle number or center value of a set of data in which all the data are arranged in sequence.
Metric: A standard for measurement.
Metrology: The science of weights and measures or of measurement; a system of weights and measures.
MIL-Q-9858A: A military standard that describes quality program requirements.
MIL-STD-45662A: A military standard that describes the requirements for creating and maintaining a calibration system for measurement and test equipment.
MIL-STD-105E: A military standard that describes the sampling procedures and tables for inspection by attributes.
Mission: An organization’s purpose.
Mistake proofing: Use of production or design features to prevent the manufacture or passing downstream a nonconforming product; also known as “error proofing.”
Mode: The value occurring most frequently in a data set.
Monument: Any design, scheduling or production technology with scale requirements that call for designs, orders and products to be brought to the machine to wait in line for processing. The opposite of a right sized (see listing) machine.
Muda: Japanese for waste; any activity that consumes resources but creates no value for the customer.
Multivariate control chart: A control chart for evaluating the stability of a process in terms of the levels of two or more variables or characteristics.
Mutual recognition agreement (MRA): A formal agreement providing reciprocal recognition of the validity of other organizations’ deliverables, typically found in voluntary standards and conformity assessment groups.
Myers-Briggs type indicator (MBTI): A method and instrument for identifying an individual’s personality type based on Carl Jung’s theory of personality preferences.
N: The number of units in a population.
n: The number of units in a sample.
Nagara system: Smooth production flow, ideally one piece at a time, characterized by synchronization (balancing) of production processes and maximum use of available time; includes overlapping of operations where practical. A nagara production system is one in which seemingly unrelated tasks can be produced simultaneously by the same operator.
National Institute of Standards and Technology (NIST): An agency of the U.S. Department of Commerce that develops and promotes measurements, standards and technology, and manages the Malcolm Baldrige National Quality Award.
Natural team: A team of individuals drawn from a single work group; similar to a process improvement team except that it is not cross functional in composition and it is usually permanent.
Next operation as customer: The concept of internal customers in which every operation is both a receiver and a provider.
Nominal group technique: A technique, similar to brainstorming, to generate ideas on a particular subject. Team members are asked to silently write down as many ideas as possible. Each member is then asked to share one idea, which is recorded. After all the ideas are recorded, they are discussed and prioritized by the group.
Nonconforming record (NCR): A permanent record—made in writing—for accounting and preserving the knowledge of a nonconforming condition for the purposes of documenting facts or events.
Nonconformity: The nonfulfillment of a specified requirement. Also see “blemish,” “defect” and “imperfection.”
Nondestructive testing and evaluation (NDT, NDE): Testing and evaluation methods that do not damage or destroy the product being tested.
Nonlinear parameter estimation: A method whereby the arduous and labor intensive task of multiparameter model calibration can be carried out automatically under the control of a computer.
Nonparametric tests: All tests involving ranked data (data that can be put in order). Nonparametric tests are often used in place of their parametric counterparts when certain assumptions about the underlying population are questionable. For example, when comparing two independent samples, the Wilcoxon Mann-Whitney test (see listing) does not assume that the difference between the samples is normally distributed, whereas its parametric counterpart, the two-sample t-test, does. Nonparametric tests can be, and often are, more powerful in detecting population differences when certain assumptions are not satisfied.
Nonvalue added: A term that describes a process step or function that is not required for the direct achievement of process output. This step or function is identified and examined for potential elimination. Also see “value added.”
Norm (behavioral): Expectations of how a person or persons will behave in a given situation based on established protocols, rules of conduct or accepted social practices.
Normal distribution (statistical): The charting of a data set in which most of the data points are concentrated around the average (mean), thus forming a bell shaped curve.
Number of affected units chart: A control chart for evaluating the stability of a process in terms of the total number of units in a sample in which an event of a given classification occurs.
Objective: A specific statement of a desired short-term condition or achievement; includes measurable end results to be accomplished by specific teams or individuals within time limits.
One-piece flow: The opposite of batch and queue; instead of building many products and then holding them in line for the next step in the process, products go through each step in the process one at a time, without interruption. Meant to improve quality and lower costs.
One-touch exchange of dies: The reduction of die setup to a single step. Also see “single-minute exchange of dies,” “internal setup” and “external setup.”
Operating characteristic curve (OC curve): A graph to determine the probability of accepting lots as a function of the lots’ or processes’ quality level when using various sampling plans. There are three types: type A curves, which give the probability of acceptance for an individual lot coming from finite production (will not continue in the future); type B curves, which give the probability of acceptance for lots coming from a continuous process; and type C curves, which (for a continuous sampling plan) give the long-run percentage of product accepted during the sampling phase.
Operating expenses: The money required for a system to convert inventory into throughput.
Operations: Work or steps to transform raw materials to finished product.
Original equipment manufacturer (OEM): A company that uses product components from one or more other companies to build a product that it sells under its own company name and brand. Sometimes mistakenly used to refer to the company that supplies the components.
Overall equipment effectiveness (OEE): The product of a machine’s operational availability, performance efficiency and first-pass yield.
Out-of-control process: A process in which the statistical measure being evaluated is not in a state of statistical control. In other words, the variations among the observed sampling results cannot be attributed to a constant system of chance causes. Also see “in-control process.”
Out of spec: A term that indicates a unit does not meet a given requirement or specification.
Outputs: Products, materials, services or information provided to customers (internal or external), from a process.
U chart: Count-per-unit chart.
Unit: An object for which a measurement or observation can be made; commonly used in the sense of a “unit of product,” the entity of product inspected to determine whether it is defective or nondefective.
Upper control limit (UCL): Control limit for points above the central line in a control chart.
Uptime: See “equipment availability.
Validation: The act of confirming a product or service meets the requirements for which it was intended.
Validity: The ability of a feedback instrument to measure what it was intended to measure; also, the degree to which inferences derived from measurements are meaningful.
Value added: A term used to describe activities that transform input into a customer (internal or external) usable output.
Value analysis: Analyzing the value stream to identify value added and nonvalue added activities.
Value engineering: Analyzing the components and process that create a product, with an emphasis on minimizing costs while maintaining standards required by the customer.
Value stream: All activities, both value added and nonvalue added, required to bring a product from raw material state into the hands of the customer, bring a customer requirement from order to delivery and bring a design from concept to launch. Also see “information flow” and “hoshin planning.”
Value stream loops: Segments of a value stream with boundaries broken into loops to divide future state implementation into manageable pieces.
Value stream manager: Person responsible for creating a future state map and leading door-to-door implementation of the future state for a particular product family. Makes change happen across departmental and functional boundaries.
Value stream mapping: A pencil and paper tool used in two stages. First, follow a product’s production path from beginning to end and draw a visual representation of every process in the material and information flows. Second, draw a future state map of how value should flow. The most important map is the future state map.
Values: The fundamental beliefs that drive organizational behavior and decision making.
Variable data: Measurement information. Control charts based on variable data include average (X-bar) chart, range (R) chart, and sample standard deviation (s) chart (see individual listings).
Variation: A change in data, characteristic or function caused by one of four factors: special causes, common causes, tampering or structural variation (see individual entries).
Verification: The act of determining whether products and services conform to specific requirements.
Virtual team: Remotely situated individuals affiliated with a common organization, purpose or project, who conduct their joint effort via electronic communication.
Vision: An overarching statement of the way an organization wants to be; an ideal state of being at a future point.
Visual controls: Any devices that help operators quickly and accurately gauge production status at a glance. Progress indicators and problem indicators help assemblers see when production is ahead, behind or on schedule. They allow everyone to instantly see the group’s performance and increase the sense of ownership in the area. Also see “andon board,” “kanban,” “production board,” “painted floor” and “shadow board.”
Vital few, useful many: A term Joseph M. Juran used to describe the Pareto principle, which he first defined in 1950. (The principle was used much earlier in economics and inventory control methods.) The principle suggests most effects come from relatively few causes; that is, 80% of the effects come from 20% of the possible causes. The 20% of the possible causes are referred to as the “vital few;” the remaining causes are referred to as the “useful many.” When Juran first defined this principle, he referred to the remaining causes as the “trivial many,” but realizing that no problems are trivial in quality assurance, he changed it to “useful many.” Also see “eighty-twenty (80-20).”
Voice of the customer: The expressed requirements and expectations of customers relative to products or services, as documented and disseminated to the providing organization’s members.
Voluntary standard: A standard that imposes no inherent obligation regarding its use (www.asq.org).
Waste: Any activity that consumes resources and produces no added value to the product or service a customer receives. Also known as muda.
Weighed voting: A way to prioritize a list of issues, ideas or attributes by assigning points to each item based on its relative importance.
Wilcoxon Mann-Whitney test: Used to test the null hypothesis that two populations have identical distribution functions against the alternative hypothesis that the two distribution functions differ only with respect to location (median), if at all. It does not require the assumption that the differences between the two samples are normally distributed. In many applications, it is used in place of the two sample t-test when the normality assumption is questionable. This test can also be applied when the observations in a sample of data are ranks, that is, ordinal data rather than direct measurements.
Work in process: Items between machines or equipment waiting to be processed.
Work team: See “natural team.”
Working sequence: One of three elements of standard work; refers to the sequence of operations in a single process that leads a floor worker to most efficiently produce quality goods.
World-class quality: A term used to indicate a standard of excellence: best of the best. (www.asq.org).
X-bar chart: Average chart.
Zero defects: A performance standard and method Philip B. Crosby developed; states that if people commit themselves to watching details and avoiding errors, they can move closer to the goal of zero defects