Discover millions of ebooks, audiobooks, and so much more with a free trial

Only $11.99/month after trial. Cancel anytime.

Statistics for Quality Control
Statistics for Quality Control
Statistics for Quality Control
Ebook409 pages4 hours

Statistics for Quality Control

Rating: 0 out of 5 stars

()

Read preview

About this ebook

Statistics for Quality Control is a basic level study of statistics used in a production operations setting. It is primarily intended for students entering quality control or other industrial and operations careers. It covers introductory topics such as a brief background, measurements, and graphing techniques. It then moves on to the beginning statistics necessary to understand these practices commonly found in industry and ends with coverage of quality assurance issues along with some pertinent managerial practices associated with this topic.
LanguageEnglish
Release dateMay 15, 2015
ISBN9780831193225
Statistics for Quality Control

Read more from Dan Jackson

Related to Statistics for Quality Control

Related ebooks

Technology & Engineering For You

View More

Related articles

Reviews for Statistics for Quality Control

Rating: 0 out of 5 stars
0 ratings

0 ratings0 reviews

What did you think?

Tap to rate

Review must be at least 10 words

    Book preview

    Statistics for Quality Control - Dan Jackson

    ONE

    Numbers, Measurements, and Scales

    There is no such thing as an exact measure. In theory, measurements are only estimates. For practical purposes, however, these estimates are completely accurate to serve their purpose. Suppose a manufacturer of crankshafts for car engines wants to measure the diameter of one of the journals of the crankshaft. If the surface of the journal is examined under a powerful enough microscope, they would see the surface is uneven. Even if the surface was honed and polished with the most sophisticated technology available, it would still show some level of imperfection at some level of magnification.

    In this case, how do they take a measurement? Do they average the imperfect surface? Or do they settle on the outermost surface portions, in which case, if a mechanical measurement instrument is used, how much pressure do they exert to take the measure? Once they have made contact with the surface of the journal with the measuring instrument, they have already indented that portion of the surface — regardless of how hard the metal is. If they increase the pressure, they will be measuring below the outermost dimension.

    For practical applications, this microscopic depression of material would mean nothing to the function of the journal, but it does illustrate the fact the measure is an estimate. The true question asks how precise the measuring instrument is in accommodating dimensions of a practical application.

    Suppose a micrometer is the instrument used. The micrometer is truly an example of Plato’s assertion that necessity is the mother of all inventions. It (the invention) was developed in the mid 19th century (adapted from the earlier telescope version used in astronomy) to improve the accuracy of length measurements for purposes of precision in machining (the necessity). Had machining already been precise enough, the micrometer would not have been invented. The micrometer can easily measure to the thousandth of an inch (0.001 in); those with an attached Vernier scale can get even closer. Although this measure is plenty close for the crankshaft journal and many other applications, it is still an estimate of the actual distance. Even with dial indicators, gauge blocks (a.k.a. jo-blocks), infrared, and laser measurements, it is only an estimate of the variability that exists along the surface of that journal.

    FIGURE 1.1    Poor accuracy

    Precision in the measuring instrument becomes critical when trying to reveal the variance in any measurable entity. Imagine trying to measure the variability of the journal surface on the crankshaft with a handheld one-foot ruler. Take a diameter measurement every tenth of an inch along the length of the journal and plot that number on a graph. The result is a perfectly straight line, as indicated in Figure 1.1. Do the same with a dial indicator (easily accurate to 0.0001 in) and the graphed line will start to show a variance in the measurement by moving up and down along the length, as shown in Figure 1.2. The more sophisticated measuring instrument reveals a closer measure than the ruler.

    Accuracy is another consideration. Although the terms accuracy and precision are used interchangeably, there is a difference. Accuracy asserts the measure uses a scale appropriate for the characteristic being measured whereas precision determines how closely that scale measures the characteristic. The example just given was a distance, or length measure. So the scale (an inch ruler) was representative of distance but was not precise enough to reveal thousandths of an inch. Therefore, that scale was neither precise nor accurate because the measure was not a true representation of a thousandth of an inch. Other measures such as weight, volume, volts, temperature, parts-per-million, translucency, or dielectric strength require measurement instruments representative to those measures. One wouldn’t measure temperature with a tape measure. In addition, each of these instruments must be capable of revealing a measure close enough to expose variation. Some measures become quite complicated and often times require a relative scale to describe them. These scales are considered subjective and are covered in more detail later in this chapter.

    The accuracy of any of these measuring instruments depends on their ability to truly measure what the scale and precision purport to measure. Statistics is such a measurement system and, in many ways, a scale that measures and estimates the variance in a set of data. There are both precision and accuracy considerations. Don’t let the term estimate fool you. By estimating and providing a probability of accuracy and precision, as most statistics do, statistics can come closer to the true theoretical measure than the measuring instrument itself.

    FIGURE 1.2    Adequate accuracy

    The Numbering System

    Contrary to Leopold Kroenecker’s statement God made the integers; all else is the creation of man, mathematics was in full form prior to humans, earth, and the universe. There have never been any inventions in mathematics — only discoveries and attempts to explain a natural science. This was no trivial undertaking during man’s time on earth. Followers of the Greek mathematician Pythagoras even murdered Hippasus because he discovered the existence of irrational numbers; mathematicians still struggle with inconsistencies presented in our cumulative explanation of this science. Given the mathematics used in statistics and the statistics used in industry, this theoretical realm of mathematics rarely presents problems in its practical use. Still, to have a clear understanding of scales and measures, there must first be a clear understanding of numbers.

    Numbers are different than numerals. Numbers are almost an instinctive concept to humans. Undoubtedly, because humans have five fingers on each hand and five toes on each foot, counting was initially relative to these extremities. Perhaps in trade and commerce, humans even conceptualized counting (called natural numbers) by having two hands and two feet (20) of sheep or cattle. Then the idea of representing this concept with a symbol emerged.

    The earliest evidence of this was probably the act of notching bones in Africa some 37,000 years ago. These tally sticks may represent a calendar or accumulation of game or crops. Written numbers or numerals began much later and developed alongside that of writing. The Sumerians (base 60), then Egyptians (base 10), Indians (eastern, unless otherwise noted), and Chinese developed symbols for numbers starting around 7000 years ago. Because of proximity, these systems may have borne some influence on each other. The Mayans, however, being detached from the Indo-European land mass, developed their own system using base 20. The Indians were probably the first to develop a symbol for zero, which was somewhat of a new concept at that time. Today, whole numbers include all positive and negative numbers, and zero. Many systems left out the zero, creating confusion between 492 and 4092, for example.

    Since then, humans have complicated numbers with various classifications. The following definitions will help. It is a partial list that includes the kinds of numbers relevant to statistics used in industry. Understand that there is much disagreement in the science of mathematics in how best to define these terms.

    Counting numbers or natural numbers.    These are 1, 2, 3, 4, 5...to infinity. Zero is not included because zero is a non-countable number; when people first began to count the very concept of zero was nonexistent.

    Integers.    The set of natural numbers and zero.

    Whole numbers.    The full set of numbers including natural numbers, zero, and negative numbers (no fractions) from negative infinity to positive infinity.

    Rational numbers and Irrational numbers.    A rational number is a number between two whole numbers giving a precise value. For example, 1/2 = 0.5. On the other hand, 2/3 = 0.66666... is an irrational number because it never ends. In practical use, this number terminates with a 7 at a given placement value, for example, 0.667 or 0.66667.

    Cardinal numbers.    The names of numbers in any given language. In English, these are one, two, three, and so on.

    Ordinate numbers.    These are the names give to numbers representing a sequential order, such as first, second, third, fourth, and so on.

    Nominal number.    Nominal numbers have no mathematical value. They are simply numbers given to represent a certain item, entity, group, classification, or category. Examples include your social security number, zip code, or telephone number. In statistics, nominal numbers are called categorical data. For example, all items that are blue may be designated as one whereas all items that are red may be designated as two, or as in industry, zero means off and one means on. In some cases where there is no order, serial numbers are considered nominal. However, if the serial number represents a position in a sequence, it becomes an ordinate number. Nominal numbers are used extensively in statistics when labeling attribute type data, which will be explained in Chapter Three.

    Intervals.    Numbers that are meaningful when compared to other numbers. These numbers are not, however, meaningful when compared to zero. Zero may simply be another arbitrary point on the scale. Take common temperature scales, for example. The Fahrenheit scale set zero at the temperature ocean water freezes. Therefore, 10 degrees Fahrenheit is not half the temperature of 20 degrees Fahrenheit. Even at zero degrees Fahrenheit, the amount of heat is substantial. Fresh water boils at 212 degrees Fahrenheit, but it is not 212 times hotter than water at 1 degree Fahrenheit. It is true however that 10 degrees Fahrenheit contains the same amount of heat whether it comes from –10 to –20 degrees Fahrenheit or 90 to 100 degrees Fahrenheit. This is the advantage of the interval — equal amounts through the scale have the same value.

    Ratios.    A ratio represents the relationship of one number to another. Unlike the interval, the ratio is a number on a scale where zero is meaningful. The number of defects in a manufacturing line is one example. Zero defects per number of produced items means the number of defects cannot go lower. Length compared to width, weight to volume, strength to weight, and mixtures such as 1 part rice to 3 parts water are examples.

    Units of Measure

    Most units of measure are manmade. For example, length is determined through comparison to another distance. This other distance could be anything — the cubit was based on the distance from the elbow to the tip of the middle finger. As scales developed through history, the measures became more standard. Still, homemade measures are often used today. When hanging two or more pictures on a wall, how many people fix a position on the first picture and compare it to their own eye level or chin level? This level then becomes the reference height for hanging the next picture.

    Over time, measures became standardized so that information could be transferred farther than a short distance down the wall or hallway. Standardized scales are essential in statistics. Analyses might be achieved without them, but interpretation by the rest of the world is difficult. For example, weight data on steel rods could be collected using a beam scale. Place the steel rod on one side and use 1/4″ grade 8 flat washers on the other side. Keep adding washers until the scale is level; then count how many washers. Every statistical analysis needed can be computed using these data and the meaningfulness of the results will be sound. However, by converting the washers into pounds, ounces, grams, or kilograms, this information becomes more easily transferred and understood by the profession around the world. It becomes standardized without changing the outcome of the statistical analysis.

    Let’s look at just a few of the most common standard measures. The list is partial because there are literally thousands of standardized measures.

    Length

    There are two common standards in measuring length: the British standard scale and the metric scale. The foot was developed just as one would guess. However, feet vary between individuals, so several different foot measures evolved. The foot was standardized when the Normans came to England, the 12-inch foot was made official by King Henry I less than 100 years later. The inch too was representative of a human part — the width of the thumb. It progressed as the foot did and was standardized as 12 per foot. The yard was the length between the nose and the outstretched arm. It was later standardized to equal three feet.

    Today, the standard scale can be fractional or decimal. In fractional, an inch can be taken in half, then 4ths, then 8ths, 16ths, and further. Figure 1.3 shows a typical partition of a standard scale using fractions. The largest partition is the half, which is then split into quarters. The fractions are counted as 1/4, 1/2, 3/4, and a full inch. The quarters can be split into eighths and then sixteenths. The arrow shown in Figure 1.3 indicates 3 and 5/8ths inches.

    FIGURE 1.3    Inch scale

    Decimal scales are similar but can be written using decimal place value format rather than fractions. The mile was introduced first by the Romans; it was defined as 1000 paces of a Roman legion. In most of England, however, this distance was thought of as 8 furlongs because a furlong was truly an English measure (representing a plowed furrow of 660 feet in agriculture), making a mile equal to 5280 feet.

    The metric scale is available only in the decimal form. It was developed by the French Academy of Sciences commissioned by the National Assembly of France in 1790. One meter equals one ten-millionth of the straight line distance between the North Pole (geographic) and the equator. Most of the world (including England) uses the metric system to some extent. The United States is one of the last remaining countries that have not officially changed to the metric system.

    Both standard and metric scales are capable of significant accuracy and precision, depending on the instrument used to take the measure. The scale itself is not determined by the measurement. Accuracy can vary between miles or kilometers, from nominal sizes of one or two inches or even centimeters to highly accurate readings in the millionths of either standard or metric scales. Just as one can measure the distance between New York to Beijing, one can determine the width of an atom. For example, a nanometer (nm) is 10-9 or 0.0000000001 of a meter.

    Similar to length are area and volume. Common area measures include square inch, square foot, square mile, acre (0.0015625 square miles), square meter, square kilometer, and hectare (10,000 square meters). Volume and capacity includes cubic measures of most of what was included above for area. In many volume measurements, however, volume must be defined as liquid or dry. Pints, quarts, gallons, and liters almost always refer to liquid volume measures, whereas ounces and cups (proportions of quarts, gallons, etc.) frequently measure dry volume such as flour or rice.

    Weight

    As with length, commonly used weight measures include British standard and metric systems. Uncommon weight measures are many and varied throughout the world. In China, for example, (prior to the 20th century) the jin equaled 604.79g. Now, in a conformance to international standards, this ancient measure equals 500g. All measures, including weight, were simultaneously developed worldwide. Through pressures and circumstances that go beyond explanation in this text, many have disappeared or were changed and absorbed into the modern international standard (usually metric).

    Even the British standard system has made several measures obsolete, including the very definition of the pound. Today, the pound is standardized as an avoirdupois pound as opposed to the troy, merchant, apothecary, tower, or London pound. Because of these many different systems throughout history, there exists an ambiguity among measures. For example, the imperial ton, also called the long ton, is 2240 pounds; the short ton (used in the United States) is 2000 pounds; and the metric ton (or tonne) is 1000 kilograms or 2200 pounds (approximate).

    Tons are divided into pounds, pounds into ounces, and ounces into grains. In the metric system, the gram is the primary weight unit. One kilogram (1000 grams) equals 2.204622621849 pounds.

    Electrical

    Common electrical measures include voltage, amperes, ohms, and wattage. Volts are the potential power available. Think of a water tower as an analogy. The amount of power from the water pressure in the tank (the weight of the water) can be thought of as the volts. Most electrical applications require a certain amount of voltage to operate. Computer manufacturers are interested in what variance there is in the voltage available at the power supply and transformer. Tests would be performed on each transformer to determine if it met minimal voltage for the computer to operate.

    Power available is not the only consideration. Even if power is available, the delivery rate may be too slow because of resistance. This is a measure of amperage. Again, thinking of the water tower, regardless of how much water is available, how much water can flow through a given hole in the bottom of the tank? Given a substantial amount of water, a small hole creates a fast spray of water, but the volume of water delivered is minimal. A large hole creates a slow gush delivering a large amount of water volume. The volume of water moving through the hole (dependent on pressure and size of hole) is the amperage. The size of the hole is resistance. So, an electric motor in a washing machine needs the correct amount of voltage to activate the system, the availability of enough amperage to keep the motor running, and the proper amount of resistance to control the current to keep it from burning up.

    These measures are read from meters. A typical multi-meter can detect voltage, amperage (current), and ohms (resistance). One of the most common measures in electricity is continuity. This measure is a yes or no measure. It either has electricity going through it or it does not. If antique automobile renovators are tinkering with the tail lights on an old car, they may want to see which wire from the switch goes to which light. Putting the multi-meter in the ohms reading mode and closing the circuit at each end of the wire can detect total resistance (no connection) or no resistance at all (the same wire).

    Wattage is a calculated value of volts times amperes. It represents the total amount of electrical energy and can be described using other power units such as Joules per second, Newton meters per second, and even horsepower. Several other measures dealing mostly with electronics include Coulomb, Farad, Weber, Tesla, Henry, and Siemens. The definition for each of these goes beyond the scope of this text.

    Light

    Light can be measured both directly and indirectly depending on how a material behaves when exposed to light. Intensity, or brightness, is the most basic direct measure. Luminance is measured in candela per square meter, historically originating from the light emitted from one candle. This unit is sometimes referred to as a nit. Wavelength or frequency of the wavelength determines the color. If the desired characteristic of a manufactured item is green, for example, measuring wavelength from reflected light off of this object can determine that. For green, the measure is approximately 500 to 565 nanometers.

    Different materials display various properties when exposed to light; these include translucency, opacity, reflectivity, and refractivity. These properties can deliver an indirect measure of light and be important considerations of the various materials as affected by light. A transparent material — such as glass, some plastics, and water — will allow light to pass through. On the other end of this scale is opacity (not allowing light to pass through). In-between the two is translucency.

    Typically measuring devices for light are sophisticated instruments and can be quite expensive. Consider, however, measuring translucency in plastic film. One inexpensive method is to make a homemade instrument with a flashlight, a film-holding fixture, and a photovoltaic cell (Figure 1.4a). Position the flashlight a constant distance from the photovoltaic cell. Shroud the flashlight where it adjoins the holding fixture so light cannot escape. Place a plastic specimen into the holder and read the voltage that the photovoltaic cell generates (Figure 1.4b). This process generates ratio data proportionally to the intensity of the light moving through the plastic. Although it is not a direct measure, it may be used for meaningful statistical analyses regarding that material. The scale is created by assuming 0.0 volts for a known opaque film, and measuring the voltage with no film installed (completely transparent).

    Reflection and refraction are also open to clever ways of measuring. Reflection is the amount of light that is bounced off a surface. Refraction measures how a material changes the speed of light and, as a result, the light appears to bend through that material.

    Chemical

    Chemical compounds and elements are usually measured in parts per million (ppm). The International System of Measurements (SI) uses the mole as the base unit that defines a system with as many particles as equal to 0.012 kilograms of carbon 12 atoms. One ppm equals one micromole. These units of course must be qualified by volume, mass, fluid, or atoms.

    FIGURE 1.4a and 1.4b: Homemade measure of translucency instrument components (Photos provided by Marietta Byerline)

    FIGURE 1.5    XLS ultra trace–1310 mass spectrometer (Photo provided by Thermo Fisher Scientific)

    Generally ppm instrumentation is expensive, accomplished by purity meters, lasers passed through a substance with the amount of light measured on the other end, and spectrometers. Spectrometers are probably the most

    Enjoying the preview?
    Page 1 of 1