Tests and Measurements
Continuing our series on measurement devices, today we are going to discuss Gage Blocks. Gage blocks are rectangular blocks usually made of hardened steel or zirconia ceramics. The surfaces are flattened and polished to a tolerance of only 2 to 8 millionths of an inch.
The purpose of gage blocks is to measure things to an astonishing accuracy. Gage blocks are used to determine the accuracy of fixed gages while checking to see if those gages have experienced excess wear or any other form of alteration, calibrate various adjustable gauges, to set up machine tool settings, and to measure the accuracy of finished parts.
There are three classes of gage blocks:
- Class AA – also known as the laboratory or master set. They are accurate to within +/- 0.000002 in and +/- 0.00005 mm. These gages are used in climate controlled settings to ensure the accuracy of lower class gage blocks.
- Class A – used for inspection purposes, they are accurate to within +/- 0.000004 in and +0.00015/-0.00005 mm.
- Class B- are commonly known as a “working set” and are used most often in shop settings for machine tool setups, layout work, and measurement. They are accurate to +/- 0.000008 in and +0.00025/-0.00015 mm.
Gage block sets can come with only three or four blocks to sets that number up to 115 blocks. The typical English gage block set consists of 83 blocks and two wear blocks. The wear blocks can be either 0.050 in or 0.100 in. The most common metric sets consist of 88 pieces with two 2 mm wear blocks.
Wear blocks are stacked on each end of the gage block set when measuring and are designed to take all the wear and erosion that occurs during the lifetime of measuring using the set therefore prolonging the set’s usefulness. NEVER place a non-wear block on a work surface that you are measuring. Work surfaces can contain minute amounts of abrasives that will degrade the accuracy of your blocks over time. Always use wear blocks on each end of the stack, with the same face of the wear blocks touching the surfaces all the time. Most wear blocks are marked so you not put the wrong face on the item you are measuring.
Gage blocks are designed to be used in environments that are climate controlled. Most blocks dimensions are set in a temperature of 68 degrees F (20 degrees C). For every increase of 1 degree F (0.5 C), a typical 4 in stack of gage blocks will expand approximately 0.000025 in. With the human body temperature being about 98.6 degrees, it is important that not only a climate controlled facility be used when measuring, but that the aerospace technician limit his contact with the blocks either by holding them by your finger tips as little as possible or using insulated tweezers. The work area that is being measured should also be the same temperature as the blocks in order to obtain the best accuracy. Some manufacturers suggest you go the extra step and use insulated gloves along with the insulated tweezers. If the part being measured and the blocks are not the same temperature, some books suggest you immerse both items in kerosene until they are equalized. That of course may not be practical due to fire hazards or the size of the part being measured.
To save time and reduce the chance of error when using gage blocks you should use as few blocks as possible. There is an actual procedure advocated by the authors of the book, Technology of Machine Tools, to use when calculating the exact blocks you will need to make a measurement.
Step One: Write the dimension required on a a piece of paper.
Step Two: Deduct the size of two wear blocks.
Step Three: use a block that will eliminate the right-hand digit.
Step Four: Use a block that will eliminate the right-digit and at the same time bring the digit to the left of it to a zero or a five.
Step Five: Continue to eliminate the digits from the right to the left until the dimension required is attained.
Now to see this in action, here is an example in the table below:
As you see in the left hand column, you are subtracting the blocks from the desired measurement while in the right hand column you are adding the block’s measurements together. You should achieve a “0” in the left hand column and the desired measurement in the right hand column if you have done your math correctly.
Gage blocks surfaces are flattened so accurately, that they can actually “stick” together and withstand a pull of up to 200 lbs! It is not known exactly why that is though some have suggested it is either a molecular bond or due to the slight film of oil left over due to cleaning. To stack or “wring” gage blocks together, you must first clean the blocks with a clean, lint free, and soft cloth. Wipe the contact surface area of the block on the palm of your hand or wrist. This has two functions; One, to wipe any remaining particulates from the block onto your hand using the oil from your skin to “grab” the particulates, and two, also using the oil from your skin to “lube” the blocks. Place the end of one block onto the end of the other block and while using pressure, slide the blocks together. They should stick together. If they don’t, then the blocks were not properly cleaned.
To take care of your gage blocks and ensure that they have a long life, you should:
- Keep the case closed at all times except when you are getting a block or placing back a cleaned block.
- Do not play dominoes with them.
- Do not unnecessarily finger the surfaces of the block to avoid rusting and tarnishing due to your skin oils and moisture.
- Do not drop the blocks or scratch the surfaces of the blocks.
- Do not use them in your juggling act at the comedy club.
- Immediately after use, each block should be cleaned, oiled, and placed back into the box. (Don’t forget to close the box!)
- Never leave your gage blocks wrung together! Leaving them this way will encourage rusting from the oils and moisture from your skin.
The Quality Technician’s Handbook (Griffith, 2003)
Technology of Machine Tools (Krar & Check, 1997)
The most common and overlooked measuring tool is the steel rule or ruler. Steel rules can be in inches or metric with sub-markings as precise as 1/64 of an inch or one millimeter. Steel Rules come in 4 types: spring tempered, flexible, narrow, hook, and short-length rules.
Spring tempered steel rules are the most commonly used in aerospace shops. These rules are usually 6 inches in length and made for quick reading. Usually they are broken into 4 scales, two on each side. For English measurements, one side of the rule will have graduations in eighths and sixteenths, and the back is graduated in thirty-seconds and sixty-fourths. Some spring tempered rules will have English measurements on one side and metric on the other.
Flexible steel rules are more commonly found in construction and in the home. They are also commonly called tape measures.
Narrow steel rules are used to reach tight and hard to access places to measure.
Hook steel rules have a “hook” on the end that you can butt up flat against a corner or protrusion while measuring to ensure the rule doesn’t move.
Short-length rules are small rules that come in sets of four and range in size from 1/4 inch to 1 inch in length. These rules are placed in a holder and used to measure small parts or openings.
There are two major rules to follow when using steel rules to ensure accuracy:
1. Due to rules being worn on the edge over time and use, it is best to start your measurement from the 1 inch or 1 centimeter mark. Once you have your measurement, subtract the 1 unit you started at to get the final measurement.
2. Make sure the graduated markings are as close to the area of the part you are measuring to ensure accuracy. It is best if the graduated markings are actually touching the area your measuring to make it easier to obtain the correct measurement.
Steel rules can also be used to determine the flatness of a material. Lay the steel rule on it’s edge on a part, hold the rule and part up to a light. If light shows through between the part and the rule, then the part is not flat. If there is no light showing, then the part is flat.
Don’t underestimate the usefulness of the steel rule. It will be the most often used measuring tool in your tool box.
The Quality Technician’s Handbook (Griffith, 2003)
Technology of Machine Tools (Krar & Check, 1997)
Caliper tools are considered transfer tools because the measurement cannot be read directly. Caliper tools “make contact with the part (on the dimension being measured), are then locked in the measured position, and are measured with another tool…” such as a rule or micrometer. (Griffith 2003) With caliper tools being a transfer tool, it doesn’t matter if the part being measured is in English or metric form. The caliper tool is used to measure the part which then you can compare it to either a English or metric rule or micrometer.
Calipers can be also used as a go/no go tool. You would preset and lock a caliper against a rule or master part and do a “fit check” on your part to see if a part meets the measurement required.
Though Caliper tools are good for regular measurements they should not be used when accuracy of less than 0.015 ” or 0.39 mm is required. (Krar and Check 1997)
There are two basic types of calipers, outside and inside calipers.
Outside calipers measure the outside surface of a round or flat surface. The most common type of outside caliper is the spring joint caliper. To use a outside caliper you would:
2. Compare the caliper to a rule to determine the measurement or compare it against a Master Part.
Inside Calipers are used to measure the inside diameters of a circular object such as a pipe or to measure the inside width of a slot or keyhole.
1. Hold one leg against the bottom surface of the area your measuring with your finger.
2. Turn the adjusting nut until the second leg touches the opposite side and lock legs in place.
3. Compare the caliper against a rule or Master part, or for more accuracy, use a micrometer on the caliper.
Calipers are easy to use and very simple tools to add precision to your work. Don’t underestimate a caliper’s usefulness when it comes to your work.
If there are any other common measuring tools or some other aerospace technician subjects you would like to see covered, please feel free to email your suggestion and we will work to cover it here.
The first micrometer was invented by William Gascoigne in the 17th century to assist in measuring angular distances between stars while using a telescope. Jean Laurent Palmer of France developed the tool further in 1848 making it possible to use to measure hand held devices and therefore was originally named after him, the palmer, as it is still known today. In Spain it is still known as the palmer screw or tomillo de palmer.
By the late 1800’s the micrometer had been mass marketed into many machine shops throughout these United States. Though it’s form has changed many times and the objects it can measure can differ such as outside and inside diameters, and depth, the particular micrometer we will discuss is the one that measures in inches the outside diameter or the micrometer caliper.
The micrometer or “mike” can measure the diameter of an object within 0.001 inches. Some micrometers have an vernier added to them that allow the measurements to done at 0.0001. No matter the size of the frame of a micrometer, and some can get pretty large, the actual maximum area that can be measured is only 1 inch total.
The micrometer consists of the anvil (the fixed part of the measuring area), the frame, the spindle (the part that moves while measuring), the spindle lock (that locks the caliper in place after measurement in the event you are using it to compare parts or such), the barrel which has the quarter inch (0.25) lines, the sleeve, sometimes called thimble, which contains the lines for another quarter inch divided into 40 lines representing the 0.001, and the ratchet to use for fine tuning measurements.
“The scale of a micrometer ranges from 0 to 1 inch and is usually graduated in thousandths of an inch (think of 1 in. as 1000/1000). The sleeve of the mike (as it slowly pulls back) shows the numbers 0 1 2 3 4 5 6 7 8 9 0. Each of these numbers represents 0.100 (or 100 thousandths) of an inch…Now look at the thimble. Each line on the thimble represents 0.001, and there are number every five lines (or 0.005). One complete turn (revolution) of the thimble is equal to 0.025 in. So each revolution of the thimble is equal to one division on the barrel.” (Griffith, 2003)
The Stefanelli website has a great simulator of a inch micrometer that you can move with the mouse and it will tell you the readings while you practice. I would also suggest you find some old machine tool textbooks and look for exercises that will test you in reading the measurements. As I have said before, know your tool or measuring device before using it in your actual day to day use. Refer to the previous post for more general suggestions on measuring.
We are going to start a series talking about the use and type of measuring tools commonly used by the aerospace technician. In this part, we will cover the basics that are needed to keep in mind when selecting and using measuring tools.
There is an old phrase that says, “When the only tool you have in your tool box is a hammer, then every problem looks like a nail.” The same goes with measuring tools. If all you have is a ruler, then you have the tendency to measure everything with it. As we all know, various things such as surfaces, threads, thickness, etc. need to be measured which require a variety of measuring tools, from rulers, dial calipers, to gages.
Though accuracy is needed in the aerospace field, accuracy can actually be too much. When you have need of knowing the measurement of a part within 0.01 of accuracy, you do not need to use a tool that measures within 0.000001 accuracy. That much detail is wasteful and not needed for the task. “The rule of thumb is to select a measuring tool that is ten times more accurate than the total tolerance to be measured, or the tool can discriminate to one-tenth of the total part tolerance.” (Griffith, 2003) This is called the 10 % or 10-1 rule. So if you wish to measure something within 0.01 accuracy as called for by the tolerence listed on the process, you only need to measure ten times that accurate or 0.001 in order to follow the 10-1 rule.
When I took a chemistry lab in college, I had the instructor once comment that I was being too accurate in my measurements. I had taken that as a compliment at the time not understanding that his “tolerance” did not require as much accuracy as I was using in measuring my chemical compounds for the experiment. Though my outcomes were more inline with what the lab book had said we would come up with in the results of the experiment, I was always last to leave the class, and I missed the point that he wanted us to grasp the concept of the experiment, not the technique.
Know your measuring tool! What does it measure? What is the fixed end (reference) and what is the measured surface (movable)? Do you know how to read the measurement displayed? What are the divisons on it and what do they represent? You must be proficient with your tool in order to use and read the correct measurment.
Strive for accuracy and precision. “Accuracy is the difference between the average of several measurements made on a part and the true value of that part. Precision means getting the consistent results repeatedly.” (Griffith, 2003) If you are using the wrong tool during a measurement or reading it wrong, but getting consistent results, you have precision but not accuracy. That is because you are getting the same results each time, but the measurment process is inaccurate. But if you are using the right tool and reading it correctly, then you will get accuracy and precision.
Pay attention to the pressure used when measuring. If you use too much pressure or too little, you can get an less precise reading (especially if the pressure changes each time you measure) and therefore an inaccurate reading. Don’t use a micrometer like a C-clamp to hold a part in place. That is too much pressure and will affect the reading. Same goes for using too little pressure. If contact with the part is too light, then the reading is skewed. Always use consistent pressure.
Take care of your measuring tools. First of all, inspect the tool. Is there a calibration sticker on it stating when it was calibrated and when the calibration expires? Is the tool in good condition? Does all the parts of it work as expected? Is the tool clean of debris and dirt that could affect its use or the measurement? Has the tool been dropped? Have you tested the tool on a part that already has had its measurment determined? Has the tool been stored properly to prevent damage? Measuring tools should not be piled on each other and usually should be kept in individual cases. Is the tool showing any sign of wear, especially on the measuring and fixed surfaces?
Learning about and becoming proficient with your measuring tools will ensure that your work will be more consistant and error free. And, don’t forget the old adage when doing your work; “Measure twice and cut once!”
Sources: The Quality Technician’s Handbook by Griffith
Technology of Machine Tools by Krar and Check
When I watch the International Space Station fly overhead, I am always amazed at what a piece of craftsmanship it truly is. Is it because of its large size or how high up it is? No, not really. It’s the fact that each and every piece was built in over 1 dozen countries by thousands of people and brought together for the first time 250 miles up in low Earth orbit. The chances of everything fitting together the first time would be astronomical (and it did fit together the first time!) if it wasn’t for one thing, the standardization of measurement.
With things such as the ISS, the various countries involved use the United States Customary System of measurement. In fact, as stated on the ISS tour at KSC, the ISS is the last international space project involving the USA that will use this system of measurement. Afterwards, all international space projects are to be done in the metric or SI system.
Currently the USA uses the United States Customary System as its standard of measurement. The rest of the world uses the International System of Units (SI or commonly known as “metric.”) In 1959, an international standard was agreed upon so that both standards could be easily translated back and forth. The table below shows a good example of conversions for common measurement between the American and SI standards.
|Exact relationships shown in boldface|
|1 inch (in)||2.54 cm|
|1 foot (ft)||12 in||0.3048 m|
|1 yard (yd)||3 ft||0.9144 m|
|1 mile (mi)||1760 yd||1.609344 km|
The standardization of measurement is a basic building block of a successful civilization. You cannot have trade, buildings, or complicated machinery without an agreed upon standard of measurement. From standards on length, volume, etc. societies cannot function without some standard of measurement. The next time you go to fuel your car, look at the gas pump. Somewhere on the pump will be a stamp from an official state office certifying that the pump meter is in accordance with standards of measurement when it calculates flow of gasoline being pumped into your car. This ensures the fair trade in your purchase of gasoline.
Many ancient societies had standards of measurement. Some small villages that dealt in trade would post their “standards of measurement” on a board in the village square, while larger governments and cities would actually set up standards of measurement by decree and have officials to enforce the standards.
The earliest known examples of standards of measurements came from the 4th and 3rd mellennia BC from the civilizations of Indus Valley (covering modern day parts of Pakistan, India, Iran, and Afghanistan, Egypt, and Mesopotamia (covering modern day parts of Iraq, Syria, Turkey, and Iran). Probably the most common story of government setting standards of measurement is the story of King Henry I of England, who ruled England from 1100 to 1135. The standard for the “foot” was supposed to have been made by measuring the King’s foot. This practice had been going on before his rule, but it appears that new rulers would frequently want to leave their “mark” in some way in the culture, and this was one way of doing it, hence the “foot” and quite likely the term “ruler” for the stick showing a foot.
So what can happen if standards are ignored or the wrong standard is applied? Much can happen such as unfair trade practices, building collapses, machinery that cannot have interchangeable parts, and one famous example of a multi-million dollar spacecraft being lost.
The Mars Climate Orbiter was launched on December 11th, 1998 as part of a two spacecraft team (the other being the Polar Lander which was also lost) and was declared lost September 23rd, 1999. It was discovered that the loss of the spacecraft (total program cost of $327.6 million) was due to the wrong measurement standard being used. Lockheed Martin was responsible for the thrusters and had used United States Customary Units to calculate the thrust in pound force. The main computer was expecting the calculations to be in newton’s based on SI standards resulting in the spacecraft underestimating it’s thruster effects by a factor of 4.5 (1 pound force is equal to 4.5 newton’s) . The software error was never caught during ground testing and the entire spacecraft ended up dipping too low into Mar’s atmosphere during orbit insertion causing the spacecraft to burn up.
The importance of having a standard of measurement cannot be stressed enough. Because of these standards, a meter or foot means the same throughout the world ensuring fair and accurate trade, collaboration on international projects, and someday a human return to the Moon and on to Mars.
For more information try: