A common tool that is used by the aerospace technician is the torque wrench. The purpose of the wrench is to precisely place the required amount of torque or tightening of a nut on a bolt without under or over torquing the nut. It is a misconception that people believe that “the tighter the better” applies to all nut/bolt arrangements. That is not true. Over-tightening a nut can crush the gaskets underneath, warp the bold threads, or cause the nut to be difficult to remove. Under-tightening can result in the nut coming off at an inopportune time. Using a torque wrench, you ensure that the nut is at the proper torque that is required avoiding damage or breakage.
There are four major types of torque wrenches an aerospace technician can use: Flexible beam, Rigid frame with a dial indicator, preset or “clicker” torque, or digital.
Flexible beam is used by grasping the center of the handle, turning the nut until the needle points to the torque desired.
Rigid Frame torque wrenches use a dial indicator to read the torque being applied.
Preset or “Clicker” torque wrenches can be set to the desired torque and will “click” when the proper torque is reached. The unit showed above can be set for either english standard or metric.
Digital torque wrenches will show the torque being applied via a digital readout and some models will actually beep when you have achieved the torque desired.
When using a torque wrench, keep these things in mind:
- Select the proper wrench for the measurement being called for. If inch/lbs are called for, then you select a inch/lb wrench. If inch/grams are called for, then select a wrench that is measured in grams.
- Check the calibration date on the wrench. If it is outdated, return to logistics and get a wrench that is within the calibration date.
- Always test the torque wrench on a torque measuring device to ensure that it is still calibrated and reading accurately.
- Never jerk the wrench, but pull on it slowly keeping close eye on the torque being applied.
- Always set the wrench back to it’s lowest torque setting after use to ensure that the spring does not compress and end up with “memory.”
- Always handle torque wrenches with care. Do not drop them or bang them around. Return them to their foam case once finished.
Torque is determined by the formula:
Torque=force X distance
For example if a an engineering drawing requires a 120 lb inch pound torque for a specific nut and you have a 10 inch torque wrench, you would figure out the following:
120 inch/lb=force X 10 inches
120 inch/lb divided by 10 inches=force X 10 inches divided by 10 inches
or 120 inch/lb divided by 10 inches= force
Therefore you need 12 lbs of force to achieve 120 inch/lbs using a 10 inch torque wrench.
This formula is very useful in determining the torque required.
Some technicians have been known to alter the length of their torque wrench by adding an extender. If that is done, then the force required to achieve 120 inch/lb of torque is changed. Let’s look at the same problem again and assume the aerospace technician has decided to add a 15 inch extender on his wrench.
120 inch/lb=force X (10 inches (wrench length) + 15 inches (extension length))
120 inch/lb=force X 25 inches
120 inch/lb divided by 25 inches=force X 25 inches divided by 25 inches
120 inch/lb divided by 25 inches=force
Note the difference in force required! If you had not taken into account the added length of the wrench, you would have over torqued the nut possible resulting in damage to the aerospace hardware.
With good care and timely calibrations, your torque wrench should last and be an invaluable tool in your work.
Continuing our series on measurement devices, today we are going to discuss Gage Blocks. Gage blocks are rectangular blocks usually made of hardened steel or zirconia ceramics. The surfaces are flattened and polished to a tolerance of only 2 to 8 millionths of an inch.
The purpose of gage blocks is to measure things to an astonishing accuracy. Gage blocks are used to determine the accuracy of fixed gages while checking to see if those gages have experienced excess wear or any other form of alteration, calibrate various adjustable gauges, to set up machine tool settings, and to measure the accuracy of finished parts.
There are three classes of gage blocks:
- Class AA – also known as the laboratory or master set. They are accurate to within +/- 0.000002 in and +/- 0.00005 mm. These gages are used in climate controlled settings to ensure the accuracy of lower class gage blocks.
- Class A – used for inspection purposes, they are accurate to within +/- 0.000004 in and +0.00015/-0.00005 mm.
- Class B- are commonly known as a “working set” and are used most often in shop settings for machine tool setups, layout work, and measurement. They are accurate to +/- 0.000008 in and +0.00025/-0.00015 mm.
Gage block sets can come with only three or four blocks to sets that number up to 115 blocks. The typical English gage block set consists of 83 blocks and two wear blocks. The wear blocks can be either 0.050 in or 0.100 in. The most common metric sets consist of 88 pieces with two 2 mm wear blocks.
Wear blocks are stacked on each end of the gage block set when measuring and are designed to take all the wear and erosion that occurs during the lifetime of measuring using the set therefore prolonging the set’s usefulness. NEVER place a non-wear block on a work surface that you are measuring. Work surfaces can contain minute amounts of abrasives that will degrade the accuracy of your blocks over time. Always use wear blocks on each end of the stack, with the same face of the wear blocks touching the surfaces all the time. Most wear blocks are marked so you not put the wrong face on the item you are measuring.
Gage blocks are designed to be used in environments that are climate controlled. Most blocks dimensions are set in a temperature of 68 degrees F (20 degrees C). For every increase of 1 degree F (0.5 C), a typical 4 in stack of gage blocks will expand approximately 0.000025 in. With the human body temperature being about 98.6 degrees, it is important that not only a climate controlled facility be used when measuring, but that the aerospace technician limit his contact with the blocks either by holding them by your finger tips as little as possible or using insulated tweezers. The work area that is being measured should also be the same temperature as the blocks in order to obtain the best accuracy. Some manufacturers suggest you go the extra step and use insulated gloves along with the insulated tweezers. If the part being measured and the blocks are not the same temperature, some books suggest you immerse both items in kerosene until they are equalized. That of course may not be practical due to fire hazards or the size of the part being measured.
To save time and reduce the chance of error when using gage blocks you should use as few blocks as possible. There is an actual procedure advocated by the authors of the book, Technology of Machine Tools, to use when calculating the exact blocks you will need to make a measurement.
Step One: Write the dimension required on a a piece of paper.
Step Two: Deduct the size of two wear blocks.
Step Three: use a block that will eliminate the right-hand digit.
Step Four: Use a block that will eliminate the right-digit and at the same time bring the digit to the left of it to a zero or a five.
Step Five: Continue to eliminate the digits from the right to the left until the dimension required is attained.
Now to see this in action, here is an example in the table below:
As you see in the left hand column, you are subtracting the blocks from the desired measurement while in the right hand column you are adding the block’s measurements together. You should achieve a “0” in the left hand column and the desired measurement in the right hand column if you have done your math correctly.
Gage blocks surfaces are flattened so accurately, that they can actually “stick” together and withstand a pull of up to 200 lbs! It is not known exactly why that is though some have suggested it is either a molecular bond or due to the slight film of oil left over due to cleaning. To stack or “wring” gage blocks together, you must first clean the blocks with a clean, lint free, and soft cloth. Wipe the contact surface area of the block on the palm of your hand or wrist. This has two functions; One, to wipe any remaining particulates from the block onto your hand using the oil from your skin to “grab” the particulates, and two, also using the oil from your skin to “lube” the blocks. Place the end of one block onto the end of the other block and while using pressure, slide the blocks together. They should stick together. If they don’t, then the blocks were not properly cleaned.
To take care of your gage blocks and ensure that they have a long life, you should:
- Keep the case closed at all times except when you are getting a block or placing back a cleaned block.
- Do not play dominoes with them.
- Do not unnecessarily finger the surfaces of the block to avoid rusting and tarnishing due to your skin oils and moisture.
- Do not drop the blocks or scratch the surfaces of the blocks.
- Do not use them in your juggling act at the comedy club.
- Immediately after use, each block should be cleaned, oiled, and placed back into the box. (Don’t forget to close the box!)
- Never leave your gage blocks wrung together! Leaving them this way will encourage rusting from the oils and moisture from your skin.
The Quality Technician’s Handbook (Griffith, 2003)
Technology of Machine Tools (Krar & Check, 1997)
The most common and overlooked measuring tool is the steel rule or ruler. Steel rules can be in inches or metric with sub-markings as precise as 1/64 of an inch or one millimeter. Steel Rules come in 4 types: spring tempered, flexible, narrow, hook, and short-length rules.
Spring tempered steel rules are the most commonly used in aerospace shops. These rules are usually 6 inches in length and made for quick reading. Usually they are broken into 4 scales, two on each side. For English measurements, one side of the rule will have graduations in eighths and sixteenths, and the back is graduated in thirty-seconds and sixty-fourths. Some spring tempered rules will have English measurements on one side and metric on the other.
Flexible steel rules are more commonly found in construction and in the home. They are also commonly called tape measures.
Narrow steel rules are used to reach tight and hard to access places to measure.
Hook steel rules have a “hook” on the end that you can butt up flat against a corner or protrusion while measuring to ensure the rule doesn’t move.
Short-length rules are small rules that come in sets of four and range in size from 1/4 inch to 1 inch in length. These rules are placed in a holder and used to measure small parts or openings.
There are two major rules to follow when using steel rules to ensure accuracy:
1. Due to rules being worn on the edge over time and use, it is best to start your measurement from the 1 inch or 1 centimeter mark. Once you have your measurement, subtract the 1 unit you started at to get the final measurement.
2. Make sure the graduated markings are as close to the area of the part you are measuring to ensure accuracy. It is best if the graduated markings are actually touching the area your measuring to make it easier to obtain the correct measurement.
Steel rules can also be used to determine the flatness of a material. Lay the steel rule on it’s edge on a part, hold the rule and part up to a light. If light shows through between the part and the rule, then the part is not flat. If there is no light showing, then the part is flat.
Don’t underestimate the usefulness of the steel rule. It will be the most often used measuring tool in your tool box.
The Quality Technician’s Handbook (Griffith, 2003)
Technology of Machine Tools (Krar & Check, 1997)
Caliper tools are considered transfer tools because the measurement cannot be read directly. Caliper tools “make contact with the part (on the dimension being measured), are then locked in the measured position, and are measured with another tool…” such as a rule or micrometer. (Griffith 2003) With caliper tools being a transfer tool, it doesn’t matter if the part being measured is in English or metric form. The caliper tool is used to measure the part which then you can compare it to either a English or metric rule or micrometer.
Calipers can be also used as a go/no go tool. You would preset and lock a caliper against a rule or master part and do a “fit check” on your part to see if a part meets the measurement required.
Though Caliper tools are good for regular measurements they should not be used when accuracy of less than 0.015 ” or 0.39 mm is required. (Krar and Check 1997)
There are two basic types of calipers, outside and inside calipers.
Outside calipers measure the outside surface of a round or flat surface. The most common type of outside caliper is the spring joint caliper. To use a outside caliper you would:
2. Compare the caliper to a rule to determine the measurement or compare it against a Master Part.
Inside Calipers are used to measure the inside diameters of a circular object such as a pipe or to measure the inside width of a slot or keyhole.
1. Hold one leg against the bottom surface of the area your measuring with your finger.
2. Turn the adjusting nut until the second leg touches the opposite side and lock legs in place.
3. Compare the caliper against a rule or Master part, or for more accuracy, use a micrometer on the caliper.
Calipers are easy to use and very simple tools to add precision to your work. Don’t underestimate a caliper’s usefulness when it comes to your work.
If there are any other common measuring tools or some other aerospace technician subjects you would like to see covered, please feel free to email your suggestion and we will work to cover it here.
The first micrometer was invented by William Gascoigne in the 17th century to assist in measuring angular distances between stars while using a telescope. Jean Laurent Palmer of France developed the tool further in 1848 making it possible to use to measure hand held devices and therefore was originally named after him, the palmer, as it is still known today. In Spain it is still known as the palmer screw or tomillo de palmer.
By the late 1800’s the micrometer had been mass marketed into many machine shops throughout these United States. Though it’s form has changed many times and the objects it can measure can differ such as outside and inside diameters, and depth, the particular micrometer we will discuss is the one that measures in inches the outside diameter or the micrometer caliper.
The micrometer or “mike” can measure the diameter of an object within 0.001 inches. Some micrometers have an vernier added to them that allow the measurements to done at 0.0001. No matter the size of the frame of a micrometer, and some can get pretty large, the actual maximum area that can be measured is only 1 inch total.
The micrometer consists of the anvil (the fixed part of the measuring area), the frame, the spindle (the part that moves while measuring), the spindle lock (that locks the caliper in place after measurement in the event you are using it to compare parts or such), the barrel which has the quarter inch (0.25) lines, the sleeve, sometimes called thimble, which contains the lines for another quarter inch divided into 40 lines representing the 0.001, and the ratchet to use for fine tuning measurements.
“The scale of a micrometer ranges from 0 to 1 inch and is usually graduated in thousandths of an inch (think of 1 in. as 1000/1000). The sleeve of the mike (as it slowly pulls back) shows the numbers 0 1 2 3 4 5 6 7 8 9 0. Each of these numbers represents 0.100 (or 100 thousandths) of an inch…Now look at the thimble. Each line on the thimble represents 0.001, and there are number every five lines (or 0.005). One complete turn (revolution) of the thimble is equal to 0.025 in. So each revolution of the thimble is equal to one division on the barrel.” (Griffith, 2003)
The Stefanelli website has a great simulator of a inch micrometer that you can move with the mouse and it will tell you the readings while you practice. I would also suggest you find some old machine tool textbooks and look for exercises that will test you in reading the measurements. As I have said before, know your tool or measuring device before using it in your actual day to day use. Refer to the previous post for more general suggestions on measuring.
We are going to start a series talking about the use and type of measuring tools commonly used by the aerospace technician. In this part, we will cover the basics that are needed to keep in mind when selecting and using measuring tools.
There is an old phrase that says, “When the only tool you have in your tool box is a hammer, then every problem looks like a nail.” The same goes with measuring tools. If all you have is a ruler, then you have the tendency to measure everything with it. As we all know, various things such as surfaces, threads, thickness, etc. need to be measured which require a variety of measuring tools, from rulers, dial calipers, to gages.
Though accuracy is needed in the aerospace field, accuracy can actually be too much. When you have need of knowing the measurement of a part within 0.01 of accuracy, you do not need to use a tool that measures within 0.000001 accuracy. That much detail is wasteful and not needed for the task. “The rule of thumb is to select a measuring tool that is ten times more accurate than the total tolerance to be measured, or the tool can discriminate to one-tenth of the total part tolerance.” (Griffith, 2003) This is called the 10 % or 10-1 rule. So if you wish to measure something within 0.01 accuracy as called for by the tolerence listed on the process, you only need to measure ten times that accurate or 0.001 in order to follow the 10-1 rule.
When I took a chemistry lab in college, I had the instructor once comment that I was being too accurate in my measurements. I had taken that as a compliment at the time not understanding that his “tolerance” did not require as much accuracy as I was using in measuring my chemical compounds for the experiment. Though my outcomes were more inline with what the lab book had said we would come up with in the results of the experiment, I was always last to leave the class, and I missed the point that he wanted us to grasp the concept of the experiment, not the technique.
Know your measuring tool! What does it measure? What is the fixed end (reference) and what is the measured surface (movable)? Do you know how to read the measurement displayed? What are the divisons on it and what do they represent? You must be proficient with your tool in order to use and read the correct measurment.
Strive for accuracy and precision. “Accuracy is the difference between the average of several measurements made on a part and the true value of that part. Precision means getting the consistent results repeatedly.” (Griffith, 2003) If you are using the wrong tool during a measurement or reading it wrong, but getting consistent results, you have precision but not accuracy. That is because you are getting the same results each time, but the measurment process is inaccurate. But if you are using the right tool and reading it correctly, then you will get accuracy and precision.
Pay attention to the pressure used when measuring. If you use too much pressure or too little, you can get an less precise reading (especially if the pressure changes each time you measure) and therefore an inaccurate reading. Don’t use a micrometer like a C-clamp to hold a part in place. That is too much pressure and will affect the reading. Same goes for using too little pressure. If contact with the part is too light, then the reading is skewed. Always use consistent pressure.
Take care of your measuring tools. First of all, inspect the tool. Is there a calibration sticker on it stating when it was calibrated and when the calibration expires? Is the tool in good condition? Does all the parts of it work as expected? Is the tool clean of debris and dirt that could affect its use or the measurement? Has the tool been dropped? Have you tested the tool on a part that already has had its measurment determined? Has the tool been stored properly to prevent damage? Measuring tools should not be piled on each other and usually should be kept in individual cases. Is the tool showing any sign of wear, especially on the measuring and fixed surfaces?
Learning about and becoming proficient with your measuring tools will ensure that your work will be more consistant and error free. And, don’t forget the old adage when doing your work; “Measure twice and cut once!”
Sources: The Quality Technician’s Handbook by Griffith
Technology of Machine Tools by Krar and Check
What makes up a hydraulic system? Well the simplest hydraulic system has only three parts: The pump, cylinder, and the fluid. Pumps can be of any shape or size. Some pumps are hand or foot driven while more complicated ones can be a turbo or another form of gear type pump. Cylinders can also be of any size as long as it is water tight to prevent fluid leakage.
The fluid can be just about anything, though some fluids do work better than others. In theory, you could use your nice cold ice tea you’re drinking on a hot summer day as a hydraulic fluid since all fluids are nearly incompressible, but I don’t think I would try that. I prefer to drink my ice tea after work is done instead of using it for work. Specially made hydraulic oil is usually the preferred fluid due to its slow heating (hydraulic fluids do heat up due to friction as it moves inside a cylinder) properties compared to water. Remember, it takes only 212 degrees to make water boil while hydraulic oils require higher temperatures before it will boil and fail in a hydraulic system. I have no idea what the boiling temperature of ice tea is though.
Looking at the picture below, you can see all three parts. The “pump” is placing 5 lbs. of pressure on a 1/2 inch area “cylinder” causing the “fluid” to do its work lifting a hundred lb. weight. No matter how complicated a hydraulic system can be, it will always have these three main parts.
Other parts can be added to ensure better safety and control. Parts such as:
- Check valves-To ensure a one way flow in a cylinder.
- Reservoirs – tanks that hold large amounts of hydraulic fluid that can’t all be stored in the cylinder.
- Control valves – allows the operator to direct the flow and supply of hydraulic fluid traveling in the hydraulic system. With a control valve, an operator can actually open the lines to allow the fluid to flow freely and not build up pressure therefore allowing the pumps to stay on but in a neutral state. When work is needed, the control valve is engaged to a closed position causing the pressure to build inside the cylinder and allowing work to be done.
- Relief valve – Probably the most important safety feature in a hydraulic system. Pressures can increase quite quickly in a hydraulic system beyond the structural limits of a cylinder and can cause an explosive failure. Relief valves are designed to open and relive the pressure if it exceeds a certain point it is designed for causing the hydraulic fluid to exit the cylinder and return to the reservoir lessening the pressure and preventing failure. Hydraulic leaks and relief valve malfunctions are probably the most common (and possibly dangerous!) failures an aerospace technician will encounter.
Now that you have had a brief overview of hydraulic systems, go to the refrigerator or break room and get yourself some ice tea. I know you have wanted some for a few paragraphs now.
Hehn, A. H. (1993). Fluid Power Handbook Volume 1: System Design, Maintenance, and Troubleshooting. Houston: Gulf Publishing Company.
As an aerospace technician, you will likely find yourself working with or working on various hydraulic systems. “Hydraulics is the science dealing with work performed by liquids in motion…The science of hydraulics is divided into two distinct categories; hydrodynamics and hydrostatics. Hydrodynamics deals with power transmitted by liquids in motion, such as water turning a turbine. Hydrostatics deals with power transmitted by confined liquids under pressure.” (Hehn, 1993) When we refer to “hydraulics” in this post, we are referring to “Hydrostatics” only.
The first known study of hydrostatics was done in the mid to late 1600’s by the French mathematician, Blaise Pascal, who developed a law or principle that stated when pressure was applied or lowered on a confined liquid at any point, that change of pressure was transmitted equally throughout the entire fluid. This is an important principle in hydraulics and explained the large amount of work a hydraulic system could do with very little liquid and minimal pressure.
Due to the fact that liquids are practically incompressible and that any force or pressure applied is equally transmitted in all directions, you could apply a moderate amount of pressure on a small area and that same pressure would be transmitted to a larger area without losing power and doing much more work.
For example, look at the picture above. If you apply 10 lbs. of pressure into the smaller opening that is only 1 square inch, you are applying 10 psi of pressure. On the other side is a hundred pound weight that is sitting over a 10 square inch area will receive that same 10 lbs. psi pressure and the 100 lb. weight will actually be lifted! Each square inch is receiving 10 lbs. of force, but since the larger opening is larger, 10 sq. inches, 10 times the work can be performed.
Pascal’s law can be expressed in F=P*A. P is pressure (psi), F is force (pounds), and A is area (square inches). You can also use a triangle (similar to the ones discussed in previous posts to calculate electrical values) called the Relationship between Force, Pressure, and Area Triangle shown below. Using this triangle, you can calculate any value as long as you know the other two values.
Next time we will discuss the main parts of a hydraulic system and what role each one plays.
Hehn, A. H. (1993). Fluid Power Handbook Volume 1: System Design, Maintenance, and Troubleshooting. Houston: Gulf Publishing Company.