You may find the information made available here useful when interpreting your reports or certificates. Please note the information provided is for educational purposes and does not constitute any advice.
Absolute pressure measurements are referenced to zero pressure, (a perfect vacuum.)
Absolute zero is the lowest possible state at which matter can exist, 0 K or -273.15°C.
Absolute Pressure Transducer
A transducer that has an internal reference chamber sealed at or close to zero pressure (full vacuum) when exposed to atmosphere a reading of approximately 14.7 psi results.
The rate of change of velocity often depicted as “g’s” or in “mm/s2” in the metric system or “in/sec2” in the English system. Acceleration is not constant but will vary through the vibration cycle, reaching maximum levels as velocity reaches its minimum. This is typically when a particular mass has decelerated to a stop and is about to begin accelerating again.
A transducer whose electrical output responds directly to acceleration. Accelerometers typically cover a much wider frequency range, along them to pick up signals not present with other types of transducers. Due to the frequency range, accelerometers are ideal for most types of rotating equipment, making them the most used transducer for vibration measurements.
Formal recognition by an accreditation body that a calibration or testing laboratory is able to competently perform the calibrations or tests listed in the accreditation scope document. Accreditation includes evaluation of both the quality management system and the competence to perform the measurements listed in the scope.
An organisation that conducts laboratory accreditation evaluations in conformance to ISO Guide 58.
Document issued by an accreditation body to a laboratory that has met the conditions and criteria for accreditation. The certificate, with the documented measurement parameters and their best uncertainties, serves as proof of accredited status for the time period listed. An accreditation certificate without the documented parameters is incomplete.
Set of requirements used by an accrediting body that a laboratory must meet in order to be accredited.
The accuracy of a digital tester is defined as the difference between the reading and the true value for a quantity measured in reference conditions.
A term used for power when it is necessary to distinguish among Apparent Power, Complex Power and its components, and Active and Reactive Power.
A condition where components within a drivetrain are parallel or perpendicular, according to design requirements. The Tester can diagnose misalignment conditions where these components are no longer aligned according to design requirements, causing excessive bearing wear and power consumption in the machine.
Alternating Current (AC)
An electric current that reverses direction in a circuit at regular intervals.
The unit expressing the rate of flow of an electric current. One Ampère is the current produced by a difference in potential of one volt across a resistance of one ohm; An electric current flowing at the rate of one coulomb per second.
The use of one Ampère for one hour.
Apparent Power (volt-amps)
The product of the applied voltage and current in an AC circuit. Apparent power, or volt-amps, is not the true power of the circuit because the power factor is not considered in the calculation.
As Found/As Left
Also known as Found-Left. It means Calibration data is collected without any adjustment and/or repairs performed.
Attachment pads (bronze or stainless steel) can be placed at appropriate measuring locations on machines using an industrial adhesive. The triaxial accelerometer is attached to these pads for measurement collection. The pad may include an alignment notch to ensure the consistent orientation of the accelerometer to the three vibration axes (Radial, Tangential, and Axial). The pad ensures a good transfer of vibration data to the transducer by providing a strong and consistent mounting location.
One of the three vibration axes (Radial, Tangential and Axial), the axial plane is parallel to the centreline of a shaft or turning axis of a rotating part.
Adjusting the distribution of mass in a rotating element, to reduce vibratory forces generated by rotation.
The data carrying capacity of a transmission path, measured in bits or bytes per second.
The bar is a metric (but not SI) unit of pressure, defined by the IUPAC as exactly equal to 100,000 Pa. It is about equal to the atmospheric pressure on Earth at sea level, and since 1982 the IUPAC has recommended that the standard for atmospheric pressure should be harmonized to 100,000 Pa = 1 bar ≈ 750.0616827 Torr. The same definition is used in the compressor and the pneumatic tool industries (ISO 2787).
The volume of a gas is inversely proportional to the pressure of the gas at constant temperature: V=1/P.
A black body is an idealised physical body that absorbs all incident electromagnetic radiation, regardless of frequency or angle of incidence.
Presence of calcium ions in water, from dissolved carbonates and bicarbonates; treated in boiler water by introducing sodium phosphate. The primary components of hardness are calcium (Ca++) and magnesium (Mg++) ions.
It is the process of verifying the capability and performance of an item of measuring and test equipment by comparison to traceable measurement standards.
Calibration is performed with the item being calibrated in its normal operating configuration – as the normal operator would use it. The calibration process uses traceable external stimuli, measurement standards, or artefacts as needed to verify the performance. Calibration provides assurance that the instrument is capable of making measurements to its performance specification when it is correctly used.
The result of a calibration is a determination of the performance quality of the instrument with respect to the desired specifications. This may be in the form of a pass/fail decision, determining or assigning one or more values, or the determination of one or more corrections.
The calibration process consists of comparing an IMTE unit with specified tolerances, but of unverified accuracy, to a measurement system or device of specified capability and known uncertainty in order to detect, report, or minimise by adjustment any deviations from the tolerance limits or any other variation in the accuracy of the instrument being compared. Calibration is performed according to a specified documented calibration procedure, under a set of specified and controlled measurement conditions, and with a specified and controlled measurement system.
A calibration certificate is generally a document that states that a specific item was calibrated by an organisation. The certificate identifies the item calibrated, the organisation presenting the certificate, and the effective date. A calibration certificate should provide other information to allow the user to judge the adequacy and quality of the calibration.
A calibration procedure is a controlled document that provides a validated method for evaluating and verifying the essential performance characteristics, specifications, or tolerances for a model of measuring or testing equipment.
A calibration procedure documents one method of verifying the actual performance of the item being calibrated against its performance specifications.
A calibration report is a document that provides details of the calibration of an item. In addition to the basic items of a calibration certificate, a calibration report includes details of the methods and standards used, the parameters checked, and the actual measurement results and uncertainty.
A calibration seal is a device, placard, or label that, when removed or tampered with, and by virtue of its design and material, clearly indicates tampering. The purpose of a calibration seal is to ensure the integrity of the calibration. A calibration seal is usually imprinted with a legend similar to “Calibration Void if Broken or Removed” or “Calibration Seal – Do Not Break or Remove.” A calibration seal provides a means of deterring the user from tampering with any adjustment point that can affect the calibration of an instrument and detecting an attempt to access controls that can affect the calibration of an instrument.
A calibration standard is an IMTE item, artefact, standard reference material, or measurement transfer standard that is designated as being used only to perform calibrations of other IMTE items. As calibration standards are used to calibrate other IMTE items, they are more closely controlled and characterised than the workload items they are used for. Calibration standards generally have lower uncertainty and better resolution than general-purpose items.
Designation as a calibration standard is based on the use of the specific instrument, however, not on any other consideration. For example, in a group of identical instruments, one might be designated as a calibration standard while the others are all general purpose IMTE items. Calibration standards are often called measurement standards.
The ratio of an impressed charge on a conductor to the corresponding change in potential. The ratio of the charge on either conductor of a capacitor to the potential difference between the conductors. The property of being able to collect a charge of electricity.
An electrical device having Capacitance.
The negative electrode, that emits electrons or gives off negative ions and toward which positive ions move or collect in a voltaic cell or other such device. The negative pole of a battery.
International Commission on Rules for the Approval of Electrical Equipment A regional, European safety agency in which the United States participates only as an observer.
The Celsius temperature scale is a common but non-SI temperature scale which is defined by assigning the temperatures of 0°C and 100°C to the freezing and boiling points of water, respectively.
Essentially states for a fixed volume of gas, if the temperature is raised, the pressure will increase. P = Constant x T.
Combined standard uncertainty
The standard uncertainty of the result of a measurement, when that result is obtained from the values of a number of other quantities. It is equal to the positive square root of a sum of terms. The terms are the variances or covariances of these other quantities, weighted according to how the measurement result varies with changes in those quantities. (GUM, 2.3.4)
Common Mode Pressure
The underlying common pressure (or static pressure) within a system from which a differential measurement is being made.
For a laboratory, the demonstrated ability to perform the tests or calibrations within the accreditation scope and to meet other criteria established by the accreditation body. For a person, the demonstrated ability to apply knowledge and skills.
The voltage a current source develops when attempting to drive a mA signal through a resistive load.
Condition monitoring (CM)
The measurement, recording and analysis of machinery parameters (such as acceleration) to determine equipment health. Current condition is compared to when the machine was new. Also known as machinery health monitoring.
The capability of a conductor to carry electricity, usually expressed as a percent of the conductivity of a same sized conductor of soft copper.
The conductivity (or specific conductance) of an electrolyte solution is a measure of its ability to conduct electricity. The SI unit of conductivity is Siemens per meter (S/m). Conductivity measurements are used routinely in many industrial and environmental applications as a fast, inexpensive and reliable way of measuring the ionic content in a solution.
Any material that allows electrons to flow through it.
A range of values that is expected to contain the true value of the parameter being evaluated with a specified level of confidence. The confidence interval is calculated from sample statistics. Confidence intervals can be calculated for points, lines, slopes, standard deviations, and so on.
Corrective action is something done to correct a non-conformance when it arises, including actions taken to prevent re-occurrence of the non-conformance.
The coulomb is a derived SI unit of electrical charge. A coulomb is amount of charge moved by an electric current of one ampere in one second.
A numerical factor used as a multiplier of the combined standard uncertainty in order to obtain an expanded uncertainty (GUM, 2.3.6). The coverage factor is identified by the symbol k. It is usually given the value 2, which approximately corresponds to a probability of 95 percent for degrees of freedom > 10’.
The shortest distance between two conductors as measured along the device that separates them. Creepage Distance is normally a design parameter of insulators or insulating bushings.
The ratio of the maximum value to the effective value. It represents the range of input in which a tester maintains linear operation, expressed by a multiple of the full scale value of the range being used.
A unit used to express the magnitude of change in level of electric signal or sound intensity. A voltage ratio of 1 to 10 is equal to -20dB, 10 to 1 to 20dB, 100 to 1 to 40dB and 1000 to 1 to 60dB. A power ratio of 10 to 1 is not 20dB, but 10dB, since power (P) is proportional to the square of voltage (V).
Decibels compared to one milliwatt. The higher the dBm, the higher the devices transmit or receive power.
Nonfulfillment of conditions and/or criteria for accreditation is sometimes referred to as a non-conformance.
A term used by a few calibration laboratories to refer to bias, error or systematic error. The exact meaning can usually be determined from examination of the calibration certificate.
Any electrical insulating medium between two conductors.
A number that describes the dielectric strength of a material relative to a vacuum, which has a dielectric constant of one.
A test that is used to verify an insulation system. A voltage is applied of a specific magnitude for a specific period of time.
The ability of insulating materials and spacing’s to withstand specified overvoltage’s for a specified time (one minute unless otherwise stated) without flashover or puncture.
Differential Pressure (D/P)
Other names used to mean the same thing are d/p cell, d/p transmitter and DP transmitter (where D is delta or differential). This is the most common type of transmitter used in most process industries. It can be used to measure level, flow, pressure, differential pressure, and density or specific gravity. With some modifications, it can measure such things as temperature and oxygen purity. The d/p transmitter can be pneumatic, electromechanical, or solid state. It can also be a smart transmitter. A typical large process plant can have hundreds or thousands of d/p transmitters in service.
A two-terminal semiconductor (rectifying) device that exhibits a nonlinear current-voltage characteristic. The function of a diode is to allow current in one direction and to block current in the opposite direction. The terminals of a diode are called the anode and cathode.
Direct current (DC)
Current that flows in one direction only.
Dissolved Oxygen (DO)
Oxygen saturation or dissolved oxygen (DO) is a relative measure of the amount of oxygen that is dissolved or carried in a given medium. It can be measured with a dissolved oxygen probe such as an oxygen sensor in liquid media, usually water. The standard unit is milligrams per litre (mg/l) or parts per million (ppm).
When measuring machinery vibration, displacement represents the actual distance the vibration causes the part in question to move. It is measured in thousandths of an inch (mils) in the English system and in millimetres (mm) in the metric system.
Dry Well Calibrator
A temperature calibrator that uses a precision oven to source precise temperature. This style of calibrator is often used for the verification of temperature sensors.
Electronic Valve Positioners
Devices that can control the flow in a process with input from a mA or digital control signal.
The emissivity of a material (usually written ε or e) is the relative ability of its surface to emit energy by radiation. It is the ratio of energy radiated by a particular material to energy radiated by a black body at the same temperature. A true black body would have a ε = 1 while any real object would have ε < 1. Emissivity is a dimensionless quantity.
Acceptance of the competence of other national metrology institutes (NMI), accreditation bodies, and/or accredited organisations in other countries as being essentially equal to the NMI, accreditation body, and/or accredited organisations within the host country. It can be also known as a formal, documented determination that a specific instrument or type of instrument is suitable for use in place of the one originally listed, for a particular application.
Error (of measurement)
In metrology, error (or measurement error) is an estimate of the difference between the measured value and the probable true value of the object of the measurement. The error can never be known exactly; it is always an estimate. Error may be systematic and/or random. Systematic error (also known as bias) may be corrected.
A constant current applied to an RTD probe to determine actual resistance for temperature measurement. Typical values are 2 mA or less to minimize self-heating of the probe.
The event, or inoperable state, in which any item or part of an item does not, or would not, performs as specified.
The mechanical or physical parts that results in failure.
The capacitance value of a capacitor of which there appears a potential difference of one volt when it is charged by a quantity of electricity equal to one coulomb.
Ferrous metals are a term usually used to indicate the presence of iron. Ferrous metals include steel and pig iron (with a carbon content of a few percent) and alloys of iron with other metals (such as stainless steel). Manipulation of atom-to-atom relationships between iron, carbon, and various alloying elements establishes the specific properties of ferrous metals.
The vibration of a machine caused by some mechanical excitation. If the excitation is periodic and continuous, the response motion eventually becomes steady-state.
The number of events that occur within a fixed time period, frequency is also calculated as the reciprocal of time (i.e. one divided by the time interval). Frequency is typically expressed in terms of Hertz (Hz), but can also be expressed as cycles per minute (cpm) or revolutions per minute (rpm) when multiplying Hz times 60. It can also be represented as multiples of turning speed, or “orders,” where frequency in rpm is divided by the turning speed of the machine. In ac systems, the rate at which the current changes direction, expressed in hertz (cycles per second). A measure of the number of complete cycles of a wave-form per unit of time.
Since vibration exists within the time domain, a vibration signal is represented as a time wave form if viewed on an oscilloscope. If plotted, the time waveform would represent a plot of amplitude vs. time. If the waveform were transformed to the frequency domain, the result would be a spectrum representing a plot of amplitude vs. frequency.
The pressure relative to atmospheric pressure. Gauge pressure = absolute pressure minus one atmosphere.
Gauge Pressure Transducer
A transducer that measures pressure relative to atmospheric pressure.
A gram is a metric unit of mass equal to one thousandth of a kilogram.
An electrical term meaning to connect to the earth. A conducting connection, whether intentional or accidental by which an electric circuit, or equipment, is connected to the earth or some conducting body that serves in place of the earth.
Differences in potential (voltage) between (2) signal grounds.
An acronym commonly used to identify the ISO Guide to the Expression of Uncertainty in Measurement.
A sinusoidal component of the voltage that is a multiple of the fundamental wave frequency. Harmonics are primarily the result of the today’s modern electronic equipment. Today’s electronics are designed to draw current in “pulses” rather than in a smooth, sinusoidal manner as older, non-electronic equipment did. These pulses can cause distorted current wave shapes, which in turn cause distortion of the voltage. Current and voltage harmonics can cause such problems as excessive heating of wiring, connections, motors, and transformers and can cause inadvertent tripping of circuit breakers.
In the output signal of a device, distortion caused by the presence of frequencies not present in the input signal.
The meter-kilogram-second unit of inductance, equal to the inductance of a circuit in which an electromotive force of one volt is produced by a current in the circuit which varies at the rate of one Ampère per second.
A unit of frequency equal to one cycle per second. In alternating current, the number of changes of the negative and positive poles per second.
Humidity is the amount of water vapour in the air. There are three main measurements of humidity: absolute, relative and specific. Absolute humidity is the water content of air. Relative humidity, expressed as a percentage, measures the current absolute humidity relative to the maximum for that temperature. Specific humidity is a ratio of the water vapour content of the mixture to the total air content on a mass basis.
Ideal Gas Law
Combining Boyle’s Law and Charles’ Law, results in the Ideal Gas Law: PV=nRT, where nR is constant for a particular gas analogous to the number of molecules and the relative size of the molecule.
International Electrotechnical Commission.
A condition on rotating equipment where the centre of mass does not lie on the centre of rotation. Imbalance can severely reduce bearing life as well as cause undue machine vibration.
The acronym IMTE refers to inspection, measuring, and test equipment. This term includes all items that fall under a calibration or measurement management program. IMTE items are typically used in applications where the measurement results are used to determine conformance to technical or quality requirements before, during, or after a process. Some organisations do not include instruments used solely to check for the presence or absence of a condition (such as voltage, pressure, and so on) where a tolerance is not specified and the indication is not critical to safety.
Note: Organisations may refer to IMTE items as MTE (measuring and testing equipment), TMDE (test, measuring, and diagnostic equipment), GPETE (general purpose electronic test equipment), PME (precision measuring equipment), PMET (precision measuring equipment and tooling), or SPETE (special purpose electronic test equipment).
Inches of mercury (inHg or Hg)
Inches of mercury are an alternate unit of measurement for pressure. It is defined as the pressure exerted by a circular column of mercury of 1 inch in height at 32 °F (0 °C). 1 inHg = 3386.389 Pa at 0 °C.
The property of a circuit in which a change in current induces an electro motive force. Magnetic component of impedance.
The initial surge of current experienced before the load resistance of impedance increases to its normal operating value.
A non-conductive material used on a conductor to separate conducting materials in a circuit. The non-conductive material used in the manufacture of insulated cables.
Organisation, performance, and evaluation of tests or calibrations on the same or similar items or materials by two or more laboratories in accordance with predetermined conditions.
A systematic and documented process for obtaining audit evidence and evaluating it objectively to verify that a laboratory’s operations comply with the requirements of its quality system. An internal audit is done by or on behalf of the laboratory itself, so it is a first-party audit.
International Organisation for Standardization (ISO)
An international nongovernmental organisation chartered by the United Nations in 1947, with headquarters in Geneva, Switzerland. The mission of ISO is “to promote the development of standardisation and related activities in the world with a view to facilitating the international exchange of goods and services, and to developing cooperation in the spheres of intellectual, scientific, technological and economic activity.”
The scope of ISO’s work covers all fields of business, industry and commerce except electrical and electronic engineering. The members of ISO are the designated national standards bodies of each country.
I/P (I to P)
A current to pressure transmitter. A common instrument in modern industrial plants. A typical large paper mill or refinery could have 5,000 I/Ps in use.
IP numbers are an indication of the level of protection designed into a piece of equipment to protect the internal components from solid or liquid entry.
IP is short for Ingress Protection and the numbers following the letters define the conditions that can be tolerated. The IP67 rating indicates that the equipment is dust tight and can withstand immersion in water to a depth of 1 metre for a period of at least 30 minutes.
International Practical Temperature Scale of 1968. A temperature standard adopted in 1968 that uses intrinsic standards to define the measurement of temperature.
International System of Units (SI)
A defined and coherent system of units adopted and used by international treaties. (The acronym SI is from the French Systéme International.) SI is international system of measurement for all physical quantities. (Mass, length, amount of substance, time, electric current, thermodynamic temperature, and luminous intensity.) SI units are defined and maintained by the International Bureau of Weights and Measures (BIPM) in Paris, France. The SI system is popularly known as the metric system.
ISO is a Greek word root meaning equal. The International Organisation for Standardization chose the word as the short form of the name, so it will be a constant in all languages. In this context, ISO is not an acronym. (If the acronym was based on the full name were used, it would be different in each language.) The name also symbolises the mission of the organisation – to equalise standards worldwide.
A reduction in motion severity, usually by a resilient support. A shock mount or isolator attenuates shock. A vibration mount or isolator attenuates steady-state vibration.
International Temperature Scale of 1990. A temperature standard adopted in 1990 that uses intrinsic standards to define the measurement of temperature. This standard modifies the intrinsic standards of IPTS-68 with additional intrinsic references.
The Kelvin temperature scale is an absolute temperature scale based on the definition that the volume of a gas at constant (low) pressure is directly proportional to temperature and that 100 degrees separates the freezing and boiling points of water.
Kelvin temperatures are written with a capital letter ‘K’ and without the degree symbol, such as 1 K, 1120 K. Note that 0 K is ‘absolute zero’ and there are no negative Kelvin temperatures.
The kilogram (SI unit symbol: kg), is the base unit of mass in the International System of Units (SI) and is defined as being equal to the mass of the International Prototype of the Kilogram (IPK).
Apparent Power expressed in Thousand Volt-Amps.
kVAR is the measure of additional reactive current flow which occurs when the voltage and current flow are not perfectly synchronised or not in phase.
Actual Power expressed in Kilo-Watts (kW).
Kilo Watt Hour, the use of one thousand watts for one hour.
Lead Resistance Compensation
A compensation method used with 3 and 4 wire RTDs and resistance measurement. This method negates the error associated with lead resistance when making an RTD measurement.
Light Emitting Diode. It is a p–n junction diode that emits light when activated. When a suitable voltage is applied to the leads, electrons are able to recombine with electron holes within the device, releasing energy in the form of photons.
Level of confidence
Defines an interval about the measurement result that encompasses a large fraction p of the probability distribution characterised by that result and its combined standard uncertainty, and p is the coverage probability or level of confidence of the interval. Effectively, the coverage level expressed as a percent.
All phases in the life of the system from initial requirements until retirement including design, specification, programming, testing, installation, operation, and maintenance.
The closeness of a calibration curve to a specified straight line. Linearity is expressed as the maximum deviation of any calibration point from a specified straight line.
A device that produces an electrically isolated mirror image of the input side 4-20 mA current.
The maximum pressure in the pressure vessel or pipe for differential pressure measurement.
It is the SI unit of illuminance. One lux is equal to one lumen per square metre (1 lx = 1 lm/m2 = 1 cd·sr/m2).
The planned, formal, periodic, and scheduled examination of the status and adequacy of the quality management system in relation to its quality policy and objectives by the organisation’s top management.
Mass is a property of a physical body which determines the body’s resistance to being accelerated by a force and the strength of its mutual gravitational attraction with other bodies. The SI unit of mass is the kilogram (kg).
A measurement of reliability for repairable items: The mean number of life units during which all parts of the item perform within their specified limits, during a particular measurement interval under stated conditions.
A basic measure of reliability for non-repairable items: The total number of life units of an item divided by the total number of failures within that population, during a particular measurement interval under stated conditions.
A set of operations performed for the purpose of determining the value of a quantity.
A measurement system is the set of equipment, conditions, people, methods, and other quantifiable factors that combine to determine the success of a measurement process. The measurement system includes at least the test and measuring instruments and devices, associated materials and accessories, the personnel, the procedures used, and the physical environment.
A testing device that applies a DC voltage and measures the resistance (in millions of ohms) offered by conductor’s or equipment insulation.
Metrology is the science and practice of measurement.
A unit of electric current equal to one millionth of an ampere.
A unit of electric current equal to one thousandth of an ampere.
Millibar (mbar or mb)
A millibar is 1/1000th of a bar, a unit for measurement of pressure. It is not an SI unit of measure, however it is one of the units used in meteorology when describing atmospheric pressure. The SI unit is the Pascal (Pa), with 1 millibar = 100 Pascals (a hectopascal).
One millionth (10−6) of a farad.
Microlitre (µL or µl)
A microliter is a unit of volume equal to 1/1,000,000th of a litre.
Micrometre (µm or um)
The micrometre, also commonly known as a micron, is an SI derived unit of length equalling 1×10−6 of a metre. It is one millionth of a metre (or one thousandth of a millimetre, 0.001 mm, or about 0.000039 inch). The symbol µm is sometimes rendered as um if the symbol µ cannot be used.
Millilitre (mL, ml, or mℓ)
A Millilitre is a unit of volume equal to 1/1,000th of a litre.
Millisecond (ms or mSec)
A millisecond is a unit of time equal to one thousandth (10−3 or 1/1,000) of a second.
Operations those are independent of an established calibration laboratory facility. Mobile operations may include work from an office space, home, vehicle, or the use of a virtual office.
The nanometre is a unit of length in the metric system, equal to one billionth of a metre. The nanometre is commonly used to specify the wavelength of electromagnetic radiation near the visible part of the spectrum: visible light ranges from around 400 to 800 nm.
The National Association of Testing Authorities Australia (NATA), is an accrediting authority recognised by the Government of Australia via a memorandum of understanding, as an organisation that can provide independent assurance of technical competence. NATA provides assessment, accreditation and training services to laboratories and technical facilities throughout Australia and internationally.
Natural (physical) constant
A natural constant is a fundamental value that is accepted by the scientific community as valid. Natural constants are used in the basic theoretical descriptions of the universe. Examples of natural physical constants important in metrology are the speed of light in a vacuum (c), the triple point of water (273.16 K), the quantum charge ratio (h/e), the gravitational constant (G), the ratio of a circle’s circumference to its diameter (p), and the base of natural logarithms (e).
Formerly known as the National Conference of Standards Laboratories (NCSL). NCSL was formed in 1961 to “promote cooperative efforts for solving the common problems faced by measurement laboratories. NCSL has member organisations from academic, scientific, industrial, and commercial and government facilities around the world. NCSL is a non-profit organisation, whose membership is open to any organisation with an interest in the science of measurement and its application in research, development, education, or commerce. NCSL promotes technical and managerial excellence in the field of metrology, measurement standards, instrument calibration, and test and measurement.”
A nephelometer is an instrument for measuring concentration of suspended particulates in a liquid or gas colloid. A nephelometer measures suspended particulates by employing a light beam (source beam) and a light detector set to one side (often 90°) of the source beam. Particle density is then a function of the light reflected into the detector from the particles.
Nephelometric Turbidity Units (NTU)
The units of turbidity from a calibrated nephelometer are called Nephelometric Turbidity Units (NTU).
A non-ferrous metal is any metal, including alloys, that does not contain iron in appreciable amounts. Non-ferrous metals may be used to achieve desirable material properties such as low weight, higher conductivity, non-magnetic properties or resistance to corrosion.
Normal Acceleration (gn)
Normal Acceleration is a unit in the category of Acceleration. This unit is commonly used in the INT unit system. Normal Acceleration (gn) has a dimension of LT-2 where L is length, and T is time. It can be converted to the corresponding standard SI unit m/s2 by multiplying its value by a factor of 9.80665.
A word to describe the Unit Under Test (UUT), Instrument or Artefact.
Offset is the difference between a nominal value (for an artefact) or a target value (for a process) and the actual measured value. For example, if the thermocouple alloy leads of a reference junction probe are formed into a measurement junction and placed in an ice point cell, and the reference junction itself is also in the ice point, then the theoretical thermoelectric EMF measured at the copper wires should be zero. Any value other than zero is an offset created by inhomogeneity of the thermocouple wires combined with other uncertainties.
A unit of electrical resistance defined as the resistance of a circuit with a voltage of one volt and a current flow of one Ampère.
U=IR; I=U/R; R=U/I; Where U = Voltage impressed on a circuit, I = current flowing in a circuit and R = circuit resistance. Ohm’s Law is used for calculating voltage drop, fault current and other characteristics of an electrical circuit.
In rotating machines, orders are multiples or harmonics of the running speed (or associated reference component).
A common primary sensing element (PSE) for measuring flow. It must be used in conjunction with a d/p cell. It creates a venturi and a resulting P is developed across the plate whose square root is proportional to flow.
The Pascal is the SI derived unit of pressure, internal pressure, stress, Young’s modulus and tensile strength, defined as one newton per square metre.
Peak to Peak
The amplitude of the ac wave form from its positive peak to its negative peak.
A performance test (or performance verification) is the activity of verifying the performance of an item of measuring and test equipment to provide assurance that the instrument is capable of making correct measurements when it is properly used. A performance test is done with the item in its normal operating configuration. A performance test is the same as a calibration.
pH is a measure of the acidity or basicity of an aqueous solution. Solutions with a pH less than 7 are said to be acidic and solutions with a pH greater than 7 are basic or alkaline. Pure water has a pH very close to 7.
The angular displacement between a current and voltage waveform, measured in degrees or radians.
Phase rotation defines the rotation in a Poly-Phase System and is generally stated as “1-2-3”, counter clockwise rotation. Utilities in the United States use “A-B-C” to define their respective phase names in place “1-2-3”. However some refer to their rotation as A-B-C, A-C-B, or C-B-A counter clockwise, were “A” can replace 1, 2, or 3. Europe adapted R-S-T to define the phase names.
P/I (P to I)
A pressure to current transducer.
Rotation in the plane of forward motion, about the left-right axis.
The electrical Term used to denote the voltage relationship to a reference potential (+). With regard to Transformers, Polarity is the indication of the direction of the current flow through the high voltage terminals with respect to the direction through the low voltage terminals.
A policy defines and sets out the basic objectives, goals, vision, or general management position on a specific topic. A policy describes what management intends to have done regarding a given portion of business activity. Policy statements relevant to the quality management system are generally stated in the quality manual. Policies can also be in the organisation’s policy/procedure manual.
The ratio of energy consumed (watts) versus the product of input voltage (volts) times input current (amps). In other words, power factor is the percentage of energy used compared to the energy flowing through the wires. Adding capacitors to the system changes the inductive effect of the ballast coils, converting a Normal Power Factor (NPF) to a High Power Factor (HPF) system.
Refers to a pneumatic instrument that performs a function to its input and provides the result on its output.
Precision is a property of a measuring system or instrument. Precision is a measure of the repeatability of a measuring system – how much agreement there is within a group of repeated measurements of the same quantity under the same conditions. Precision is not the same as accuracy.
Precision Current Shunt
A conductor joining two points in a circuit to form a parallel circuit, through which a precision voltage can be measured or derived.
Pressure (p or P)
Pressure is the ratio of force to the area over which that force is distributed. It is force per unit area applied in a direction perpendicular to the surface of an object. Gauge pressure is the pressure relative to the local atmospheric or ambient pressure. Pressure is measured in any unit of force divided by any unit of area. The SI unit of pressure is the newton per square metre, which is called the Pascal (Pa) after the seventeenth-century philosopher and scientist Blaise Pascal.
Preventive action is something done to prevent the possible future occurrence of a non-conformance, even though such an event has not yet happened. Preventive action helps improve the system.
A procedure describes a specific process for implementing all or a portion of a policy. There may be more than one procedure for a given policy. A procedure has more detail than a policy but less detail than a work instruction. The level of detail needed should correlate with the level of education and training of the people with the usual qualifications to do the work and the amount of judgment normally allowed to them by management. Some policies may be implemented by fairly detailed procedures, while others may only have a few general guidelines.
The person responsible for the business process.
Determination of laboratory testing performance by means of inter-laboratory comparisons.
Pounds per square inch (same as psig).
Pounds per square inch absolute.
Pounds per square inch differential.
Pounds per square inch gauge (same as psi).
The quality manual is the document that describes the quality management policy of an organisation with respect to a specified conformance standard. The quality manual briefly defines the general policies as they apply to the specified conformance standard and affirms the commitment of the organisation’s top management to the policy. In addition to its regular use by the organisation, auditors use the quality manual when they audit the quality management system. The quality manual is generally provided to customers on request. Therefore, it does not usually contain any detailed policies and never contains any procedures, work instructions, or proprietary information.
One of the three vibration axes (Radial, Tangential and Axial), the radial plane represents the direction from the transducer to the centre of the shaft on rotating equipment. For typical horizontal machines, Radial equals the vertical axis. For Horizontal machines Radial refers the Horizontal axis to which the accelerometer is attached.
Random error is the result of a single measurement of a value, minus the mean of a large number of measurements of the same value. Random error causes scatter in the results of a sequence of readings and, therefore, is a measure of dispersion. Random error is usually evaluated by Type A methods, but Type B methods are also used in some situations.
Note: Contrary to popular belief, the GUM specifically does not replace random error with either Type A or Type B methods of evaluation.
Nominal operating limits, specified by the lowest calibration point to the highest calibration point.
The average value of the instantaneous product of volts and amps over a fixed period of time in an AC circuit.
A specific range of values of an influence quantity within which the transducer complies with the requirements concerning intrinsic errors.
A specified single value of an influence quantity at which the transducer complies with the requirements concerning intrinsic errors.
Conditions of use for a transducer prescribed for performance testing, or to ensure valid comparison of results of measurement.
The temperature at which a thermocouple temperature measurement is referenced.
Repair is the process of returning an unserviceable or nonconforming item to serviceable condition. The instrument is opened, or has covers removed, or is removed from its case and may be disassembled to some degree. Repair includes adjustment or alignment of the item as well as component-level repair. (Some minor adjustment such as zero and span may be included as part of the calibration.) The need for repair may be indicated by the results of a calibration. For calibratable items, repair is always followed by calibration of the item. Passing the calibration test indicates success of the repair.
Minor repair is the process of quickly and economically returning an unserviceable item to serviceable condition by doing simple work using parts that are in stock in the calibration lab. Examples include replacement of batteries, fuses, or lamps; or minor cleaning of switch contacts; or repairing a broken wire; or replacing one or two in-stock components. The need for repair may be indicated by the results of a calibration. For calibratable items, minor repair is always followed by calibration of the item.
Passing the calibration test indicates success of the repair. Minor repairs are defined as repairs that take no longer than a short time as defined by laboratory management, and where no parts have to be ordered from external suppliers, and where substantial disassembly of the instrument is not required.
The maximum deviation from the mean of corresponding data points taken under identical conditions. The maximum difference in output for identically-repeated stimuli when there is no change in other test conditions.
Testing that reproduces a specified desired history.
One or more numerical results of a calibration process, with the associated measurement uncertainty, as recorded on a calibration report or certificate. The specific type and format vary according to the type of measurement being made. In general, most reported values will be in one of these formats:
- Measurement result and uncertainty. The reported value is usually the mean of a number of repeat measurements. The uncertainty is usually expanded uncertainty as defined in the GUM.
- Deviation from the nominal (or reference) value and uncertainty. The reported value is the difference between the nominal value and the mean of a number of repeat measurements. The uncertainty of the deviation is usually expanded uncertainty as defined in the GUM.
- Estimated systematic error and uncertainty. The value may be reported this way when it is known that the instrument is part of a measuring system and the systematic error will be used to calculate a correction that will apply to the measurement system results.
The algebraic sum, in a multi-phase system, of all the line currents.
The opposition to current flow, expressed in ohms.
The smallest input change that produces a detectable change in an instrument’s output.
Revolutions per minute (rpm, RPM, rev/min, r/min, or r·min−1)
Revolutions per minute are a measure of the frequency of a rotation. It annotates the number of turns completed in one minute around a fixed axis. It is used as a measure of rotational speed of a mechanical component. It is not a unit under the International System of Units (SI).
Root Cause Analysis
Determining what actually caused a failure via systematic investigation starting at the source.
The effective value of alternating current or voltage. The RMS value equates an AC current or voltage to a DC current or voltage that provides the same power transfer.
Resistance Temperature Device, a temperature measurement sensor that has predictable changes in resistance with a change in temperature. The most common RTD is the platinum PT100-385.
The speed, usually expressed in revolutions per minute (rpm), at which a rotating machine runs. It may also be expressed in Hz by dividing rpm by 60.
Scope of Accreditation
For an accredited calibration or testing laboratory, the scope is a documented list of calibration or testing fields, parameters, specific measurements, or calibrations and their best measurement, uncertainty. The scope document is an attachment to the certificate of accreditation and the certificate is incomplete without it. Only the calibration or testing areas that the laboratory is accredited for are listed in the scope document, and only the listed areas may be offered as accredited calibrations or tests. The accreditation body usually defines the format and other details.
Thermoelectric effect in which the voltage potential increases with temperature (thermocouples) in a junction of dissimilar metals.
Self-calibration or a self- check is a process performed by a user for the purpose of making an IMTE instrument or system ready for use. The process may be required at intervals such as every power-on sequence; or once per shift, day, or week of continuous operation; or if the ambient temperature changes by a specified amount. Once initiated, the process may be performed totally by the instrument or may require user intervention and/or use of external calibrated artefacts. The usual purpose is accuracy enhancement by characterisation of errors inherent in the measurement system before the item to be measured is connected.
Self-calibration is not equivalent to periodic calibration (performance verification) because it is not performed using a calibration procedure and does not meet the metrological requirements for calibration. Also, if an instrument requires self-calibration before use, then that will also be accomplished at the start of a calibration procedure.
The ratio between electrical signal (output) and mechanical quantity (input).
A load that occurs when at ungrounded conductor comes into contact with another conductor or grounded object. An abnormal connection of relatively low impedance, whether made intentionally or by accident, between two points of different potential.
A circuit to modulate a signal so as to make it intelligible to, or compatible with, another device, including such manipulation as pulse shaping, pulse clipping, compensating, digitizing, and linearizing.
Sensor with no moving parts.
In metrology, a specification is a documented statement of the expected performance capabilities of a large group of substantially identical measuring instruments, given in terms of the relevant parameters and including the accuracy or uncertainty. Customers use specifications to determine the suitability of a product for their own applications. A product that performs outside the specification limits when tested (calibrated) is rejected for later adjustment, repair, or scrapping.
A standard is a document that describes the processes and methods that must be performed in order to achieve a specific technical or management objective, or the methods for evaluation of any of these. An example is ISO/IEC 17025 – a standard that describes the requirements for the quality management system of a calibration organisation and the requirements for calibration and management of the measurement standards used by the organisation.
A standard is a system, instrument, artefact, device, or material that is used as a defined basis for making quantitative measurements. The value and uncertainty of the standard define a limit to the measurements that can be made: a laboratory can never have better precision or accuracy than its standards. Measurement standards are generally used in calibration laboratories. Items with similar uses in a production shop are generally regarded as working-level instruments by the calibration program.
- Primary standard
Accepted as having the highest metrological qualities and whose value is accepted without reference to other standards of the same quantity. Examples: triple point of water cell and caesium beam frequency standard.
- Transfer standard
A device used to transfer the value of a measurement quantity (including the associated uncertainty) from a higher level to a lower level standard.
- Secondary standard
The highest accuracy level standards in a particular laboratory generally used only to calibrate working standards. Also called a reference standard.
- Working standard
A standard that is used for routine calibration of IMTE. The highest level standards, found in national and international metrology laboratories, are the realisations or representations of SI units.
Standard operating procedure (SOP)
A term used by some organisations to identify policies, procedures, or work instructions.
Standard reference material
A standard reference material (SRM) as defined as a material or artefact that has had one or more of its property values certified by a technically valid procedure, and is accompanied by, or traceable to, a certificate or other documentation. Standard reference materials are manufactured according to strict specifications and certified for one or more quantities of interest. SRMs represent one of the primary vehicles for disseminating measurement technology to industry.
The uncertainty of the result of a measurement, expressed as a standard deviation. (GUM, 2.3.1)
The zero-velocity pressure at any arbitrary point within a system.
A measuring element for converting force, pressure, tension, etc., into an electrical signal.
A systematic error is the mean of a large number of measurements of the same value minus the (probable) true value of the measured parameter. Systematic error causes the average of the readings to be offset from the true value. Systematic error is a measure of magnitude and may be corrected. Systematic error is also called bias when it applies to a measuring instrument. Systematic error may be evaluated by Type A or Type B methods, according to the type of data available.
The person responsible for the availability, and maintenance of a computerised system and for the security of the data residing on that system.
One of the three vibration axes (Radial, Tangential and Axial), the tangential plane is positioned 90 degrees to the Radial plane, running tangent to the drive shaft. For typical horizontal machines, tangential equals the horizontal axis. For typical vertical machines tangential equals the second horizontal axis perpendicular to the mounting of the accelerometer.
Temperature is a numerical measure of hot and cold. Its measurement is by detection of heat radiation, particle velocity, kinetic energy, or most commonly, by the bulk behaviour of a thermometric material. When a body that exchanges little energy or matter with its surroundings, temperature tends to become spatially uniform as time passes.
When a path permeable only to heat is open between two bodies, energy transfers spontaneously as heat from a hotter body to a colder one. The transfer rate depends on the nature of the path. If they are connected by a path permeable only to heat, and no heat flows between them, then the two bodies are equally hot. If changes are slow and spatially smooth enough to allow consistent comparisons of their hotness with other bodies that are respectively in their own states of internal thermodynamic equilibrium, they obey the Zeroth law of thermodynamics and then they have well defined and equal temperatures. Then thermodynamics provides a fundamental physical definition of temperature, on an absolute scale, relying on the second law of thermodynamics.
The coldest theoretical temperature is called absolute zero. It can be approached but not reached in any actual physical system. It is denoted by 0 K on the Kelvin scale, −273.15 °C on the Celsius scale. In matter at absolute zero, the motions of microscopic constituents are minimal.
Test Accuracy Ratio
In a calibration procedure, the test accuracy ratio (TAR) is the ratio of the accuracy tolerance of the unit under calibration to the accuracy tolerance of the calibration standard used.
THD (%THD, Total Harmonic Distortion) — the contribution of all harmonic frequency currents or voltages to the fundamental current or voltage, expressed as a percentage of the fundamental.
A junction of dissimilar metals that generates a small voltage correlated to the temperature of the junction.
A generic legal term for any individual who does not have a direct connection with a legal transaction but who might be affected by it.
A tolerance is a design feature that defines limits within which a quality characteristic is supposed to be on individual parts; it represents the maximum allowable deviation from a specified value. Tolerances are applied during design and manufacturing. A tolerance is a property of the item being measured.
Torque is the tendency of a force to cause or change rotational motion of a body. Torque is calculated by multiplying Force and distance, so the SI units of torque are newton-meters, or Nm (even though this is the same as joules, torque isn’t work or energy, so should just be newton-meters). It is also known as: Moment.
Traceability is a property of the result of a measurement, providing the ability to relate the measurement result to stated references, through an unbroken chain of comparisons each having stated uncertainties. Traceability is a demonstrated or implied property of the result of a measurement to be consistent with an accepted standard within specified limits of uncertainty.
The stated references are normally the base or supplemental SI units as maintained by a national metrology institute; fundamental or physical natural constants that are reproducible and have defined values; ratio type comparisons; certified standard reference materials; or industry or other accepted consensus reference standards. Traceability provides the ability to demonstrate the accuracy of a measurement result in terms of the stated reference.
Measurement assurance methods applied to a calibration system include demonstration of traceability. A calibration system operating under a program controls system only implies traceability. Evidence of traceability includes the calibration report (with values and uncertainty) of calibration standards, but the report alone is not sufficient. The laboratory must also apply and use the data. A calibration laboratory, a measurement system, a calibrated IMTE, a calibration report, or any other thing is not and be traceable to a national standard. Only the result of a specific measurement can be said to be traceable, provided all of the conditions just listed are met.
A transfer measurement is a type of method that enables making a measurement to a higher level of resolution than normally possible with the available equipment. Common transfer methods are differential measurements and ratio measurements.
A transfer standard is a measurement standard used as an intermediate device when comparing two other standards. Typical applications of transfer standards are to transfer a measurement parameter from one organisation to another, from a primary standard to a secondary standard, or from a secondary standard to a working standard in order to create or maintain measurement traceability. Examples of typical transfer standards are DC volt sources (standard cells or zener sources), and single value standard resistors, capacitors, or inductors.
Triple Point of Water
This temperature reference point is the intrinsic standard at which water is liquid, ice and gas. This reference point defines 0.01 °C.
Most alternating currents and voltages are expressed in effective values, which are also referred to as RMS (Root-Mean-Square) values. The effective value is the square root of the average of the square of alternating current or voltage values. Many clamp meters with rectifier type circuits have scales that are calibrated in RMS values for AC measurements. But, they actually measure the average value of input voltage or current, assuming the voltage or current to be a sine wave. The conversion factor for a sine wave, which is obtained by dividing the effective value by the average value, is 1.1. These instruments are in error if the input voltage or current has some other shape than a sine wave.
Turbidity is the cloudiness or haziness of a fluid caused by large numbers of individual particles that are generally invisible to the naked eye, similar to smoke in air. The measurement of turbidity is a key test of water quality
Type A Evaluation (of uncertainty)
Type A evaluation of measurement uncertainty is the statistical analysis of actual measurement results to produce uncertainty values. Both random and systematic error may be evaluated by Type A methods. (GUM, 3.3.3 through 3.3.5) Uncertainty can only be evaluated by Type A methods if the laboratory actually collects the data.
Type B Evaluation (of uncertainty)
Type B evaluation of measurement uncertainty includes any method except statistical analysis of actual measurement results. Both random and systematic error may be evaluated by Type B methods. (GUM, 3.3.3 through 3.3.5) Data for evaluation by Type B methods may come from any source believed to be valid.
Unequal mass distribution on a rotor. The mass centreline does not coincide with the rotation or geometric centreline. Also known as imbalance.
Uncertainty is a property of a measurement result that defines the range of probable values of the measured. Total uncertainty may consist of components that are evaluated by the statistical probability distribution of experimental data or from assumed probability distributions based on other data. Uncertainty is an estimate of dispersion; effects that contribute to the dispersion may be random or systematic. (GUM, 2.2.3)
Uncertainty is an estimate of the range of values that the true value of the measurement is within, with a specified level of confidence. After an item that has a specified tolerance has been calibrated using an instrument with a known accuracy, the result is a value with a calculated uncertainty.
The systematic description of known uncertainties relevant to specific measurements or types of measurements, categorised by type of measurement, range of measurement, and/or other applicable measurement criteria.
Uncertainty of Measurement
Uncertainty of Measurement is part of the expression of the correct result which defines the range values within the true value, or if appropriate the accepted true value is estimated to be.
The unit test – the instrument being calibrated. These are standard generic labels for the IMTE item that is being calibrated, which are used in the text of the calibration procedure for convenience. It may also be called Device Under Test (DUT), Unit Under Calibration (UUC) or Equipment Under Test (EUT).
Electrical capacity or electrical load, expressed as Volts*Amps. Volt Ampère rating designates the output which a transformer can deliver at rated voltage and frequency without exceeding a specified temperature rise.
Substantiation by examination and provision of objective evidence that verified processes, methods, and/or procedures are fit for their intended use.
Volt Ampère Reactive.
Confirmation by examination and provision of objective evidence that specified requirements have been fulfilled.
A small, movable, graduated scale running parallel to the fixed graduated scale and used for measuring a fractional part of one of the divisions of the fixed scale.
Velocity is the rate of change in position, measured in distance per unit of time. When measuring vibration signals, velocity also represents the rate of change in displacement and is expressed in inches (in) or millimetres (mm) per second.
Mechanical motion around an equilibrium reference point.
An acronym commonly used to identify the ISO International 200 Vocabulary of Basic and General Terms in Metrology. (The acronym comes from the French title.)
Voltage, Volt (V)
A unit of electromotive force. The electrical potential needed to produce one Ampère of current with a resistance of one ohm.
The loss of voltage in a circuit when current flows.
With AC measurements, effective power (measured in Watts) equals the product of voltage, current, and power factor (the cosine of the phase angle between the current and the voltage). Watts=E*I *COS(φ). A Watt is a unit of power that considers both volts and amps and is equal to the power in a circuit in which a current of one Ampère flows across a potential difference of one volt.
A unit of work equal to the power of one watt operating for one hour.
A differential pressure transducer or transmitter that uses a metal diaphragm at the wet port where fluids can be applied and no diaphragm at the dry port. The dry port exposes the sensor material to the medium, so only clean dry gas can be applied to this port.
The diaphragm and pressure port material that comes in direct contact with the medium (gas, liquid).
In a quality management system, a work instruction defines the detailed steps necessary to carry out a procedure. Work instructions are used only where they are needed to ensure the quality of the product or service. The level of education and training of the people with the usual qualifications to do the work must be considered when writing a work instruction. In a metrology laboratory, a calibration procedure is a type of work instruction.
This information is subject to change without notice. If any errors are identified, please contact: firstname.lastname@example.org