MMM Lab Viva Questions :-
1. What is metrology?
Metrology is the science of measurement. Metrology includes all theoretical and practical aspects of measurement. Metrology is the process of making extremely precise measurements of the relative positions and orientations of different optical and mechanical components. Metrology is concerned with the establishment, reproduction, conservation and transfer of units of measurement & their standards.
2. What are the objectives of metrology?
- To provide accuracy at minimum cost.
- Thorough evaluation of newly developed products, and to ensure that components are within the specified dimensions.
- To determine the process capabilities.
- To assess the measuring instrument capabilities and ensure that they are adequate for their specific measurements.
- To reduce the cost of inspection & rejections and rework.
- To standardize measuring methods.
- To maintain the accuracy of measurements through periodical calibration of the instruments.
- To prepare designs for gauges and special inspection fixtures
3. What is calibration?
Calibration is the comparing of an unknown measurement device against equal or better known standard under specified conditions. Every measuring system must be provable. The
procedure adopted to prove the ability of a measuring system to measure reliably is called ‘calibration’.
4. Give the importance of calibration.
∗ Assurance of accurate of measurements
∗ Ability to trace measurements to international standards
∗ International acceptance of test/calibration reports
∗ Consumer protection (legal metrology)
∗ Correct diagnosis of illness (medical reports)
∗ Meeting the requirements of ISO 9000 & 17025
5. What is a load cell?
A Load cell is a transducer that is used to convert a force into an electrical signal. This conversion is indirect and happens in two stages. Through a mechanical arrangement, the force being sensed deforms a strain gauge. The strain gauge measures the deformation (strain) as an electrical signal, because the strain changes the effective electrical resistance of the wire.
6. List the various linear measuring instruments.
- Vernier Calipers
- Height Gauge
- Micrometer etc.
7. Define an error.
Error may be defined as the difference between the best measured or indicated value and the true or actual value. No measurement can be made without errors at all times i.e. 100% accurate measurements cannot be made at all the times. Classified in different ways, they are: Systematic error, Random errors and illegitimate errors.
8. Define Standard with an example.
“Something that is set up & established by an authority as a rule of the measure of the quantity, weight, extent, value or quality” Ex: A meter is a standard established by an international organization for the measure of length.
9. Define measurements. Mention different methods of measurements.
Measurement is a process or an act of comparing a quantitatively an unknown magnitude with a predefined standard. For Example, consider the measurement of a length of a bar. We made use of a scale/ steel rule (i.e. a standard). It is a collection of quantitative data. A measurement is a process of comparing a quantity with a standard unit. Since this comparison cannot be perfect, measurements inherently include error. There are two
methods of measurement: 1) direct comparison with primary or secondary standard & 2) indirect comparison through the use of calibrated system.
10. What is L.V.D.T? What is its application?
The linear variable differential transformer (LVDT) (also called just a differential transformer) is a type of electromechanical transformer used to convert linear displacement into electrical signal. Although the LVDT is a displacement sensor, many other physical quantities can be sensed by converting displacement to the desired quantity via thoughtful arrangements.
11. Explain the principle of working of a L.V.D.T
The LVDT converts a position or linear displacement from a mechanical reference (zero, or null position) into a proportional electrical signal containing phase (for direction) and amplitude (for distance) information.
12. What is Precision?
Precision of an instrument indicates its ability to reproduce a certain reading with a given accuracy. It is the degree of agreement between repeated results.
13. Define sensitivity.
Sensitivity is the ratio of the magnitude of the output quantity (response) to the magnitude of input quantity. Ex: 1 mV recorder might have a 10 cm scale. Its sensitivity would be a 10 cm/mV. Assuming that measurement is linear all across the scale.
14. Define Linearity.
A measuring system is said to be a linear if the output is linearly proportional to the input.
15. Define Repeatability.
Repeatability is defined as the ability of a measuring system to reproduce output readings when the same input is applied to it consecutively under the same conditions & in the same directions.
16. Define Hysteresis.
An instrument is said to exhibit hysteresis when there is a difference in readings depending on whether the value of the measured quantity is approached from higher value or from a lower value. Hysteresis is a phenomenon which depicts different output effects when loading and unloading.
17. Define Resolution or Discrimination.
Resolution is defined as the smallest increment of input signal that a measuring system is capable of displaying or Measurement resolution which is the smallest change in the underlying physical quantity that produces a response in the measurement.
18. Define Accuracy.
Accuracy of an instrument indicates the deviation of the reading from a known input.
19. Define least count.
It is the smallest difference between two indications that can be detected on the instrument scale.
20. Define Readability & Threshold.
Readability indicates the closeness with which the scale of the instrument may be read.
Ex: an instrument with 30 cm scale has a higher readability than that of a 15 cm scale. Threshold: If the instrument input is increased very gradually from zero, there will be some minimum value of input below which no output change can be detected. This minimum value defines the threshold of the instrument.
21. Define system response.
System response: Response of a system may be defined as the ability of the system to transmit & present all the relevant information contained in the input signal & to exclude all others. If the output is faithful to input, i.e. the output signals have the same phase relationships as that of input signal, the system is said to have good System response. If there is a lag or delay in output signal which may be due to natural inertia of the system, it is known as ‘measurement lag’. “Rise time” is defined as the time taken for system to change from 5% to 95% of its final value. It is measure of the speed of response of a measuring system and a short rise time is desirable.
22. Define Discrepancy.
The difference between two indicated values or results determined from a Supposedly fixed time value.
23. True value ( vt ) or Actual value ( va )
It is the actual magnitude of the input signal to a measuring system which may be approximated but never truly be determined.
24. Indicated value ( vi ) or Measured value ( vm )
The magnitude of the input signal indicated by a measuring instrument is known as a indicated value.
25. Define measure.
It means, to determine the dimension, quantity or capacity of something.
26. Define result.
It is obtained by making all known corrections to the indicated value.
27. Give the relationship among the different types of pressures and its definitions.
It is the pressure exerted by the earth’s atmosphere and is usually measured by a barometer. At sea level. Its value is close to 1.013 x 105 N/m2 absolute and decreases with altitude. Gage Pressure It represents the difference between the absolute pressure and the local atmosphere pressure Vacuum It is an absolute pressure less the
atmospheric pressure i.e. a negative gage pressure.
Static and Dynamic pressures:
If a fluid is in equilibrium, the pressure at a point is identical in all directions and independent of orientation is referred as pressure. In dynamic pressure, there exists a pressure gradient within the system. To restore equilibrium, the fluid flows from regions of higher pressure t regions of lower pressure. Pressure is the force per unit area. Gauge pressure: It is the system pressure which is measured with the pressure gauge, a device to
measure the pressure. Atmosphere pressure: It is the pressure exerted by the air molecules on the object. This atmospheric pressure is measured with the help of Barometer. Absolute Pressure: It is the pressure measured with reference to the Zero pressure or perfect vacuum. It represents the summation of atmospheric pressure and gauge pressure. Hence, Absolute pressure = Gauge pressure + Atmospheric pressure
28. How do you define yard?
Yard is defined as distance between the two central traverse lines of the gold plug when the temperature of the bar is at 62º F (Imperial Standard yard).
29. What is thermocouple? Where are they used?
If two dissimilar metals are joined, an emf exists which is a function of several factors including the temperature. When junctions of this type are used to measure temperature, they are called as thermocouples.
30. What are slip gauges?
Slip gauges a very accurately ground block of hardened steel used to measure a gap with close accuracy: used mainly in tool-making and inspection.
31. What is Tolerance?
It is the difference between the upper limit and the lower limit of a dimension. It is impossible to make anything to an exact size, therefore it is essential to allow a definite tolerance. It is also the maximum permissible variation on every specified dimension.
32. What are Limits?
The maximum and minimum permissible sizes within which the actual size of a Component lies are called Limits.
33. Define fits.
The relationship existing between two parts, shaft and hole, which are to be assembled, with respect to the difference in their sizes is called fit.
34. What is Range?
Range represents the highest possible value that can be measured by an instrument or Range is the difference between the largest & the smallest results of measurement.
35. What is loading effect?
Loading effect: The presence of a measuring instrument in a medium to be measured will always lead to extraction of some energy from the medium, thus making perfect measurements theoretically impossible. This effect is known as ‘loading effect’ which must be kept as small as possible for better measurements. For ex, in electrical measuring systems, the detector stage receives energy from the signal source, while the intermediate modifying devices and output indicators receive energy from auxiliary source. The loading effects are due to impedances of various elements connected in a system.
36. What is comparator?
Comparator is a precision instrument used for comparing dimensions of a part under test with the working standards. It is an indirect type of instrument and used for linear measurement. If the dimension is less or greater than the standard, then the difference will be shown on the dial. It gives only the difference between actual and standard dimension of the work piece.
37. Name the different types of comparator?
Mechanical Comparator, Pneumatic Comparator, Optical Comparator, Electrical Comparator, Electronic Comparator and Combined Comparator (ex: mechanical –optical comparator).
38. What are advantages and disadvantages of mechanical comparator?
Advantages of Mechanical Comparator:
- They do not require any external source of energy.
- These are cheaper and portable.
- These are of robust construction and compact design.
- The simple linear scales are easy to read.
- These are unaffected by variations due to external source of energy such air, electricity etc.
- Range is limited as the pointer moves over a fixed scale.
- Pointer scale system used can cause parallax error.
- There are number of moving parts which create problems due to friction, and ultimately the accuracy is less.
- The instrument may become sensitive to vibration due to high inertia.
39. What is a sine bar?
Sine bar is a high precision & most accurate angle measuring instrument. It is used for measurement of an angle of a given job or for setting an angle. They are hardened and precision ground tools for accurate angle setting. It can be used in conjunction with set of angle gauges and dial gauge for measurement of angles and tapers from horizontal surface.
40. What is a sine center?
These are used in situations where it is difficult to mount the component on the sine bar. It is basically used for conical work pieces. It is the extension of sine bars where two ends are provided on which centers can be Clamped. These are useful for testing of conical work centered at each end, up to 60°.