Q1.Distinguish between measurement and testingA1.Measurement is defined as the collection of data, the values of certain variables such as resistance, voltage and current may be found at certain points in a circuit or system. These values are obtained using relevant measurement instruments and techniques.Testing is defined as manipulating the values gained during the measurement phase described above to determine whether or not a circuit or system under test is operating correctly or at maximum efficiency.Q2.Explain the meaning of resolution and indicate what determines it for an analogue and for a digital measuring device.A2.Resolution is defined as the magnitude of signal equivalent to one count of the least significant digit of the display. This means the smallest signal strength that it is possible to measure. For a digital device the resolution is determined by the number of digits after the decimal point on a particular range. ...view middle of the document...
1V). The accuracy figure is required as no instrument will ever be perfectly accurate.Typical accuracy for a digital device is +/- 0.1V + 2LSD (Least Significant Digit).Typical accuracy for an analogue device is +/- 2.5% FSD.Usually quoted at specific temperature or over a range of temperatures.The quality of a device is often taken to mean its sensitivity, the more sensitive it is, the less current is required to produce a reading of full scale deflection. A perfect voltmeter would require no current to operate it, a good voltmeter is one that requires very little operating current. The sensitivity is usually quoted in ohms per volt and is calculated as follows:Sensitivity = Total Resistance of Device/ Voltage Required to Give FSD.Q4.Explain the process by which a technician should know the accuracy of the device that they are intending to use.A4.To discover how accurate a device should be the technician should consult the instrument specification sheet produced by the manufacturer to indicate how accurate the device should be under ideal conditions.The technician should then gain access to the calibration history of the instrument to see when it was last calibrated and whether it is still in date for calibration.Some test procedures may mandate a set up procedure which may include calibration of the test equipment.Q5.Explain the relevance of the input impedance of a device and quote typical values for digital and analogue devices.A5.Input impedance is the value of resistance to be measured across the input terminals of a device. A basic requirement of a test set is that it must have a minimum effect on the circuit or system under test. For example a voltmeter must have a high input impedance so that little current flows through the device, the majority of the current flowing within the circuit or system under test. Conversely ammeters must have a low input impedance as they are connected in series with the circuit or system under test and we do not want a large voltage drop across them.Typical input impedance for a digital device is 10,000,000 ohms per volt.Typical input impedance for an analogue device is 20,000 ohms per volt.