Supporting Systron Donner Inertial’s industry-leading position in cost-effective Inertial Sensing Systems and Products, a comprehensive Technical Terms Glossary is provided. Simply select the term of your choice to find concise definitions and parameters to consider as you seek new frontiers in inertial sensing design or answers to your current requirements.
- Full Range Output
- Scale Factor
- Bias (also called Drift)
- Scale Factor Temperature Sensitivity
- Bias Temperature Stability
- Bias Stability (Drift)
- Output Noise
- Measurement Noise
- Linear Acceleration (g) Sensitivity (also called G Sensitive Bias)
- Start-up Time
- Vibration - Operating
- Vibration - Survival
- Factory Setting (also called Initial Offset)
- Factory Calibration
The specified maximum input rate (°/sec) over which full performance will be provided.
The nominal voltage output for the specified full range input. The actual voltage output will be a complex compilation of several parameters including scale factor, bias, linearity and temperature characteristics.
The coefficient which translates input angular rate to DC voltage in terms of linear scaling. A specified 15mV /°/sec Scale Factor indicates that 1.50 VDC of output corresponds to a 100°/sec rotational motion.
The device output when the input rate is zero as measured at a reference temperature (typically +22°C), excluding outputs due to hysteresis, acceleration or noise. Measured in VDC, converted to °/sec by dividing by the scale factor in VDC/°/sec.
The bounds within which the Scale Factor will lie as the temperature varies across the operating temperature range, usually referenced to a room temperature calibration value (+22°C typical). Also can be defined as a temperature coefficient, usually the best fit straight line of Scale Factor change over the operating temperature range.
The bounds within which the Rate Bias may vary as the temperature varies across the operating temperature range, included in the "Over Operating Environments" specification.
The bounds within which the Rate Bias may vary over specified periods of time, typically 100 seconds at fixed conditions, including constant temperature (short term); or as long term stability, over 1 year, excluding outputs due to self-generating noise. The gyro has little if any real bias drift, though thermal stabilization at turn-on and ambient temperature changes may appear in the rate output as "drift" (see Measurement Noise).
Self-generated electrical noise is defined by its power-spectral-density (PSD) and given in volts-squared per Hz over a given range of frequency. This defines the distribution of noise power in the output of the instrument, on a per-Hz basis. For example, if one computes the area under the output response curve within any 1 Hz window, then take the square root of this area, one obtains a value of voltage. Dividing this voltage by the Scale factor converts it to an equivalent rate reading. This is the output noise of the instrument, measured in °/sec/ÖHz.
To convert this measurement into an RMS equivalent over a given bandwidth, calculate the square-root of the bandwidth and multiply this by the noise specification. (i.e. If Bandwidth is 50 Hz and Spec. = 0.02 °/sec/ÖHz: the maximum RMS noise of the device will be Ö50 (7.07) x 0.02 or 0.1414 VRMS.)
The total noise, appearing as an uncertainty of value, contained in a voltage measurement due to any causes such as instrument noise, ambient structural vibration, slip-ring noise, power supply drifts, etc Such noise limits the speed with which accurate readings of output may be made since the noise will cause a statistical spread in the data taken, depending on the bandwidth of the measurement equipment and the averaging interval..
The input signal frequency range from DC (zero frequency) up to the frequency where a -90 degree phase shift (between mechanical input and rate output) is observed. This phase shift is determined by filters within the device.
Defines the upper and lower limit within which the output signal may vary or deviate from the Best-Fit-Straight-Line (BFSL) drawn through the data, expressed as a per cent (%) of the angular rate full range.
The minimum change in input rate that can produce at least 50% of the expected ideal change in the output signal. Resolution is defined at any input rate within the full range and measured in °/sec (see Threshold for comparison).
The minimum input signal, starting from zero rate, that is required to produce at least 50% of the expected ideal output as defined by the value of the Scale Factor. Threshold is defined only at zero rate input and measured in °/sec (see Resolution for comparison).
The maximum difference, at a fixed temperature, between bias readings during a full range excursion of input rates - first measured at zero input rate after having just returned from negative full range; and second, that bias reading measured at zero input rate after having just returned from positive full range. The Quartz Rate Sensor (QRS14) has no significant hysteresis and does not specify a parameter.
The maximum change in rate output in °/sec/g due to the application of linear acceleration applied in any direction.
The time required for the instrument to produce a usable rate output after power application.
The specified limit of random vibration in g-RMS, usually across the flat 20 Hz - 2,000 Hz input spectrum range, which the device will withstand while operating within its performance specifications.
The specified limit of random vibration in g-RMS, usually within the flat 20 Hz - 2,000 Hz range, which the device will survive for a limited period of time without damage.
The limit of shock which the device will withstand without damage. Typically, the QRS14 can withstand shocks without damage with a peak value of 200 g's lasting 2 ms.
The technique of using a mathematical model of Scale Factor and/or Rate Bias vs. temperature to correct the output data and minimize such temperature-induced errors. Such a model may make use of power-series equations, multiple-point data with interpolation between temperatures, for example.
The maximum value, or tolerance of value, of a factory adjusted parameter, usually at a specified temperature (typically +22°C).
The accuracy of the measurement of a parameter when calibrated at the factory, usually at a specified reference temperature (typically +22°C).