Docsity
Docsity

Prepare for your exams
Prepare for your exams

Study with the several resources on Docsity


Earn points to download
Earn points to download

Earn points by helping other students or get them with a premium plan


Guidelines and tips
Guidelines and tips

Trigg's Technique for Trend Detection in Control Data, Slides of Chemistry

The application of Trigg's technique for trend detection in control data produced by multi-test continuous-flow analyzers. The document analyzes actual control data to determine the frequency of trends and the accuracy of several parameters obtained from Trigg's method. It also compares the use of the cumulative sum (cusum) technique and Trigg's technique for trend detection.

What you will learn

  • How is Trigg's technique used for trend detection in control data?
  • What is Trigg's technique for trend detection?
  • What is the difference between the cumulative sum technique and Trigg's technique for trend detection?

Typology: Slides

2021/2022

Uploaded on 08/05/2022

hal_s95
hal_s95 🇵🇭

4.4

(652)

10K documents

1 / 10

Toggle sidebar

This page cannot be seen from the preview

Don't miss anything!

bg1
CUN. CHEM.21/10. 1396-1405(1975)
1396 CLINICALCHEMISTRY,Vol. 21, No. 10,1975
Trend Detection in Control Data: Optimization and
Interpretation of Trigg’s Technique for Trend Analysis
George S. Cembrowski, James 0. Westgard,1 Arthur A. Eggert, and E. Clifford Toren, Jr.
A method for trend detection, Trigg’s technique [Oper.
Res. 0. 15, 271 (1964)], has been investigated for use
in monitoring trends in control data produced by multi-
test continuous-flow analyzers. Simulated trend data
were used to optimize the method. Actual control data
were analyzed retrospectively to determine the frequen-
cy of trends and the accuracy of several parameters
obtained from Trigg’s method. The prospective use of
the technique has successfully uncovered important
trends. Criteria for the interpretation of the Trigg’s trend
data are suggested and an algorithm for the computer
implementation of Trigg’s calculations is included.
AddItIonal Keyphras.a: statistics #{149}Levey-Jennings plot
#{149}cusum values #{149} continuous-flow analysis #{149}quality
control #{149}computers
Levey-Jennings control charts-graphs of succes-
sive control values-have been used extensively to
monitor analytical variation in the clinical laboratory
(1). Arelatively large amount of information can be
presented for simple visual interpretation. Unfortu-
nately, much effort must be expended to graph and
analyze the control data produced by multichannel
analyzers, which assay several different controls
many times per day. In addition, Levey-Jennings
control charts do not permit simple detection of non-
random trends. The cumulative sum (cusum) (2)
technique has been used for trend detection, but as-
sessment of “out-of-control” situations is still some-
what intuitive (3). Application of cusum to multi-
channel systems requires even greater effort than the
Levey-Jennings charts. Although this technique has
been implemented on computers, its application is
not widespread (4).
Departments of Medicine, Pathology, and the Clinical Laborato-
ries, University of Wisconsin Center for Health Sciences, 1300
University Ave., Madison, Wis. 53706.
‘Author to whom correspondence should be addressed.
Presented in part at the 26th National Meeting of the AACC,
Las Vegas, Nevada, August 18-23, 1974.
Received April 7, 1975; accepted May 17, 1975.
There is a simple method for trend analysis, known
as Trigg’s technique (5), which is applicable to real-
time analysis of trends. Trigg’s method is a modifica-
tion of a method introduced by Brown (6), and was
initially used in a computer-based forecasting and
stock-control system (5). Hope et al. (7) used Trigg’s
method for monitoring critical physiological variables
of intensive-care patients. A parameter called
“Trigg’s tracking signal,” a measure of control within
a process, is calculated from the values of successive
control measurements. For specified confidence lev-
els, upper and lower control limits can be established
for the tracking signal, which, if exceeded, define an
“out-of-control” situation. Interpretation of data
from multichannel instruments is simplified, because
one set of limits will apply to all tests. Current con-
trol averages or standard deviations can be accurate-
ly estimated from intermediate calculations. The
changing of a single parameter can adjust the time
period over which trends can be analyzed and detect-
ed.Here, we describe a study of the applicability of
Trigg’s method of trend analysis to the control of an-
alytical measurements in the clinical chemistry labo-
ratory. We used computer-simulated control data to
investigate the sensitivity of Trigg’s method to
changes in the analytical process and to optimize the
technique for application to multichannel instru-
ments. Once optimized, the technique was applied to
control data produced by the Technicon SMA 12/60
and the Technicon SMAC systems.
Method and Materials
Trigg’s Analysis
Abrief introduction to the technique is presented
here. A more complete description as well as an im-
plementation guide are included in the Appendix.
The technique assumes that a prediction (forecast) of
the next control observation can be made with an ex-
pf3
pf4
pf5
pf8
pf9
pfa

Partial preview of the text

Download Trigg's Technique for Trend Detection in Control Data and more Slides Chemistry in PDF only on Docsity!

CUN. CHEM. 21/10. 1396-1405(1975)

1396 CLINICALCHEMISTRY,Vol. 21, No. 10, 1975

Trend Detection in Control Data: Optimization and

Interpretation of Trigg’s Technique for Trend Analysis

George S. Cembrowski, James 0. Westgard,1 Arthur A. Eggert, and E. Clifford Toren, Jr.

A method for trend detection, Trigg’s technique [Oper. Res. 0. 15, 271 (1964)], has been investigated for use in monitoring trends in control data produced by multi- test continuous-flow analyzers. Simulated trend data were used to optimize the method. Actual control data were analyzed retrospectively to determine the frequen- cy of trends and the accuracy of several parameters obtained from Trigg’s method. The prospective use of the technique has successfully uncovered important trends. Criteria for the interpretation of the Trigg’s trend data are suggested and an algorithm for the computer implementation of Trigg’s calculations is included.

AddItIonal Keyphras.a: statistics #{149} Levey-Jennings plot #{149}cusum values #{149} continuous-flow analysis #{149} quality control #{149} computers

Levey-Jennings control charts-graphs of succes- sive control values-have been used extensively to monitor analytical variation in the clinical laboratory (1). A relatively large amount of information can be presented for simple visual interpretation. Unfortu- nately, much effort must be expended to graph and analyze the control data produced by multichannel analyzers, which assay several different controls many times per day. In addition, Levey-Jennings control charts do not permit simple detection of non- random trends. The cumulative sum (cusum) (2) technique has been used for trend detection, but as- sessment of “out-of-control” situations is still some- what intuitive (3). Application of cusum to multi- channel systems requires even greater effort than the Levey-Jennings charts. Although this technique has

been implemented on computers, its application is

not widespread (4).

Departments of Medicine, Pathology, and the Clinical Laborato- ries, University of Wisconsin Center for Health Sciences, 1300 University Ave., Madison, Wis. 53706. ‘Author to whom correspondence should be addressed. Presented in part at the 26th National Meeting of the AACC, Las Vegas, Nevada, August 18-23, 1974. Received April 7, 1975; accepted May 17, 1975.

There is a simple method for trend analysis, known as Trigg’s technique (5), which is applicable to real- time analysis of trends. Trigg’s method is a modifica-

tion of a method introduced by Brown (6), and was

initially used in a computer-based forecasting and

stock-control system (5). Hope et al. (7) used Trigg’s method for monitoring critical physiological variables of intensive-care patients. A parameter called “Trigg’s tracking signal,” a measure of control within a process, is calculated from the values of successive control measurements. For specified confidence lev- els, upper and lower control limits can be established for the tracking signal, which, if exceeded, define an “out-of-control” situation. Interpretation of data from multichannel instruments is simplified, because one set of limits will apply to all tests. Current con- trol averages or standard deviations can be accurate- ly estimated from intermediate calculations. The changing of a single parameter can adjust the time period over which trends can be analyzed and detect- ed. Here, we describe a study of the applicability of Trigg’s method of trend analysis to the control of an- alytical measurements in the clinical chemistry labo- ratory. We used computer-simulated control data to investigate the sensitivity of Trigg’s method to changes in the analytical process and to optimize the technique for application to multichannel instru- ments. Once optimized, the technique was applied to control data produced by the Technicon SMA 12/ and the Technicon SMAC systems.

Method and Materials Trigg’s Analysis A brief introduction to the technique is presented here. A more complete description as well as an im- plementation guide are included in the Appendix. The technique assumes that a prediction (forecast) of the next control observation can be made with an ex-

CUNICAL CHEMISTRY, Vol. 21,No. 10,1975 1397

ponentially weighted average (exponential smooth- ing). This prediction or “smoothed average” is given by:

Smoothed average = a(latest measurement) + (1 - a)(previous smoothed average) (1)

The smoothing constant, a, which can have values between 0 and 1, determines the number of control observations included in the exponential smoothing. Exponential smoothing is analogous to the calcula- tion of a moving average (6). Selected values of a and corresponding numbers of observations for equiva- lent moving averages are shown in Table A-I. The difference between the actual value and the predicted value is called the “forecast error,” which is also averaged by exponential smoothing to give a “smoothed forecast error:”

Smoothed forecast error = (latest error) + (1 - c)(previous smoothed forecast error) (2)

When a process is under control, the smoothed fore- cast error will approximate zero. A gradual increase or decrease in the control measurements will cause the forecasts to lag the measured values and result in the smoothed forecast error differing significantly from zero. The Trigg’s tracking signal is given by:

smoothed forecast error Tracking signal =

If a process is under control, the tracking signal will fluctuate about zero. A significant change in the pro-

cess will cause the signal to approach one of its limits,

±1, a positive tracking signal indicating an increase in process measurements, a negative one indicating a decrease. The magnitude of the tracking signal can be related to the probability that a process is out of control. Tables A-Il and A-Ill show, for various values of a, absolute limits of the tracking signal to- gether with the probabilities that these limits are not normally exceeded. For example, from Table A-Il and a = 0.1 at the 98% confidence level, the control limits for the tracking signal are ±0.49. There would be only two chances in 100 that a tracking signal with a magnitude exceeding 0.49 would occur simply be- cause of the random variability in the analytical pro- cess. Trigg’s method can provide estimates of averages and standard deviations. The exponentially smoothed average is an accurate estimate of the cur-

rent average. The MAD is proportional to the stan-

dard deviation and may be used to estimate the cur- rent standard deviation. The term “smoothed stan- dard deviation” is used here to represent the stan- dard deviation estimate as calculated from the MAD.

SMA 12/60 and SMAC

Assays run on the SMA 12/60 (Technicon Instru-

ments Corp., Tarrytown, N. Y. 10591) were for calci- um, phosphorus, glucose, BUN, uric acid, cholesterol,

total protein, albumin, total bilirubin, AST, LD, and

ALP. Instrument protocol required standardization

every 10 cups, and analysis of four different lyophi- lized control pools, one three times per plate of 40 (Hyland I; Hyland, Costa Mesa, Calif. 92626), one once per plate (Versatol Automated High; General Diagnostics, Morris Plains, N. J. 07950), and two oth- ers once per day. These same tests were studied on the SMAC (Technicon). The instrument was standardized every 24 cups, a control being analyzed every 8 cups. Three controls, Normal (Dade, Miami, Fla. 33152), Scale I, or Scale II (Technicon), were analyzed at equal fre- quency. Computations and simulations. FORTRAN pro- grams (listing available on request) were written to analyze control data and print the tracking signal,

the smoothed average, and the standard deviation (as

calculated from MAD). A variation of one of the pro-

(5) grams was used to analyze simulated control data,

where several series of 200 randomly distributed

numbers (average = 100, standard deviation =

  1. were generated with a random normal-number subroutine. Impulses (single out-of-control values), step changes (abrupt shifts), and slope changes (trends) were introduced into these data to study the sensitivity of the Trigg’s parameters. Impulse changes from 1 to 20 units were obtained by addition to the 101st control value. The step changes were in- troduced by adding constants from 1 to 10 to the last

MAD is the mean absolute deviation,3 a measure of the amount of noise (random variation) in a process, and is defined as the exponentially weighted average of the absolute forecast errors:

MAD = allatesterrorl + (1 - a)(previous MAD) (4)

For a process whose control measurements are nor- mally distributed with a standard deviation, o#{149}, the MAD and o#{149} are simply related:

MAD = (2/ir)”2a

2 In forecasting and prediction, a is the accepted symbol for the smoothing constant. It denotes only the smoothing constant in this paper and should not be confused with significance level. 3Nonstandard abbreviations used: BUN, blood (actually, serum) urea nitrogen; AST, aapartate aminotransferase (glutamic- oxaloacetic transaminase; L-aspartate:2-oxoglutarate aminotrans- ferase; EC 2.6.1.1); LD, lactate dehydrogenase (L-lactate:NAD oxi- doreductase; EC 1.1.1.27); ALP, alkaline phosphatase (orthophos- phoric acid monester phosphohydrolase; EC 3.1.3.1); QC, quality control; and MAD, mean average deviation.

a (^) 40-

30-

20-

1PCNITUDE OF STEP CHPNCE

ir (^) to

10-

Li(. z a I 6) I-LI LiI- Li 0 (^0) I- (I) -J (^0) a I- z (^0) LI (^0) z ________________

Fig. 3. Response curve for a simulated slope change for dif- b ferent confidence limits, a = 0. 0.01; , P< 0.02; 0, P< 0.

o a

l0 23 I 41Q SLOPE INCREPSE/100 CONTROLS)

(^0) CC, (^0) U I 4 (^00) ID U) 3’ 2’ 105 104: 103’ 102 lot’ ISO. 99. 96-

6

1111111

so

o is is 20 25 30 30 40 40 50 CONTROL NUMBER

U^4 a U > C (^0) U I (^00) I:UI

LI U a Li> a (^0) Li I (^0) C ID CO

I (^) IDI I 20 I I 30 I I 40 I I SLOPE (INCREPSE/100 CONTROLS)

Fig. 2. Response curves for simulateda step and b slope

changes for different smoothing constants. The number of

controls necessary to detect the change is graphed against

the size of the change +, a = 0.005; 0. a = 0.010; 0, a 0.025;. a = 0.050; , a = 0100;

0. a = 0.

the sensitivity to slope changes for one simulation

when 95, 98, and 99% control limits are used; the dif-

ferences are small. The smoothed average and smoothed standard de- viation for the simulated step change of Figure lb are illustrated in Figure 4a. The average gradually

climbs from 100 to approximately 103. There is an

accompanying increase in standard deviation. Simi- larly, Figure 4b shows the smoothed average and smoothed standard deviation for the slope data of Figure ic.

b

1111111111711t 111111 III It III III 11111 II lit

CUNICAL CHEMISTRY,Vol. 21, No. 10, 1975 1399

60- a^ z ()^ I so- t- (. 40- Li 0 o 30- I- (I) aI- z o to- ( (^0) z o

U 60- z a - I U __ I-L) - U U 30- 0 - 20 -j 0 -

0 Z^ 10- U

Retrospective Analysis of Instrument Control Data

Accuracy of the smoothed average and standard deviation. To evaluate the accuracy of exponential smoothing, we compared the exponentially smoothed average and standard deviation of one month’s SMAC data for the Normal control to the arithmetic

average and standard deviation (Table 1). The aver-

age difference in means is 0.17%; the largest differ-

ence is 0.49% (albumin). The average difference in standard deviations is 10.7%, the largest 27.4% (phos- phorus).

(^0) UI 5 (^0) LI I 4 (^0) a U)^ ID^ 3.

lOS 104 103 tOO 101 loS: 9$. IuIIiIIIIIIIIiIIIIItItIIIlIItIIIIIiI/IIIIIIIIIIIII CONTROL NUMBER Fig. 4. The exponentially smoothed average and standard de-

v’sation of the data in Figure lb and Figure 1 c

a, step change; b. slopechange

Trend analysis. A two-months’ collection of SMA 12/60 control data was analyzed retrospectively to determine the frequency of trends. Eight hundred Hyland I control results from 48 successive days were used to calculate Trigg’s tracking signal. The number

of times the signal exceeded the 99% control limits

was tallied. The same treatment was applied to 191

Versatol Automated High control results that were obtained in the same period.

Table 1. Comparison of Average and Standard Deviation Obtained by (a) Exponential Smoothing, and

(b) Arithmetic Averaging of One Month’s SMAC Normal Control Dataa

Exponential smoothing Arithmetic averaging Test Av SD Av SD n Glucose, mg/dI 108.69 3.702 108.67^ 4.128^317 BUN, mg/dl 20.08 0.381 20.10 0.396 320 Calcium, mg/dI 8.50 0.160 8.48 0,203 315 Phosphorus, mg/dI (^) 4.01 0.045 4.00 0.062 320 Uric acid, mg/dl 6.51 0.121 6.51 0.144 315 Cholesterol, mg/dI 141.19 5.060 140.68 4.762 313 Total protein, g/dl 6.54 0.102 (^) 6.53 0.101 315 Albumin, g/dl 4.08 0.101 4.06 (^) 0.097 319 Total bilirubin, mg/dl 2.40 0.053 2.40 0.058 317 ALP, U/dl 8.92 0.473 8.92 0.388 295 AST, U/liter 5340 3.399 53.48 3.461 309 LD, U/liter 145.63 5.065 145,22 5,387 312 a A smoothing constant of a = 0.00667 was used (corresponds to 300 data points in the smoothed average).

Table 3. Frequency of the Different Types of

Within-day Trends (P< 0.01, a = 0.10) for

Selected Tests of the SMA 12/60a

Number of trends detected by Trigg’s analysis and caused by: Control values on one side of the mean

1400 CUNICAL CHEMISTRY, Vol. 21, No. 10, 1975

To detect within-day trends, the initial parameters were reset daily-the smoothed error to zero, the smoothed average to the monthly QC average, and the MAD to the MAD corresponding to the monthly

QC standard deviation. The detected within-day

trends for Hyland I are summarized in Table 2, which shows the upward, downward, and the total

trends for smoothing constants of 0.10 and 0.05.

There is little difference between these two smooth- ing constants in trend detection. The Versatol con- trol serum, run less frequently, showed similar but substantially fewer trends. The trends observed with the Hyland control were categorized into three groups: those caused by a rela- tively large deviation from the mean by an isolated control (an impulse rather than a trend, but detected as such), those caused by control values primarily on one side of the mean but not exceeding the ±2 stan- dard deviation limits for reference sample quality control, and those trends caused by or involving con- trols that were consistently above or below the mean, with at least one value exceeding the ±2 standard de- viation control limits. The tests of Table 2 that pro- duced the most trends have their trends classified in Table 3. The number of trends for which the esti- mates of the standard deviations (from MAD) ex- ceeded the corresponding QC standard deviations are shown in parentheses. Longer-term trends may be monitored by initializ- ing Trigg’s parameters less often and using smaller values of a. The number of weekly and monthly trends observed in the 48 days of Hyland control values are shown in Tables 4 and 5. Smoothing con- stants of 0.02 to 0.05 were appropriate for monitoring

weekly periods and 0.005 to 0.010 for monthly peri-

ods, given our average number of controls per week

(100). A control chart containing the first 16 days’ ac-

cumulation of glucose control results for the Hyland serum is shown in Figure 5. Small systematic differ-

Table 2. Frequency of Within-day Trends

(P < 0.01) in SMA 12/60 Control Data for Two

Months, as Detected by Trigg’s Analysis

a0,1O a=0. Down- Down’ Test Upward ward Total Upward ward Total

Calcium 5 1 6 5 1 6

P04 13 5 18 16 3 19

Glucose 10 18 28 7 15 22 BUN 6 15 21 11 16 27 Uric acida 12 27 39 12 22 34 Cholesterol 6 2 8 9 1 10 Total protein 3 4 7 7 3 10 Albumin 7 2 9 8 2 10 Total bilirubin 6 4 10 10 2 12 ALP 22 11 33 19 7 26 LO 12 3 15 14 3 17 AST 9 13 22 7 8 15 a The set point for uric acid was changed on the 19th day so that the measured value of the uric acid in the control dropped approxi- mately 0.2 mg/dl. The number of trends for uric acid is not repre- sentative.

But with at Large random But within least one value Test deviations controllimits > ±2 SD ALP 5(5) 12(4) 16(12) Glucose 2 (2) 16 (6) 10 (6) AST 6 (6) (^) 9 (5) 7 (6) BUN 1 (1) (^) 13 (6) 7 (7) P04 2(2) 13(9) 3(3) LD 7(7) (^) 2(0) 6(6) a Numbers in parentheses are number of trends left after elimina- tion of trends with small smoothed standard deviation.

1402 CLINICALCHEMISTRY.Vol. 21, No. 10, 1975

trends involving only within-limit measurements. Often the within-day precision is so good that the MAD is reduced to the point where a control or series of controls with a small deviation from the smoothed average will cause Trigg’s tracking signal to exceed the control limits. In our analysis of within-day trends, we found that the smoothed standard deviation (proportional to the MAD) can aid in determining whether a trend is im- portant. When the value is small-less than the monthly quality-control standard deviation-the de- tected trend is often not important. When such trends are eliminated (see Table 3) the number of trends for which control values are entirely within the ±2 standard deviation control limits is decreased considerably. The detection of the important trends and those caused by a large random deviation are not greatly affected. The analysis of trends over longer periods requires comparable care. Small systematic day-to-day differ- ences in control values, exemplified by Figures 5, 6, and 7, result in trends readily identifiable by Trigg’s technique. The long-term state of control, however, cannot be easily related to the tracking signal or even

a combination of the tracking signal and the standard

deviation. We have used, instead, the smoothed aver- age and standard deviation to estimate the direction and magnitude of long-term drift. It becomes apparent that the difficulty with the Trigg’s technique is not one of detecting trends, but rather one of discrimination of important trends. An- alyzers such as the Technicon SMA 12/60 and SMAC systems seem to show frequent small step changes, likely owing to day-to-day differences in calibration. Because of their good within-day precision, small step changes can be detected as daily trends, as well as longer-term trends. Discrimination is aided by use

of the smoothed average and standard deviation, and

inclusion of these estimates should be considered es- sential for adequate interpretation of the trend data. A summary printout of (a) tracking signal, (b)

smoothed average, and (c) smoothed standard devia-

tion efficiently provides the necessary information.

Appendix

A method for detecting nonrandom process trends was described by Brown (6) in 1960. Process mea- surements were used to calculate a tracking signal, a number that could easily be related in terms of cu- mulative probability to the amount of control within a process. To overcome certain limitations in the tracking signal, Trigg (5) modified its derivation and calculated tables of the tracking signal and associated confidence limits. Batty (8) found that some of Trigg’s assumptions were invalid and recalculated new tracking signal limits that agreed well with simu- lation results.

One can quickly scan the tracking signals, because

only one control limit needs to be kept in mind, re- gardless of what test channel is being observed.

When a trend is detected, then the mean and stan-

dard deviation can be inspected to determine wheth- er the trend is significant. We have applied this printout in an offline mode for a short time. Impor- tant problems with glucose (within-day decrease

caused by reagent problems) and cholesterol (carry-

over owing to poor reagent) were detected. Further application will involve accessing control

data directly from the laboratory computer and then

processing and analyzing the results on-line. The

trend-detection technique can be used efficiently in

conjunction with the reference sample system and

only requires additional calculations on presently available data. The calculations are not trivial, and a computer program is essential (see Appendix for im- plementation guide).

This work was supported by Grant No. GM 10978 from the NIH, USPHS, by a General Research Support Sub-Grant from the University of Wisconsin Medical School, and by the Clinical Labo- ratories, University Hospitals, Madison, Wis. We thank Michael K. Mansfield (Department of Neurophysiology) for the computer plotting programs and R. Neil Carey for his helpful comments on the manuscript.

References

  1. Levey, S., and Jennings, E. R., The use of control charts in the clinical laboratory. Am. J. Clin. Pat hot. 20, 1059 (1950).
  2. Whitby, L. G., Mitchell, F. L., and Moss, D. W., Quality control in routine clinical chemistry. Adu. Clin. Chem. 10, 102 (1967).
  3. Lewis, C. D., Statistical monitoring techniques. Med. Biol. Eng. 9,315 (1971).
  4. Harrison, P. J., and Davies, 0. L., The use of cumulative sum (Cusum) techniques for the control of routine forecasts of product demand. Oper. Res. 12, 325 (1964).
  5. Trigg, D. W., Monitoring a forecasting system. Oper. Res. Q. 15, 271 (1964). 6. Brown, R. G., Smoothing, Forecasting and Prediction of Dis- crete Time Series, Prentice-Hall, Englewood-Cliffs, N. J., 1962.
  6. Hope, C. E., Lewis, C. D., Perry, I. R., and Gamble, A., Comput- ed trend analysis in automated patient monitoring systems. Br. J. Anaesth. 45,440 (1973).
  7. Batty, M., Monitoring an exponential smoothing forecasting system. Oper. Res. Q. 20, 319 (1969).

Method An averaging method, exponential smoothing, is used to calculate Trigg’s tracking signal. The expo- nential average, E, of a series of process measure- ments, x0, X1, X2... Xt is:

= axt + (1 - = cvx + (1 - a)[ax.,1 + (1 - a)E2]

= axt + a(1 - a)xi + (1 - a)2[ax_2 +

(1 - a)E_3]

= ax + a(1 - a)x_1 + a(1 - a)2x_2 +

  • a(1 - + ... + (1 - )tx

CLINICALCHEMISTRY,Vol. 21, No. 10, 1975 1403

f-I

= a(l - a)”x + (1 - a)x

The smoothing constant, a, has a range of 0 to 1.

The weight given to previous observations decreases geometrically with age. If the smoothing constant is a = 0.2, then the current observation has weight 0.2; the preceding observations, in order of increasing age, have weight of 0.16, 0.128, 0. 1024, etc. The attributes of exponential smoothing have been described by Brown (6): (1) The expected value of the exponentially smoothed observations is equal to the expected value of the observations; i.e., the current estimate can be called an average of the previous observations. (2) The variance of the exponential average is sim- ply related to the variance of the input observations. For serially independent observations with variance o the variance of the estimate is: [a/(2 - (3)Exponential smoothing is accurate. (4) Computations are simple; only one number, must be stored to calculate a new average. (5)Exponential smoothing is flexible. With a small smoothing constant the estimate behaves like the av- erage of much past data. If the constant is large, the estimate responds rapidly to changes in pattern. No reprogramming is necessary to adjust the rate of re- sponse; only the smoothing constant need be

changed. Brown (6) has compared the moving aver-

age (an average of the most recent observations) and

the exponentially smoothed average, and has shown

that they are conceptually similar. The smoothing

constant, and the number of observations, n, in an equivalent moving average, are simply related:

a = 2/(n + 1)

This relationship is shown in Table A-I. In Trigg’s method the exponentially smoothed av- erage, E, is used as a predictor (forecast) for the next observation, Xt+i. The difference of the forecast and the next observation is defined as the forecast error,

e = Xt+i - E (A3)

For most distributions of observations, the distri- bution of forecast errors can be shown to be approxi- mately normal (6). Because the mean of the observa- tions should equal the mean of the exponential aver- ages, the mean of the forecast errors will be zero. The variance of the forecast errors, a, is equal to the vari- ance of the estimate plus the variance of the observa- tions (6):

= a2 + [a/(2 -

= [2/(2 - (A4)

Table A-I. Smoothing Constants and Number

(Al) of Observations in an Equivalent Number Movingof observations Average in a

moving average 399 199 99 39 19 9 5 4 3 2

Smoothing constant (a)

If a process is constant, the forecast and averaged

measurements should be approximately the same, with the resulting forecast errors fluctuating about zero. A change in the process will cause the forecast

to lag behind the changed process measurements and

result in a forecast error that is consistently negative or positive. Trigg’s method uses the exponentially smoothed forecast error, SE, to measure within process control:

SE = ae + (1 - a)SE1 (A5)

A steadily increasing or decreasing smoothed forecast error is indicative of a changing or changed process. The variance of the smoothed error, was shown by Batty (8) to be:

ETSE = 2acr2,/(2 - a)3 (A6)

(A2) Trigg’s tracking signal is the normalized smoothed

error:

Tracking signal = (A7)

The mean absolute deviation (MAD) is the exponen- tially smoothed absolute forecast error:

MAD = + (1 - a)MADti (A8)

Like the standard deviation, the MAD is a measure

of the scatter of errors around their mean. It is often used instead of the standard deviation because it is easier to calculate. For a normal distribution with variance a2, the MAD is:

MAD = (2/r)2a

The MAD of a series of forecast errors is equal to:

MADe = (2/ir)”2ae

(A9)

= (2/ir)1”2[2/(2 - a)]1”2a (Alo)

(1) et = New control value - deviation be printed along with an asterisk if the

(2) SE = aet + (1 - a)SE1 tracking signal exceeds predetermined limits. After

printing, the variables, SE_, MAD_1, and E_ (3) MADE = a I e I + (1 - a) MADE-, must be updated: (4) E = a(New control value) + (1 - (‘7) SE1 = SE

(5) Tracking signals = SE/MAD (8) MADE.., = MADE

(6) Standard deviation estimate =

(ur/2)”2[(2 - a)/2]”2MAD^ (9)^ Es..,^ =^ E

For one to gain familiarization with the technique, With each new control value, operations 1-9 are re- we suggest that the tracking signal, the estimate of peated, with Trigg’s parameters being printed and

the control value, and the estimate of the standard analyzed.

CUNICAL CHEMISTRY, Vol. 21, No. 10. 1975 1405