**RSICC CODE PACKAGE PSR-272**

**1. NAME AND TITLE**

ZOTT99: Zero-in On The Truth; Evaluation of Correlated Data Using Partitioned Least Squares.

**2. CONTRIBUTORS**

International Atomic Energy Agency, Nuclear Data Section, Vienna Austria, through the OECD Nuclear Energy Agency Data Bank, Issy-Les Molineaux, France.

**3. CODING LANGUAGE AND COMPUTER**

Fortran 77; Pentium, DEC Alpha, and Sun (P00272ALLCP02).

**4. NATURE OF PROBLEM SOLVED**

Given an existing combined set y(i) of differential and integral measurements with completely general covariances cyy(i,j) and a sensitivity matrix relating the expectation values of the various measurements, ZOTT obtains a new evaluation yp(i) with covariances cyyp(i,j). The results yp(i) are minimum-variance linear unbiased estimators of the true values, E[y(i)]. ZOTT99 differs from ZOTT95 in that it offers the analyst the option of applying a simple universal and objective method, called the Method of Least Distortion (MLD), to treat discrepant data, that is, data sets for which the overall chi-squared per degree of freedom substantially exceeds unity. Discrepant data can be characterized as data which are subject to additional sources of uncertainty, not recognized by the person who evaluated the nominal data uncertainties.

**5. METHOD OF SOLUTION**

The method of solution is partitioned least squares, which is a specialized form of minimum variance linear estimation featuring reduced matrix-inversion requirements (relative to methods based on solving the conventional normal equations). Partioned least squares permits general correlations among all data uncertainties (including cross-type correlations between differential and integral data.) If the problem to be solved is precisely linear, and if correct input is supplied by the user, then the minimum-variance solution obtained with ZOTT is unique and exact. At user request by specifying inc=1, the code will perform a minimally invasive modification of the input covariance matrix to enforce consistency (unit chi-squared). Only the diagonal elements are changed, and an iterative procedure is followed in which only one diagonal element is changed at a time, namely, the one which produces the maximum benefit in lowering chi-squared. This process is repeated until chi-squared reaches unity. Regardless of the input value of inc, plots of the data uncertainties and the final residuals are produced to permit the analyst to judge the desirability of invoking this option and the realism of the changes thereby introduced. At the end of the run, these plots reside on ZOTTVU.PS for viewing with GhostView or similar PostScript viewer.

**6. RESTRICTIONS OR LIMITATIONS**

Subroutine CLOCK is machine-dependent and must be either adapted to the user's machine or removed. The logarithmic option is a rigorous (that is, minimum-variance) procedure if, and only if, the relative uncertainties are small. With respect to storage limitations, this particular version is dimensioned for a total number of 220 (differential plus integral) measurements (variable NBIG), up to 66 of which may be integral, or redundant, measurements (variable NLIT), However, the code can easily be re-dimensioned for larger problems by changing the values of NLIT and NBIG in the program and making appropriate changes in all arrays having dimensions (66), (66,2), (66,66), (66,220), or (220,220). For very large problems one can optionally use the same memory locations for both of the large arrays cyy(i,j) and cyyp(i,j). If one chooses to do this, one must also omit the eigenvalue-eigenvector analysis of the input covariance matrix (see SUBROUTINE EIGER). The EIGER analysis is very useful for detecting errors (especially negative eigenvalues) in the covariances, so the "large problem" option should not be activated unless it is essential. The changes necessary to activate the "large problem" option are bracketed by comment cards containing the string "clarge".

**7. TYPICAL RUNNING TIME**

Test problem 1, which treats an uncertainty-increment type problem with 26 integral data and which requires 43 uncertainty iterations, runs in about 30 seconds of cpu time on a Pentium-II PC.

**8. COMPUTER HARDWARE REQUIREMENTS**

No special requirements. Any modern 32-bit or 64-bit VMS, Windows or UNIX platform should suffice. ZOTT99 was tested at the NEA Data Bank on a DEC Alpha running OpenVMS and on a Pentium-II PC. It was tested at RSICC on a Pentium II and on Sun using f77.

**9. COMPUTER SOFTWARE REQUIREMENTS**

ZOTT99 ran at the NEA Data Bank on a DEC AlphaServer 2100 running OpenVMS Version 7.1 using the DEC Fortran (Fortran 77) compiler and on an Intel Pentium-II running Windows NT 4.0. RSICC also tested ZOT99 on a Sun under Solaris 2.6. An executable created under Windows95 with the Lahey-F77L-EM32 Version 5.2 compiler is included in the package for PC users.

**10. REFERENCES**

**a. Included in documentation:**

D.W. Muir, "Treatment of Discrepant Data in the ZOTT99 Generalized Least Squares Program," Covariance Workshop Brookhaven, New York (22-23 April 1999).

D. W. Muir, "Evaluation of Correlated Data Using Partitioned Least Squares: A Minimum-Variance Derivation," LA-UR-2365 (Rev.) also published in *Nuclear Science and Engineering* 101,
88-93 (January 1989).

**b. Background information:**

E. J. Axton, "The Thermal Capture Cross Section of ^{55}Mn," *Ann. Nucl. Energy* 12, 315 (1985).

**11. CONTENTS OF CODE PACKAGE**

Included are the referenced document (in 10.a) and one DS/HD diskette including the source file, and sample input and output transmitted in a compressed self-extracting DOS file.

**12. DATE OF ABSTRACT**

January 1989, April 1993, June 2000.

**KEYWORDS: ** COVARIANCE DATA PROCESSING; EXPERIMENTAL DATA
ANALYSIS; SENSITIVITY ANALYSIS; UNCERTAINTY ANALYSIS;
MICROCOMPUTER; WORKSTATION