Download citation
Download citation
link to html
Recently Henn & Meindl [Acta Cryst. (2010), A66, 676–684] examined the significance of Bragg diffraction data through the descriptor W = 〈I1/2〉/〈σ(I)〉. In the Poisson limit for the intensity errors W equals unity, but any kind of data processing (background subtraction, integration, scaling, absorption correction, Lorentz and polarization correction etc.) introduces additional error as well as remaining systematic errors and thus the significance of processed Bragg diffraction data is expected to be below the Poisson limit (WBragg < 1). Curiously, it was observed by Henn & Meindl for several data sets that WBragg had values larger than one. In the present study this is shown to be an artefact due to the neglect of a data scale factor applied to the standard uncertainties, and corrected values of WBragg applied to Bragg data on an absolute scale are presented, which are all smaller than unity. Furthermore, the error estimation models employed by two commonly used data-processing programs {SADABS (Bruker AXS Inc., Madison, Wisconsin, USA) and SORTAV [Blessing (1997). J. Appl. Cryst. 30, 421–426]} are examined. It is shown that the empirical error model in SADABS very significantly lowers the significance of the Bragg data and it also results in a very strange distributions of errors, as observed by Henn & Meindl. On the other hand, error estimation based on the variance of a population of abundant intensity data, as used in SORTAV, provides reasonable error estimates, which are only slightly less significant than the raw data. Given that modern area detectors make measurement of highly redundant data relatively straightforward, it is concluded that the latter is the best approach for processing of data.

Follow Acta Cryst. A
Sign up for e-alerts
Follow Acta Cryst. on Twitter
Follow us on facebook
Sign up for RSS feeds