Crystallographic
methods and equipment keep evolving fast. Today, there seems to be
almost no limit to technical hardware and software questions as well as
to limitations on computational resources in crystallographic
applications. Instead of collecting more and even more data it now seems
to become more and more important to also use all of this data to
generate knowledge and meaning from it. Traditionally, only information
referring to structure models or charge density models are used.
However, each diffraction experiment produces much more data, which much
too often is just given away without deeper examination. In particular,
every diffraction experiment requires many decisions about data
acquisition strategies and measurement parameters, data processing
parameters and model refinement parameters. These decisions determine
and sometimes also limit the data quality and therefore the model
quality. Our goal is to focus on the extreme valuable information hidden
in the residuals. The decoding of this information paves the road to
higher data quality, higher precision and accuracy of the data,
measurement cost reduction and constant increase of data quality by
continuous evaluation of data quality. In this way, new scientific
insights will become possible and some old may need to be revised.
The foundation of data quality assessment is to have good and reliable data quality descriptors and to know how to apply these and also how not to apply these. The one-day crash course will give some basic information about these topics and a demonstration of our software for the identification of systematic errors in diffraction data. |