Download citation
Download citation
link to html
Thermal vibrations destroy the perfect crystalline periodicity generally assumed by dynamical diffraction theories. This can lead to some difficulty in deriving the temperature dependence of X-ray reflectivity from otherwise perfect crystals. This difficulty is overcome here in numerical simulations based on the extended Darwin theory, which does not require periodicity. Using Si and Ge as model materials, it is shown how to map the lattice vibrations derived from measured phonon dispersion curves onto a suitable Darwin model. Good agreement is observed with the usual Debye-Waller behavior predicted by standard theories, except at high temperatures for high-order reflections. These deviations are discussed in terms of a possible breakdown of the ergodic hypothesis for X-ray diffraction.
Follow Acta Cryst. A
Sign up for e-alerts
Follow Acta Cryst. on Twitter
Follow us on facebook
Sign up for RSS feeds