The use of a mathematical model is proposed in order to denoise X-ray two-dimensional patterns. The method relies on a generalized diffusion equation whose diffusion constant depends on the image gradients. The numerical solution of the diffusion equation provides an efficient reduction of pattern noise as witnessed by the computed peak of signal-to-noise ratio. The use of experimental data with different inherent levels of noise allows us to show the success of the method even in the case, experimentally relevant, when patterns are blurred by Poissonian noise. The corresponding MatLab code for the numerical method is made available.
Ladisa, Massimo; Lamura, Antonio
Molecular Diversity Preservation International