Cookies on this website

We use cookies to ensure that we give you the best experience on our website. If you click 'Accept all cookies' we'll assume that you are happy to receive all cookies and you won't see this message again. If you click 'Reject all non-essential cookies' only necessary cookies providing core functionality such as security, network management, and accessibility will be enabled. Click 'Find out more' for information on how to change your cookie settings.

Digital x-ray acquisition allows the sophisticated processing of acquired images before display to the reader, making possible such operations as the removal in software of the systematic blurring effect of scatter. A method for analysing scatter removal is presented. The scatter model incorporated within the Standard Attenuation Rate (SAR) is used, which is a method for calculating a normalised image of tissue radiodensity. The model builds on the fundamental physical relations underlying Monte Carlo techniques; but through optimal information sampling and interpolation is able to execute in a clinically realistic time. The scatter kernel arising around each primary ray is calculated, and these are superimposed to give the scatter image. An iterative refinement procedure is used to calculate the radiodensity and scatter at each ray/pixel, cyclically feeding back to each other, to yield the scatter field. Image sharpness and contrast-to-noise (CNR) analysis is presented for two tissue equivalent phantoms. The algorithm is found to be able to match image sharpness without the grid, to that with the grid present, confirmed by residual analysis using autocorrelation plots which show the difference is almost white noise within a 95% C.I. The increased fluence in the absence of the grid is shown to allow dose to be reduced by 37-49%, whilst delivering equivalent contrast and CNR. © 2012 Springer-Verlag Berlin Heidelberg.

Original publication




Conference paper

Publication Date



7361 LNCS


260 - 267