词条 | Optical theorem |
释义 |
In physics, the optical theorem is a general law of wave scattering theory, which relates the forward scattering amplitude to the total cross section of the scatterer. It is usually written in the form where {{mvar|f}}(0) is the scattering amplitude with an angle of zero, that is, the amplitude of the wave scattered to the center of a distant screen, and {{mvar|k}} is the wave vector in the incident direction. Because the optical theorem is derived using only conservation of energy, or in quantum mechanics from conservation of probability, the optical theorem is widely applicable and, in quantum mechanics, includes both elastic and inelastic scattering. Note that the above form is for an incident plane wave; a more general form involving arbitrary outgoing directions k invented by Werner Heisenberg can be written The optical theorem implies that an object that scatters any light at all (or electrons, neutrons, etc) will have a nonzero forward scattering amplitude {{mvar|f}}(0). However, the physically observed field in the forward direction is the sum of the (nonzero) scattered field and the incident field, which may add to zero. HistoryThe optical theorem was originally invented independently by Wolfgang von Sellmeier and Lord Rayleigh in 1871. Lord Rayleigh recognized the forward scattering amplitude in terms of the index of refraction as (where {{mvar|N}} is the number density of scatterers), which he used in a study of the color and polarization of the sky. The equation was later extended to quantum scattering theory by several individuals, and came to be known as the Bohr–Peierls–Placzek relation after an unpublished 1939 paper. It was first referred to as the "optical theorem" in print in 1955 by Hans Bethe and Frederic de Hoffmann, after it had been known as a "well known theorem of optics" for some time. DerivationThe theorem can be derived rather directly from a treatment of a scalar wave. If a plane wave is incident along positive z axis on an object, then the wave amplitude a great distance away from the scatterer is approximately given by All higher terms, when squared, vanish more quickly than , and so are negligible a great distance away. For large values of and for small angles, a Taylor expansion gives us We would now like to use the fact that the intensity is proportional to the square of the amplitude . Approximating as , we have If we drop the term and use the fact that , we have Now suppose we integrate over a screen far away in the xy plane, which is small enough for the small-angle approximations to be appropriate, but large enough that we can integrate the intensity from to with negligible error. In optics, this is equivalent to including many fringes of the diffraction pattern. To further simplify matters, let's approximate . We obtain where A is the area of the surface integrated over. Although these are improper integrals, by suitable approximations the exponentials can be treated as Gaussians, and This is the probability of reaching the screen if none were scattered, lessened by an amount , which is therefore the effective scattering cross section of the scatterer. References
4 : Scattering theory|Scattering, absorption and radiative transfer (optics)|Quantum field theory|Physics theorems |
随便看 |
|
开放百科全书收录14589846条英语、德语、日语等多语种百科知识,基本涵盖了大多数领域的百科知识,是一部内容自由、开放的电子版国际百科全书。