New! Sign up for our free email newsletter.
Science News
from research organizations

Ultra-Sensitive Measurements Of Changes In Images Using Slow Light

Date:
February 1, 2007
Source:
University of Rochester
Summary:
Assistant Professor John Howell and his Quantum Optics team at the University of Rochester have discovered a way to manipulate a light field while retaining all of the information it carries. A considerable advance in imaging technology, the new method detects subtle changes in an image over time. Using photons and atomic vapor in what is known as imaging with slow light, the new technique precisely slows the image while retaining all of its properties.
Share:
FULL STORY

Assistant Professor John Howell and his Quantum Optics team at the University of Rochester have discovered a way to manipulate a light field while retaining all of the information it carries. A considerable advance in imaging technology, the new method detects subtle changes in an image over time. Using photons and atomic vapor in what is known as imaging with slow light, the new technique precisely slows the image while retaining all of its properties.

Previous research by Howell and others demonstrated that pulses of light could be slowed in a controllable way by propagation through an atomic vapor. Slow light is useful because telecommunications relies on buffering--delaying information until it can be properly used or routed.  Delaying a pulse that contains an image, as opposed to an arbitrary pulse that represents digital information, means that image processing does not rely on analog-to-digital conversions that lose resolution.

Currently, says Howell, "We're slowing images down to 300 times lower than the speed of light, and we're working on systems that slow images down to 10 million times lower than the speed of light."

The success of imaging depends on whether amplitude and phase information, the two primary defining characteristics of a light signal, are preserved when light propagates through the slowing medium. An experiment devised by Howell's team examines this issue and also considers images made with a very low light level, where each pulse contains less than one photon on average.

Experimental Setup

The basic setup of the experiment is a single light source (a laser), a single camera (feeding to a computer) for observation, and between them, a path for the laser beam. After leaving its source, the beam is divided in half by a 50/50 beam splitter cube. One part of the beam is reflected sideways and takes a path around mirrors and through air. The other half is transmitted through the cube and takes a shorter, straight path in which it acquires an image and is slowed. Before the camera, the two beams are recombined by another 50/50 beam splitter cube. This creates an interferometer, in which the free-space beam is used as a reference: if the delayed light can interfere with it, producing dark and light fringes, it means that the phase of the delayed light has been preserved.

Technical Details

The phase velocity of light, which is basically its speed of propagation, is determined by the frequency-dependent refractive index of the medium through which the light is traveling. The refractive index is the ratio of the light's speed in free space to the light's speed in the medium. The group velocity is determined by how a collection of waves with slightly different phase velocities add. The key point is that waves add up so that the group velocity is proportional to the derivative of the index of refraction with respect to frequency. The derivative of the refractive index can be nominally much larger (up to 8 orders of magnitude) than the index itself, and for a narrow frequency range, this derivative can experimentally fluctuate quite dramatically without the index itself fluctuating too much. The dependence of velocity on frequency is called dispersion.

Cesium vapor is the dispersive medium in this experiment. It is held in a glass cell and has a density of about 1012 to 1013 atoms (at least a thousand billion) per cubic centimeter. How many atoms are in the vapor depends on how much the cell is heated. More heat will bring more atoms off the drop of cesium sitting on the side of the cell and into the gaseous form. The amount of delay of the light pulse depends on the density of the vapor and is therefore dictated by the temperature of the cell, which is controlled electronically. The dispersion in cesium occurs for light at the wavelength of 852 nanometers. This is in the part of the infrared and near the longest wavelengths of visible light (the visible region of the electromagnetic spectrum ends in the realm of 700 nanometers). This wavelength of dispersion is between wavelengths at which light would be absorbed by cesium.

The laser emits continuously, but the beam is passed through a device that makes 2-nanosecond-long pulses every 7 nanoseconds. After the first beam splitter, the transmitted beam goes through what is called an amplitude mask. This is basically a stencil that takes away parts of the light and allows others through its holes. In other words, it imparts an image to the light. The image used in this experiment was a pattern comprised of vertical and horizontal bars and the character "4". When the pulse carrying this image is delayed by the cesium, it emerges with the image intact.

The delay in the cell is tuned to match the time it takes for the reference beam to traverse its longer path, in order that they meet at the same time at the second beam splitter. That path is five feet longer than the short path; because light traverses a foot in a single nanosecond (10-9 seconds), the delay in the short path must be five nanoseconds. With the alignment and delay properly set, the picture seen by the camera (a CCD camera--charge coupled device, the same image-detection technology found in commercial digital cameras, though this one runs continuously) is of a set of circular interference fringes, with the bars-and-4 pattern clearly superimposed. The clarity of the interference and the pattern's "shadow" imply that the slow light medium preserves everything about the light field but allows for its delay.

Preserving an Image using Single Photons and No Reference Beam

This research also treated the case of weak light fields. In contrast to the above setup, where the light field is macroscopic (you would be able to see the beam make a spot on a surface, if you were able to see in the infrared) and a detector would read massive numbers of photons, the weak light field consists of pulses that contain, on average, less than one photon per pulse.

The pulse generator for this part of the experiment made 4-nanosecond-long pulses every 330 nanoseconds. The observation of the light was not by camera, but by receiving photons into the end of an optical fiber that was scanned over a region in space, peering toward the beam path, its location precisely coordinated with time. If a photon entered the fiber, a single photon detector attached to the fiber registered a hit that could be placed properly into a two-dimensional image. The test image used in this setup was an amplitude mask with the pattern "UR".

For the weak field experiment, the reference beamline was blocked, because the different parts of an interferometer of that size would inevitably move with respect to each other during scanning of the image, thus changing the physical setup, ruining the image, and inhibiting proper evaluation of the slow light system. Not needing coordination with the long reference path, the delay could be set to times other than 5 nanoseconds. The results indicate the preservation of the image even with slow individual photons.

Conclusions

To manipulate a light field and not lose any of the information it carries is a considerable advance in imaging technology.

Existing tools for image processing require the conversion of the light into electronic signals, specifically, localized pixels described by a string of a certain number of bits. In the scaling of an intensity range to a bit range, subtle changes in the image may be lost. Electronic buffering of an image for later routing or transmission would require image detection (such as by a CCD camera), conversion, data storage, and re-emission. Optical buffering could be used for analysis of extremely fine details of an image because no information would be lost in analog-to-digital conversions.

With a method similar to the macroscopic light field experiment, which employs interferometry, applications in military imaging or astronomical observations could benefit from being able to superimpose pictures with only slight differences.


Story Source:

Materials provided by University of Rochester. Note: Content may be edited for style and length.


Cite This Page:

University of Rochester. "Ultra-Sensitive Measurements Of Changes In Images Using Slow Light." ScienceDaily. ScienceDaily, 1 February 2007. <www.sciencedaily.com/releases/2007/01/070131114025.htm>.
University of Rochester. (2007, February 1). Ultra-Sensitive Measurements Of Changes In Images Using Slow Light. ScienceDaily. Retrieved December 21, 2024 from www.sciencedaily.com/releases/2007/01/070131114025.htm
University of Rochester. "Ultra-Sensitive Measurements Of Changes In Images Using Slow Light." ScienceDaily. www.sciencedaily.com/releases/2007/01/070131114025.htm (accessed December 21, 2024).

Explore More

from ScienceDaily

RELATED STORIES