Hyperspectral Camera Captures Wealth of Data in an Instant
Standard snapshots from space don't quite show Earth in all its glory. There's so much more to see.
To reveal details impossible to observe with the naked eye, Rice University engineers are building a portable spectrometer that can be mounted on a small satellite, flown on an airplane or a drone or someday even held in the hand.
Bioengineer Tomasz Tkaczyk and his colleagues at Rice's Brown School of Engineering and Wiess School of Natural Sciences have published the first results from a NASA-funded project to develop a small, sophisticated spectrometer with unusual versatility. Their paper appears in Optics Express.
A spectrometer is an instrument that gathers light from an object or a scene, separates the colors and quantifies them to determine the chemical contents or other characteristics of what it sees.
The Rice device, called the Tunable Light-Guide Image Processing Snapshot Spectrometer (TuLIPSS), will let researchers instantly capture data across the visible and near-infrared spectrum, unlike current systems that scan a scene line-by-line and for later reassembly.
Each pixel in the hyperspectral images produced by TuLIPSS contains either spectral or spatial information. The "pixels" in this case are thousands of optical fibers, flexible light guides that deliver the image components to a detector. Because they can reposition the fibers, researchers can customize the balance of image and spectral data sent to the detector.
The device, for example, can be tuned to measure the chemistry of a tree to see if it's healthy or diseased. It can do the same for a cell, a single leaf, a neighborhood or farm, or a planet. In continuous-capture mode, akin to a camera's motor drive, it can show how the spectral "fingerprints" in a stationary scene change over time, or grab the spectral signature of a lightning bolt in real time.
Tkaczyk said TuLIPSS is unique because it works like any camera, capturing all the hyperspectral data - what researchers refer to as a data cube - in an instant. That means an airplane or orbiting satellite can snap an image of the ground quickly enough to avoid motion blur that would distort the data. Onboard processing will filter the data and send only what's required back to Earth, saving time and energy.
"This would be an interesting tool in the case of an event like Hurricane Harvey," Tkaczyk said. "When there's a flood and potential contamination, a device able to fly over a reservoir could tell if that water is safe for people to drink. It would be more effective than sending someone to a site that may be hard to reach."
In a normal camera, a lens focuses incoming light onto a sensor chip and converts the data into an image. In TuLIPSS, the lens focuses that light onto a middleman: the bundle of optical fibers.
In the current prototype, these fibers collect more than 30,000 spatial samples and 61 spectral channels in the 450-to-750 nanometer range - essentially, hundreds of thousands of data points - split by prisms into their component bands and passed on to a detector. The detector then feeds these data points to software that recombines them into the desired images or spectra.
The fiber array is tightly packed at the input and rearranged into individually addressable rows at the output, with gaps between them to avoid overlap. Spacing the rows allows researchers to tune spatial and spectral sampling for specific applications, Tkaczyk said.
First author Ye Wang, who earned her doctorate this year at Rice, and her colleagues painstakingly built the prototype, assembling and positioning the fiber bundles by hand. They used scenes in and around Rice to test it, reconstructing images of buildings to fine-tune TuLIPSS and taking spectral images of campus trees to "detect" their species. They also successfully analyzed the health of various plants with spectral data alone.
Continuous capture images of moving traffic in Houston showed the system's ability to see which spectra are shifting over time (such as moving vehicles and changing traffic lights) and which are stable (everything else). The experiment was a useful proof-of-concept to show how well the spectrometer could filter motion blur in dynamic situations.
Co-author David Alexander, a professor of physics and astronomy and director of the Rice Space Institute, said the researchers have begun discussions with the city of Houston and Rice's Kinder Institute for Urban Research about testing TuLIPSS in aerial studies of the city.
"Since we need to test TuLIPSS anyway, we want to do something useful," he said, suggesting a hyperspectral map of the city could reveal how the urban landscape is changing, distinguish buildings from parks or map sources of pollen. "In principle, regular flights over the city will allow us to map out the changing conditions and identify areas that need attention."
Tkaczyk suggested future versions of TuLIPSS will be useful for agricultural and atmospheric analysis, algae blooms and other environmental conditions where quick data acquisition will be valuable.
"The real challenge has been to decide what to focus on first," Alexander said. "Ultimately, we want to be successful enough that the next phase of development pushes us closer to flying TuLIPSS in space."
MORE IN ITECHPOST
Beyond Queen's Stomp-Stomp-Clap: Concerts and Computer Science Converge in New Research
The iconic "stomp-stomp-clap" of Queen's "We Will Rock You" was born out of the challenge that rock stars and professors alike know all too well: How to get large numbers of people engaged in participating during a live performance like a concert -- or a lecture -- and channel that energy for a sustained time period.
Using Waves to Move Droplets
Self-cleaning surfaces and laboratories on a chip become even more efficient if we are able to control individual droplets. University of Groningen professor Patrick Onck, together with colleagues from the Eindhoven University of Technology, has shown that this is possible by using a technique named mechanowetting.