Most of the electrical engineers I’ve met have a peculiar love for games and puzzles. Some tear through Sudoku books. Others are chess masters who plan so many moves ahead, I’m toast when I reach for my first pawn. Eric Miller isn’t one of these people.
Miller, a professor of electrical and computer engineering at the School of Engineering, isn’t a big fan of games. He seems to be saving his energy for the serious stuff: inverse problems.
In the scientific world, Miller says, questions can break down into two broad categories. There are “forward problems,” which you probably recognize from grade school: If two factories are on a riverbank, and each one releases a certain amount of pollutant into the water, how much pollution is in the river? (Easy: It’s a matter of A plus B equals C; just plug in the numbers and solve the equation.)
Inverse problems, though, flip that construct: You have a river flooded with a certain level of pollutants, but you don’t know which factory is putting what into the water. How do you begin to solve the problem?
Questions like these are what get Miller up in the morning. “It’s just sort of a perverse fascination,” he says, laughing. “It’s the only type of puzzle that I like.”
Miller’s specialty is a field called image processing—the art of taking raw data from a sensor and working backwards, mathematically, to turn it into a detailed picture. He’s done research to improve medical imaging devices, creating software that can highlight cancerous tumors in breast tissue, for example. He’s worked with the federal Department of Homeland Security on systems that that can automatically find “objects of interest” (read: explosives) in airport baggage.
But the work Miller is most excited about is also his most challenging—creating images of chemical spills that have leached deep into the ground, using only the little bit of evidence that’s available. The resulting images—with much more detail than previously possible—could give cleanup crews the ability to target specific areas of a contaminated site, making cleanup cheaper, more efficient and more effective.
Creating an Accurate Picture
More than 250,000 locations around the United States—former industrial sites, landfills and military installations—have dangerous levels of chemical pollutants in the soil, according to the U.S. Environmental Protection Agency. In some cases, those contaminants stay close to the surface, but in many, they seep slowly downward, eventually reaching the water table. Because nearly half of the U.S. population gets its drinking water from wells and aquifers, those sites pose a big problem.
Cleanup, if it’s going to happen, needs to be focused on the specific areas where the chemicals are actually located. But you can’t just go and dig up the pollutants, Miller says. In many cases, chemical plumes stretch 50 or 60 feet under the surface.
Unfortunately, it’s extremely difficult to get an accurate picture of the shape and size of these chemical plumes. Unlike a CT scanner or an MRI, which encircle the human body to collect images from all sides, the data that researchers can pull from underground is pretty sparse.
“You can dig a few dozen monitoring wells downstream at large sites, take readings at different depths in them, and get a sort of two-dimensional slice of the site,” says Miller. “You can also do something called electrical resistance tomography, running low currents through the soil and reading where those currents hit resistance.”
But such techniques can only provide a loose idea of what’s underground. At most, they generate a few hundred points of data—not the millions of data points that you’d get from an MRI. So Miller is faced with the challenge of creating a detailed picture of a pollutant plume from very limited information.
“It’s sort of a needle-in-a-haystack problem, but you don’t get to see the needle or the haystack,” he says. “You just get indirect information about them.”
The Expected and the Measured
To get around this limitation, Miller and his collaborator, Linda Abriola, dean of the School of Engineering and a professor of civil and environmental engineering, have developed a novel approach. Instead of trying to build an image of the pollutant plume directly, Abriola first creates a model that predicts the spill’s geometry—its potential shape and structure—by analyzing site-specific data, like the type of soil below ground and the rate at which chemicals might seep through it.
It’s an inexact method, and can only give a ballpark sense of the size of each plume. But that’s where Miller’s expertise comes in. He’s developing computer algorithms that combine limited information from wells and other sensors to fine-tune these geometric models, creating a detailed map of the plume’s shape and size. It’s a multi-step process that involves complex mathematics.
“First you start with the physics—understanding the physical relationship between the data that you get, the sensors that you’re using, and the quantity of material that you’re interested in imaging,” says Miller. He then uses the geometric models to predict what data might be expected at the site, and compares the two. “Is that data similar to the data that the sensors actually collected? If it is, you’re done.”
If it’s not, Miller looks at the difference between what was expected and what was measured and tweaks the parameters of the model for a better fit. Using computer programs he’s written, Miller repeats the process over and over again until the final geometric model takes shape.
Although this method looks promising, more research needs to be done, Miller says. At the moment, he’s working with computer simulations, rather than real-world data, to slowly refine his technique so it can eventually be used in the field.
It may be a while until his image processing methods are used for environmental cleanup, but Miller’s colleagues—like Kurt Pennell, chair of the Department of Civil and Environmental Engineering—are eagerly awaiting the day when it’s ready for prime time.
“There are real-world sites that could really benefit from this method,” Pennell says, citing a heavily polluted area in Groveland, Mass., where chemicals from an industrial site have seeped down into the water table. For residents of Groveland, a town of roughly 7,000 people, a successful cleanup effort could mean regaining a source of clean water within town boundaries. When the underground pollutants were first discovered in the 1970s, the town was forced to close both of its municipal wells—then its sole sources of drinking water.
“Attempts to treat it over the last 10 years haven’t been very effective,” says Pennell. “Eric’s technique could better characterize a site like this, so when you apply technology to clean it up, you’d have a better chance of making it work.”
David Levin is a freelance science writer based in Boston. He can be reached through his website at www.therealdavidlevin.com.