
PAGE 3 OF 3
Cancer researcher Bert Vogelstein, an HHMI investigator at Johns Hopkins University, can now go through millions of genes to find the rare, tumor-specific ones that provide clues about the origins of disease—and about targets for therapy or cancer tests.
“Before, it was just too expensive to study a lot of patients, but now we can,” Vogelstein says.
The technology isn't without its challenges, however. For example, the new systems spew out data with the force of a fire hose.
HHMI investigator Greg Hannon, whose team uses the Solexa instrument at Cold Spring Harbor Laboratory to study gene regulation, says it produces a terabyte, or one trillion bytes, of raw data. That's more data than can be transferred easily via the Internet. So his team started filling hard drives with the data and “transferring” the information by foot to cars for delivery to a data center. He calls this a “sneaker protocol.” That's sneaker as in shoes.
“Many investigators are not prepared to deal with this large amount of data,” says Thomas Tuschl, an HHMI investigator at Rockefeller University who studies the role of RNA in gene silencing. “They do two runs and then they'll spend a year trying to build up the software to interpret it.”
Over the years, Tuschl has built relationships with experts in bioinformatics, who have built software to handle his deluge of data. He expects the new platforms will double the pace of discovery in his lab, while lowering costs fivefold.
Hannon, also, isn't put off. “This is the natural evolution of every technology. It goes from the effort required to learn to drive a car to thinking about where you're going to go.”
As for Mandel, her experience convinced her to purchase a Solexa system. She says Solexa will not only give her more data more quickly, it will also enable her to piece together more easily the big picture of the proteins involved in neurological function.
|