A cross-disciplinary collaboration between faculty at the UH Cullen College of Engineering and Duke University has opened up new avenues for cancer research and other clinical investigations.
This collaboration is centered on the FARSIGHT software suite, developed by a group led by Badrinath Roysam, chair of the college's Department of Electrical and Computer Engineering.
FARSIGHT is designed to rapidly analyze images of human tissue collected from laser-scanning microscopes by quantifying specific molecules of interest in cells and tissue. Researchers have used it for everything from analyzing brain tissue after injury to studying the effectiveness of experimental medications.
Before FARSIGHT can provide this information, though, the software must learn how to analyze each distinct set of images. It does this by quantifying sample images and then asking the user to confirm whether the mathematical properties of these samples correspond to a specific cell or tissue type.
Previously, these examples were randomly selected by researchers, a method that was far from ideal. “There was no way for the biologist to pick the mathematically most informative examples for FARSIGHT to learn from,” said Roysam. As a result researchers would often have to guide the software through hundreds of sample images before it was able to perform independent analysis. This time-consuming process ran directly counter to FARSIGHT’s goal of accelerating clinical research.
This is where Larry Carin, a professor of electrical engineering at Duke University, comes in. With funding from the Office of Naval Research, Carin has developed “active learning” software to help mine-sweeping robots learn how to distinguish random flotsam and jetsam from unexploded mines.
Raghav Padmanabhan, a graduate student advised by Roysam, worked closely with Carin’s staff to incorporate the active learning approach into FARSIGHT. With it, FARSIGHT is able to pick out the most valuable images from which to learn, allowing it to dramatically reduce the number of samples it has to review before it can analyze images independently. While it’s too early to put a specific, definitive number on how much more quickly the software now learns, Roysam said in some cases it has shaved days off the training period while delivering much more reliable cell identifications.
This advance in the FARSIGHT system is already being exploited by William Lee, an associate professor of medicine, hematology and oncology at the University of Pennsylvania. Lee is using the improved FARSIGHT system to study a particularly lethal form of kidney cancer, clear-cell renal cell carcinoma. In particular he is focusing on the effects of experimental drugs on endothelial cells that form the blood vessels that feed tumors – promising a major breakthrough in cancer research.
“With the computer program having learned to pick out endothelial cells reliably out of multi-stained slides, we have now automated this process, and it seems to be highly accurate,” said Lee. “We can begin to study the endothelial cells of human cancer – something that is not being done because it’s so difficult and time-consuming.”
Lee, of course, won’t be the only one who benefits from this improvement to the FARSIGHT toolkit. This new functionality is now a permanent part of the system and available to all who use it.