NIBIB-funded researchers at the Massachusetts General Hospital have developed a smartphone-based device that can reliably carry out molecular diagnoses in under an hour for approximately two dollars per patient. The device could enable point-of-care cancer diagnostics in low- to middle-income or remote areas, which often have high rates of mortality from cancer due to missed opportunities for treatment.
“In these areas, patient samples often have to be shipped to facilities that are capable of carrying out conventional pathology services,” says Richard Conroy, Ph.D., program director for Molecular Imaging at NIBIB. “As a result, it can take several days before a diagnosis is returned to the patient. In many cases, patients aren’t able to return for follow-up care either because they have to travel long distances to reach a clinic or can’t afford to take multiple days off work. A low-cost technology that can diagnose cancer at the point-of-care would enable patients to begin treatment on the same day that they are tested, greatly increasing the number of patients who receive treatment.”
The development of the new device was led by Hakho Lee, Ph.D., an associate professor in Radiology at Harvard Medical School / Massachusetts General Hospital, and Ralph Weissleder, M.D., Ph.D., director of the Center for Systems Biology at Massachusetts General Hospital. They describe their device in the May 5, 2015 issue of the journal PNAS.1
According to Lee, molecular diagnostics have been difficult to perform at the point-of-care due to a lack of infrastructure and trained personnel.
“To carry out molecular diagnostics currently, you need a good microscope, you need antibodies or ligands that can recognize a molecular target, and you need a specialized person who can interpret the data. Right now, those three things are hard to obtain in point-of-care settings,” says Lee.
The new device—called the D3 (digital diffraction diagnosis system)—is made up of a smartphone and an imaging module that snaps onto it, consisting of a battery-powered LED light and a lens. After a sample is collected from a patient (blood, aspirate, or other biological fluid), it is mixed with microbeads that have specific antibodies attached to them. The antibodies bind to molecules expressed on the surface of cancer cells and different antibodies are used depending on the type of cancer that is being screened. The mixture is then placed on a microscope slide and inserted into the imaging module, which allows the researchers to take pictures of the cell-bead mixture.
With its wide field of view, the system is capable of capturing more than 100,000 cells per image. This is 100 times more cells than is generally captured with a traditional microscope, which has narrower field of view. Initially, the researchers believed they’d be able to distinguish cancer cells from non-cancerous cells simply by looking at whether they had beads bound to them. However, the beads and cells diffracted the light, causing the images to become greatly distorted. This drove the researchers to create an algorithm that could reconstruct the images of bead-bound cells from the diffraction patterns that the camera captures.
Because their reconstruction process requires heavy computations, the researchers realized early on that they would be limited by the smartphone’s processing capabilities. To get around this, they created an application for the smartphone that automatically uploads the diffraction images—as soon as they’ve been snapped—to a secure cloud, after which they are transmitted to a server at the Massachusetts General Hospital. That server is capable of conducting many computations in parallel and can reconstruct the images in less than one-tenth of a second. Using these reconstructed images, the server then counts the total number of cells with beads attached to them as well as the total number of beads attached to a given cell. Based on these numbers, the sample is classified as high-risk, low-risk, or benign.
The researchers recently tested their device using cervical specimens from twenty-five patients with previously abnormal Pap smear results. The cell samples were mixed with beads tagged against three known cell markers of cervical cancer. The researchers reported that there was a positive correlation between the number of beads per cell and the risk of cancer as confirmed by conventional analysis by a pathologist and they were able to successfully classify the patients as high-risk or low-risk/benign with 100% sensitivity and 92% specificity.
“The speed at which this technology can diagnose disease is extremely impressive,” said Conroy. “The researchers have taken a process that sometimes takes several days using conventional pathology methods and have condensed it to under an hour. In addition, by taking advantage of cloud-computing and smartphone technology, they’re making the technology available to those who need it the most and for a very low cost.”
In addition to cell surface protein detection for identifying cancer cells, the system can also be adapted to detect DNA. The researchers reported in their paper that they could accurately detect human papilloma virus DNA in cervical cancer specimens. This ability to detect DNA opens the door for rapid diagnosis of infectious diseases in addition to cancer.
In the future, Lee plans to increase the spatial resolution of the images from 2.2 microns down to 1.2 microns through the use of additional computing methods. In doing so, Lee says the system will be able to analyze multiple markers at once—called multi-plexing—by attaching different antibodies to different sized beads and mixing them all together in a sample.
Lee and his team hope to bring the system to Botswana in the near future to test whether it can be easily adopted by local healthcare workers to screen for lymphoma.
This research was funded by National Institutes of Health grants EB010011, EB00462605A1, HL113156, T32CA79443, K12CA087723-11A1.
1. Proc Natl Acad Sci U S A. 2015 May 5;112(18):5613-8. Digital diffraction analysis enables low-cost molecular diagnostics on a smartphone. Im H, Castro CM, Shao H, Liong M, Song J, Pathania D, Fexon L, Min C, Avila-Wallace M, Zurkiya O, Rho J, Magaoay B, Tambouret RH, Pivovarov M, Weissleder R, Lee H.