New Field Of Research Could Help Police In Crime Scene Forensics
- Date:
- September 1, 2008
- Source:
- The Translational Genomics Research Institute
- Summary:
- A team of investigators have found a way to identify possible suspects at crime scenes using only a small amount of DNA, even if it is mixed with hundreds of other genetic fingerprints.
- Share:
A team of investigators led by scientists at the Translational Genomics Research Institute (TGen) have found a way to identify possible suspects at crime scenes using only a small amount of DNA, even if it is mixed with hundreds of other genetic fingerprints.
Using genotyping microarrays, the scientists were able to identify an individuals DNA from within a mix of DNA samples, even if that individual represented less than 0.1 percent of the total mix, or less than one part per thousand. They were able to do this even when the mix of DNA included more than 200 individual DNA samples.
The results recently appeared in PLoS Genetics, a peer-reviewed open-access journal published by the Public Library of Science.
The discovery could help police investigators better identify possible suspects, even when dozens of people over time have been at a crime scene. It also could help reassess previous crime scene evidence, and it could have other uses in various genetic studies and in statistical analysis.
"This is a potentially revolutionary advance in the field of forensics,'' said the paper's senior author, Dr. David W. Craig, associate director of TGen's Neurogenomics Division, which otherwise is charged with finding ways to treat diseases and conditions of the brain and nervous system. "By employing the powers of genomic technology, it is now possible to know with near certainty that a particular individual was at a particular location, even with only trace amounts of DNA and even if dozens or even hundreds of others were there, too.''
The researchers analyzed complex mixes of genomic DNA using high-density Single Nucleotide Polymorphism (SNP) genotyping microarrays. This approach enabled them to accurately identify individuals from DNA mixes of at least 200 people using less than one in one-thousandth of the total mix. Theoretically, they showed that individuals could be identified in mixes of more than 1,000 people.
Currently, it is difficult for police forensic investigators to detect an individual if their genomic DNA is less than 10 percent of a mix, or if it is from a large mix of DNA material. A long-held assumption within the field of forensic science was that it was not possible to identify individuals using pooled data — until now.
According to Commander Brent Vermeer, director of the Phoenix Police Department crime lab, much DNA evidence is rendered useless because of contamination, and that to eventually put the TGen theoretical research into a cost-effective police practice "would be an amazing asset.''
A new Arizona law, Senate Bill 1412, passed in June by the Legislature, requires police agencies to keep DNA evidence in cases of homicide or felony sexual assault for as long as convicts are in prison or on supervised release, or at least 55 years in unsolved cases. Some like Phoenix keep it indefinitely.
"As technology advances, we need to be prepared to keep evidence that, down the road, could prove again to be useful,'' said Vermeer, who heads a bureau of nearly 130 analysts and crime scene investigators.
Craig said the findings presented in the paper should foster more scientific investigation that could lead to cost-effective ways of using the TGen technology to fight crime.
"It opens up ideas never considered before,'' Craig said.
Dr. Stanley F. Nelson, director of the UCLA site of the National Institute of Health's Neuroscience Microarray Consortium, said forensics investigators are "often stymied'' because they now search for fewer than 20 DNA markers. The TGen researchers looked at hundreds of thousands of markers to make their identifications, he said.
"It opens up a whole new can of worms of what's possible to do forensically,'' said Nelson, professor of Human Genetics and Psychiatry at UCLA's David Geffen School of Medicine. Nelson contributed to the TGen paper.
Nelson said that, using current police methods, DNA processing costs less than $50, while a similar process for genomic research costs several hundred dollars. However, with advances in technology, those costs should come down, he said.
The TGen study resulted from what Nelson described as "an intellectual curiosity'' by Craig while investigating diseases. Nils Homer, a former TGen intern who now is working on his doctorate degree in computer science at UCLA, brought Nelson and Craig together. Homer is the paper's first author.
"We demonstrate an approach for rapidly and sensitively determining whether a trace amount … of genomic DNA from an individual is present within a complex DNA mixture,'' the paper said.
Story Source:
Materials provided by The Translational Genomics Research Institute. Note: Content may be edited for style and length.
Cite This Page: