A team intent on unraveling the genetic secrets of archaic hominins has come up with a new strategy for amplifying single rather than double strands of DNA, making it possible to sequence ancient genomes to far greater depth than was previously possible.
And because the strategy is specialized for dealing with old and/or somewhat degraded DNA, its developers say it could prove useful not only for sequencing ancient genetic material, but also for performing more sensitive forensic studies.
"[W]e generated protocols from scratch that take into account the special preservation conditions of ancient DNA, such as fragmentation and strand breaks," Max Planck Institute for Evolutionary Anthropology researcher Matthias Meyer said during a telephone briefing with reporters last week.
As described in a study appearing online last week in Science, Meyer and his colleagues applied the approach to sequencing the genome of a female from an archaic group known as the Denisovans to more than 30-fold coverage, on average. Genomic DNA for the study came from a fragment of finger bone found in a Siberian cave in 2010 that is believed to be tens of thousands of years old.
Prior to this, the Denisova genome had been sequenced to around 1.9 times coverage, on average, using DNA from the same finger bone shard and a DNA repair-based method developed in 2010 (IS 5/25/2010).
With its newest sequencing method, though, the team "succeeded in developing a more efficient way of extracting information from the few DNA fragments that are preserved in the bone," Meyer told reporters.
And by dramatically bumping up the depth of coverage for the archaic hominin genome, the team was able to do much more detailed studies on archaic admixture with modern humans and to delve into genetic features that have arisen in the modern human lineage since the split from the Denisovans and a related archaic group, the Neandertals.
"We've sort of taken the next technical step, if you like," senior author Svante Pääbo, director of the evolutionary genetics department at the Max Planck Institute for Evolutionary Anthropology, said during last week's press briefing, "in that we have determined this genome sequence from this little finger bone to a quality that's equal to what you would determine a genome sequence in me or you today."
Having gotten over this latest hurdle in sequencing ancient DNA, Pääbo said during last week's briefing that there is now "no difference in what we can learn genetically about a person that lived 50,000 years ago and from a person today, provided that we have well-enough preserved bones."
For that, Pääbo credits the work of Meyer, who has spent much time since the original Denisova genome was sequenced coming up with more refined methods for garnering sequence data from ancient DNA.
Prior to that, the DNA amplification and sequencing protocols used for ancient DNA were much the same as those used to deal with modern samples, with a few tweaks here and there.
"Until recently, we only adapted protocols that were initially developed for modern DNA — and these are done on double-stranded DNA," Meyer told In Sequence.
With that in mind, he and his team decided to try to find ways to completely overhaul the process so that the short pieces of DNA that are found in very old and degraded samples would become a benefit rather than a hindrance.
"It differs in the way that we convert the DNA that we extract from the fossil into sequencing libraries," Meyer explained.
Rather than attaching sequencing adaptors to double-stranded DNA during library preparation, the new method involves converting the double-stranded DNA to individual strands prior to amplification and adaptor addition, so that each of the strands gets amplified separately.
"[F]or ancient DNA, the use of single-stranded DNA may be advantageous as it will double its representation in the library," the studies authors explained.
"Furthermore," they elaborated, "in a single-stranded DNA library, double-stranded molecules that carry modifications on one strand that prevent their incorporation into double-stranded DNA libraries could still be represented by the unmodified strand."
By using streptavidin beads to immobilize pieces of DNA over much of the preparation, researchers found that they could also forego some of the DNA loss usually associated with DNA purification steps.
The team started by stripping phosphates from the DNA — a step that prevents DNA strands from self-ligating when a single strand ligase enzyme known as CircLigase is added to the mix later.
From there, the researchers denature the DNA with heat and slap on the first set of sequencing adaptors. Because these adaptors are biotinylated, each single-stranded stretch of DNA then can be nabbed with streptavidin beads.
"The first adaptor includes a biotin that makes it possible to do all subsequent steps on solid support by immobilizing the ligated molecules to streptavidin beads," Meyer said.
"The benefit of using streptavidin beads is that they have a very high coupling efficiency to biotin," he added, "and once this link is established, you can exchange buffers and enzymes as often as you want without having any substantial losses."
Once it's been restrained in this manner, the DNA on each bead gets amplified with the help of a primer hybridized to the adaptor sequence. Finally, a second sequencing adaptor is added to the amplified genetic material before it is freed from the beads.
"Once we've done the single-strand ligation step, we then copy the molecule and make it double-stranded again. Then we add the second adaptor using very similar methodology as used previously," Meyer said, noting that the method's "most creative steps" are the first ones.
After showing that they could create and sequence libraries prepared in this way using a variety of samples, including more expendable ancient DNA from cave bear samples, the researchers used this method to amplify and sequence genomic DNA from the Denisovan finger bone to an average of around 31-fold coverage on the Illumina GAIIx.
Meyer noted that Illumina's short read technology is well suited to dealing with ancient DNA, which already tends to be found in short pieces — on the order of 40 or 50 bases in many cases.
Even so, he explained that it should be fairly straightforward to tweak the protocol to incorporate other adaptors and make the resulting library compatible with other sequencing platforms.
With this upgraded Denisovan genome in hand, the investigators have now done far more detailed analyses of the archaic hominin itself — for instance, identifying variants in the genome that, in present-day human populations, are linked to dark skin and brown eyes.
Moreover, they have teased apart new information about interbreeding between modern humans and both Denisovans and Neandertals and established a preliminary list of otherwise-conserved hominin and primate variants that have been altered in the human genome since our lineage diverged.
The single-molecule method offered a closer look at what happens to DNA over very long periods of time, too, the researchers reported, revealing that ancient DNA seems to be particularly prone to guanine residue loss.
The new library preparation method is more expensive than commercially available kits, in part due to the cost of the single strand ligase enzyme used, Meyer explained. But that extra cost is mitigated by the fact that ancient DNA studies often hinge around far fewer samples overall.
"Unlike with modern DNA, where you often work on thousands of samples, with ancient DNA, you are often working with tens of samples," he said. "So the cost for individual library preparation is not quite as important as for modern DNA."
Moreover, Meyer said the cost of sequencing itself still outweighs the library preparation costs. For the new Denisovan genome, the team did 10 runs on the Illumina GAIIx to the tune of roughly $250,000. They have now switched over to the company's more cost-effective HiSeq 2000 platform.
Given their success with the Denisovan genome, the researchers are currently going back and looking at their sample collections to see where else they might be able to apply the single-stranded amplification and sequencing method.
For instance, they are using the single-stranded DNA amplification technique to generate a higher coverage version of the Neandertal genome, which was initially sequenced to an average depth of around 1.3-fold.
There is also interest in trying to use the single-stranded approach in a modern forensics setting, since DNA from these samples has many of the same features as ancient DNA — not necessarily due to age, but more often owing to environmental exposures.
"In principle, the set of problems that we see for ancient DNA will largely overlap with problems that people have for forensic DNA," Meyer said. "So any improvement that's made for ancient DNA should also be applicable to forensic DNA."
So far the researchers have not attempted such forensic studies on modern-day samples. Instead, they are in the process of digging into the single-stranded method to figure out how to decrease sequence artifacts. They're also trying to learn more about just how the single-stranded method improves coverage, beyond just doubling the number of DNA molecules that are amplified.
"We are trying to understand now what is actually driving this improvement," Meyer said. "And maybe there are even ways to improve it further."
"We had other samples that worked very well with the single-stranded method and we had some samples that worked less well, although still better than the previous techniques," he added.
For instance, the team suspects that by independently amplifying each DNA strand, it is getting access to DNA molecules with single-stranded DNA nicks — or other peculiarities specific to the ancient samples — that would have been lost from double-stranded DNA library preparations.
Because single-stranded DNA amplification effectively doubles the representation of DNA available, the authors of the new study say the method could theoretically be used as an alternative to doing random whole-genome amplification with a phi29 polymerase enzyme for single-cell sequencing.
That is something that still needs to be tested, though, since it is not yet clear how much of a human genome can be captured in the single-stranded libraries.
"It's a very different approach," Meyer said, "since current whole-genome amplification methods usually rely on phi29 polymerase and random hexamers, whereas our method relies on making a library first and then amplifying this library."
He cautioned that some of the same features that make the single-molecule amplification good for ancient DNA sequencing applications — particularly its propensity for dealing with short pieces of DNA — might end up being disadvantage when looking at fairly well-preserved modern samples.
"The single strand ligase is most effective for short molecules. So one would need to break down the DNA into relatively short pieces before converting it to a library," he explained. "And that is an obvious disadvantage compared to the current random hexamer protocols that are out there."
Still, Meyer said, "we'd like to test this and see how far we get with it."