Close Menu
The National Science Foundation is partnering with Google and IBM to give the academic community access to a large-scale computing cluster in order to help researchers improve ways to process many terabytes of data in parallel.
 
In a “dear colleague” letter sent to the research community this week to announce the initiative, the NSF said that it is designed to address the challenges of “data-intensive” computing, in which “the sheer volume of data is the dominant performance parameter.”
 

Get the full story with
GenomeWeb Premium

Only $95 for the
first 90 days*

GenomeWeb Premium gives you:
✔ Full site access
✔ Interest-based email alerts
✔ Access to archives

Never miss another important industry story.

Try GenomeWeb Premium now.

You may already have institutional access!

Check if I qualify.

Already a GenomeWeb or 360Dx Premium member?
Login Now.

*Before your trial expires, we’ll put together a custom quote with your long-term premium options.

Not ready for premium?

Register for Free Content
You can still register for access to our free content.

Matt Hancock, the UK health secretary, is calling for the swift rollout of predictive genetic tests, the Guardian reports.

A WHO panel is calling for a global registry of human germline gene-editing projects, according to Stat News.

Vox writes that lab mishaps involving pathogens are quite common.

In Genome Biology this week: analysis of wild and cultivated peach genomes, Hi-C-based pipeline for assembling microbial genomes from metagenomic data, and more.

Mar
28
Sponsored by
Qiagen

The Human Gene Mutation Database (HGMD) is a manually curated, comprehensive collection of disease-causing, germline mutations. Since 1996, a team of experts has manually catalogued over a quarter of a million mutations for the database.  

Apr
09
Sponsored by
Sophia Genetics

This webinar will present the utility of a personalized in silico analytical approach for the routine clinical diagnosis of channelopathies and cardiomyopathies.