NEW YORK (GenomeWeb) – In collaboration with Intel, the University of California, San Francisco's Center for Digital Health Innovation is developing and validating a deep-learning analytics platform, that if successful could be used to develop clinical algorithms and apps.
In so doing, they are taking aim at enabling clinicians to make better treatment decisions, more precisely predict patient outcomes, and more rapidly respond when needed in critical care settings.
The collaboration represents one among multiple initiatives that UCSF has embarked upon with large computer industry players. The projects embrace a shared objective — to play an important role in helping shape the future of computing within healthcare and precision medicine.
The next-generation platform being developed by Intel and UCSF will “efficiently manage the huge volume and variety of data collected for clinical care” and incorporate data from genomics, proteomics, wearables, and Internet-connected (IoT) sensors that collect health and diagnostic information, Michael Blum, director of CDHI at UCSF, said in an interview.
“The challenge is to create a powerful, scalable platform that can manage these very large, rapidly proliferating data sets, as well as provide the compute capabilities to develop the next generation of analytics and [artificial intelligence] that we need to transform healthcare,” he added.
The technologies can be applied to sophisticated, large-scale healthcare challenges including “predicting health risks, preventing hospital readmissions, analyzing complex medical images, and more,” Blum said. Deep-learning environments hold the promise of rapidly analyzing and predicting individual patient trajectories utilizing vast amounts of multidimensional data, he added.
The plan is for algorithms and apps from the analytics platform to become available within about a year.
Artificial intelligence and machine learning are gaining popularity in several industries, including, for example, the emerging area of autonomous vehicles. However, “our ability to use them in healthcare is a relatively new phenomenon,” said Blum, who is also an associate vice chancellor for informatics and professor of medicine at UCSF.
Datasets are complex and diverse, have lots of unstructured data, and are managed in multiple, incompatible systems, Blum said. He noted that these challenges have thwarted the kind of enhanced decision-making and care that will be possible through the new initiative with Intel.
Achieving compatibility among electronic health records systems is only one part of the healthcare data challenge. Next-generation computing systems must also manage and analyze genomics and proteomics data, as well as data streaming from wearables and other sensors, Blum said.
"For healthcare, this is a big change," Blum said. "We’ve spent the past decade or so trying to get away from paper with electronic health records, but information management needs are growing so rapidly now that they are outstripping our present computing infrastructures."
Intel in the past has frequently been at the center of healthcare computing, supplying the chipsets that power servers and other machines. For this initiative, the computing giant is providing not just hardware resources, but also data science resources and knowledge to build, deploy, and scale these new environments, Blum said.
UCSF and Intel will work together to deploy the high-performance computing environment on industry standard Intel Xeon processor-based platforms that will support the data management and algorithm development lifecycle, a process that involves data curation and annotation, algorithm training, and testing against labeled datasets with pre-specified outcomes.
“This collaboration between Intel and UCSF will accelerate the development of deep-learning algorithms that have great potential to benefit patients,” Kay Eron, general manager of health and life sciences in Intel’s Data Center Group, said in a statement. Intel expects that the collaboration with UCSF will inform Intel's development and testing of new platform architectures for the healthcare industry.
"As we’ve evolved organizationally, precision medicine has become a high-level strategic imperative for [UCSF]," Blum said. More broadly, "issues around compute capabilities and high-performance computing have become more and more important at that intersection of large clinical datasets, next generation analytics, and new data types," he added.
The development teams will integrate data into highly-scalable “information commons,” sometimes called data lakes, which he said can be thought of as datasets containing different kinds of data, such as electronic health records, omics, and wearable sensor data.
"And data are processed so that they are understandable and computable, and you can start to integrate machine learning and deep learning, as well as big data analytics, so that you get more from the data sources than has been done traditionally" he added.
As part of the collaboration, UCSF is providing clinical and research expertise imparted by faculty and staff that "live and work in these spaces where technology needs to be embedded and make an impact," Blum said. Among the key points is making sure that clinical data are available and annotated in a useable way that's accessible by clinicians and investigators. It's important to have people involved who are familiar with clinical and research workflows, he said, and who understand how the data impacts them.
"You can put great technology in place, but if it doesn't get to the patient or to the research project, you're not going anywhere," Blum added.
The University of California system is also providing 16 million de-identified patient records "as a starting point for the initiative," Blum said, with that number expected to grow as the project continues.
Within a year or so, UCSF believes that clinicians and investigators will be able to access the first set of new algorithms, tools, and apps developed on this new Intel platform. To get them to the marketplace, the team is collaborating with large industrial players such as GE Healthcare and Cisco who already sell technology solutions to healthcare delivery organizations around the world.
"We’re not going to become a software provider," Blum said. A primary goal for UCSF is to validate and get benefits from computing architectures that enables it to help advance precision medicine and develop new knowledge and new knowledge networks.
"We envision developing this new knowledge, and these new algorithms, validating them in the clinical environment, and then licensing them to a commercial partner that integrates them into applications that they already have in place or that are entirely new applications," Blum said, and added that they will likely be delivered via the cloud to national and international clientele who operate in clinical care or research environments.
"The beauty of this is that community hospitals, individual physicians, and small practices are not going to have the wherewithal to develop the understanding that can come out of these very large datasets," he said. "But if we develop algorithms and apps that can be cloud deployed, these healthcare providers can access them and benefit from them without having to do any of the development."
For UCSF, the Intel collaboration is one chapter of a larger story. Blum and his group at CDHI have several collaborations underway with an objective to advance healthcare through advanced computing.
UCSF is collaborating with Cisco to help develop a connected health interoperability platform that the computer firm is developing to address the challenge of interoperability among electronic health record systems, apps, wearables, and the Internet of Things. Together, they are developing a cloud-based platform that would be accessible to many different types of healthcare providers stretching from large hospitals to physicians' offices.
With General Electric, UCSF is in a joint project that takes aim at deploying machine learning and deep learning to speed the turnaround and delivery of patient imaging data. The GE collaboration is expected to benefit from the new platform that UCSF is developing with Intel, by employing its compute power to develop algorithms.
Within this project, deep-learning algorithms would quickly note findings on X-rays, CT scans, and MRIs, for example, and then correlate them with data from the EHR, monitors, and sensors, and help clinicians by quickly providing them with critical information relevant to an individual patient, Blum said. GE has a product and service distribution mechanism, the Health Cloud, through which the new algorithms and applications are likely to be deployed, he added.
UCSF believes that the first algorithms from multiple collaborations will be available to clinicians and investigators in around a year. Because they are developing clinical algorithms, they will require rigorous validation, Blum said, and he noted that the development teams will be required to test the algorithms for clinical relevance and accuracy.
Certain algorithms may take a straightforward path to market, such as those that don’t perform diagnoses or make clinical recommendations, but, instead, alert clinicians when there’s a finding on which they may make a clinical decision, Blum said. More complex algorithms that provide a diagnostic decision or recommend therapies will receive higher levels of scrutiny and require more rigorous and extensive validation, and "we expect to work closely with [the US Food and Drug Administration] as these are developed," Blum said.