Biology is a rapidly evolving science. Every new discovery uncovers new layers of complexity that must be unraveled in order to understand the underlying biological mechanisms of diseases and for successful drug development.
Driven both by community need and by the increased likelihood of positive returns on the large investment required, drug discovery research has often focused on identifying and understanding the most common diseases with relatively straightforward causes and those that affect large numbers of individuals. Today, companies continue to push the boundaries of innovation in order to alleviate the debilitating effects of complex diseases—those that affect smaller patient populations or show high variability from patient to patient. This requires looking deeper into available data.
The big data revolution
Key to understanding complex and variable diseases is the need to examine data from large numbers of afflicted patients. More than 90% of the world’s data has been created in the past two years, and the pace is accelerating. High-throughput technologies create ever-expanding quantities of data for researchers to mine. But in addressing one problem, another has developed—how can researchers find the specific information they need among the mass of data?
Beyond the simple issue of scale, data diversity also plays a key role. Twenty years ago, before the draft human genome sequence was finished, researchers could get a publication accepted into a journal by determining the sequence of a single gene. But with our growth in knowledge, successful research now depends more on understanding the biological complexity that comes from vast networks of interactions between genes, proteins and small molecules, not only from the sequence itself. In this environment, how can researchers determine what information is most important to understanding a particular disease?
Finding the right data
With approximately one million scientific articles published annually, scientists have a daunting task to find relevant papers for their work. They are drowning in a data deluge, and even highly complex queries return hundreds of possible answers. Twenty years ago researchers could feel fairly confident that they could keep up with the most important discoveries by reading a handful of journals. Today, important and high-quality research is published in an ever-expanding collection of journals—recent estimates from Google analytics suggest as many as 42% of highly cited papers appear in journals that are not traditionally highly cited—so researchers must cast a wide net to ensure they don’t miss key discoveries. How can they be confident that they have identified the most current and relevant research without missing a critical piece of the puzzle?
Although researchers often start to learn about a new disease using generalized search tools like PubMed or Google Scholar, more specialized tools and approaches that can connect information from multiple sources are needed to filter the massive lists of possible responses down to a manageable and relevant set. For instance, Elsevier offers research tools such as Reaxys in the chemistry space, and Pathway Studio used by biologists. The solutions include information available from Elsevier and also other publishers’ journals and articles. Each also provides focused search tools, so researchers can leverage multiple data sources and build a comprehensive and detailed picture of their disease based upon relevant data.
A "Big" project
DARPA’s "Big Mechanism" project has tasked teams from leading universities and data providers with helping improve the discoverability of scientific data. Elsevier is helping with one part of this project; developing "deep-reading" algorithms in conjunction with Carnegie Mellon Univ. to uncover almost all relevant data from a scientific publication. Understanding the role of KRAS in cancer activation was chosen as a test case due to its complexity: KRAS goes by at least five synonyms in the literature and interacts with more than 150 other proteins, many with dozens to hundreds of their own synonyms—a daunting task. Once developed, these "deep-reading" tools can be extended to work with a wide range of other genes, proteins and diseases.
Developing effective discovery tools requires significant scientific expertise to ensure data is categorized correctly in order for computers to "read" and extract the relevant data requested. As per the KRAS example, unless data is categorized correctly, a researcher could end up needing to input over 500 search terms. In short, discovery tools need extensive and refined taxonomies to be of value. A combination of deep biological domain knowledge and sophisticated software development skills are needed to develop computer-based "deep-reading" tools that can match human accuracy, while retaining the computer’s speed advantage to sift through the massive data collections.
The way we work
Understanding the way scientists do their work is essential to developing tools that match their unmet data management needs. In addition to searching a diverse collection of external data sources, researchers often have their own proprietary research data collections that must be integrated with other sources to provide the most complete picture. These tools must help the researcher identify the most relevant data for their particular task.
Since humans are very good at visually recognizing patterns in information, information should be presented in a way that lets users visualize that information. Tools that allow different views of the data can help users connect the dots and draw their own conclusions. It’s the difference between trying to read a long list of subway stations in a foreign language, and viewing a graphical map of the subway.
The research challenge
Searching the diverse collections of data to discover actionable insights into the biology of a disease is a huge challenge. The growth of data is outpacing our ability to analyze it, so new, more sophisticated tools and approaches are needed to help researchers connect the dots, no matter where that information is located. With the right discovery support, organizations can facilitate researchers’ interpretation of experimental data, leading to greater insight into the mechanisms of disease and accelerating biological research. This will help them to invent, validate and commercialize new, clinically effective treatments, faster and more efficiently
Wednesday, July 29, 2015
Simulations lead to design of near-frictionless material
Argonne National Laboratory scientists used Mira to identify and improve a new mechanism for eliminating friction, which fed into the development of a hybrid material that exhibited superlubricity at the macroscale for the first time. Argonne Leadership Computing Facility (ALCF) researchers helped enable the groundbreaking simulations by overcoming a performance bottleneck that doubled the speed of the team's code.
While reviewing the simulation results of a promising new lubricant material, Argonne researcher Sanket Deshmukh stumbled upon a phenomenon that had never been observed before.
"I remember Sanket calling me and saying 'you have got to come over here and see this. I want to show you something really cool,'" said Subramanian Sankaranarayanan, Argonne computational nanoscientist, who led the simulation work at the Argonne Leadership Computing Facility (ALCF), a DOE Office of Science User Facility.
They were amazed by what the computer simulations revealed. When the lubricant materials—graphene and diamond-like carbon (DLC)—slid against each other, the graphene began rolling up to form hollow cylindrical "scrolls" that helped to practically eliminate friction. These so-called nanoscrolls represented a completely new mechanism for superlubricity, a state in which friction essentially disappears.
"The nanoscrolls combat friction in very much the same way that ball bearings do by creating separation between surfaces," said Deshmukh, who finished his postdoctoral appointment at Argonne in January.
Superlubricity is a highly desirable property. Considering that nearly one-third of every fuel tank is spent overcoming friction in automobiles, a material that can achieve superlubricity would greatly benefit industry and consumers alike. Such materials could also help increase the lifetime of countless mechanical components that wear down due to incessant friction.
Experimental origins
Prior to the computational work, Argonne scientists Ali Erdemir, Anirudha Sumant, and Diana Berman were studying the hybrid material in laboratory experiments at Argonne's Tribology Laboratory and the Center for Nanoscale Materials, a DOE Office of Science User Facility. The experimental setup consisted of small patches of graphene (a two-dimensional single-sheet form of pure carbon) sliding against a DLC-coated steel ball.
The graphene-DLC combination was registering a very low friction coefficient (a ratio that measures the force of friction between two surfaces), but the friction levels were fluctuating up and down for no apparent reason. The experimentalists were also puzzled to find that humid environments were causing the friction coefficient to shoot up to levels that were nearly 100 times greater than measured in dry environments.
To shed light on these mysterious behaviors, they turned to Sankaranarayanan and Deshmukh for computational help. Using Mira, the ALCF's 10-petaflops IBM Blue Gene/Q supercomputer, the researchers replicated the experimental conditions with large-scale molecular dynamics simulations aimed at understanding the underlying mechanisms of superlubricity at an atomistic level.
This led to their discovery of the graphene nanoscrolls, which helped to fill in the blanks. The material's fluctuating friction levels were explained by the fact that the nanoscrolls themselves were not stable. The researchers observed a repeating pattern in which the hollow nanoscrolls would form, and then cave in and collapse under the pressure of the load.
"The friction was dipping to very low values at the moment the scroll formation took place and then it would jump back up to higher values when the graphene patches were in an unscrolled state," Deshmukh said.
The computational scientists had an idea to overcome this issue. They tried incorporating nanodiamond particles into their simulations to see if the hard material could help stabilize the nanoscrolls and make them more permanent.
Sure enough, the simulations proved successful. The graphene patches spontaneously rolled around the nanodiamonds, which held the scrolls in place and resulted in sustained superlubricity. The simulation results fed into a new set of experiments with nanodiamonds that confirmed the same.
"The beauty of this particular discovery is that we were able to see sustained superlubricity at the macroscale for the first time, proving this mechanism can be used at engineering scales for real-world applications," Sankaranarayanan said. "This collaborative effort is a perfect example of how computation can help in the design and discovery of new materials."
Not slippery when wet
Unfortunately, the addition of nanodiamonds did not address the material's aversion to water. The simulations showed that water suppresses the formation of scrolls by increasing the adhesion of graphene to the surface.
While this greatly limits the hybrid material's potential applications, its ability to maintain superlubricity in dry environments is a significant breakthrough in itself.
The research team is in the process of seeking a patent for the hybrid material, which could potentially be used for applications in dry environments, such as computer hard drives, wind turbine gears, and mechanical rotating seals for microelectromechanical and nanoelectromechanical systems.
Adding to the material's appeal is a relatively simple and cost-effective deposition method called drop casting. This technique involves spraying solutions of the materials on moving mechanical parts. When the solutions evaporate, it would leave the graphene and nanodiamonds on one side of a moving part, and diamond-like carbon on the other side.
However, the knowledge gained from their study is perhaps even more valuable, said Deshmukh. He expects the nanoscroll mechanism to spur future efforts to develop materials capable of superlubricity for a wide range of mechanical applications.
For their part, the Argonne team will continue its computational studies to look for ways to overcome the barrier presented by water.
"We are exploring different surface functionalizations to see if we can incorporate something hydrophobic that would keep water out," Sankaranarayanan said. "As long as you can repel water, the graphene nanoscrolls could potentially work in humid environments as well."
Simulating millions of atoms
The team's groundbreaking nanoscroll discovery would not have been possible without a supercomputer like Mira. Replicating the experimental setup required simulating up to 1.2 million atoms for dry environments and up to 10 million atoms for humid environments.
The researchers used the LAMMPS (Large-scale Atomic/Molecular Massively Parallel Simulator) code to carry out the computationally demanding reactive molecular dynamics simulations.
With the help of ALCF catalysts, a team of computational scientists who work directly with ALCF users, they were able to overcome a performance bottleneck with the code's ReaxFF module, an add-on package that was needed to model the chemical reactions occurring in the system.
The ALCF catalysts, in collaboration with researchers from IBM, Lawrence Berkeley National Laboratory and Sandia National Laboratories, optimized LAMMPS and its implementation of ReaxFF by adding OpenMP threading, replacing MPI point-to-point communication with MPI collectives in key algorithms, and leveraging MPI I/O. Altogether, these enhancements allowed the code to perform twice as fast as before.
"With the code optimizations in place, we were able to model the phenomena in real experimental systems more accurately," Deshmukh said. "The simulations on Mira showed us some amazing things that could not be seen in laboratory tests."
And with the recent announcement of Aurora, the ALCF's next-generation supercomputer, Sankaranarayanan is excited about where this line of research could go in the future.
"Given the advent of computing resources like Aurora and the wide gamut of the available two-dimensional materials and nanoparticle types, we envision the creation of a lubricant genome at some point in the future," he said. "Having a materials database like this would allow us to pick and choose lubricant materials for specific operational conditions."
While reviewing the simulation results of a promising new lubricant material, Argonne researcher Sanket Deshmukh stumbled upon a phenomenon that had never been observed before.
"I remember Sanket calling me and saying 'you have got to come over here and see this. I want to show you something really cool,'" said Subramanian Sankaranarayanan, Argonne computational nanoscientist, who led the simulation work at the Argonne Leadership Computing Facility (ALCF), a DOE Office of Science User Facility.
They were amazed by what the computer simulations revealed. When the lubricant materials—graphene and diamond-like carbon (DLC)—slid against each other, the graphene began rolling up to form hollow cylindrical "scrolls" that helped to practically eliminate friction. These so-called nanoscrolls represented a completely new mechanism for superlubricity, a state in which friction essentially disappears.
"The nanoscrolls combat friction in very much the same way that ball bearings do by creating separation between surfaces," said Deshmukh, who finished his postdoctoral appointment at Argonne in January.
Superlubricity is a highly desirable property. Considering that nearly one-third of every fuel tank is spent overcoming friction in automobiles, a material that can achieve superlubricity would greatly benefit industry and consumers alike. Such materials could also help increase the lifetime of countless mechanical components that wear down due to incessant friction.
Experimental origins
Prior to the computational work, Argonne scientists Ali Erdemir, Anirudha Sumant, and Diana Berman were studying the hybrid material in laboratory experiments at Argonne's Tribology Laboratory and the Center for Nanoscale Materials, a DOE Office of Science User Facility. The experimental setup consisted of small patches of graphene (a two-dimensional single-sheet form of pure carbon) sliding against a DLC-coated steel ball.
The graphene-DLC combination was registering a very low friction coefficient (a ratio that measures the force of friction between two surfaces), but the friction levels were fluctuating up and down for no apparent reason. The experimentalists were also puzzled to find that humid environments were causing the friction coefficient to shoot up to levels that were nearly 100 times greater than measured in dry environments.
To shed light on these mysterious behaviors, they turned to Sankaranarayanan and Deshmukh for computational help. Using Mira, the ALCF's 10-petaflops IBM Blue Gene/Q supercomputer, the researchers replicated the experimental conditions with large-scale molecular dynamics simulations aimed at understanding the underlying mechanisms of superlubricity at an atomistic level.
This led to their discovery of the graphene nanoscrolls, which helped to fill in the blanks. The material's fluctuating friction levels were explained by the fact that the nanoscrolls themselves were not stable. The researchers observed a repeating pattern in which the hollow nanoscrolls would form, and then cave in and collapse under the pressure of the load.
"The friction was dipping to very low values at the moment the scroll formation took place and then it would jump back up to higher values when the graphene patches were in an unscrolled state," Deshmukh said.
The computational scientists had an idea to overcome this issue. They tried incorporating nanodiamond particles into their simulations to see if the hard material could help stabilize the nanoscrolls and make them more permanent.
Sure enough, the simulations proved successful. The graphene patches spontaneously rolled around the nanodiamonds, which held the scrolls in place and resulted in sustained superlubricity. The simulation results fed into a new set of experiments with nanodiamonds that confirmed the same.
"The beauty of this particular discovery is that we were able to see sustained superlubricity at the macroscale for the first time, proving this mechanism can be used at engineering scales for real-world applications," Sankaranarayanan said. "This collaborative effort is a perfect example of how computation can help in the design and discovery of new materials."
Not slippery when wet
Unfortunately, the addition of nanodiamonds did not address the material's aversion to water. The simulations showed that water suppresses the formation of scrolls by increasing the adhesion of graphene to the surface.
While this greatly limits the hybrid material's potential applications, its ability to maintain superlubricity in dry environments is a significant breakthrough in itself.
The research team is in the process of seeking a patent for the hybrid material, which could potentially be used for applications in dry environments, such as computer hard drives, wind turbine gears, and mechanical rotating seals for microelectromechanical and nanoelectromechanical systems.
Adding to the material's appeal is a relatively simple and cost-effective deposition method called drop casting. This technique involves spraying solutions of the materials on moving mechanical parts. When the solutions evaporate, it would leave the graphene and nanodiamonds on one side of a moving part, and diamond-like carbon on the other side.
However, the knowledge gained from their study is perhaps even more valuable, said Deshmukh. He expects the nanoscroll mechanism to spur future efforts to develop materials capable of superlubricity for a wide range of mechanical applications.
For their part, the Argonne team will continue its computational studies to look for ways to overcome the barrier presented by water.
"We are exploring different surface functionalizations to see if we can incorporate something hydrophobic that would keep water out," Sankaranarayanan said. "As long as you can repel water, the graphene nanoscrolls could potentially work in humid environments as well."
Simulating millions of atoms
The team's groundbreaking nanoscroll discovery would not have been possible without a supercomputer like Mira. Replicating the experimental setup required simulating up to 1.2 million atoms for dry environments and up to 10 million atoms for humid environments.
The researchers used the LAMMPS (Large-scale Atomic/Molecular Massively Parallel Simulator) code to carry out the computationally demanding reactive molecular dynamics simulations.
With the help of ALCF catalysts, a team of computational scientists who work directly with ALCF users, they were able to overcome a performance bottleneck with the code's ReaxFF module, an add-on package that was needed to model the chemical reactions occurring in the system.
The ALCF catalysts, in collaboration with researchers from IBM, Lawrence Berkeley National Laboratory and Sandia National Laboratories, optimized LAMMPS and its implementation of ReaxFF by adding OpenMP threading, replacing MPI point-to-point communication with MPI collectives in key algorithms, and leveraging MPI I/O. Altogether, these enhancements allowed the code to perform twice as fast as before.
"With the code optimizations in place, we were able to model the phenomena in real experimental systems more accurately," Deshmukh said. "The simulations on Mira showed us some amazing things that could not be seen in laboratory tests."
And with the recent announcement of Aurora, the ALCF's next-generation supercomputer, Sankaranarayanan is excited about where this line of research could go in the future.
"Given the advent of computing resources like Aurora and the wide gamut of the available two-dimensional materials and nanoparticle types, we envision the creation of a lubricant genome at some point in the future," he said. "Having a materials database like this would allow us to pick and choose lubricant materials for specific operational conditions."
Finalists Announced for the 2015 R&D 100 Awards
Rockaway, NJ, July 22, 2015 – R&D Magazine today announced the Finalists for the 53rd annual R&D 100 Awards, which honor the 100 most innovative technologies and services of the past year. This year’s Winners will be presented with their honors at the annual black-tie awards dinner on November 13, 2015 at Caesars Palace, Las Vegas, Nevada.
The Finalists were selected by an independent panel of more than 70 judges. This year’s Finalists represent many of industry’s leading organizations and national laboratories, as well as many newcomers to the R&D 100 Awards, often referred to as the “Oscars of Invention.”
For the first time in its history, the winners of the R&D 100 Awards will be honored for exemplary accomplishments from across five categories: Analytical Test, IT/Electrical, Mechanical Devices/Materials, Process/Prototyping, and Software/Services. The 2015 Awards will also honor excellence in four new special recognition categories – Market Disruptor (Services), Market Disruptor (Products), Corporate Social Responsibility, and Green Tech.
"This was a particularly strong year for research and development, led by many outstanding technologies that broadened the scope of innovation,” said R&D Magazine Editor Lindsay Hock. “We are honored to recognize these products and the project teams behind the design, development, testing, and production of these remarkable innovations and their impact in the field. We look forward to celebrating the winners in November.”
A detailed list of the 2015 Finalists can be found here.
In addition to the awards gala, this year’s event has been expanded to include the two-day R&D 100 Technology Conference featuring an impressive line-up of 28 educational sessions presented by high-profile speakers. The conference will also feature two keynote addresses from noted innovator Dean Kamen, and Thom Mason, PhD, Director of Oak Ridge National Laboratory. Panel discussions devoted to the future of R&D and the annual R&D Global Funding Forecast round out the program. The educational sessions have been divided into four tracks focusing on key areas of R&D: R&D Strategies & Efficiencies, Emerging Technologies & Materials, Innovations in Robotics & Automation, and Instrumentation & Monitoring.
For more information on the R&D 100 Awards Finalists contact Lindsay Hock, at 973-920-7036, lindsay.hock@advantagemedia.com.
Click here to register to attend the 2015 R&D 100 Awards dinner on November 13 at Caesars Palace in Las Vegas.
The Finalists were selected by an independent panel of more than 70 judges. This year’s Finalists represent many of industry’s leading organizations and national laboratories, as well as many newcomers to the R&D 100 Awards, often referred to as the “Oscars of Invention.”
For the first time in its history, the winners of the R&D 100 Awards will be honored for exemplary accomplishments from across five categories: Analytical Test, IT/Electrical, Mechanical Devices/Materials, Process/Prototyping, and Software/Services. The 2015 Awards will also honor excellence in four new special recognition categories – Market Disruptor (Services), Market Disruptor (Products), Corporate Social Responsibility, and Green Tech.
"This was a particularly strong year for research and development, led by many outstanding technologies that broadened the scope of innovation,” said R&D Magazine Editor Lindsay Hock. “We are honored to recognize these products and the project teams behind the design, development, testing, and production of these remarkable innovations and their impact in the field. We look forward to celebrating the winners in November.”
A detailed list of the 2015 Finalists can be found here.
In addition to the awards gala, this year’s event has been expanded to include the two-day R&D 100 Technology Conference featuring an impressive line-up of 28 educational sessions presented by high-profile speakers. The conference will also feature two keynote addresses from noted innovator Dean Kamen, and Thom Mason, PhD, Director of Oak Ridge National Laboratory. Panel discussions devoted to the future of R&D and the annual R&D Global Funding Forecast round out the program. The educational sessions have been divided into four tracks focusing on key areas of R&D: R&D Strategies & Efficiencies, Emerging Technologies & Materials, Innovations in Robotics & Automation, and Instrumentation & Monitoring.
For more information on the R&D 100 Awards Finalists contact Lindsay Hock, at 973-920-7036, lindsay.hock@advantagemedia.com.
Click here to register to attend the 2015 R&D 100 Awards dinner on November 13 at Caesars Palace in Las Vegas.
Subscribe to:
Comments (Atom)
