AlphaFold 2 vs. Openfold Protein Folding Software

AlphaFold 2 vs. Openfold Protein Folding Software

Learn the key considerations for evaluating and selecting the right application for your Cloud-environment.  

Source

 

Good software means faster work for drug research and development, particularly concerning proteins. Proteins serve as the basis for many treatments, and learning more about their structures can accelerate the development of new treatments and medications. 

With more software now infusing an artificial intelligence element, researchers expect to significantly streamline their work and revolutionize the drug industry. When it comes to protein folding software, two names have become industry frontrunners: AlphaFold and Openfold. 

Learn the differences between the two programs, including insights into how RCH is supporting and informing our customers about the strategic benefits the AlphaFold and Openfold applications can offer based on their environment, priorities and objectives.

 

About AlphaFold2

Developed by DeepMind and EMBL’s European Bioinformatics Institute, AlphaFold2 uses AI technology to predict a protein’s 3D structure based on its amino acid sequence. Its structure database is hosted on Google Cloud Storage and is free to access and use. 

The newer model, AlphaFold 2, won the CASP competition in November 2020, having achieved more accurate results than any other entry. AlphaFold2 scored above 90 for more than two-thirds of the proteins in CASP’s global distance test, which measures whether the computational-predicted structure mirrors the lab-determined structure.  

To date, there are more than 200 million known proteins, each one with a unique 3D shape. AlphaFold2 aims to simplify the once-time-consuming and expensive process of modeling these proteins. Its speed and accuracy are accelerating research and development in nearly every area of biology. By doing so, scientists will be better able to tackle diseases, discover new medicines and cures, and understand more about life itself.

 

Exploring Openfold Protein Folding Software

Another player in the protein software space, Openfold, is PyTorch’s reproduction of Deepmind’s AlphaFold. Founded by three Seattle biotech companies (Cyrus Biotechnology, Outpace Bio, and Arzeda), the team aims to support open-source development in the protein folding software space, which is registered on AWS. The project is part of the nonprofit organization Open Molecular Software Foundation and has received support from the AWS Open Data Sponsorship Program.

Despite being more of a newcomer to the scene, Openfold is quickly turning heads with its open source model and more “completeness” compared to AlphaFold. In fact, it has been billed as a faster and more powerful version than its predecessor. 

Like AlphaFold, Openfold is designed to streamline the process of discovering how proteins fold in and around on themselves, but possibly at a higher rate and more comprehensively than its predecessor. The model has undergone more than 100,000 hours of training on NVIDIA A100 Tensor Core GPUs, with the first 3,000 hours boasting 90%+ final accuracy.

 

AlphaFold vs. Openfold: Our Perspective

Despite Openfold being a reproduction of AlphaFold, there are several key differences between the two.  

AlphaFold2 and Openfold boast similar accuracy ratings, but Openfold may have a slight advantage. Openfold’s interface is also about twice as fast as that of AlphaFold when modeling short proteins. For long protein strands, the speed advantage is minimal. 

Openfold’s optimized memory usage allows it to handle much longer protein sequences—up to 4,600 residues on a single 40GB A100.

One of the clearest differences between AlphaFold2 and Openfold is that Openfold is trainable. This makes it valuable for our customers in niche or specialized research, a capability that AlphaFold lacks.

 

Key Use Cases from Our Customers

Source

 

Both AlphaFold and Openfold have offered  game-changing functionality for our customers’ drug research and development. That’s why many of the organization’s we’ve supported haveeven considered a hybrid approach rather than making an either/or decision.

Both protein folding software can be deployed across a variety of use cases, including:

New Drug Discovery

The speed and accuracy with which protein folding software can model protein strands make it a powerful tool in new drug development, particularly for diseases that have largely been neglected. These illnesses often disproportionately affect individuals in developing countries. Examples include parasitic diseases, such as Chagas disease or leishmaniasis. 

Combating Antibiotic Resistance

As the usage of antibiotics continues to rise, so does the risk of individuals developing antibiotic resistance. Previous data from the CDC shows that nearly one in three prescriptions for antibiotics is unnecessary. It’s estimated that antibiotic resistance costs the U.S. economy nearly $55 billion every year in healthcare and productivity losses.

What’s more, when people become resistant to antibiotics, it leaves the door wide open for the creation of “superbugs.” Since these bugs cannot be killed with typical antibiotics, illnesses can become more severe.

Professionals from the University of Colorado, Boulder, are putting AlphaFold to the test in learning more about proteins involved in antibiotic resistance. The protein folding software is helping researchers identify protein structures that they could confirm via crystallography.

Vaccine Development

Source

 

Learning more about protein structures is proving useful in developing new vaccines, such as a multi-agency collaboration on a new malaria vaccine. The WHO endorsed the first malaria vaccine in 2021. However, researchers at the University of Oxford and the National Institute of Allergy and Infectious Diseases are working together to create a more effective version that better prevents transmission.

Using AlphaFold and crystallography, the two agencies identified the first complete structure of the protein Pfs48/45. This breakthrough could pave the way for future vaccine developments.

Learning More About Genetic Variations

Genetics has long fascinated scientists and may hold the key to learning more about general health, predisposition to diseases, and other traits. A professor at ETH Zurich is using AlphaFold to learn more about how a person’s health may change over time or what traits they will exhibit based on specific mutations in their DNA.

AlphaFold has proven useful in reviewing proteins in different species over time, though the accuracy diminishes the further back in time the proteins are reviewed. Seeing how proteins evolve over time can help researchers predict how a person’s traits might change in the future.

 

How RCH Solutions Can Help

Selecting protein folding software for your research facility is easier with a trusted partner like RCH solutions. Not only can we inform the selection process, but we also provide support in implementing new solutions. We’ll work with you to uncover your greatest needs and priorities and align the selection process with your end goals with budget in mind.

Contact us to learn how RCH Solutions can help.

 

Sources:

https://www.nature.com/articles/d41586-022-00997-5

https://www.deepmind.com/research/highlighted-research/alphafold

https://alphafold.ebi.ac.uk/

https://www.geekwire.com/2022/biotech-startups-join-aws-and-other-partners-in-open-source-project-to-help-design-new-proteins/

https://wandb.ai/telidavies/ml-news/reports/OpenFold-A-PyTorch-Reproduction-Of-DeepMind-s-AlphaFold–VmlldzoyMjE3MjI5

https://www.drugdiscoverytrends.com/7-ways-deepmind-alphafold-used-life-sciences/

https://www.cdc.gov/media/releases/2016/p0503-unnecessary-prescriptions.html

https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6929930/

https://inf.news/en/science/a3704ca6546593f8387d58e051d0387e.html

Considerations for Adding CryoEM to Your Research Facility

CryoEm brings a wealth of potential to drug research. But first, you’ll need to build an infrastructure to support large-scale data movement.

The 2017 Nobel prize in chemistry marked a new era for scientific research. Three scientists earned the honor with their introduction of cryo-electron microscopy—an instrument that delivers high-resolution imagery of molecule structures. With a better view of nucleic acids, proteins, and other biomolecules, new doors have opened for scientists to discover and develop new medications. 

However, implementing cryo-electron microscopy isn’t without its challenges. Most notably, the instrument captures very large datasets that require unique considerations in terms of where they’re stored and how they’re used. The level of complexity and the distinct challenges cryo-EM presents requires the support of a highly experienced Bio-IT partner, like RCH, who are actively supporting large and emerging organizations’ with their cryo-EM implementation and management. But let’s jump into the basics first.

How Our Customers Are Using Cryo-Electron Microscopy

Cryo-electron microscopy (cryo-EM) is revolutionizing biology and chemistry. Our customers are using it  to analyze the structures of proteins and other biomolecules with a greater degree of accuracy and speed compared to other methods.

In the past, scientists have used X-ray diffraction to get high-resolution images of molecules. But in order to receive these images, the molecules first need to be crystallized. This poses two problems: many proteins won’t crystallize at all. And in those that do, the crystallization can often change the structure of the molecule, which means the imagery won’t be accurate.

Cryo-EM provides a better alternative because it doesn’t require crystallization. What’s more, scientists can gain a clearer view of how molecules move and interact with each other—something that’s extremely hard to do using crystallization.

Cryo-EM can also study larger proteins, complexes of molecules, and membrane-bound receptors. Using NMR to achieve the same results is challenging, as nuclear magnetic resonance (NMR) is typically limited to smaller proteins. 

Because cryo-EM can give such detailed, accurate images of biomolecules, its use is being explored in the field of drug discovery and development. However, given its $5 million price tag and complex data outputs, it’s essential for labs considering cryo-EM to first have the proper infrastructure in place, including expert support, to avoid it becoming a sunken cost. 

The Challenges We’re Seeing With the Implementation of Cryo-EM

Introducing cryo-EM to your laboratory can bring excitement to your team and a wealth of potential to your organization. However, it’s not a decision to make lightly, nor is it one you should make without consultation with strategic vendors actively working in the cryo-EM space, like RCH 

The biggest challenge labs face is the sheer amount of data they need to be prepared to manage. The instruments capture very large datasets that require ample storage, access controls, bandwidth, and the ability to organize and use the data.

The instruments themselves bear a high price tag, and adding the appropriate infrastructure increases that cost. The tools also require ongoing maintenance.

There’s also the consideration of upskilling your team to operate and troubleshoot the cryo-EM equipment. Given the newness of the technology, most in-house teams simply don’t have all of the required skills to manage the multiple variables, nor are they likely to have much (or any) experience working on cryo-EM projects.

Biologists are no strangers to undergoing training, so consider this learning curve just a part of professional development. However, combined with learning how to operate the equipment AND make sense of the data you collect, it’s clear that the learning curve is quite steep. It may take more training and testing than the average implementation project to feel confident in using the equipment. 

For these reasons and more, engaging a partner like RCH that can support your firm from the inception of its cryo-EM implementation ensures critical missteps are circumvented which ultimately creates more sustainable and future-proof workflows, discovery and outcomes.With the challenges properly addressed from the start, the promises that cryo-EM holds are worth the extra time and effort it takes to implement it.

How to Create a Foundational Infrastructure for Cryo-EM Technology

As you consider your options for introducing Cryo-EM technology, one of your priorities should be to create an ecosystem in which cryo-EM can thrive in a cloud-first, compute forward approach. Setting the stage for success, and ensuring you are bringing the compute to the data from inception, can help you reap the most rewards and use your investment wisely.

Here are some of the top considerations for your infrastructure:

Network Bandwidth

One early study of cryo-EM found that each microscope outputs about 500 GB of data per day. Higher bandwidth can help streamline data processing by increasing download speeds so that data can be more quickly reviewed and used.

Proximity to and Capacity of Your Data Center

Cryo-EM databases are becoming more numerous and growing in size and scope. The largest data set in the Electron Microscopy Public Image Archive (EMPIAR) is 12.4TB, while the median data set is about 2TB. Researchers expect these massive data sets to become the norm for cryo-EM, which means you need to ensure your data center is prepared to handle a growing load of data. This applies to both cloud-first organizations and those with hybrid data storage models.

Integration with High-Performance Computing

Integrating high-performance computing (HPC) with your cryo-EM environment ensures you can take advantage of the scope and depth of the data created. Scientists will be churning through massive piles of data and turning them into 3D models, which will take exceptional computing power.

Having the Right Tools in Place

To use cryo-EM effectively, you’ll need to complement your instruments with other tools and software. For example, CryoSPARC is the most common software that’s purpose-built for cryo-EM technology. It has configured and optimized the workflows specifically for research and drug discovery.

Availability and Level of Expertise

Because cryo-EM is still relatively new, organizations must decide how to gain the expertise they need to use it to its full potential. This could take several different forms, including hiring consultants, investing in internal knowledge development, and tapping into online resources. 

How RCH Solutions Can Help You Prepare for Cryo-EM

Implementing cryo-EM is an extensive and costly process, but laboratories can mitigate these and other challenges with the right guidance. It starts with knowing your options and taking all costs and possibilities into account. 

Cryo-EM is the new frontier in drug discovery, and RCH Solutions is here to help you remain on the cutting edge of it. We provide tactical and strategic support in developing a Cryo-EM infrastructure that will help you generate a return on investment.

Contact us today to learn more.

Sources:

https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7096719/

https://www.nature.com/articles/d41586-020-00341-9

https://www.gatan.com/techniques/cryo-em

https://www.chemistryworld.com/news/explainer-what-is-cryo-electron-microscopy/3008091.article

https://www.nanoimagingservices.com/blog/three-common-challenges-to-adopting-cryo-em-in-your-drug-discovery-program-and-how-to-overcome-them

https://cryosparc.com/

https://www.microway.com/hpc-tech-tips/cryoem-takes-center-stage-how-compute-storage-networking-needs-growing/

https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6067001/

 

Culture & Values: Why They Matter Now More Than Ever

Thoughts from our CEO, Michael Riener

Now more than ever, creating a culture that encompasses all of the elements necessary to ensure employee, customer and organizational success and exceptional outcomes is critical. Why? Our industry is more challenged than ever with talent acquisition, development and retention, not to mention the sky high demands for services and support and the impact this has on our team members.

As industry and organizational leaders, we need to meet these challenges with strong calls to actions and initiatives to create change.

For this reason and more, hiring the right people who share these core values, and building a culture around a team that embraces the RCH Solutions DNA is paramount, because it’s our opportunity to get things right from inception, for the organization and our team members, and lead with our non-negotiable core values.

The Current Challenges
The Biopharma, Biotech and Life Sciences industries are experiencing record burnout. Most recently, the pharmaceuticals, biotechnology and life sciences category fell almost 3 points (2.9) between the first and second quarter of 2022, per corporate reputation monitor, RepTrak, and as outlined in the article linked below. But what does this actually mean? This near real time measure and its decline further solidifies the challenges that exist both internally and externally for stakeholders and organizations, which if not met with actions and initiatives, could have devastating impacts.

The Supporting Solutions
But how can we realistically form meaningful actions to improve our position and outcomes? Lean back into or overhaul your company values. Too often companies are reactive and wait to fix and troubleshoot when a problem exists. Why not proactively set our team members up for success from the beginning? By taking care of our teams and nurturing their personal and professional growth and experience, we can in turn, empower them to deliver their best for our organization and customers.

What We’re Doing at RCH
At RCH, our core values are in place to do just that, and are far more than just words — they unwaveringly represent the threads that weave together the fabric of our culture. And we truly mean it, expect it and live and breathe it. As an organization we: 

  1. Embrace Excellence
  2. Expect Accountability
  3. Adventure Together
  4. Succeed as a Team
  5. Respect Boundaries & Balance

But what do these really mean and what do they look like? To extend upon our core values, and to illustrate how these are measured and look in action, see the infographic below.

RCH Values Infographic

These core values are also consistently reinforced by our customers in recognition of specific team members, and we use these foundational principles for just about everything. From serving as a guide when interviewing new team members to a barometer when evaluating our performance as individuals and teams, and even when deciding which customers to work with. Our values embody the behaviors upon which we measure our success and create a framework for our growth as people and professionals. 

I am very confident that our teams feel these values in action daily at RCH. And their commitment and buy-in to these beliefs are evident and flow through every interaction we have and the outcomes we deliver collectively, both internally and externally. And that matters to us and it should to you too. 

‘Pharma burnout’: Industry reputation slumps in first half of 2022, study finds – Endpoints News (endpts.com)

 

Quality vs. Quantity: A Simple Scale for Success

In Life Sciences, and medical fields in particular, there is a premium on expertise and the role of a specialist. When it comes to scientists, researchers, and doctors, even a single high-performer who brings advanced knowledge in their field often contributes more value than a few average generalists who may only have peripheral knowledge. Despite this premium placed on specialization or top-talent as an industry norm, many life science organizations don’t always follow the same measure when sourcing vendors or partners, particularly those in the IT space. 

And that’s a mis-step. Here’s why.

Why “A” Talent Matters

I’ve seen far too many organizations that had, or still have, the above strategy, and also many that focus on acquiring and retaining top talent. The difference? The former experienced slow adoption which stalled outcomes which often had major impacts to their short and long term objectives. The latter propelled their outcomes out of the gates, circumventing cripping mistakes along the way. For this reason and more, I’m a big believer in attracting and retaining only “A” talent. The best talent and the top performers (Quality) will always outshine and out deliver a bunch of average ones. Most often, those individuals are inherently motivated and engaged, and when put in an environment where their skills are both nurtured and challenged, they thrive.  

Why Expertise Prevails

While low-cost IT service providers with deep rosters may similarly be able to throw a greater number of people at problems, than their smaller, boutique counterparts, often the outcome is simply more people and more problems.  Instead, life science teams should aim to follow their R&D talent acquisition processes and focus on value and what it will take to achieve the best outcomes in this space. Most often, it’s not about quantity of support/advice/execution resources—but about quality.

Why Our Customers Choose RCH

Our customers are like minded and also employ top talent, which is why they value RCH—we consistently service them with the best. While some organizations feel that throwing bodies (Quantity) at a problem is one answer, often one for optics, RCH does not. We never have. Sometimes you can get by with a generalist, however, in our industry, we have found that our customers require and deserve specialists. The outcomes are more successful. The results are what they seek— Seamless transformation.  

In most cases, we are engaged with a customer who has employed the services of a very large professional services or system integration firm. Increasingly, those customers are turning to RCH to deliver on projects typically reserved for those large, expensive, process-laden companies.  The reason is simple. There is much to be said for a focused, agile and proven company.  

Why Many Firms Don’t Restrategize

So why do organizations continue to complain but rely on companies such as these? The answer has become clear—risk aversion. But the outcomes of that reliance are typically just increased costs, missed deadlines or major strategic adjustments later on – or all of the above. But why not choose an alternative strategy from inception? I’m not suggesting turning over all business to a smaller organization. But, how about a few? How about those that require proven focus, expertise and the track record of delivery? I wrote a piece last year on the risk of mistaking “static for safe,” and stifling innovation in the process. The message still holds true. 

We all know that scientific research is well on its way to becoming, if not already, a multi-disciplinary, highly technical process that requires diverse and cross functional teams to work together in new ways. Engaging a quality Scientific Computing partner that matches that expertise with only “A” talent, with the specialized skills, service model and experience to meet research needs can be a difference-maker in the success of a firm’s research initiatives. 

My take? Quality trumps quantity—always in all ways. Choose a scientific computing partner whose services reflect the specialized IT needs of your scientific initiatives and can deliver robust, consistent results. Get in touch with me below to learn more. 

 

Five Ways to Improve Your Research Outcomes

If You’re Not Doing These Five Things to Improve Research Outcomes, Start Now

Effective research and development programs are still one of the most significant investments of any biopharma. In fact, Seed Scientific estimates that the current global scientific research market is worth $76 billion, including $33.44 billion in the United States alone. Despite the incredible advancements in technology now aiding scientific discovery, it’s still difficult for many organizations to effectively bridge the gap between the business and IT, and fully leverage innovation to drive the most value out of their R&D product. 

If you’re in charge of your organization’s R&D IT efforts, following tried and true best practices may help. Start with these five strategies to help the business you’re supporting drive better research outcomes.

 

Tip #1: Practice Active Listening

Instead of jumping to a response when presented with a business challenge, start by listening to research teams and other stakeholders and learning more about their experiences and needs. The process of active listening, which involves asking questions to create a more comprehensive understanding of the issue at hand, can lead to new avenues of inspiration and help internal IT organizations better understand the challenges and opportunities before them. 

 

Tip #2: Plan Backwards 

Proper planning is a must for most scientific computing initiatives. But one particularly interesting method for accomplishing an important goal, such as moving workloads to the Cloud or optimizing your applications and workflows for global collaboration, is to start with a premortem. During this brainstorming session, team members and other stakeholders can predict possible challenges and other roadblocks and ideate viable solutions before any work begins. Research by Harvard Business Review shows this process can improve the identification of the underlying causes of issues by 30% and ultimately help drive better project and research outcomes.

 

Tip #3: Consider the Process, Not Just the Solution

Research scientists know all too well that discovering a viable solution is merely the beginning of a long journey to market. It serves R&D IT teams well to consider the same when developing and implementing platform solutions for business needs. Whether a system within a compute environment needs to be maintained, upgraded, or retired, R&D IT teams must prepare to adjust their approach based on the business’ goals, which may shift as a project progresses. Therefore, maintaining a flexible and agile approach throughout the project process is critical.  

 

Tip #4: Build a Specialized R&D IT Team 

Any member of an IT team working in support of the unique scientific computing needs of the business (as opposed to more common enterprise IT efforts) should possess both knowledge and experience in the specific tools, applications, and opportunities within scientific research and discovery. Moreover, they should have the skills to quickly identify and shift to the most important priorities as they evolve and adapt to new methods and initiatives that support improved research outcomes. If you don’t have these resources on staff, consider working with a specialized scientific computing partner to bridge this gap. 

 

Tip #5: Prepare for the Unexpected 

In research compute, it’s not enough to have a fall-back plan—you need a back-out plan as well. Being able to pivot quickly and respond appropriately to an unforeseen challenge or opportunity is mission-critical. Better yet, following best practices that mitigate risk and enable contingency planning from the start (like implementing an infrastructure-as-code protocol), can help the business you’re supporting avoid crippling delays in their research process. 

While this isn’t an exhaustive list, these five strategies provide an immediate blueprint to improve collaboration between R&D IT teams and the business, and support better research outcomes through smarter scientific computing. But if you’re looking for more support, RCH Solutions’ specialized Sci-T Managed Services could be the answer.  Learn more about his specialized service, here

 

4 Scientific Computing Best Practices to Take Charge of your R&D IT Efforts in 2022

Attention R&D IT decision makers: 

If you’re expecting different results in 2022 despite relying on the same IT vendors and rigid support model that didn’t quite get you to your goal last year, it may be time to hit pause on your plan.

At RCH, we’ve spent the past 30+ years paying close attention to what works — and what doesn’t—while providing specialty scientific computing and research IT support exclusively in the Life Sciences. We’ve put together this list of must-do best practices that you, and especially your external IT partner, should move to the center of your strategy to help you to take charge of your R&D IT roadmap. 

And if your partners are not giving you this advice to get your project back track?  Well, it may be time to find a new one.

1. Ground Your Plan in Reality
In the high-stakes and often-demanding environment of R&D IT, the tendency to move toward solutioning before fully exposing and diagnosing the full extent of the issue or opportunity is very common. However, this approach is not only ineffective, it’s also expensive. Only when your strategy and plan is created to account for where you are today — not where you’d like to be today — can you be confident that it will take you where you want to go. Otherwise, you’ll be taking two steps forward, and one (or more) step back the entire time.

2. Start with Good Data
Research IT professionals are often asked to support a wide range of data-related projects. But the reality is, scientists can’t use data to drive good insights, if they can’t find or make sense of the data in the first place. Implementing FAIR data practices should be the centerprise of any scientific computing strategy. Once you see the full scope of your data needs, only then can you deliver on high-value projects, such as analytics or visualization.

3. Make “Fit-For-Purpose” Your Mantra
Research is never a one size fits all process. Though variables may be consistent based on the parameters of your organization and what has worked well in the past, viewing each challenge as unique affords you the opportunity to leverage best-of-breed design patterns and technologies to answer your needs. Therefore, resist the urge to force a solution, even one that has worked well in other instances, into a framework if it’s not the optimal solution, and opt for a more strategic and tailored approach. 

4. Be Clear On the Real Source of Risk
Risk exists in virtually all industries, but in a highly regulated environment, the concept of mitigating risk is ubiquitous, and for good reason.  When the integrity of data or processes drives outcomes that can actually influence life or death, accuracy is not only everything, it’s the only thing. And so the tendency is to go with what you know. But ask yourself this: does your effort to minimize risk stifle innovation? In a business built on boundary-breaking innovation, mistaking static for “safe” can be costly.  Identifying which projects, processes and/or workloads would be better managed by other, more specialized service providers may actually reduce risk by improving project outcomes.   

Reaching Your R&D IT Goals in 2022:  A Final Thought

Never substitute experience. 

Often, the strategy that leads to many effective scientific and technical computing initiatives within an R&D IT framework differs from a traditional enterprise IT model. And that’s ok because, just as often, the goals do as well. That’s why it is so important to leverage the expertise of R&D IT professionals highly specialized and experienced in this niche space.

Experience takes time to develop. It’s not simply knowing what solutions work or don’t, but rather understanding the types of solutions or solution paths that are optimal for a particular goal, because, well—‘been there done that. It’s having the ability to project potential outcomes, in order to influence priorities and workflows. And ultimately, it’s knowing how to find the best design patterns. 

It’s this level of specialization — focused expertise combined with real, hands-on experience — that can make all the difference in your ability to realize your outcome. 

And if you’re still on the fence about that, just take a look at some of these case studies to see how it’s working for others.