Optimize Your Life Science Startup for Success From the Start: Develop a Solid Compute Model

Optimize Your Life Science Startup for Success From the Start: Develop a Solid Compute Model

Building an effective computing environment early on helps ensure positive research outcomes later. 

Now more than ever, Life Science research and development is driven by technological innovation.

That doesn’t mean human ingenuity has become any less important. It simply depends on accurate, well-structured data more than ever before. The way Life Science researchers capture, store, and communicate that data is crucial to their success.

This is one of the reasons why Life Science professionals are leaving major pharmaceutical firms and starting their own new ventures. Startups have the unique opportunity to optimize their infrastructure from the very start and avoid being hamstrung by technology and governance limitations the way many large enterprises often are.

Optimized tech and data management models offer a significant competitive advantage in the Life Science and biopharma industries. For example, implementing AI-based or other predictive software and lean workflows makes it easier for scientists to synthesize data and track toward positive results or, equally as important, quickly see the need to pivot their strategy to pursue a more viable solution.  The net effect is a reduction in the time and cost of discovery, which not only gives R&D teams a competitive upper hand but improves outcomes for patients.

Time is Money in R&D

Life Science research and development is a time-consuming, resource-intensive process that does not always yield the results scientists or stakeholders would like. But startup executives who optimize discovery processes using state-of-the-art technology early on can mitigate two significant risks:

  • Fixing Broken Environments. Building and deploying optimized computing infrastructure is far easier and less expensive than repairing a less-than-ideal computing environment once you hit an obstacle. 
  • Losing Research Data Value. Suboptimal infrastructure makes it difficult to fully leverage data to achieve research goals. This means spending more time manually handling data and less time performing critical analysis. In a worst-case scenario, a bad infrastructure can lead to even good data being lost or mismanaged, rendering it useless. 

Startups that get the experience and expertise they need early on can address these risks and deploy a solid compute model that will generate long-lasting value.  

5 Areas of Your Computing Model to Optimize for Maximum Success

There are five foundational areas startup researchers should focus on when considering and developing their compute model:

1. Technology

Research teams need to consider how different technologies interact with one another and what kinds of integrations they support. They should identify the skillset each technology demands of its users and, if necessary,  seek objective guidance from a third-party consultant when choosing between technology vendors. 

2. Operating Systems

Embedded systems require dependable operating systems, especially in Life Sciences. Not only must operating systems support every tool in the researchers’ tech stack; but individual researchers must also be well-acquainted with the way those systems work. Researchers need resource management solutions that share information between stakeholders easily and securely.

3. Applications and Software

Most Life Science organizations use a variety of on-prem, Cloud-enabled, open-source, and even home-grown applications procured on short-term contracts. This offers flexibility, but organizations cannot easily coordinate between software and applications with different implementation and support requirements. Since these tools come from different sources and have varying levels of post-sale documentation and support, scientists often have to take up the heavy burden of harmonizing their tech stack on their own.

4. Workflows

Researchers have access to more scientific instruments than ever before. Manufacturers routinely provide assistance and support in implementing these systems, but that is not always enough. Startups need expert guidance in establishing workflows that utilize technological and human resources optimally. 

But building and optimizing scientific workflows is not a one-size-fits-all endeavor; teams with multiple research goals may need separate workflows optimized differently to accommodate each specific research goal.

5. Best Practices

Optimizing a set of research processes to deliver predictable results is not possible without a stable compute environment in place. For Life Science research organizations to develop a robust set of best practices, they must first implement the scientific computing model that makes those practices possible. This takes expert guidance and implementation from professionals who specialize in IT considerations unique to a research and development environment that, many times, lean startups simply don’t have on the team.

Maximize the Impact of Research Initiatives 

Emerging Life Science and biotech research companies have to empower their scientific teams to make the most of the tools now available. But architecting, hinging, and implementing a robust and effective compute model requires experience and expertise in the very specific area of IT unique to research and discovery. If the team lacks such a resource, scientists will often jump into the role of solving IT problems, pulling them away from the core value of their expertise.

The right bio-IT partner can be instrumental in helping organizations design, develop, and implement their computing environment, enabling scientists to remain focused on science and helping to position the organization for long-term success.   

RCH Solutions is a global provider of computational science expertise, helping Life Sciences and Healthcare firms of all sizes clear the path to discovery for nearly 30 years. If you’re interesting in learning how RCH can support your goals, get in touch with us here. 

Remember When?

Remember when research and scientific computing in Life Sciences reflected a simpler time?  

Teams in Early Discovery through Development had almost complete liberty to manage their computing environment. Moreover, research I.T. was often separate from general I.T. And business groups were supported by dedicated technical professionals specialized not only in a particular area of science, but also in the compute practices required to properly support it. Combined, these teams had the experience, expertise, and flexibility to implement best practices and tools needed to advance research.

But times have changed.  

Today, the environment we work in is defined in large part by the effects of more than 15 years of consolidation. In 2018  alone, global M&A activity within the pharmaceutical industry reached $265 billion, an increase of more than 25% over 2017. While there are a lot of great business reasons driving consolidation—the growing cost and complexity of effective R&D work, a shifting regulatory landscape, increasing competition within certain domains and from more sources, and the need to balance innovation with the economics of drug development.  The outcome has unintended consequences on the critical and formerly coveted qualities of speed, efficiency, transparency and, ironically, collaboration, despite the fact that many scientific computing specialists have now been brought back into general I.T. 

New and disruptive technologies have also played a role in redefining the look of scientific computing as we know it.  

Look at the advent of the Cloud. Arguably the single most significant I.T. innovation for the better part of the last decade, the Cloud and its capabilities are forcing change and shifting expectations around everything from infrastructure (hardware), to platforms (Window, Linux), and software (application development and deployment). Most significantly, perhaps, is its effect on the “business” of I.T. itself, with the Cloud affecting costs in many ways, including through economies of scale, use of OpEx in place of CapEx, more streamlined deployment of applications, and a better operating model for each business.

Nonetheless, as technology and scientific innovation collide in new ways, we’re seeing more firms facing barriers to scientific innovation. 

A Model Made for the Masses

Under an enterprise I.T. model that promotes standardization over specialization, many scientific computing professionals are choosing other paths, limited by policies and other practices that seldom fit the unique needs of science. 

And with resources thinning while demand for expertise in new and emerging technologies (like the Cloud, AI and ML) grows, many companies turn to outsourced support from ‘cost-effective’ vendors who fill seats and route support tickets, but bring little if any specialized research computing experience. 

More good people leave. 

Service—and science—continues to suffer. 

And business groups are left to choose between poor support or no support at all. 

A Better Way

I’ve often used this metaphor as I can think of no industry where it is more relatable than ours.

If you need a routine physical, chances are you’d be confident that a generalist like your family physician is more than qualified to perform the exam and provide an accurate assessment of your general health.  

But what if you also had a heart condition?  Would you not seek a physician who specializes in cardiac care?  With experience in a broad range of health topics, your family doctor undoubtedly plays an important role in helping you maintain your overall well-being. However, seeking the care of (or not) an expert with specialized experience in a more specific area of medicine when warranted, could mean the difference between life and death. 

While perhaps an oversimplification of an issue for effect, the point is this: This same principle applies to modern scientific computing environments in the Life Sciences. 

Like it or not, many companies have evolved away from a model that embeds dedicated research computing professionals within the business unit at a time when that unique skill-set and focused expertise is needed most.

Businesses that attempt to meet that need through the support of I.T. generalists, rather than turning to dedicated specialists in the Life Sciences, are at a clear disadvantage.  So while we’re thinking about times past through the lens of where they have brought us today, those who fail to leverage the expertise that is available may very-well find themselves asking the same question, remember when, but for a very different reason. 

RCH Solutions is a global provider of computational science expertise, helping Life Sciences and Healthcare firms of all sizes clear the path to discovery for nearly 30 years. If you’re interesting in learning how RCH can support your goals, get in touch with us here. 

10,000 Hours and The Reality of Hard Work

By now we’ve all heard of the now legendary self-help playbook, Outliers: The Story of Success, by Malcolm Gladwell, in which he shares his take on how to achieve success in any particular endeavor: practice correctly for roughly 10,000 hours and you’re on your way to a high performance, no matter the field. 

While the merit of the rule can and will continue to be debated by evangelical naysayers (after all, there are exceptions to every rule), Gladwell’s research attempts to debunk the myth that achievement is based on luck or chance. And though the roles of family, culture and friendship are considered by the author, the value of time, focus, and effort seem to almost always emerge as the most essential elements in the formula for success. Let’s look at a few of my favorite examples. 

Overnight Sensation– Or Not?

The music industry’s often most assumed overnight sensation and the biggest rock band in history, The Beatles, arrived on the rock ‘n’ roll scene (by way of the Ed Sullivan show) as part of the British Invasion in the mid-1960’s. Captivating American youth what seems like instantly, in reality it took the band several years playing together and multiple name changes (and even more haircuts) before they would form the mold for the pop cultural icon they would become. 

Bill Gates is another example. Starting his first venture in computer science in 1970 at just 15, his climb to the top was not accelerated, but rather long and consistent. And today, as only the second richest person in the world, he may not be finished yet. 

The subjects of both examples—arguably two of the greatest influencers of our modern culture, albeit in different ways—clearly put in a lot of hard work well-before they became well-known and successful, which, as Gladwell would claim, would amass to at least 10,000 hours of honing their craft. 

As Big As the Beatles?

At RCH Solutions, we believe there is no substitute for experience and have spent more than 27 years honing our craft—scientific computing specifically within the Life Sciences. During that time, we’ve changed our business model to reflect the unique and evolving demands of our customers, while maintaining a culture crafted for learning and achieving. 

Our customer-base has been built through years of focused work in a very specific area. And while we find that many of our relationships have grown organically driven by good results, we sometimes joke about how often we hear comments like, ‘we didn’t’ know RCH did that.” (Cue marketing). The reality, though, is that the model we follow has created a culture that is very much like a supportive family and a good group of friends. We encourage exploration and joy in our work for both employees and customers, and will prioritize quality over quantity any day of the week. 

So, while an appearance on the Ed Sullivan Show and ensuing fame may not be an option for us, we will happily continue to be relentless in our practice and pursuit for innovation, challenging ourselves to deliver a ground-breaking computing experiences for our clients every day, so that they can deliver life-saving science to humanity. 

RCH Solutions is a global provider of computational science expertise, helping Life Sciences and Healthcare firms of all sizes clear the path to discovery for nearly 30 years. If you’re interesting in learning how RCH can support your goals, get in touch with us here. 

Best Practices for R&D IT

If you’re in charge of your organization’s R&D IT efforts and expecting different results from the same initiatives, it’s time to reassess your Bio-IT roadmap.

Start here with this list of best practices curated from almost 30 years of experience exclusively focused on scientific computing within the Life Sciences:

  • Don’t make good decisions with bad data. As Bio-IT professionals, we’re often asked to support a wide range of data-related projects. After all, data is currency in the life sciences—the more you have, the more you can do. But even petabytes of data don’t do you any good if you can’t retrieve or make sense of it. While delivering big on a high-value project (analytics is the Holy Grail) may be on your team’s radar, good intentions alone won’t get you results; your plan and your processes are critical. Which leads me to my next point.
  • Start from where you are, rather than where you want to be.  Would you ever consider building a house on a foundation you weren’t sure to be solid?  Probably not. However, in the high-stakes and often-demanding environment of R&D IT, the tendency to move toward solutioning before fully exposing and diagnosing the full extent of the issue or opportunity is all too common.  However, this approach is not only backward, it’s also costly. Only when you know where it is that you’re starting from, can you accurately identify the right strategy or tools to get you where you want to go.
  • Listen (a lot).  This one may seem like a no brainer, but human nature primes us to ready a response while we should actually still be listening. In the field of scientific computing, listening should take place as part of a constructive engagement between I.T. and the business with the goal to understand each side’s particular challenges and needs.
  • Clarify risk. In a highly regulated environment, the concept of mitigating risk is ubiquitous, and for good reason. When the integrity of data or processes drives outcomes that can actually mean life or death for living things, accuracy is not only everything it’s the only thing. But different projects, business units, and even companies have varying appetites for risk. Before assuming what will and won’t work because of the perceived risk involved, be sure to clearly understand where and what may be influencing a particular way of thinking.
  • Let the shape of the solution fit the shape of the problem. Square peg, round hole—you know the outcome. Though variables may be consistent based on the parameters of your organization and what’s worked well in the past, viewing each challenge as unique affords you the opportunity (or luxury) to leverage best-of-breed design patterns and technologies to answer your needs. Resist the urge to jam a prefab solution into framework not equipped to support it and opt for a more strategic (and tailored) approach.
  • Leverage small projects to build momentum for big initiatives.  Ever try to get your kids to clean their room? Psychology 101 has taught us that humans are programmed to respond more favorably when our big goals are broken into a series of smaller, more attainable steps. The same applies when affecting change within a company.  If you want buy-in to tackle a big initiative—like converting to the cloud, for example—start first on smaller elements of the effort. Agree on success then build upon that platform.
  • Plan for the life of the solution. Once a solution is delivered—no matter what type it may be—it’s only the beginning. Solutions have to be maintained, oftentimes upgraded, and eventually retired.  Future-proof your systems when architecting them by thinking about your operations today AND years from today.
  • Build agility into your team.  Often, the efficiency and flexibility of the team implementing your compute solutions are just as critical as the solutions themselves. Knowing how to structure your team and workflows—like when and why to introduce a DevOp or a DevSecOps model, for examplecan help you bridge the gaps that often impede innovation.
  • Always have a Plan B.  No IT or Bio-IT operation should move forward without a fall-back—or rather, a “back-out”— plan. Being able to pivot quickly and respond appropriately in the event of an unforeseen challenge (or opportunity if you’re an optimist) can mean the difference between the ability to try again, or not.
  • Trust the process.  This is a phrase infinitely more famous for what it means to Philadelphia 76er fans, but unequivocally applicable in scientific computing.  Often, the strategy that leads to many effective scientific and technical computing initiatives within an R&D IT framework is different from a traditional enterprise IT model. And that’s ok because just as often the goals are different as well.  Leverage the expertise of R&D IT professionals highly specialized and experienced in this niche space, and trust the process.

Finally, I’ll leave you with this:  Never substitute experience.  The value in solutioning is a mix of skill, dedication, resources, time, and experience.  

Skill can be developed (often quickly) when needed

Dedication is generally a factor of personality but can be fostered

Resources are often fixed as a factor of the company

And with only 24 hours in each day, time is beyond our control; to be successful, you must use is wisely. 

Experience on the other hand, takes time to develop. It’s a product of resources and dedication applied to build skills.  It’s not simply knowing what solutions work or don’t, but rather understanding the types of solutions or solution paths that are optimal for a particular goal, because you’ve tried it before. It’s having the ability to project potential outcomes, in order to influence priorities and workflows. And ultimately, it’s knowing how to find the best design patterns.  

After nearly three decades of support for Bio-IT initiatives in the life sciences and healthcare,  I can say with certainty, experience is the single constant in a world otherwise fueled by innovation and change. 

Top 5 Takeaways from Bio-IT 2019

Didn’t make it to the event?  Read on for key insights from this year’s expo.

Last week, hundreds of the industry’s top minds converged on the Seaport World Trade Center in Boston for the annual Bio-IT World conference and expo. As usual, it was an information-packed event and an invaluable opportunity to connect with colleagues, friends, partners, clients, and thought-leaders across the industry.  

While there was no shortage of ideas and information exchanged, there were a few themes that emerged as the most pressing.

Unsurprisingly, the use of data and analytics, as well as the applicability of the Cloud, continue to prompt as many questions as they do answers for those exploring how to evolve a compute infrastructure to support—and accelerate—scientific innovation. In fact, the industry as a whole may not be able to realize the full value of these powerful capabilities until it collectively habituates to the buzz that surrounds them. Until then (and likely after as well), the need for an experienced partner to help navigate these trends will be more important than ever.

Beyond that, here are my other top takeaways from the week:

5. Scientific Computing is still evolving.
Platforms such as the Cloud are moving beyond adoption toward transformation. Emerging areas such as AI and ML are beginning to bear fruit for certain applications. It is exciting to see the promise of realized utility and applicability of these platforms.

4. Data is exploding and the blast is not yet well contained.
John Conway of AstraZeneca made a number of key observations including the need to treat data differently. He commented during his presentation that data needs to be treated and managed like currency. Only then will it be seen as truly valuable and the appropriate strategies for storing, sharing, and managing be applied.

3. Analytics are only one piece of the equation (and equations need to be balanced).
While the tools are plenty, the process is paramount and still taking shape. Recently, I was asked by the global CIO for R&D of a top 10 pharmaceutical company about our experience with analytics. My response was a question to the effect of, “where is the data is located and how will it be curated then managed?” In other words, analytical tools can be powerful factors in your complete data strategy, but they are only one of several factors. Mapping out an effective process and workflow for mining and aggregating the data is equally as important.  

2. Don’t believe the Cloud hype … unless you’ve done these 3 things first.
Lance Smith of Celgene noted several steps any Life Sciences organization must take before they’re in a position to enjoy the long term benefits of the Cloud.

One: Assess which applications are best-suited for on-prem vs. in the Cloud
(because not all things are created equal)

Two: Establish a full-time, dedicated, Cloud team
(because effective Cloud implementation and evolution can’t be done part-time)

Three: Get multiple levels of management on board
(because turning a big ship can require a big effort)

I’ll go a step further and add a fourth: Choose your Cloud partner wisely. If the Cloud is a critical element of your innovation roadmap, take the time to make sure your partner brings the right levels of experience in the right areas the industry, the business, and with the applications critical to advancing your goals.    

1. Companies are seeking new ways to solve new (and nuisance) challenges.
As the demand on the business to do more with less grows, the value of an IT partner who offers speed, specialization, and scientific computing experience is becoming more clear. Anecdotally, our booth had more traffic and interest than ever before. While some may snicker that our salmon-colored sweaters played a role in that (thanks, marketing), I take it as a sign that the shift away from large, one-size-fits-all vendors is underway. After all, as Klaus Schwab once said, “In the new world, it’s not the big fish that eats the small fish, it’s the fast fish that eats the slow fish.”

Looking ahead, one thing is clear: It’s going to be a fun ride as nimble organizations like RCH take the reins to help Life Sciences and Healthcare companies steer through some of these uncharted lands.

Until next year, Bio-IT World …