Digitizing Pharma Part 1: Designing Better Hypotheses
What Pharma Can Learn from the Engineers who Conquered Flight
Whenever I discuss my vision for the next generation of pharmaceutical discovery and commercialization, I like to draw analogies to the aerospace and automotive industries in the early 1950s-2010s. It may be hard to fathom that a simple pill or injection we take likely costs over $2B+ to design, develop, and commercialize to reach us, but just like designing airplanes or mass scale automotive manufacturing lines, drugs are exceptionally expensive to engineer.
I want to draw inspiration from mechanical engineering history across these two verticals:
Computer Aided Design in Aerospace (Designing Better Hypotheses)- how using computational simulations speed up ‘Design-Make-Test-Analyze’ cycles significantly.
Manufacturing and Robotics for Automotives (High-throughput, Automated Task Executions)- how utilizing mass scale automations allowed for ultra quick iteration, experimentation, and fabrication in the physical world.
I will write about High Throughput Automated Task Executions in Part 2. I’ll make this a 3 Part mini-series:
Part 1 - Aviation and Computational Fluid Dynamics
Part 2 - Automotive and Robotic Automation
Part 3 - Thoughts on a Digitizing Pharma End to End
Designing Better Hypotheses
Why is this a great analogy? Well it’s because like Pharma, Aerospace is highly complex and has exceptionally high stakes with lives on the line, the development cycles are years to decades from concept to deployment, physical prototyping is expensive and can cost millions to run, there is great regulatory intensity (FAA ~ FDA), and yet, aerospace had a complete transformation. This can serve as a great roadmap for what will happen to Pharma in the next few decades (to a degree).
Let’s Begin in 1950 - Post World War Era and Consumer Demand as a Catalyst
In the early 1950s after the World Wars, the United States and other allies were left with a tremendous industrial capacity which was previously used to develop ships, airplanes, weapons, and automobiles. In a ‘Golden Age’ of American Capitalism and with massive factories, materials, and skilled labor left over, these resources were redirected towards expanding on an already growing economy - and applied towards consumer facing goods like cars, housing, and air travel.
With the gears of the third industrial revolution/early stage digital revolution moving at full capacity, critical innovations were being made in the aerospace and automotive industries, especially during the ‘golden age of air travel’. As consumer demand grew, engineers from these industries had to figure out ways to speed up the rate at which new innovations were hitting the market to meet increased customer needs.
Besides the major dip in airline travel during covid, increased consumer demand drove increased innovation in the aviation industry from 1950 onward [6].
An equivalent to draw here to Pharma would be Eroom’s Law driving demand for better, more effective medicines as development inefficiency rises and disease rates skyrocket.
The 1960s and 1970s - Introducing Simulations for Better Designs
To fulfill the demand, and to take advantage of resources being poured into the aerospace industry from private and public funds, engineers turned towards simulations to create more effective engineering design cycles for new aircraft. Taking a page out of early aerospace engineers playbook, we see that all of the ‘digital twins’, ‘virtual cells’, and ‘biochemical models of ligand protein binding’ are all just similar workflows that engineers in the 1960s used when designing airplanes to meet the demand of a growing consumer base of air travel. Engineers began computationally designing components, introducing physics based parameters for panel and potential flow models, thermal and heat transfer analysis, and orbital and trajectory dynamics.
By the 1970s, simulations were becoming more accelerated, with the first ‘digital twin’ being utilized during the Apollo 13 mission when an oxygen tank exploded mid mission. The team’s crisis forced ground control to use these physics based models to bring the Apollo team back home safely using the information simulations were providing. It became clear that these digital models would transform the way that aircraft were designed, tested, and deployed. The equations and physical models became more validated through expansive testing and the realization of business value that was provided became more materialized. As iteration timelines shortened, and simulations allowed for testing hypotheses that would have been exorbitantly expensive to build and test in previous design eras, new value was unlocked, especially as the technology matured.
In Pharma, computational models were pioneered by early companies in the 1990s, utilizing the ground level foundational physics developed by early 1900s Physicists like Erwin Schrodinger, which underlies a majority of pharmaceutical chemistry and molecular simulations. Pictured below are some of his core equations that laid the foundation for how we understand these systems:
I’ll expand about other methods, equations, and systems in another article. The equations above are nearly impossible to compute as they scale O(n!) [n=number of atoms, where a protein can have 6-30k atoms], so modern chemists utilize approximations like Hartree-Fock(HF), Post-HF methods, and Density Functional Theory (DFT) as more practical estimations for solving those equations for different applications, like understanding interactions.
Early computational chemistry teams brought these models for ligand protein binding chemistry into larger enterprise projects, using quantum principles to understand how designed compounds interact with core biological components like proteins. Early models struggled with experimental translation, and that same issue occurs in the modern day, though newer AI and Physics informed models like AlphaFold, Chai, Boltz2, and others are getting closer towards generalizable simulations to understand challenging structural biology.
What Were the Benefits of Simulations?
Before simulations matured, aerospace engineers were left to build physical prototypes that would take an extensive amount of time before sending them out to be tested in wind tunnels, where timelines from initial designs and prototypes to viable aircraft could take decades. Early companies like Lockheed and Boeing would need to develop full scale wind tunnel models, develop initial physical prototypes of the aircraft, test them, find problems, tear them apart, and repeat until a flight ready model was prepared.
With the advancement of computational fluid dynamics (CFD), just 1 simulation type of many, engineers were able to model aircraft aerodynamics in different sonic regimes at high fidelity, which meant that physical airflows could be measured on a computer without needing to build as many expensive prototypes. The design-make-test cycles became more of an engineering process where computational modeling iterations became the new standard for aircraft designs before any physical models would be developed.
In the modern Pharma era, we are in the earlier stages of our own computational revolution and so the net economic and engineering benefits are still being materialized. Because of the nature of biology being highly dimensional (several variables interacting together), AI has yielded high breakthroughs for more translatable models, and as adoption increases, we will also yield and record the net benefits for the chemists and biologists designing these systems.
Changes in Aerospace Infrastructure With the Rise of Simulations
During aviation’s golden age between the 1950s/60s, there was a peak of operational wind tunnels, where newly designed airplanes would eventually be tested and vetted before becoming commercially viable. NASA began the expansion of their supersonic and specialty tunnels, but as simulations became more prevalent, challenging economic and infrastructure issues arose. Wind tunnels eventually could not keep up with the pace of innovation that computational fluid dynamics was providing during the rise of digital infrastructure. During the 1980s where 3D Euler and Navier-Stokes solvers/3D RANS models (Equations for solving steady state aerodynamics performance like lift, drag, and pressure distributions) were being integrated and improved at scale, wind tunnels were undergoing a strong decommissioning period in parallel.
Wind tunnels faced increased pressure due to their declining/erratic usage, higher electricity costs, staffing expenses, program changes and terminations, and the technology becoming more outdated. As better designs became more prevalent, older wind tunnels did not have the capabilities to keep up with testing new technology designed computationally with their outdated systems. The RAND Corporation, cited by NASA, reported that CFD contributed to a reduction of ~50% in the number of required wind tunnel testing hours for transport aircraft programs [1]. Because of the extraneous overhead required to maintain these large experimental facilities, and decreased utilization with the rise of computational simulations, there were huge observed declines in the number of operational wind tunnels, especially during a large decommissioning period between 1990 and 2010. According to a 2010 Lockheed Martin report, experimental facilities declined with 120 wind tunnels in commission during 1985 to there being only 61 in 2009[2].
My belief is that the end product for both aeronautics and pharmaceutical industries, whether it be an aircraft or a drug, should be highly validated ‘experimentally’. This can be a wind tunnel in the case of an airplane, or biochemical assays, cell assays, animal testing, and clinical trials for drugs. Computational simulations are not 1-shot silver bullets to getting drugs to patients, they are methods that we can use to develop smarter, more capable prototypes and hypotheses to cut down the experimental burden that we take on to validate these hypotheses. Today, wind tunnels are still utilized, with reduced overhead costs, to test new airplanes. They are being used at a reduced capacity, allowing for smarter engineering to take place across teams before necessitating a full scale physical build out that previously had high failure rates during the validation stage.
In Pharma, there is actually a duality that is occurring simultaneously. There are companies making large bets on expanding wet lab capabilities aggressively, and major investments happening on making better simulations. We are seeing miniaturization of assay components, greater investments in increased throughput facilities and robotic automation, and mass scale laboratory innovations. I’ll outline the difference and nuance between these two industries and predicted trends in Parts 2 and 3 of this series.
What Happened Economically During the Aerospace Digital Revolution?
Simply, consumer demand skyrocketed in the 1950s and persisted to the modern day with greater globalization efforts, interconnected economies, and ease of transport - giving us the democratized access we enjoy today. Pressure to build greater designs with reduced drag, greater speeds, and more capabilities drove engineers to adopt CFD dynamics at scale, and the sheer efficiency they brought shifted the infrastructure built around the industry. So what economic trends were observed during this time?
Wind Tunnel Closures vs. Ultra Large Supercomputer Infrastructure Investments
As physical testing resources declined, better mathematics and physics were being developed which modeled these aerospace systems at greater fidelity. Trust grew, and the resulting investment in the computation required to host these models grew as well.
During the early 1980s, NASA Ames scientists worked closely with supercomputing companies like Silicon Graphics Inc (SGI) to improve workstations and to make them more commercially viable for wider distribution [3]. As user needs from federal agencies, U.S. universities, and companies with the same computational aims grew, the federal government poured more money into programs like NSF’s Supercomputer Centers and Partnership for Advanced Computational Infrastructure to provide researchers nationwide with the resources they needed [4]. Adoption increased as early computation began to show major translation to more effective designs, as well as decreased costs and time of getting a plane in the sky.
Design Cycle Improvements
While CFD proved to have the capability of going from designed aerospace component geometries into predicted forces and interactions within hours, designing and fabricating wind tunnel tests would take months of set up and almost $20,000 per hour of run time[5]. Contrary to previous design cycles that were 1:1 (design 1 prototype, test 1 prototype), engineers were able to iterate design prototypes of 30+ models for a wind tunnel computationally before narrowing it down to 2-3 models that have the most promise to go into experimental testing.
The large reductions in design time and cost was also paired with ‘new revenue opportunities’ as designs that previously couldn’t be tested on standard wind tunnels could be built computationally. This provided higher confidence to engineers to justify investments into getting prototypes prepared for physical experimentation. To set some perspective(using this attached Siemens article): by improving drag by a single count, airlines could save $100,000 worth of fuel per aircraft per year; meaning that quickly optimized designs could bring major business impact[6].
Workforce Needs and Skillsets Changed
Skilled labor, as part of the global aviation infrastructure, also experienced major shifts compared to pre-simulation aviation designs. During the 1950s-70s, engineers had a greater reliance on hand drawn blue prints and physical prototypes to visualize designs which was quite inefficient. Computer aided designs (CAD) was eventually introduced for better design efficiency, but there was initial resistance in user adoption due to reluctance to deviate from established workflows - so these tools were primarily adopted and utilized at scale within large industry players whose project needs necessitated more sophisticated software modalities.
The 1980s and 90s saw tremendous improvements on numerical techniques, compute power, digital memory, and storage. Super computer access became accessible remotely, and desktop computers became more capable, allowing for more distributed software to the general public and early stage engineers. Eventually, hybrid skillsets emerged where design and wind tunnel engineers required proficiency in CFD, aerodynamics, mechanical engineering, and data analysis. Both experimental strategies and computational methods were intertwined to increase the efficacy of industry efforts towards better designs, and this was reflected in schooling and training as well.
We Can Design Better Hypotheses. Now What?
Pharma differs from Aviation in many ways. Wind tunnels may be the equivalent to a ‘clinical trial’ for a plane, but there is a reason why chemical and biological computational models have lagged behind it’s cousins in the mechanical or electrical engineering world. We are still understanding how to best model proteins, cells, humans, and biology in general given how complex and multi-variate those systems are. Moreover, it becomes even more complicated as we begin to take into consideration how our designed molecules interact with these biological components. This article is a rough approximation of what could happen to design cycles and the engineering process in Pharma, turning a once guess and check process into a field that has quick iterations, computations, and high throughput experimental turnovers.
There are major considerations and implications to the types of bets big Pharma makes. Will the industry bet on more refined experimentation with higher turnover to collect more data to test and validate hypotheses at scale? Will they bet on major HPC systems and computational algorithms that help them to design better molecules at scale? The answer is both. While computational fluid dynamics and Navier Stokes equations can be used to model simpler systems like airplanes, the landscape of physics, math, and AI that goes into Pharma is far more complex (in my opinion, as a non-aerospace engineer).
The difference with computational methods between Aerospace and Pharma is that physics can be utilized to some degree for ligand protein systems (we know the quantum principles that govern atomic interactions), but due to molecular complexity, we are limited to rough approximations and compute limitations. Biology is highly dimensional (i.e. many variables that drive end outcomes), so AI is a great solution, but major data sets that scale as biological complexity scales are required as well. This means that a dual investment in major experimental infrastructure for data collection and algorithmic build out are required to drive the field forward. This could change in the future if we get to ‘silver bullet’ level models that can generalize across all biological systems like proteins or cells, but that is still yet to be computationally realized. The industry trends will change over time as net economic, scientific, and business implications are brought to fruition.
We are in the late-early to mid stage of Pharma’s digital revolution, and the exciting, translated benefits of these simulations to better drugs for patients will fundamentally shift our approach to engineering life as we move to conquer more diseases.



