Skip To Content
Cambridge University Science Magazine
Drugs have been a fundamental component of medicine for most of human history — traditional knowledge of plant extracts, passed down for many centuries, is the origin of around a quarter of all modern medicinal drugs. Opium, for instance, has been used for pain relief for 5,000 years and willow bark was used to reduce fevers for at least 3,000 years before the chemical structure of its active component was discovered, synthesised, and marketed as aspirin. For those of us fortunate enough to have access to them, the drugs available today protect and prolong our lives in innumerable ways. With anaesthetics to facilitate surgical interventions, antibiotics to protect us from deadly infections, and hormones that enable us to manage conditions such as diabetes, it can be easy to take modern medicine for granted. When disaster strikes, it is the absence of drugs for diseases like cancer, multiple sclerosis, dementia, and (at the time of writing) COVID-19, of which we are most acutely aware. So the question becomes: what do we do when we need new drugs?

At present, the drug development pipeline is lengthy, expensive, and inefficient. Drug discovery often begins with screening large libraries of small-molecule compounds against biological targets. These targets are usually proteins which can potentially be manipulated for therapeutic benefit. Hits from these screens then undergo further validation in cultured cells and animal models of human disease. A long process of preclinical testing and improvement follows, in which a number of pharmacological properties of the drug candidate are determined. If a candidate is successful up to this point, several rounds of human clinical trials can begin to further gauge safety, dose, and efficacy of the drug. If the drug passes all of these trials and gains approval by regulatory bodies, only then can the structure of the final product be patented, and the drug manufactured, marketed, and distributed. This system requires massive financial investment, because so much work goes into drug candidates that eventually fail and produce no financial return. Despite improvements like the use of artificial intelligence for target identification, bringing a new drug to market takes an average of 10 years and £1 billion.

Financing is a natural stumbling block to drug development, but fundraising becomes nearly impossible if the drug in question must be provided at low cost, or to a relatively small number of rare disease sufferers. The financial motives of pharmaceutical companies inevitably cause progress on less profitable drugs to stagnate. Research on partially-developed drugs that are not profitable enough to take further is locked away. For drugs that do make it to market, researching alternative applications to potentially extend their benefits is almost impossible until their patents expire. Even then, accessing the original research can be difficult and expensive.

In contrast, when pharmaceutical companies do agree to share data, significant discoveries can be made. For example, AstraZeneca has provided anti-cancer antibody-drug conjugates (ADCs) to researchers wishing to repurpose them as a treatment for African sleeping sickness, an often-fatal parasitic disease caused by African trypanosomes. Existing drugs for this disease are not always fully effective and often have severe side effects, which means that new drugs are desperately needed. Recent experiments by Dr Paula MacGregor (University of Cambridge) and her collaborators have shown that changing the antibody in an ADC to one that targets trypanosomes can produce an anti-trypanosomal therapy that is effective in mice. Since ADCs have been trialled extensively for their safety already, bringing them to market may prove cheaper and faster than would be possible for a completely novel drug.

The success of projects like this is contingent on the co-operation of large pharmaceutical companies which often lack the incentives to provide it — but what if sharing data and materials that might advance medicine was the norm? Proponents of what is known as ‘open-source medicine’ believe it will revolutionise drug discovery. One key principle of this movement is that the results of unsuccessful research should be made openly available, to prevent unnecessary duplication of scientific efforts. Substantial progress has been made on this issue in response to initiatives like AllTrials, a campaign that calls for all past and present clinical trials to be registered and their results reported. It is much harder to promote a similar culture in the preclinical stages of drug discovery. The efficiency of individual researchers would suffer if they had to publish every negative result, and for most, the principles of open data must be balanced against the possibility of being ‘scooped’, something that can seriously hinder their ability to acquire funding for future work. However, progress can be made when grant-makers acknowledge the value of open data and when scientific journals change to open-access models available to all researchers.

It is not only data that can be made openly available. There are growing trends for open-source hardware (e.g. 3D-printable microscopes), open-source consumables (e.g. locally manufactured enzymes made using freely-available methods) and open-source software (for anything from modelling drug effects to data mining from scientific literature). As well as making science fairer and more accountable, these changes will greatly improve the ability of the world’s researchers to work on improving global health, with the greatest impact being on those in underfunded institutions or research areas.

Expanding and diversifying the medical research community is not simply a social justice project. It is important to remember that while large investments by big companies are responsible for the brute-force, high-throughput methods by which many drugs have been discovered, the technologies underlying some of our most revolutionary therapeutics were developed in small-scale academic laboratories. For example, the best-selling drug of 2019, Adalimumab, is a humanised antibody used to treat arthritis, Crohn’s disease, psoriasis, and other inflammatory conditions. The technology behind Adalimumab owes its existence to the work of researchers at the Laboratory of Molecular Biology (Cambridge), the Scripps Research Institute (San Diego), and the German Cancer Research Centre. The ideas, perseverance, and collaborative spirit of these researchers has led to the creation of an entirely new class of drug and large-scale production of these medicines by pharmaceutical companies has allowed them to benefit many people.

As well as universities and research institutes, private companies, national governments, and international bodies all have a role to play in developing new drugs. When it comes to matters of public health, we need all hands on deck — a fact rarely clearer than during a pandemic. The response to COVID-19 has largely been a collaborative effort: in January 2020, less than two weeks after the virus was first reported to the World Health Organization, a draft of the genetic sequence of SARS-CoV-2 was published by Chinese researchers. This enabled scientists around the world to begin developing tests, treatments, and vaccines — some of which, six months later, are already in clinical trials. To support this work, pharmaceutical companies searched their archives for drugs that could be repurposed, and agreed to combine resources to expedite the drug development process. With luck, these efforts will prove to be a catalyst for the movement towards a more open and collaborative approach to drug development and medical research as a whole. The sooner we all learn to work together, the better. The benefits to global health could be enormous.

Alice McDowell is a 3rd year PhD student in Biochemistry. Artwork by Eva Pillai.