Techno Blender
Digitally Yours.

Should we give people diseases to develop cures? | Science and nature books

0 43


In the 1770s an English doctor called Edward Jenner noticed that milkmaids didn’t seem to catch smallpox, the terrifying disease that caused around a third of the people who caught it to die. He thought that their frequent exposure to cowpox, a similar but less severe virus, might be what protected them. In order to test his hypothesis he gave his gardener’s eight-year-old son cowpox and then deliberately infected him with smallpox to see if he had become immune. He had, and Jenner successfully repeated the experiment. “Vaccination”, from the Latin word for cow, soon became commonplace.

It was of course highly irresponsible to expose a child to a deadly disease with no sure knowledge that he would survive. Even so, with hindsight, we can see that the benefits were immense: the vaccine was safe and highly effective. Demonstrating that fact and publicising it encouraged untold numbers of others to follow suit.

This is an example – albeit an unusual one – of a “challenge trial”. That is a form of research where, rather than relying on data from natural infections, we deliberately expose someone to a disease in order to test the effectiveness of a vaccine or treatment. Things have changed a lot since Jenner’s time, of course, when it was not uncommon for doctors to deliberately infect people with pathogens to try to learn which diseases they caused. Even so, there’s the lingering sense that there’s something unethical about making someone ill on purpose. That’s not surprising – even in relatively recent history, deeply sinister medical experiments have been carried out that bear a superficial resemblance to this kind of work. During the second world war, for example, imperial Japan set up a network of secret facilities to experiment on prisoners: while some were injected with plague and tetanus toxin, others had their limbs amputated – both as a form of torture and a way to train army surgeons for the battlefield. The grotesque crimes committed by the Nazis under the guise of scientific research are well known.

But this poisonous history shouldn’t blind us to the extraordinary power of challenge trials under strict conditions based on informed consent and designed to be as safe as possible. They could become increasingly important weapons in the armoury of medical research, in an era when vaccine technology is advancing and the threat of diseases jumping from animals into human beings is increasing.

Much has been done to mitigate the risks: challenge trials designed to advance malaria research have proved to be very safe, because the disease is now well understood and can be treated easily under close supervision. For tuberculosis, trials have used the mild BCG vaccine as the challenge, instead of the actual bacteria. For respiratory syncytial virus (RSV), researchers have recruited adults who are at a low risk of severe illness. These experiments have already whittled down a massive range of vaccine candidates and helped refine their ingredients. With their help, the world will soon have two effective vaccines against malaria, which kills hundreds of thousands of people every year, as well as the first vaccines against RSV, which kills tens of thousands of infants each year.

But not all diseases are like these ones. We don’t always know the dangers volunteers might face; we don’t always have treatments ready. What then? How does someone consent to risks that remain hard to quantify? How should they be compensated for taking those risks?

WE COULD, of course, just avoid these questions entirely, and rely on other types of research. But that doesn’t always work: sometimes, animal testing is tricky and uninformative, because the disease doesn’t develop in the same way as it would in humans. For human trials, such as those looking at the effectiveness of a vaccine against Zika, it can take tens of thousands of people and several years to run a single study, because only a fraction of the participants in the placebo group will ever develop the disease, making it hard to see how much difference the drug or vaccine would make.

In contrast, challenge trials can be deeply informative within weeks, with far fewer volunteers. And the stakes can be staggeringly high. It’s easy for us to grasp the risks that volunteers might face after being injected with a pathogen, but harder to keep in mind how many people suffer from diseases every day, and how many lives would be saved if a treatment or vaccine were developed and rolled out sooner. Take the Covid‑19 pandemic. At the end of last year, as the death toll is estimated to have reached about 17.8 million, it’s also estimated that 20 million had been saved by vaccines. In the years to come, they will hopefully save millions more. The burden of suffering relieved by vaccines is immense – and the faster they arrive, the better.

Researchers around the world were able to rapidly develop Covid vaccines through a combination of luck and initiative: similar vaccines were already in development; the disease was so widespread that it was easy to recruit people into studies; and research was massively well-funded and given high priority because it was a global emergency. If that hadn’t been the case, we would have been in dire straits – much like doctors hundreds of years ago, we’d have been faced with a looming threat we didn’t understand and could not beat.

In order to make sure we are as protected as possible from current and future threats, we should try to eliminate the stigma that still haunts challenge trials, making them a more familiar part of our toolkit. What if we thought of the act of volunteering to be infected not as a rather peculiar and reckless thing to do? What if we thought of volunteers more like first responders who rush to help during a disaster? What if we recognised the sacrifices they made on our behalf by holding them in especially high regard, like firefighters or paramedics, rewarding them not just with money, but with recognition, long-term support and respect?

Perhaps the greatest reward of all would be to make sure their efforts were worthwhile: by designing trials to be open and transparent, applying them when and where they might make a real difference, and developing the tools to learn as much from them as possible. In short, by helping them to save thousands, if not millions of lives.

Saloni Dattani is a researcher at King’s College London and a founding editor of Works in Progress.

Further reading

War Against Smallpox: Edward Jenner and the Global Spread of Vaccination by Michael Bennett (Cambridge, £29.99)

Vaxxers: A Pioneering Moment in Scientific History by Sarah Gilbert and Catherine Green (Hodder & Stoughton, £20)

The Mosquito: A Human History of Our Deadliest Predator by Timothy Winegard (Text, £12.99)


In the 1770s an English doctor called Edward Jenner noticed that milkmaids didn’t seem to catch smallpox, the terrifying disease that caused around a third of the people who caught it to die. He thought that their frequent exposure to cowpox, a similar but less severe virus, might be what protected them. In order to test his hypothesis he gave his gardener’s eight-year-old son cowpox and then deliberately infected him with smallpox to see if he had become immune. He had, and Jenner successfully repeated the experiment. “Vaccination”, from the Latin word for cow, soon became commonplace.

It was of course highly irresponsible to expose a child to a deadly disease with no sure knowledge that he would survive. Even so, with hindsight, we can see that the benefits were immense: the vaccine was safe and highly effective. Demonstrating that fact and publicising it encouraged untold numbers of others to follow suit.

This is an example – albeit an unusual one – of a “challenge trial”. That is a form of research where, rather than relying on data from natural infections, we deliberately expose someone to a disease in order to test the effectiveness of a vaccine or treatment. Things have changed a lot since Jenner’s time, of course, when it was not uncommon for doctors to deliberately infect people with pathogens to try to learn which diseases they caused. Even so, there’s the lingering sense that there’s something unethical about making someone ill on purpose. That’s not surprising – even in relatively recent history, deeply sinister medical experiments have been carried out that bear a superficial resemblance to this kind of work. During the second world war, for example, imperial Japan set up a network of secret facilities to experiment on prisoners: while some were injected with plague and tetanus toxin, others had their limbs amputated – both as a form of torture and a way to train army surgeons for the battlefield. The grotesque crimes committed by the Nazis under the guise of scientific research are well known.

But this poisonous history shouldn’t blind us to the extraordinary power of challenge trials under strict conditions based on informed consent and designed to be as safe as possible. They could become increasingly important weapons in the armoury of medical research, in an era when vaccine technology is advancing and the threat of diseases jumping from animals into human beings is increasing.

Much has been done to mitigate the risks: challenge trials designed to advance malaria research have proved to be very safe, because the disease is now well understood and can be treated easily under close supervision. For tuberculosis, trials have used the mild BCG vaccine as the challenge, instead of the actual bacteria. For respiratory syncytial virus (RSV), researchers have recruited adults who are at a low risk of severe illness. These experiments have already whittled down a massive range of vaccine candidates and helped refine their ingredients. With their help, the world will soon have two effective vaccines against malaria, which kills hundreds of thousands of people every year, as well as the first vaccines against RSV, which kills tens of thousands of infants each year.

But not all diseases are like these ones. We don’t always know the dangers volunteers might face; we don’t always have treatments ready. What then? How does someone consent to risks that remain hard to quantify? How should they be compensated for taking those risks?

WE COULD, of course, just avoid these questions entirely, and rely on other types of research. But that doesn’t always work: sometimes, animal testing is tricky and uninformative, because the disease doesn’t develop in the same way as it would in humans. For human trials, such as those looking at the effectiveness of a vaccine against Zika, it can take tens of thousands of people and several years to run a single study, because only a fraction of the participants in the placebo group will ever develop the disease, making it hard to see how much difference the drug or vaccine would make.

In contrast, challenge trials can be deeply informative within weeks, with far fewer volunteers. And the stakes can be staggeringly high. It’s easy for us to grasp the risks that volunteers might face after being injected with a pathogen, but harder to keep in mind how many people suffer from diseases every day, and how many lives would be saved if a treatment or vaccine were developed and rolled out sooner. Take the Covid‑19 pandemic. At the end of last year, as the death toll is estimated to have reached about 17.8 million, it’s also estimated that 20 million had been saved by vaccines. In the years to come, they will hopefully save millions more. The burden of suffering relieved by vaccines is immense – and the faster they arrive, the better.

Researchers around the world were able to rapidly develop Covid vaccines through a combination of luck and initiative: similar vaccines were already in development; the disease was so widespread that it was easy to recruit people into studies; and research was massively well-funded and given high priority because it was a global emergency. If that hadn’t been the case, we would have been in dire straits – much like doctors hundreds of years ago, we’d have been faced with a looming threat we didn’t understand and could not beat.

In order to make sure we are as protected as possible from current and future threats, we should try to eliminate the stigma that still haunts challenge trials, making them a more familiar part of our toolkit. What if we thought of the act of volunteering to be infected not as a rather peculiar and reckless thing to do? What if we thought of volunteers more like first responders who rush to help during a disaster? What if we recognised the sacrifices they made on our behalf by holding them in especially high regard, like firefighters or paramedics, rewarding them not just with money, but with recognition, long-term support and respect?

Perhaps the greatest reward of all would be to make sure their efforts were worthwhile: by designing trials to be open and transparent, applying them when and where they might make a real difference, and developing the tools to learn as much from them as possible. In short, by helping them to save thousands, if not millions of lives.

Saloni Dattani is a researcher at King’s College London and a founding editor of Works in Progress.

Further reading

War Against Smallpox: Edward Jenner and the Global Spread of Vaccination by Michael Bennett (Cambridge, £29.99)

Vaxxers: A Pioneering Moment in Scientific History by Sarah Gilbert and Catherine Green (Hodder & Stoughton, £20)

The Mosquito: A Human History of Our Deadliest Predator by Timothy Winegard (Text, £12.99)

FOLLOW US ON GOOGLE NEWS

Read original article here

Denial of responsibility! Techno Blender is an automatic aggregator of the all world’s media. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials, please contact us by email – [email protected]. The content will be deleted within 24 hours.

Leave a comment