Information

Would humans eventually evolve to fight off fatal diseases if we didn't treat/vaccinate for them?

Would humans eventually evolve to fight off fatal diseases if we didn't treat/vaccinate for them?


We are searching data for your request:

Forums and discussions:
Manuals and reference books:
Data from registers:
Wait the end of the search in all databases.
Upon completion, a link will appear to access the found materials.

I am definitely pro vaccination, and the question is a bit morbid because people would die, but assuming that we didn't treat or vaccinate people for fatal diseases such as Ebola or the black death, would the surviving people pass along a trait that eventually would lead to near immunity for that disease?

Are there any examples of diseases that humans "contract" but they are immune to? Or any examples of immunity through human evolution?

What about in animals? Are there diseases that have been naturally eradicated?


Sure.

When the Europeans settled the Americas, 90%+ of the native American inhabitants were wiped out due to diseases like smallpox that the Europeans clearly had developed resistance to. Clearly, the surviving native americans had resistance, and the Europeans had gone through some similar event previously in history.

The bubonic plague that was the cause behind the black death wiped out large percentages of populations, but selected for survivors with resistance:

Black Death Left a Mark on Human Genome

Caveats:

  • Often evolving to fight off disease involves mass death, like with >90% of the native american population dying.
  • There is always the chance that a future pandemic can come that human biology fails to simply evolve a resistance for and gets wiped out.
  • The Earth is changing rather quickly. The past is not a guarantee for the future.

Before Vaccines, Doctors ‘Borrowed’ Antibodies from Recovered Patients to Save Lives

In 1934, a doctor at a private boy’s school in Pennsylvania tried a unique method to stave off a potentially deadly measles outbreak. Dr. J. Roswell Gallagher extracted blood serum from a student who had recently recovered from a serious measles infection and began injecting the plasma into 62 other boys who were at high risk of catching the disease.

Only three students ended up contracting measles and all were mild cases.

The method, while relatively novel, was not new to science. In fact, the very first Nobel Prize in Physiology and Medicine was awarded in 1901 to Emil von Behring for his life-saving work developing a cure for diphtheria, a bacterial infection that was particularly fatal in children. His groundbreaking treatment, known as diphtheria antitoxin, worked by injecting sick patients with antibodies taken from animals who had recovered from the disease.


Study suggests new way to help the immune system fight off sleeping sickness parasite

IMAGE: Trypanosomes are able to flourish in the bloodstream by continually changing their protein coat. The inhibition of bromodomains causes the protein coat to stick, shown in green, giving the host. view more

Credit: Laboratory of Lymphocyte Biology at The Rockefeller University

Some infectious diseases are particularly difficult to treat because of their ability to evade the immune system. One such illness, African sleeping sickness, is caused by the parasite Trypanosoma brucei, transmitted by the tsetse fly, and is fatal if left untreated. The trypanosome parasite is transmitted to mammals through fly bites and eventually invades major organs such as the brain, disrupting the sleep cycle, among other symptoms.

Trypanosomes exist in different forms. When inhabiting a fly, they are covered with proteins called procyclins. But upon entering the bloodstream of a mammal, they acquire a dense layer of glycoproteins that continually change, allowing the parasite to dodge an attack from the host's immune system.

Now, new research from postdoctoral scientists Danae Schulz and Erik Debler, working in Nina Papavasiliou's and Günter Blobel's labs at Rockefeller University, reveals a method to manipulate trypanosomes in the mammalian bloodstream to acquire fly stage characteristics, a state that makes it easier for the host immune system to eliminate the invader. The findings suggest that inhibiting specific proteins that interact with chromatin--the mass of DNA and proteins that packages a cell's genetic information--can "trick" the parasite into differentiating to a different stage of its lifecycle. The study was published on December 8 in PLOS Biology.

"By blocking these chromatin-interacting proteins, we have found a way to make the parasite visible to the immune system," says Nina Papavasiliou, head of the Laboratory of Lymphocyte Biology. "The bloodstream form of the parasite is constantly switching protein coats, so the immune system can't recognize and eliminate it. This new method makes the parasite think it's in the fly, where it doesn't need to worry about the immune system attacking it."

Epigenetic regulation

Regulatory proteins interact with chromatin to either unwind it or package it more tightly, affecting which genes are expressed. Some of these regulatory proteins contain a region called the bromodomain, which recognizes a specific signal on chromatin and induces changes in gene expression.

Recent findings in mice have indicated that bromodomains are involved in cell differentiation, which led Papavasiliou and colleagues to hypothesize that such epigenetic mechanisms may drive the trypanosome to change from one form to another.

"The changes in gene expression that accompany the transition between the different parasite forms had been well established," said Schulz, the lead author of the study. "But we didn't understand if there was some type of regulation happening at DNA, at the level of chromatin. Whether chromatin-altering mechanisms might be important for differentiation hadn't really been studied before."

To investigate this, the researchers inhibited bromodomain proteins in cells by introducing genetic mutations in their DNA or by exposing the cells to a small-molecule drug called I-BET151, which is known to block bromodomains in mammals. When these perturbations were made, the investigators observed changes in gene expression levels that resembled those seen in cells differentiating from the bloodstream form to the fly form. They also saw that the parasites developed a procyclin coat normally found on the fly form.

Based on these findings, Papavasiliou and colleagues suggest that proteins with bromodomains maintain the bloodstream form of trypanosomes, and inhibiting them causes the parasite to progress in its development toward the fly form. They believe bromodomains could serve as a potential therapeutic target to treat African sleeping sickness.

Harnessing the natural immune system

To explore whether I-BET151 could be used to combat the disease, the researchers used drug-treated trypanosomes to infect mice. The mice infected with drug-treated trypanosomes survived significantly longer than those infected with untreated trypanosomes, indicating that the virulence of the parasite--its ability to invade the host--was diminished in the presence of I-BET151.

"When bromodomains are inhibited, the variant protein coat is replaced with an unvarying coat on the surface of the trypanosome cell," says Schulz. "This means that the parasite surface is no longer a moving target, giving the immune system enough time to eliminate it."

I-BET151 is not effective enough to be used in the clinic, but a crystal structure determined by Debler and published as part of this study provides direct clues for how an optimized drug could be designed to bind parasite bromodomains in a highly specific manner, limiting side effects.

"Current treatments for this disease are limited and they have substantial side effects, including very high mortality rates," says Papavasiliou. "This study, and recent work by others, demonstrates that targeting chromatin-interacting proteins offers a promising new avenue to develop therapeutics."

This could apply not only to African sleeping sickness, she adds, but to a number of related parasitic diseases like Chagas or malaria, with disease burdens that are far more substantial than those caused by Trypanosoma brucei.

Disclaimer: AAAS and EurekAlert! are not responsible for the accuracy of news releases posted to EurekAlert! by contributing institutions or for the use of any information through the EurekAlert system.


Epidemiologist rates 10 zombie virus movies and TV shows for realism

Tara Smith: In some cases, people do infect themselves.

Hi, I'm Tara Smith. I'm a professor of epidemiology at Kent State University in Ohio. I study infectious diseases and have long been interested in zombie infections. So, today, we'll be talking about zombies in movies and rating them for realism.

"The Walking Dead" (2010-)

Edwin: It invades the brain like meningitis. The adrenal glands hemorrhage. The brain goes into shutdown. Two hours, one minute.

Smith: So, this one is mixed. So, it does have an incubation period. It usually takes several days between the time that one is exposed to a virus and one develops symptoms.

Lori: It restarts the brain?

Edwin: No, just the brain stem.

Smith: Viruses can get into the brain. He mentions meningitis, which is actually about kind of the outside covering of the brain, and then you would say encephalitis when it actually goes into the deeper parts of the brain. So that can happen. There are viruses that cause those type of illnesses. But, you know, once you're dead, you're dead. And even if this reanimated the brain stem, that still wouldn't quite make sense because you wouldn't be able to walk. You could breathe, your heart would beat, but you wouldn't be actively going out, you know, shambling around. But that anyone who died would also become a zombie, even if they died in other mechanisms. So, that's a little less realistic. I mean, yellow fever is one that people can get very ill, be close to death, and then sometimes they come back and feel fine and then are dead the next day.

So it has what we call a biphasic illness. It has different phases. And, of course, back in the day, before people had stethoscopes and things like that, if you had somebody who, again, was very ill with some infection, you listen for a heartbeat or try to feel for a pulse. It might be very weak. It might not be detectable to the average person. And so there are lots of stories from outbreaks of the bubonic plague, outbreaks of cholera, and even also yellow fever and other types of infections where people have been taken out of their homes, put in the dead cart, going off to be hauled to the graveyard, and they wake up in the cart. You know, the human body decomposes rather quickly. So it starts out really from the inside that you have all these bacteria in your gut, in your mouth, on your skin, that, once you die, you become a big food source for them. So they're the ones that do a lot of the initial decomposing. And there's nothing about this particular virus that would stop that. They have these shamblers that have taken years and years that are still walking around after the outbreak, and they would all just be bones by that time. Something would have eaten them, whether it would be the bacteria that they carry or other rats or scavengers or something like that. Yeah, again, it gets a bump for the incubation period, but downgraded for the reanimation, so maybe a five.

"Resident Evil" (2002)

I mean, as a lab scientist, [laughs] it makes me cringe. But you could see that the virus is liquid on the floor, and it would not be aerosolized like that. So, there would be maybe a little bit of a splash in the immediate vicinity, but there's nothing that is making that liquid become airborne.

But he's in the blue suit with the helmets and everything, so obviously this is a lab that studies things that are dangerous. You wouldn't have ventilation go into the rest of the building. What you would have would be filters in the ventilation system that have very small pores so they can trap even viruses from getting out of the building. And then that ventilation usually goes out to fresh air, so to outside rather than inside the building. So even if there was a few viral particles that made it out, they would be diluted when they made it outside to the fresh air and not a danger to anyone. That also would completely not happen, even in my laboratory. No food, no drink in the laboratory, anything at all.

There are some animals that have been trained to detect certain pathogens. So, not necessarily like how they are in, how they would be in the room air here, but sometimes dogs, or sometimes they've used pouched rats that can be trained to diagnose people who are infected, like, with tuberculosis, for example. So they're using more of the chemicals that humans put off when they're infected, rather than directly smelling, like, the tuberculosis bacterium. Yeah, this is about a one.

"World War Z" (2013)

Doctor: He's gonna inject himself.

Segen: We have no idea which one's gonna work.

Smith: So, they had noticed previously in the movie that people who are sick are avoided by zombies. So Pitt is trying to inject himself with something to basically make the zombies ignore him and believe that he is sick so they don't want to bite him. Of course, we're not trying to hide from predators like this when we're ill, but there are examples where we can use microbes to kind of fight off each other. Back before antibiotics were common for treatment of infections, they actually used injection with malaria, a parasite that causes high fevers, as an attempt to treat syphilis, a bacterial infection, and that actually won a Nobel Prize in 1927. Today, we don't use it anymore because we do have antibiotics.

So, what we do use sometimes are viruses called bacteriophage that can actually kill bacteria that may be resistant to the drugs we have, but not quite in the way it's described in "World War Z." In some cases, people do infect themselves. Probably the most famous of those was Dr. Barry Marshall. He was an Australian scientist who won the Nobel Prize, again, for discovering that a bacterium called Helicobacter pylori causes, first, gastric ulcers. People didn't necessarily believe him, so he ended up swallowing a mixture of Helicobacter pylori himself and then treated himself with antibiotics.

Maybe about a three. [laughs] I think the book was much better as far as realism, but the movie took the concept and I think twisted it a little bit.

"28 Days Later" (2002)

Scientist: The chimps are infected!

Activist: Infected with what?

Scientist: Rage.

Activist: [screams] Oh, my God!

Smith: The incubation period is basically instantaneous. When blood goes all over, that is how some blood-borne diseases can be transmitted. For example, Ebola is a disease that is transmitted by exposure to blood and other body fluids. In this case, potentially even if people aren't bitten, if that blood goes into their eyes or other mucus membranes, they could be infected that way as well. So that part could happen. But, again, the incubation period would take days for this. In animals, at least, rabies does cause a lot of aggression. In humans, a little bit less. It does cause depression. It does cause active biting, because in rabies it's transmitted through saliva, as well. So, not too surprising that a virus called rage could have those similar kind of aggressive, biting.

Smith: Yeah, so some viruses do cause eye infections, so basically a form of conjunctivitis. In this case, it's infecting usually a different place of the eye that you don't typically see with viruses. Again, we see that with Ebola. We've seen that a little bit with the current coronavirus also, that some people do seem to have eye infections from that as well. So, there are carrier states for various types of organisms. You don't see it as often in viruses, but you do have things like HIV that people can have chronic infection and be infected for a long time and transmit the virus to others. So, we actually do see that occasionally with blood-borne pathogens, like not only HIV, but hepatitis B virus and hepatitis C virus. And they do try to incorporate some of those aspects of the biology of the virus. So I'd say this one's maybe a six or a seven.

Columbus: It's been two months since patient zero took a bite of a contaminated burger at a Gas N' Gulp. Just two months, and I might be the last non-cannibal freak in the country.

Smith: This is another one that is not a viral infection. When we're talking about mad cow disease, that's actually caused by what, we call it prion, so an infectious protein. One of the things about those is they often are ingested. That's why the big scare in the UK many years ago, some cows infected with this may have gotten into the food chain, people may have eaten it, and potentially could develop disease that way. But that two-month time frame would not be realistic at all. This is a condition that takes many years to develop, 'cause what happens is that you ingest the prion and it ends up in your brain eventually. And what it does is that all of the proteins in your body have to be folded a certain way, OK? So they have, you know, a long amino acid sequence, but that's just the first part. Those amino acids have to not only come together one after another in the right way, but also they have to have what we call a tertiary structure, so a three-dimensional structure that has to be a certain way in order for the protein to function correctly. And so what this prion does is that not only is it folded incorrectly, but as it gets into your brain, it basically acts as a template for other proteins in your brain and causes them to misfold. And so as those proteins increasingly are misfolded in your brain, it causes little sections of your brain to die. So it causes what's called a spongiform encephalopathy. So that's the official name for mad cow disease, is bovine spongiform encephalopathy, and it basically leads your brain to become Swiss cheese over time.

It can take decades, so the timeframe is pretty off here. Coming back from the dead is, as well.

It can definitely cause behavioral changes, you know, aggression or things like that. Although to the level of chasing people and trying to eat them, I don't think that's ever been established with any kind of prion disease. I would give this one also maybe a five.

"Quarantine" (2008)

Firefighter: We can't help you. We can't help you unless you calm down. An ambulance is on its way right now, all right?

Smith: Yeah, so this is kind of similar to, again, that rabies aspect of it. Rabies is probably the only other virus that we know that is basically 100% fatal. There have been a couple of people who were infected. You can get what's called postexposure prophylaxis. So if you have not gotten the rabies vaccine before and you've been exposed, you've had a bite from a rabid dog or a wild animal or exposed to a bat, you can get the rabies vaccine shortly after your exposure. That can basically stop the virus from replicating in your body, so that will protect you from infection. But rabies has a very long incubation period, you know, weeks to sometimes months or more. So once you have symptoms of rabies, basically it is completely 100% fatal.

Police: You need to remain in the building!

Smith: I mean, you would have to get public health involved really quickly, 'cause they're the ones really with the authority to shut anything down. So, presumably somebody would call 911, call police, call an ambulance, and it would go from the hospital first responders to the public-health department, who would say, "OK, we have some kind of an outbreak here." We would quarantine everyone. So that would be a pretty rapid response. Yeah, I would give this, again, maybe, like, a three.

"The Girl with All the Gifts" (2016)

There are some fungal infections that can affect the brain, but they don't seem to cause the type of behavioral changes like this.

Dr. Caroline: The next stage in the fungus' life cycle, the mature sexual stage. This is a sporangium, a seed pod. If they were to open.

Sgt. Eddie: What?

Dr. Caroline: The end of the world.

Smith: You know, this is modeled on insect pathogens, where the fungus infects the insect, and it does change its behavior. They basically force the insects to go as high as they can up a tree, and eventually, the fungus causes the insect to just basically be a host for kind of that next phase of evolution. They talk about it in there. But basically you have this fruiting body that will replicate in the insect and then kind of pop out somewhere of their body, from their head or elsewhere. And then that is what sporulates. And so, when it sporulates, it releases all of the new fungi, and then they kind of rain down over the colony of insects and can infect all of those. So that's how it ends up amplifying. We have had fungal infections that have gone around the world or around a country. So, I live in an area in northeast Ohio that used to be covered by chestnuts, but then they all got a fungal disease and they basically all died. This can happen. Those can be transported very easily through the air. So that's kind of the premise of this, that if something was adapted to humans that was like this cordyceps that infects insects, that this could be the result. I believe in The Last of Us they have a lot of different kind of phenotypes or different types of zombies that kind of evolve over the course of infection. Yeah, I would say, I would give this a six.

"Z Nation" (2014-2018)

Prisoner: No! Guard: They don't look like volunteers to me.

Scientist: Millions of people are dying for no reason. These men are gonna die to help us find a vaccine.

Smith: So, I love this show. I think "Z Nation" is completely underrated as far as zombie shows go. So, some things that are at least somewhat accurate here is that we do have a long history, one, of doing experiments on prisoners. For many years, if you needed something done, you went to prisons and you went to orphanages, because they were places where you could get consent rather easily. Whether this is being done legally or not, it would be actually very difficult to do these experiments on prisoners right now because there's so many issues of consent. So that is actually kind of accurate, although hopefully not anymore. So, you have Murphy, who was injected with this serum. Of course, he gets bitten and doesn't turn. So, throughout the show, he's kind of not only resistant to the virus but able to control other zombies, which, again, may not be very realistic. But you do have people with perhaps kind of partial immunity to some of these infections.

Murphy: Eight times I was bitten, and did not turn! Eight times I was infected by their bloody saliva!

Smith: Sometimes, vaccine protection is complete. Sometimes it's incomplete. That's actually what we are expecting for the coronavirus vaccine, that we don't expect it will be 100% effective. But this is kind of what we see for the influenza vaccine, that it protects people from infection, but not all of the time. But even for people who are vaccinated, get influenza, you tend to have more mild symptoms, so kind of what Murphy is seeing here. Yeah, probably about a two. It's pretty unrealistic, especially some of the scenarios they get into later on.

"The Crazies" (2010)

David: It was a big plane, Russ.

Russ: Why's that?

David: 'Cause we're right on top of it.

Smith: If you're transporting a very deadly bioweapon, it would likely be in very tight storage, multiple layers probably, and you'd also have dry ice or something to keep it cold. So exactly how it was released during this plane crash is not very clear, I think. That water supply, the virus would be diluted, and even further so as it got spread out to all the people of the town. So you'd have to have a really small infectious dose for this, like one virus for people to get infected. And then, most likely the virus would be killed off in the water, because rabies is a virus that needs a host to amplify. It's not typically a waterborne virus. So it's unclear exactly how that would happen, how it would survive in that water in order to infect those. If you don't have that knowledge that a government plane went down carrying a bioweapon there and you're just doing this based on the symptoms, it would take a lot to kind of tie those together. Usually when we see outbreaks of waterborne disease, we respond to them, because you have upticks in people reporting to their doctors with gastrointestinal symptoms, with diarrhea and vomiting. One of the biggest-ever outbreaks of waterborne disease in the United States in modern times was in Milwaukee in 1993. They had an outbreak of a parasite called cryptosporidium. And one of the early indicators for that was that all of the Imodium was gone from stores. People weren't necessarily reporting to their doctor that they had diarrhea, but they were going out for anti-diarrheal medicine. I'll put this down at about a two.

Andy: Oh, no.

Smith: While he's in that incubation period, know that he is going to turn eventually. And so this idea of bringing back pathogens from history as you're digging into the ground, as the permafrost thaws, can some of these pathogens that have been locked in ice cores for millennia or longer, could those come back? Yes, some of them definitely can. Anthrax has caused outbreaks in some areas of Siberia. One thing is that anthrax is a really hearty bacterium. It forms spores, so it's really stable in the environment. But again, we don't know how long those would last. If it's decades or centuries or longer, we're not sure.

But I do want to kind of put a flip side to this, is that when scientists were trying to study the 1918 flu virus, so the one that caused the last big global pandemic, they found some samples from this that were frozen in permafrost, so one in Alaska and one in northern Europe. In Alaska in the 1950s, scientists went up there, dug up some of the people who had died and who were, again, preserved in permafrost for many years, took samples of their lungs, tried to grow them in the laboratory, completely unsuccessful. There was no live virus there. Did the same thing with some of the samples from, again, northern Europe, just about 15 years ago or so. So we were able to get viral RNA from those, and that's actually one of the reasons we know so much about the sequence of the 1918 flu virus, but could not grow it at all. And so most viruses are more fragile than, like, bacteria, so a viral outbreak would probably be less likely from something that has been long put in the ground.

Cleverman: They're poisoning this land, you know? This country's changing. Sick, we all get sick.

Smith: Interrupting nature, that's a big problem. In West Africa, we had the Ebola outbreak a couple years ago, the really big one, and part of that may have been from the more frequent introductions of the virus into the human population across parts of Africa where they're being deforested. Now the bats are all in close contact and they're in closer contact with other animals in the forest because there's only this little part of it left. So there's more potential for transmission for that virus to other animals. People eat those other animals. People come in contact with that through butchering them or other types of things. And so it can jump into humans. So we see that cycle on and on. So maybe at a four.

I like "28 Days Later" just 'cause it really scares me. I can suspend disbelief for all of the science that's maybe exaggerated or kind of mistold in there, but the story just really freaks me out.


IN CONTEXT: WHAT IS A VIRUS?

Viruses are small infectious agents that can only multiply within living cells. They are, therefore, defined as “infectious obligate intracellular parasites.” Viruses contain genetic information, but they need to use the biosynthetic machinery of living cells to produce new virus particles. Their origins are obscure, but they may have evolved from microorganisms that lost many of their cellular components and functions or from genes that escaped from an ancestral cell and gained the ability to infect other cells. All viruses consist of a nucleic acid core, which serves as genetic material, surrounded by a protective protein coat. Viruses' genetic material is either deoxyribonucleic acid (DNA) or ribonucleic acid (RNA), which can be either single- or double-stranded. In some virus families the nucleic acid-protein complex is surrounded by a lipid membrane.

Despite their small size and relative simplicity, viruses are diverse in size, shape, structure, and their ability to infect host species, which range from bacteria, fungi, and algae to plants, animals, and humans. Some are species specific others can infect a wide variety of hosts. Viruses are classified according to their structure, nucleic acid, and the presence or absence of a lipid membrane.

When viruses replicate, the whole process from adsorption in the host cell membrane to the release of new viruses may take as little as 30 minutes in the case of some bacteriophages to several days for some animal viruses. Some, such as the herpes simplex virus, are able to enter host cells and establish a latent infection, that is, a condition in which the virus remains dormant. Under appropriate conditions, a latent virus can be reactivated and begin to produce progeny that attack other cells. When this phenomenon occurs in bacteriophage, it is called lysogeny.

By the end of the twentieth century, the growing threat of antibiotic-resistant bacteria had revived interest in phage therapy. An estimated 90,000 Americans died in 2000 of hospital-acquired infections caused by


EDWARD JENNER

Edward Jenner was born on May 17, 1749, in Berkeley, Gloucestershire, the son of the Rev. Stephen Jenner, vicar of Berkeley. Edward was orphaned at age 5 and went to live with his older brother. During his early school years, Edward developed a strong interest in science and nature that continued throughout his life. At age 13 he was apprenticed to a country surgeon and apothecary in Sodbury, near Bristol (16). The record shows that it was there that Jenner heard a dairymaid say, “I shall never have smallpox for I have had cowpox. I shall never have an ugly pockmarked face.” It fact, it was a common belief that dairymaids were in some way protected from smallpox.

In 1764, Jenner began his apprenticeship with George Harwicke. During these years, he acquired a sound knowledge of surgical and medical practice (10). Upon completion of this apprenticeship at the age of 21, Jenner went to London and became a student of John Hunter, who was on the staff of St. George's Hospital in London. Hunter was not only one of the most famous surgeons in England, but he was also a well-respected biologist, anatomist, and experimental scientist. The firm friendship that grew between Hunter and Jenner lasted until Hunter's death in 1793. Although Jenner already had a great interest in natural science, the experience during the 2 years with Hunter only increased his activities and curiosity. Jenner was so interested in natural science that he helped classify many species that Captain Cook brought back from his first voyage. In 1772, however, Jenner declined Cook's invitation to take part in the second voyage (4).

Jenner occupied himself with many matters. He studied geology and carried out experiments on human blood (17). In 1784, after public demonstrations of hot air and hydrogen balloons by Joseph M. Montgolfier in France during the preceding year, Jenner built and twice launched his own hydrogen balloon. It flew 12 miles. Following Hunter's suggestions, Jenner conducted a particular study of the cuckoo. The final version of Jenner's paper was published in 1788 and included the original observation that it is the cuckoo hatchling that evicts the eggs and chicks of the foster parents from the nest (17, 18). For this remarkable work, Jenner was elected a fellow of the Royal Society. However, many naturalists in England dismissed his work as pure nonsense. For more than a century, antivaccinationists used the supposed defects of the cuckoo study to cast doubt on Jenner's other work. Jenner was finally vindicated in 1921 when photography confirmed his observation (19). At any rate, it is apparent that Jenner had a lifelong interest in natural sciences. His last work, published posthumously, was on the migration of birds.

In addition to his training and experience in biology, Jenner made great progress in clinical surgery while studying with John Hunter in London. Jenner devised an improved method for preparing a medicine known as tartar emetic (potassium antimony tartrate). In 1773, at the end of 2 years with John Hunter, Jenner returned to Berkeley to practice medicine. There he enjoyed substantial success, for he was capable, skillful, and popular. In addition to the practice of medicine, he joined two local medical groups for the promotion of medical knowledge and continued to write occasional medical papers (4, 18). He also played the violin in a musical club and wrote light verse and poetry. As a natural scientist, he continued to make many observations on birds and the hibernation of hedgehogs and collected many specimens for John Hunter in London.

While Jenner's interest in the protective effects of cowpox began during his apprenticeship with George Harwicke, it was 1796 before he made the first step in the long process whereby smallpox, the scourge of mankind, would be totally eradicated. For many years, he had heard the tales that dairymaids were protected from smallpox naturally after having suffered from cowpox. Pondering this, Jenner concluded that cowpox not only protected against smallpox but also could be transmitted from one person to another as a deliberate mechanism of protection. In May 1796, Edward Jenner found a young dairymaid, Sarah Nelms, who had fresh cowpox lesions on her hands and arms (Figure ​ (Figure3 3 ). On May 14, 1796, using matter from Nelms' lesions, he inoculated an 8-year-old boy, James Phipps. Subsequently, the boy developed mild fever and discomfort in the axillae. Nine days after the procedure he felt cold and had lost his appetite, but on the next day he was much better. In July 1796, Jenner inoculated the boy again, this time with matter from a fresh smallpox lesion. No disease developed, and Jenner concluded that protection was complete (10).

The hand of Sarah Nelms. Photo courtesy of the National Library of Medicine.

In 1797, Jenner sent a short communication to the Royal Society describing his experiment and observations. However, the paper was rejected. Then in 1798, having added a few more cases to his initial experiment, Jenner privately published a small booklet entitled An Inquiry into the Causes and Effects of the Variolae Vaccinae, a disease discovered in some of the western counties of England, particularly Gloucestershire and Known by the Name of Cow Pox (18, 10). The Latin word for cow is vacca, and cowpox is vaccinia Jenner decided to call this new procedure vaccination. The 1798 publication had three parts. In the first part Jenner presented his view regarding the origin of cowpox as a disease of horses transmitted to cows. The theory was discredited during Jenner's lifetime. He then presented the hypothesis that infection with cowpox protects against subsequent infection with smallpox. The second part contained the critical observations relevant to testing the hypothesis. The third part was a lengthy discussion, in part polemical, of the findings and a variety of issues related to smallpox. The publication of the Inquiry was met with a mixed reaction in the medical community.

Jenner went to London in search of volunteers for vaccination. However, after 3 months he had found none. In London, vaccination became popular through the activities of others, particularly the surgeon Henry Cline, to whom Jenner had given some of the inoculant (4). Later in 1799, Drs. George Pearson and William Woodville began to support vaccination among their patients. Jenner conducted a nationwide survey in search of proof of resistance to smallpox or to variolation among persons who had cowpox. The results of this survey confirmed his theory. Despite errors, many controversies, and chicanery, the use of vaccination spread rapidly in England, and by the year 1800, it had also reached most European countries (10).

Although sometimes embarrassed by a lack of supply, Jenner sent vaccine to his medical acquaintances and to anyone else who requested it. After introducing cowpox inoculation in their own districts, many recipients passed the vaccine on to others. Dr. John Haygarth (of Bath, Somerset) received the vaccine from Edward Jenner in 1800 and sent some of the material to Benjamin Waterhouse, professor of physics at Harvard University. Waterhouse introduced vaccination in New England and then persuaded Thomas Jefferson to try it in Virginia. Waterhouse received great support from Jefferson, who appointed him vaccine agent in the National Vaccine Institute, an organization set up to implement a national vaccination program in the United States (20).

Although he received worldwide recognition and many honors, Jenner made no attempt to enrich himself through his discovery. He actually devoted so much time to the cause of vaccination that his private practice and his personal affairs suffered severely. The extraordinary value of vaccination was publicly acknowledged in England, when in 1802 the British Parliament granted Edward Jenner the sum of ꌐ,000. Five years later the Parliament awarded him ꌠ,000 more. However, he not only received honors but also found himself subjected to attacks and ridicule. Despite all this, he continued his activities on behalf of the vaccination program. Gradually, vaccination replaced variolation, which became prohibited in England in 1840.

Jenner married in 1788 and fathered four children. The family lived in the Chantry House, which became the Jenner Museum in 1985. Jenner built a one-room hut in the garden, which he called the “Temple of Vaccinia” (Figure ​ (Figure4 4 ), where he vaccinated the poor for free (10, 17). After a decade of being honored and reviled in more or less equal measure, he gradually withdrew from public life and returned to the practice of country medicine in Berkeley. In 1810, his oldest son, Edward, died of tuberculosis. His sister Mary died the same year and his sister Anne 2 years later. In 1815, his wife, Catherine, died of tuberculosis (17). Sorrows crowded in on him, and he withdrew even further from public life. In 1820, Jenner had a stroke from which he recovered. On January 23, 1823, he visited his last patient, a dying friend. The next morning Jenner failed to appear for breakfast later that day he was found in his study. He had had a massive stroke. Edward Jenner died during the early morning hours of Sunday, January 26, 1823. He was laid to rest with his parents, his wife, and his son near the altar of the Berkeley church.

The Temple of Vaccinia. Photo courtesy of the Jenner Museum, Berkeley, Gloucestershire, England.


Convergence of evidence

To be sure, not all claims are subject to laboratory experiments and statistical tests. There are many historical and inferential sciences that require nuanced analyses of data and a convergence of evidence from multiple lines of inquiry that point to an unmistakable conclusion. Just as detectives employ the convergence of evidence technique to deduce who most likely committed a crime, scientists employ the method to deduce the likeliest explanation for a particular phenomenon.
— Michael Shermer The Believing Brain

When Holocaust revisionists and creationists, for example, attack mainstream accounts of reality, they look at all the evidence used to defend these theories and then focus on very specific ones that seem particularly weak and problematic to them. Sometimes, they are indeed problematic, but what they fail to see is that the edifice of our best theories don’t rely on a few scattered pillars, but on many of them, such that even if we did concede that pillar was weak, there are many more to support the structure.

Philosophy ought to imitate the successful sciences in its methods, so far as to proceed only from tangible premisses which can be subjected to careful scrutiny, and to trust rather to the multitude and variety of its arguments than to the conclusiveness of any one. Its reasoning should not form a chain which is no stronger than its weakest link, but a cable whose fibers may be ever so slender, provided they are sufficiently numerous and intimately connected.
— Charles S. Peirce


Refusing Vaccination Puts Others At Risk

Candybox Images: dreamstime

A significant proportion of Americans believe it is perfectly all right to put other people at risk of the costs and misery of preventable infectious diseases. These people are your friends, neighbors, and fellow citizens who refuse to have themselves or their children vaccinated against contagious diseases.

There would be no argument against allowing people to refuse vaccination if they and their families would suffer alone the consequences of their foolhardiness. It would be their right to forego misery-reducing and life-preserving treatments. But that is not the case in the real world.

The University of Pittsburgh's Project Tycho database, launched last week, quantifies the prevalence of infectious disease since 1888 in the United States. Drawing on Project Tycho data, a November 28 New England Journal of Medicine article concluded that vaccinations since 1924 until now prevented 103 million cases of polio, measles, rubella, mumps, hepatitis A, diphtheria, and pertussis. While the NEJM article did not calculate the number of deaths avoided as a result of vaccination, one of the study's authors estimates that number is between three and four million.

People who don't wish to take responsibility for their contagious microbes will often try to justify their position by noting the fact that the mortality rates of many infectious diseases had declined significantly before vaccines came along. And it is certainly true that a lot of that decline in infectious disease mortality occurred as a result of improved sanitation and water chlorination. A 2004 study by the Harvard University economist David Cutler and the National Bureau of Economic Research economist Grant Miller estimated that the provision of clean water "was responsible for nearly half of the total mortality reduction in major cities, three-quarters of the infant mortality reduction, and nearly two-thirds of the child mortality reduction." Improved nutrition also reduced mortality rates, enabling infants, children, and adults to fight off diseases that would have more likely killed their malnourished ancestors.

But vaccines have played a substantial role in reducing death rates too. An article in the Journal of the American Medical Association compared the annual average number of cases and resulting deaths of various diseases before the advent of vaccines to those occurring in 2006. Before an effective diphtheria vaccine was developed, for example, there were about 21,000 cases of the disease each year, 1,800 of them leading to death. No cases or deaths from the disease were recorded in 2006. Measles averaged 530,000 cases and 440 deaths per year before the vaccine. In 2006, there were 55 cases and no deaths. Whooping cough saw around 200,000 cases and 4,000 deaths annually. In 2006, there were nearly 16,000 cases and 27 deaths. Polio once averaged around 16,000 cases and 1,900 deaths. No cases were recorded in 2006. The number of Rubella cases dropped from 48,000 to 17, and the number of deaths dropped from 17 to zero.

With the latter disease, the more important measure is the number of babies, born to rubella-infected mothers, who suffered from disease-induced birth defects, such as deafness, cloudy corneas, damaged hearts, and stunted intellects. Some 2,160 infants were afflicted with congenital rubella syndrome as late as 1965. In 2006 it was one.

The risk that infectious diseases will kill innocent bystanders is not the only issue. Sheer misery counts too. The fevers, the sweats, the incessant coughs, the runny noses, the itchy rashes, and the lost days at work must be taken into account, too. And, of course, many people end up in the hospital as a result of infectious disease.

Before a chicken pox vaccine became available, upwards of four million kids got the disease every year, of which 11,000 were hospitalized and 105 died. In 2004, the estimated number cases had dropped to 600,000, resulting in 1,276 hospitalizations and 19 deaths. Before the measles vaccine was introduced in 1962, some 48,000 were hospitalized and 450 died of that infection each year. So far this year there have been 175 cases and three hospitalizations. A 1985 study by Centers for Disease Control and Prevention epidemiologist in the journal Pediatrics estimated that the first 20 years of measles vaccination in the U.S. had prevented 52 million cases, 5,200 deaths, and 17,400 cases of mental retardation.

In rich countries, few children die of rotavirus diarrheal disease, but it does kill some 500,000 kids living in poor countries annually. Prior to 2006, when vaccines against rotavirus became available, about one in five kids under the age of five in the United States annually came down with it, of which 57,000 were hospitalized. Subsequent to widespread vaccination, hospitalization rates have dropped by 90 percent. Interestingly, rotavirus hospitalizations among older children and young adults who are not immunized have also fallen by around 10,000 annually. Why? Because they are no longer are exposed to the disease in infants who would otherwise have infected them.

Vaccines do not produce immunity in some people, so a percentage of those who took the responsibility to be vaccinated remain vulnerable. This brings us to the important issue of herd immunity. Herd immunity works when most people are immunized against an illness, greatly reducing the chances that an infected person can pass his microbes along to other susceptible people, such as infants who cannot yet be vaccinated, immunocompromised individuals, or folks who have refused the protection of vaccination.

People who refuse vaccination for themselves and their children are free-riding off herd immunity. Anti-vaccination folks are taking advantage of the fact that most people around them have chosen the minimal risk of vaccination, thus acting as a firewall protecting them from disease. But if enough refuse, the firewall comes down and other people get hurt.

Oliver Wendell Holmes articulated a good libertarian principle when he said, "The right to swing my fist ends where the other man's nose begins." Holmes' observation is particularly salient in the case of whooping cough shots.

Infants cannot be vaccinated against whooping cough, so their protection against this dangerous disease depends upon the fact that most of the rest of us are immunized against it. Unfortunately, whooping cough incidence rates have been increasing along with the number of people refusing immunization for their kids. The annual number of pertussis cases fell to a low of 1,010 in 1976. Last year, the number of reported cases rose to 48,277, the highest number since 1955. Eighteen infants died of the disease in 2012, and half of the infants who got it were hospitalized.

In 2005, an intentionally unvaccinated 17-year-old girl brought measles back with her from a visit to Romania and ended up infecting 34 people. Most of them were also intentionally unvaccinated, but a medical technician who had been vaccinated caught the disease as well and was hospitalized. Despite the medical technician's bad luck, the good news is that the measles vaccine is thought to protect 99.8 percent of who get the shot. Similarly, in 2008 an intentionally unvaccinated seven-year-old boy sparked an outbreak of measles in San Diego. The boy, who caught the disease in Switzerland, ended up spreading his illness to 11 other children, all of whom were also unvaccinated, putting one infant in the hospital. Forty-eight other kids who were too young to be vaccinated were quarantined.

To borrow Holmes' metaphor, people who refuse vaccination are asserting that they have a right to "swing" their microbes at other people. There is no principled libertarian case for their free-riding refusal to take responsibility for their own microbes.


Microbes help to battle infection: Gut microbes help develop immune cells, study finds

The human relationship with microbial life is complicated. At almost any supermarket, you can pick up both antibacterial soap and probiotic yogurt during the same shopping trip. Although there are types of bacteria that can make us sick, Caltech professor of biology and biological engineering Sarkis Mazmanian and his team are most interested in the thousands of other bacteria -- many already living inside our bodies -- that actually keep us healthy. His past work in mice has shown that restoring populations of beneficial bacteria can help alleviate the symptoms of inflammatory bowel disease, multiple sclerosis, and even autism. Now, he and his team have found that these good bugs might also prepare the immune cells in our blood to fight infections from harmful bacteria.

In the recent study, published on March 12 in the journal Cell Host & Microbe, the researchers found that beneficial gut bacteria were necessary for the development of innate immune cells -- specialized types of white blood cells that serve as the body's first line of defense against invading pathogens.

In addition to circulating in the blood, reserve stores of immune cells are also kept in the spleen and in the bone marrow. When the researchers looked at the immune cell populations in these areas in so-called germ-free mice, born without gut bacteria, and in healthy mice with a normal population of microbes in the gut, they found that germ-free mice had fewer immune cells -- specifically macrophages, monocytes, and neutrophils -- than healthy mice.

Germ-free mice also had fewer granulocyte and monocyte progenitor cells, stemlike cells that can eventually differentiate into a few types of mature immune cells. And the innate immune cells that were in the spleen were defective -- never fully reaching the proportions found in healthy mice with a diverse population of gut microbes.

"It's interesting to see that these microbes are having an immune effect beyond where they live in the gut," says Arya Khosravi, a graduate student in Mazmanian's lab, and first author on the recent study. "They're affecting places like your blood, spleen, and bone marrow -- places where there shouldn't be any bacteria."

Khosravi and his colleagues next wanted to see if the reduction in immune cells in the blood would make the germ-free mice less able to fight off an infection by the harmful bacterium Listeria monocytogenes -- a well-studied human pathogen often used to study immune responses in mice. While the healthy mice were able to bounce back after being injected with Listeria, the infection was fatal to germ-free mice. When gut microbes that would normally be present were introduced into germ-free mice, the immune cell population increased and the mice were able to survive the Listeria infection.

The researchers also gave injections of Listeria to healthy mice after those mice were dosed with broad-spectrum antibiotics that killed off both harmful and beneficial bacteria. Interestingly, these mice also had trouble fighting the Listeria infection. "We didn't look at clinical data in this study, but we hypothesize that this might also happen in the clinic," says Mazmanian. "For example, when patients are put on antibiotics for something like hip surgery, are you damaging their gut microbe population and making them more susceptible to an infection that had nothing to do with their hip surgery?"

More importantly, the research also suggests that a healthy population of gut microbes can actually provide a preventative alternative to antibiotics, Khosravi says. "Today there are more and more antibiotic resistant superbugs out there, and we're running out of ways to treat them. Limiting our susceptibility to infection could be a good protective strategy."


Is Addiction Really a Disease?

For many decades, it's been widely accepted that alcoholism (or addiction) is a disease. The "disease concept" is taught in addiction training programs and to patients in treatment programs. It is unquestioned by public figures and the media. But is it true? And if it is not true, is there a better and more helpful way to define addiction?

Let's start with a short history. In the bad old days, before the disease concept became widely popular (about 40 years ago), our society was even more prejudiced against people with addictions than it is now. "Addicts" were seen as different and worse than "normal" folks. They were thought to be lacking in ordinary discipline and morality, as self-centered and uncaring. They were seen as people who were out for their own pleasure without regard for anyone else. They were viewed as having deficiencies in character.

Then came the idea that addiction is a disease: a medical illness like tuberculosis, diabetes or Alzheimer's disease. That meant that people with addictions weren't bad, they were sick. In an instant, this changed everything. Public perceptions were less judgmental. People were less critical of themselves. Of course, it wasn't welcome to hear that you had a disease, but it was better than being seen as immoral and self-centered. So, the disease concept was embraced by virtually everyone. With all its benefits, it's no wonder this idea continues to attract powerful, emotional support.

Widespread enthusiasm for the disease model, however, has led to willingness to overlook the facts. Addiction has very little in common with diseases. It is a group of behaviors, not an illness on its own. It cannot be explained by any disease process. Perhaps worst of all, calling addiction a "disease" interferes with exploring or accepting new understandings of the nature of addiction.

This becomes clear if you compare addiction with true diseases. In addiction, there is no infectious agent (as in tuberculosis), no pathological biological process (as in diabetes), and no biologically degenerative condition (as in Alzheimer's disease). The only "disease-like" aspect of addiction is that if people do not deal with it, their lives tend to get worse.

That's true of lots of things in life that are not diseases it doesn't tell us anything about the nature of the problem. (It's worthwhile to remember here that the current version of the disease concept, the "chronic brain disease" neurobiological idea, applies to rats but has been repeatedly shown to be inapplicable to humans. Please see earlier posts on Psychology Today, or my book, Breaking Addiction, for a full discussion of the fallacy of this neurobiological disease model for addiction.)

As you likely know, addictive acts occur when precipitated by emotionally significant events. They can be prevented by understanding what makes these events so emotionally important, and they can be replaced by other emotionally meaningful actions or even other psychological symptoms that are not addictions. Addictive behavior is a readily understandable symptom, not a disease.

But if we are to scrap the disease concept and replace it with something valid, our new explanation must retain all the beneficial aspects of the old disease idea. It must not allow moralizing or any other negative attributions to people suffering with addictions. In fact, we'd hope an alternative explanation would have more value than the disease label, by giving people with addictions something the disease concept lacks: an understanding that is useful for treating the problem.

Knowing how addiction works psychologically meets these requirements. Recognizing addiction to be just a common psychological symptom means it is very much in the mainstream of the human condition. In fact, as I've described elsewhere, addiction is essentially the same as other compulsive behaviors like shopping, exercising, or even cleaning your house. Of course, addiction usually causes much more serious problems. But inside, it is basically the same as these other common behaviors.

When addiction is properly understood to be a compulsive behavior like many others, it becomes impossible to justify moralizing about people who feel driven to perform addictive acts. And because compulsive behaviors are so common, any idea that "addicts" are in some way sicker, lazier, more self-centered, or in any other way different from the rest of humanity becomes indefensible.

Seeing that addiction is just a compulsive symptom also meets our wish for a new explanation: Unlike the "disease" idea, it actually helps people to get well. As I've described here and in my books, when people can see exactly what is happening in their minds that leads to that urge to perform an addictive act, they can regularly learn to become its master, instead of the urge mastering them.

Despite all its past helpfulness, then, we are better off today without the disease idea of addiction. For too long, it has served as a kind of "black box" description that explains nothing, offers no help in treatment, and interferes with recognizing newer ways to understand and treat the problem.

And there is one more advantage. If we can eliminate the empty "disease" label, then people who suffer with an addiction can finally stop thinking of themselves as "diseased."