The headlines might seem lifted straight out of a dystopian novel at the moment, but there’s some good news for at least two groups of people: those who suffer from food allergies and those who are at risk of developing them.
As far as they’re concerned, we’re living in the middle of a scientific and medical revolution. In recent years, our knowledge and understanding of food allergies have radically shifted, and we’ve made some major breakthroughs in being able to prevent and reverse these conditions that afflict millions of people around the world.
In the not too distant future, food allergies could even become a thing of the past. It’s a ray of hope in a time full of darkness, especially if you or any of your loved ones suffer from food allergies yourselves.
But even if you’re lucky enough to be allergy-free, it’s a fascinating and inspiring development in modern medicine, and you’re about to learn some of the stories and science behind it.
One important thing to mention: if you suffer from allergies, please don’t make any changes to your diet without consulting your doctor.
Allergist Gideon Lack developed a revolutionary hypothesis about food allergies.
Back in the late 1990s and early 2000s, a British researcher named Gideon Lack was feeling a mixture of puzzlement and alarm.
As a pediatric allergist at King’s College London, he’d seen the rate of peanut allergies in the UK double in just 10 years. Meanwhile, more British parents than ever were following the standard medical advice of the time. It boiled down to a simple message: don’t feed your baby peanuts. That way, you’ll avoid the risk of them developing a peanut allergy.
Sounds logical – but apparently it wasn’t working, and no one knew why.
Then something happened. During a trip to Tel Aviv, Lack stumbled upon a pair of facts that led him to a eureka moment. The result would be a hypothesis that turned the conventional wisdom on food allergies on its head.
Lack had come to Tel Aviv to give a talk about peanut allergies to a group of Israeli clinicians. At one point during the talk, he asked his audience members for a show of hands: who here has treated at least one case of peanut allergy in the past year?
Whenever he asked this question in the UK, nearly all the hands would go up. But there, in Tel Aviv, only a couple did. Apparently, British children were suffering from peanut allergies at a much higher rate than Israeli children – ten times higher (1.85 versus only 0.17 percent), he later found out.
That was the first fact he stumbled upon. What explained it? Well, that brings us to the second fact.
One day, shortly after his talk, Lack was having lunch with some Israeli friends. One of them was a mother, who was feeding her baby some food. It was a very common baby snack in Israel, his friends informed him. Just out of curiosity, he asked if he could try it for himself.
The snack tasted like peanut butter.
It turned out that Israeli babies were eating food containing peanuts at a much higher rate than British babies – seven times higher (69 versus 10 percent) by the age of nine months old, he later determined.
Could the two facts be connected? Could early exposure to peanuts actually help children to be less likely to develop peanut allergies, rather than the other way around? So, was avoidance a misguided idea? And could the same thing be true of other food allergies as well?
Lack suspected the answer to all of these questions was yes. But it’s one thing to come up with a hypothesis. It’s another to test it, let alone confirm it.
Further research was necessary.
Food allergies are a global problem, afflicting both children and adults at alarmingly growing rates.
We’ll get back to Gideon Lack’s hypothesis a little bit later, but first, let’s zoom out and look at the bigger picture. You see, that rise in peanut allergies Lack was observing in the UK wasn’t just an anomaly. It was part of a much larger trend.
Around the same time, on the other side of the Atlantic, things were getting even worse in the US. In 1997, less than 0.5 percent of American children had peanut allergies. By 2018, that percentage had more than quadrupled to 2.2.
But the problem of food allergies goes way beyond peanuts, and it’s hardly confined to children in Western countries like the US and UK.
Peanuts are just one of the eight most common food allergens – food substances that cause allergic reactions in people who suffer from food allergies. The others include eggs, fish, shellfish, wheat, soy, milk, and tree nuts, like almonds and pistachios. In recent decades, all eight allergens have been claiming more and more victims.
That’s been especially true in the US, where the percentage of children with one or more food allergies went up 8.5 percent from 1997 to 2011. But similar increases have been going on elsewhere. For instance, China saw a 7.7 percent rise in the percentage of infants with food allergies between 1999 and 2009.
And it’s not just infants and children. In both the US and UK, more than 10 percent of adults have one or more food allergies, and almost half of them developed at least one of those allergies in adulthood. A similar story is playing out across the world. Globally, researchers estimate that up to 8 percent of children and 11 percent of adults suffer from food allergies.
Now, the percentages vary from country to country, and so does the quantity and quality of the data that’s available to us. But while the details are complicated and sometimes a little murky, the overall picture is clear: food allergies have become a major problem just about everywhere. Estimated rates ranging from 4 to 15 percent of children and adults can be found in nations as diverse as Ghana, Tanzania, Japan, Taiwan, Colombia, Canada, Australia, Poland, and Bulgaria.
If ever there was a time for some fresh thinking on food allergies, that time would be now.
No single theory fully explains why people suffer from food allergies, and many factors are at play.
Before we go back to Gideon Lack’s hypothesis, let’s round out our bigger picture with some important caveats.
Food allergies have been on the rise in modern times, but they’re hardly a new phenomenon. Neither is the desire to understand them. All the way back in the fifth century BC, the ancient Greek physician Hypocrites observed people suffering from cheese allergies. He theorized that their “constitutions” were simply “hostile” to the dairy product.
Food allergy science has come a long way since then. But although it’s made some major breakthroughs, it still has some equally major mysteries left to solve.
The details of how and why food allergies happen are complicated, and they’re still not fully grasped by science. But the general facts are fairly straightforward and well-established.
Essentially, allergic reactions to a certain type of food happen when the body’s immune system mistakes that food’s proteins for dangerous foreign substances.
At that point, the system’s alarm bells start ringing, and it goes into defense mode – triggering various physical responses that are designed to repel the attack it believes it’s under. The resulting inflammation, muscle contractions, and enzyme-production lead to one or more allergic reactions, such as itchy skin, hives, shortness of breath, vomiting, or dangerously low blood pressure.
So what makes the immune system go haywire? And why has this been happening to more and more people in recent decades? Scientists have put forward several theories. Some of them point to genetics, others to changes in our environments, diets, and lifestyles, along with their effects on our gut bacteria.
To one degree or another, all of these theories have some credence and help to explain the matter. But none of them are fully satisfactory by themselves.
For example, one genetic theory centers on an antibody in our blood called Immunoglobulin E, or IgE, which plays a big role in triggering allergic reactions to food. Now, people with food allergies usually have high levels of IgE. Those levels, in turn, seem to be determined by certain genes. But not everyone with high amounts of IgE ends up developing a food allergy, so this theory alone can’t explain the phenomenon.
In all likelihood, food allergies are the result of a complex array of interconnected factors – genes interacting with environments interacting with diets, and so on. The only simple explanation is there is no simple explanation.
So keep that in mind as we focus on Gideon Lack’s hypothesis. It’s just one piece – albeit a very important piece – of a much bigger puzzle.
The prevalence of peanut allergies can’t be explained by genetics or other medical conditions.
Alright, so now let’s return to Gideon Lack’s hypothesis. Does early exposure to peanuts really affect whether children subsequently develop peanut allergies?
After his trip to Israel, Lack had a hunch that the answer was yes. But at this point, it was only that, a hunch. To feel enough confidence to advance it as a scientific hypothesis, he first had to rule out some other possible explanations.
Maybe fewer Israeli babies suffered from peanut allergies than British babies because they simply had different genes. Or maybe they just had lower rates of other medical conditions that were connected to peanut allergies, such as asthma.
It turned out that neither explanation held up to scrutiny.
To get to the bottom of the matter, Lack and his colleagues collected data on 8,826 Jewish children in both Israel and the UK. Why Jewish children? Well, they shared the same genetic profile, so that could help the researchers control for this variable.
The children of both countries also had similar rates of asthma, so Lack and his colleagues could control for that variable as well. If the two groups of children still had different rates of peanut allergies despite sharing these similarities, that would seem to rule out genetics and asthma as an explanation.
Sure enough, that’s exactly what they discovered. In fact, the results of their research allowed Lack and his colleagues to go even further.
By collecting data and crunching the numbers, they were able to rule out many other possible explanations as well, such as differences in the children’s social class and how much they suffered from other allergic conditions that could be related to peanut allergies. Those included milk, egg, sesame, and tree nut allergies, as well as hay fever and eczema – a condition characterized by red, itchy skin.
Now, it might seem unsurprising that multiple food allergies could come hand in hand. But what could a skin condition like eczema possibly have to do with an allergy to eating peanuts? The connection isn’t immediately obvious.
Nonetheless, it definitely exists. Research has shown that children with severe eczema are at a significantly greater risk of developing peanut allergies. Why? Well, the answer to that question plays a big role in what Lack and his colleagues did next.
The dual-allergen exposure theory suggests that our skin might be one of the main conduits of developing food allergies.
To understand the connection between eczema and peanut allergies, it helps to remember what the point of having skin is in the first place.
From a biological standpoint, our skin is essentially a barrier between our bodies’ internal organs and the outside world. It prevents foreign substances like microbes from getting inside us and wreaking havoc.
Or at least that’s what it’s supposed to do. Unfortunately, conditions like eczema can weaken the skin and make it more permeable, allowing invaders to push on through.
What does this have to do with food allergies? Well, the answer to that question brings us to something called the dual-allergen exposure theory.
Imagine you’re a baby growing up in a household where your parents follow the conventional wisdom on food allergies and avoid feeding you anything with peanuts in it. The thinking behind this approach seems reasonable enough: if peanuts can cause allergic reactions, just stay clear of them, and you’ll avoid the potential risks of eating peanuts as well.
But here’s the problem, peanut proteins have another way of getting into your body – your skin. If anyone in your family eats anything with peanuts in it, some peanut residue will get into your household dust. It will also linger on their skin and saliva for up to three hours after eating – plenty of time for them to transfer it to your skin by touching you or kissing you.
Once that happens, there’s a slight chance some peanut proteins will sneak their way into your body through your skin – a risk that becomes much greater if you’re suffering from a condition like eczema.
Now, let’s say some peanut residue does, in fact, make its way inside your body. How’s your immune system going to react? Well, if you’ve never eaten peanuts before, it won’t be familiar with their proteins. This will increase the chance it’ll see them as harmful foreign invaders to defend against, rather than beneficial nutrients to leave alone.
And that, in turn, will increase the chance that the immune system will trigger an allergic reaction to ward off the perceived threat.
At that point, you have the makings of a peanut allergy. And according to the dual-allergen exposure theory, the same thing can happen with any other food allergen.
Gideon Lack hypothesized that eating food containing allergens could help babies avoid food allergies – and vice-versa.
So does the dual-allergen exposure theory have any evidence in its favor?
Yes, it does. One study found that the skin of infants with peanut allergies was exposed to peanut residue ten times more often than their non-allergic counterparts, due to differences in their environments.
Another study showed that just a little skin exposure to peanut oil could increase children’s risk of developing peanut allergies by the age of five years old.
But don’t just blame the skin or the peanut-contaminated hands, mouths, or dust that might come into contact with it.
The problem isn’t skin exposure per se; it’s skin exposure combined with the tendency to avoid feeding infants food containing potential allergens – also known as allergenic food. And this brings us back to Gideon Lack’s hypothesis.
Lack’s underlying contention was two-fold. On the one hand, eating allergenic food could help babies’ immune systems learn to perceive them as friends instead of foes. On the other hand, avoiding that food could do the opposite. It could teach the immune system to perceive those allergens as enemies when it eventually encountered them through skin exposure or accidental consumption.
If all that’s correct, the implications would be enormous. For decades, medical authorities around the world had been advising parents to avoid feeding their babies allergenic food.
Not only that, they’d also been advising mothers to avoid eating that food themselves during pregnancy and breastfeeding. That way, they’d prevent the allergens from getting transferred to the babies through their mothers’ umbilical cords or breastmilk.
Over the decades, this advice seeped into the public consciousness of many parents, who took it to heart and dutifully followed the avoidance strategy it advocated. Unfortunately, this did little to prevent babies from developing food allergies. In fact, as we’ve seen, the rate of food allergies skyrocketed instead.
If Lack was correct, those two facts could be related. Avoidance could be aggravating the problem of food allergies, instead of solving it. And in that case, the conventional advice would be woefully misguided.
In fact, the opposite advice might be in order: parents should actively try to include allergenic foods in their babies’ diets, rather than avoiding them – and mothers shouldn’t avoid eating them during pregnancy or breastfeeding either.
But only if Lack was correct – a very big if.
The LEAP study confirmed Lack’s hypothesis that early exposure makes children less likely to develop peanut allergies.
We now arrive at the pivotal question: was Lack’s hypothesis correct? Does eating peanuts at an early age prevent babies from developing peanut allergies?
Starting in 2006, Lack and some colleagues set out to answer that question. It was called the LEAP study – a catchy acronym for a much less catchy phrase: Learning Early About Peanut Allergy.
Published in 2015, the study took nearly 10 years to complete, and it required funding from three organizations, participation of hundreds of infants and parents, and the consumption of who knows how many peanuts. But in the end, all the hard work paid off. It turned out that Lack was right.
To conduct the LEAP study, Lack and his colleagues recruited 640 infants (along with their parents, of course). Some of the infants already had peanut allergies. Most of them didn’t. All of them had severe eczema, an egg allergy, or both, which put them at high risk of developing peanut allergies.
Lack and his colleagues then took all of these infants and split them into two groups. Over the first two years of their lives, half of them would completely avoid eating any food with peanuts in it. The other half would eat food containing peanuts regularly, in a way that was carefully monitored and administered.
Five years later, the researchers checked back in with their little subjects, who were now young children. How many of them had developed peanut allergies? Were there fewer allergies among the peanut-eating group than the group who avoided eating peanuts?
In short, did the intervention of feeding babies peanuts help to prevent them from developing peanut allergies?
The answer turned out to be an overwhelming yes. Among the peanut-eating babies who didn’t have peanut allergies before the study began, there was an astonishing 86 percent fewer who’d developed peanut allergies.
And even among the babies who were allergic to peanuts before the study began, there was an astounding 70 percent reduction in peanut allergies. In other words, eating peanuts actually helped these babies to lose their peanut allergies!
Apparently, the LEAP study’s name had been appropriately chosen, because Lack and his colleagues had just made one giant leap for food allergy science.
Introducing babies to allergenic food seems to be a generally good idea, but there are some important caveats.
In the wake of the remarkable success of the LEAP study, food allergists from around the world were inspired to pursue many avenues of follow-up research. What came next was a veritable alphabet soup of acronyms.
First, there was the LEAP-ON study. Then there were the EAT, PETIT, BEAT, STAR, STEP, and HEAP studies.
Now, if this were the Hollywood movie version of the story, the next part would be about how all this follow-up research confirmed the LEAP study’s initial findings about peanut allergies and then extended them to other food allergies as well.
And that is what actually happened – sort of. But the reality of food allergy science is a bit messier than a feel-good film’s triumphal tale of unambiguous success.
We won’t go into the details or even the acronyms of all those follow-up studies, but here’s the bottom line. As far as peanut allergies go, the findings of the LEAP study were further bolstered by the LEAP-ON study. The EAT study then showed that it was safe for babies to eat sesame, milk, fish, eggs, and wheat during the first six months of their lives.
So, the EAT study provided further evidence against the conventional wisdom that parents should avoid feeding their babies allergenic food. But a separate analysis of this study failed to establish that there was a benefit to feeding them such food, rather than a lack of danger. This failure might have been due to issues with the amount of food they were fed, which raises a still-open question about how much allergenic food is best.
For the rest of the alphabet soup of studies, the results were also a little mixed. All of them focused on egg allergies, and most of them showed analogous results to the LEAP study: feeding babies eggs appeared to decrease their risk of developing egg allergies. But the effect was slight in the STAR study, and it was contradicted by the HEAP study. And in both the HEAP study and the otherwise successful PETIT study, multiple children had to be hospitalized due to allergic reactions.
And that brings us to one last important caveat: if you’re a parent of a child at risk of food allergies, don’t just feed them allergenic food. Talk to a doctor first.
Oral immunotherapy can reverse already-existing food allergies.
All of the mixed evidence and caveats notwithstanding, a new consensus is now forming within the field of food allergy science and the larger medical community: when it comes to babies’ diets, simply avoiding allergenic food is a misguided idea, and early introduction is a better approach, generally speaking.
That’s great news if you’re a parent of an infant who hasn’t yet developed any food allergies. But what if you or a loved one already suffers from them? Well, there’s great news for you too. An exciting new approach to food-allergy treatment has entered the medical scene. It’s called oral immunotherapy, or OIT for short.
Remember, a food allergy is basically the result of your immune system getting miseducated about which substances entering the body are friendly, nutritious proteins to let by and which ones are hostile foreign invaders to attack. If you think about it, that points to an obvious way of reversing food allergies: re-educate the immune system.
But how do you do that? Very gradually, is the short answer. An OIT treatment begins by asking the patient to eat a tiny amount of the food he’s allergic to, usually in powdered and carefully sterilized form. The amount then gets slowly increased over a long period of time.
The goal is to desensitize the patient to the allergen. By getting more and more familiar with it, the patient’s immune system becomes less and less prone to perceiving it as a threat. This, in turn, enables the patient to tolerate larger and larger amounts of the food in question.
This process can go on for as long as the patient desires. Some patients just want to reach the point of being able to tolerate trace amounts of the allergen. That way, they don’t have to worry as much about their food being contaminated by it.
For instance, a person with a nut allergy would be able to eat food made in a facility that also handles nuts. Other patients want to go further and be able to tolerate larger quantities of the food – perhaps even an entire serving of it.
Why would someone want to quit the treatment earlier than that? Well, as we’re about to see, OIT isn’t exactly a walk in the park.
OIT is becoming less demanding, dangerous, and time-consuming.
For people with food allergies, there’s good news and bad news about OIT.
Let’s start with some good news. It works. Numerous studies and clinical trials have shown it to be highly effective.
For example, in a landmark 2019 study that treated people with peanut allergies, 84 percent of participants who received OIT were able to safely consume peanut proteins by the end of the treatment period.
But to reach this point, they had to put in a lot of time and undergo some rather painful, potentially harmful experiences. That’s the bad news. Right now, OIT can be difficult to complete. Fortunately, there’s also some more good news waiting in the wings.
The amount of time that OIT takes depends on the goal of the patient. If she just wants to be safe from accidental exposure to an allergen, it takes about six months.
If she wants to be completely desensitized to the allergen and be able to eat full servings of the food in question, it takes about two years. In any case, each treatment session can take a couple of hours or more to complete, and a patient has to attend one of them every two weeks or so. In other words, it requires a huge time commitment.
Not only that, but each treatment session essentially consists of the clinician purposefully trying to bring the patient’s immune system to the brink of triggering an allergic reaction to the food in question. Sometimes, that brink is exceeded, and the patient has to suffer the consequences. In the early days of OIT, that could often mean experiencing severe allergic reactions.
But as OIT progresses and becomes more fine-tuned, those reactions are becoming milder and less frequent. The overall amount of time it takes is also getting shorter and shorter, thanks to a drug called omalizumab, which can speed up the whole process. Several other experimental drugs could help make it even quicker and safer: mepolizumab, reslizumab, benralizumab, and a bunch of other names that end in “zumab.”
Meanwhile, an array of alternative treatments are also in the works, ranging from food allergy vaccines to gene therapy, which aims to directly rewire the immune system itself. The future may be dark in other respects, but it’s getting brighter and brighter for those who suffer from food allergies.
Final Summary
Thanks to some recent advances in science and medicine, food allergies could soon be a thing of the past. Introducing babies to allergenic foods at an early age can help prevent them from ever developing food allergies.
Oral immunotherapy can help other people overcome the food allergies they already have. In both cases, careful exposure, rather than avoidance, appears to be the key to success.