Food and Behaviour Research

Donate Log In

Choline - A Neglected Nutrient Vital for Healthy Brains - BOOK HERE

Is sugar the world's most popular drug?

by Gary Taubes

Sugar - sharon-mccutcheon-oKay0q7Pa30-unsplash.jpg

It eases pain, seems to be addictive and shows every sign of causing long-term health problems. Is it time to quit sugar for good?


5 January 2017 - The Guardian

This article gives a thoughtful and comprehensive account of the numerous scientifically valid arguments for classifying sugar as an addictive substance.

It also explains the huge difficulties facing those who are trying to get this achieved - most of which boil down to three things:

1) the massive influence that 'the food industry' exerts in influencing this debate 

2) the impossibility of 'proving' that sugar is addictive IF the level of evidence demanded involves 'randomised controlled clinical trials in humans'

However, it was equally impossible to carry out such trials to 'prove' that smoking causes lung cancer (and a host of other serious health problems) 

3) the fact that denial is ALSO a key characteristic of addiction

See also:

And for more information on this stopic, see:

5 January 2017 - The Guardian


Imagine a drug that can intoxicate us, can infuse us with energy and can be taken by mouth. It doesn’t have to be injected, smoked, or snorted for us to experience its sublime and soothing effects. Imagine that it mixes well with virtually every food and particularly liquids, and that when given to infants it provokes a feeling of pleasure so profound and intense that its pursuit becomes a driving force throughout their lives.

Could the taste of sugar on the tongue be a kind of intoxication? What about the possibility that sugar itself is an intoxicant, a drug?

Overconsumption of this drug may have long-term side-effects, but there are none in the short term – no staggering or dizziness, no slurring of speech, no passing out or drifting away, no heart palpitations or respiratory distress. When it is given to children, its effects may be only more extreme variations on the apparently natural emotional rollercoaster of childhood, from the initial intoxication to the tantrums and whining of what may or may not be withdrawal a few hours later.

More than anything, it makes children happy, at least for the period during which they’re consuming it. It calms their distress, eases their pain, focuses their attention and leaves them excited and full of joy until the dose wears off. The only downside is that children will come to expect another dose, perhaps to demand it, on a regular basis.

How long would it be before parents took to using our imaginary drug to calm their children when necessary, to alleviate discomfort, to prevent outbursts of unhappiness or to distract attention? And once the drug became identified with pleasure, how long before it was used to celebrate birthdays, a football game, good grades at school?

How long before no gathering of family and friends was complete without it, before major holidays and celebrations were defined in part by the use of this drug to assure pleasure?

How long would it be before the underprivileged of the world would happily spend what little money they had on this drug rather than on nutritious meals for their families?

There is something about the experience of consuming sugar and sweets, particularly during childhood, that readily invokes the comparison to a drug. I have children, still relatively young, and I believe raising them would be a far easier job if sugar and sweets were not an option, if managing their sugar consumption did not seem to be a constant theme in our parental responsibilities.

Even those who vigorously defend the place of sugar and sweets in modern diets – “an innocent moment of pleasure, a balm amid the stress of life”, as the journalist Tim Richardson has written – acknowledge that this does not include allowing children “to eat as many sweets as they want, at any time”, and that “most parents will want to ration their children’s sweets”.

But why is this rationing necessary? Children crave many things – Pokémon cards, Star Wars paraphernalia, Dora the Explorer backpacks – and many foods taste good to them. What is it about sweets that makes them so uniquely in need of rationing?

This is of more than academic interest, because the response of entire populations to sugar has been effectively identical to that of children: once people are exposed, they consume as much sugar as they can easily procure. The primary barrier to more consumption – up to the point where populations become obese and diabetic – has tended to be availability and price. As the price of a pound of sugar has dropped over the centuries, the amount of sugar consumed has steadily, inexorably climbed.

In 1934, while sales of sweets continued to increase during the Great Depression, the New York Times commented: “The Depression [has] proved that people wanted candy, and that as long as they had any money at all, they would buy it.” During those brief periods of time during which sugar production surpassed our ability to consume it, the sugar industry and purveyors of sugar-rich products have worked diligently to increase demand and, at least until recently, have succeeded.

The critical question, as the journalist and historian Charles C Mann has elegantly put it, “is whether [sugar] is actually an addictive substance, or if people just act like it is”.

This question is not easy to answer. Certainly, people and populations have acted as though sugar is addictive, but science provides no definitive evidence. Until recently, nutritionists studying sugar did so from the natural perspective of viewing it as a nutrient – a carbohydrate – and nothing more. They occasionally argued about whether or not it might play a role in diabetes or heart disease, but not about whether it triggered a response in the brain or body that made us want to consume it in excess. That was not their area of interest.

The few neurologists and psychologists interested in probing the sweet-tooth phenomenon, or why we might need to ration our sugar consumption so as not to eat too much of it, did so typically from the perspective of how these sugars compared with other drugs of abuse, in which the mechanism of addiction is now relatively well understood.

Lately, this comparison has received more attention as the public-health community has looked to ration our sugar consumption as a population, and has thus considered the possibility that one way to regulate these sugars – as with cigarettes – is to establish that they are, indeed, addictive. These sugars are very probably unique in that they are both a nutrient and a psychoactive substance with some addictive characteristics.

Historians have often considered the sugar-as-a-drug metaphor to be an apt one. “That sugars, particularly highly refined sucrose, produce peculiar physiological effects is well known,” wrote Sidney Mintz, whose 1985 book Sweetness and Power is one of two seminal English-language histories of sugar. But these effects are neither as visible nor as long-lasting as those of alcohol or caffeinated drinks, “the first use of which can trigger rapid changes in respiration, heartbeat, skin colour and so on”.

Mintz has argued that a primary reason sugar has escaped social disapproval is that, whatever conspicuous behavioural changes may occur when infants consume sugar, it did not cause the kind of “flushing, staggering, dizziness, euphoria, changes in the pitch of the voice, slurring of speech, visibly intensified physical activity or any of the other cues associated with the ingestion” of other drugs.

Sugar appears to cause pleasure with a price that is difficult to discern immediately and paid in full only years or decades later. With no visible, directly noticeable consequences, as Mintz says, questions of “long-term nutritive or medical consequences went unasked and unanswered”.

Most of us today will never know if we suffer even subtle withdrawal symptoms from sugar, because we’ll never go long enough without it to find out.

Sugar historians consider the drug comparison to be fitting in part because sugar is one of a handful of “drug foods”, to use Mintz’s term, that came out of the tropics, and on which European empires were built from the 16th century onward – the others being tea, coffee, chocolate, rum and tobacco.

Its history is intimately linked to that of these other drugs. Rum is distilled, of course, from sugar cane. In the 17th century, once sugar was added as a sweetener to tea, coffee and chocolate, and prices allowed it, the consumption of these substances in Europe exploded. Sugar was used to sweeten spirits and wine in Europe as early as the 14th century; even cannabis preparations in India and opium-based wines and syrups contained sugar.

As for tobacco, sugar was, and still is, a critical ingredient in the American blended-tobacco cigarette, the first of which was Camel. It’s this “marriage of tobacco and sugar”, as a sugar-industry report described it in 1950, that makes for the “mild” experience of smoking cigarettes as compared with cigars and, perhaps more important, makes it possible for most of us to inhale cigarette smoke and draw it deep into our lungs.

Unlike alcohol, which was the only commonly available psychoactive substance in the old world until they arrived, sugar, nicotine and caffeine had at least some stimulating properties, and so offered a very different experience, one that was more conducive to the labour of everyday life. These were the “18th-century equivalent of uppers”, writes the Scottish historian Niall Ferguson. “The empire, it might be said, was built on a huge sugar, caffeine and nicotine rush – a rush nearly everyone could experience.”

Sugar, more than anything, seems to have made life worth living (as it still does) for so many, particularly those whose lives lacked the kind of pleasures that relative wealth and daily hours of leisure might otherwise provide.

Sugar was “an ideal substance”, says Mintz. “It served to make a busy life seem less so; it eased, or seemed to ease, the changes back and forth from work to rest; it provided swifter sensations of fullness or satisfaction than complex carbohydrates did; it combined with many other foods … No wonder the rich and powerful liked it so much, and no wonder the poor learned to love it.”

What Oscar Wilde wrote about a cigarette in 1891 might also be said about sugar: It is “the perfect pleasure. It is exquisite, and it leaves one unsatisfied. What more can one want?”

Children certainly respond to sugar instantaneously. Give babies a choice of sugar water or plain, wrote the British physician Frederick Slare 300 years ago, and “they will greedily suck down the one, and make Faces at the other: Nor will they be pleas’d with Cows Milk, unless that be bless’d with a little Sugar, to bring it up to the Sweetness of Breast-Milk”.

One proposition commonly invoked to explain why the English would become the world’s greatest sugar consumers and remain so through the early 20th century – alongside the fact that the English had the world’s most productive network of sugar-producing colonies – is that they lacked any succulent native fruit, and so had little previous opportunity to accustom themselves to sweet things, as Mediterranean populations did. The sweet taste was more of a novelty to the English, and their first exposure to sugar occasioned a population-wide astonishment.

This is speculation, however, as is the notion that the taste of sugar will soothe distress and stop infants crying, or that consuming sugar will allow adults to work through pain and exhaustion and to assuage hunger pains.

If sugar, though, is only a distraction to the infant and not actively a pain reliever or a psychoactive inducer of pleasure that overcomes any pain, we have to explain why, in clinical trials, it is more effective in soothing the distress of infants than the mother’s breast and breast milk itself.

Research literature on the question of whether sugar is addictive and thus a nutritional variant on a drug of abuse is surprisingly sparse. Until the 1970s, and for the most part since then, mainstream authorities have not considered this question to be particularly relevant to human health. The very limited research allows us to describe what happens when rats and monkeys consume sugar, but we’re not them and they’re not us. The critical experiments are rarely if ever done on humans, and certainly not children, for the obvious ethical reasons: we can’t compare how they respond to sugar, cocaine and heroin, for instance, to determine which is more addictive.

Sugar does induce the same responses in the region of the brain known as the “reward centre” as nicotine, cocaine, heroin and alcohol. Addiction researchers have come to believe that behaviours required for the survival of a species – specifically, eating and sex – are experienced as pleasurable in this part of the brain, and so we do them again and again.

Sugar stimulates the release of the same neurotransmitters – dopamine in particular – through which the potent effects of these other drugs are mediated. Because the drugs work this way, humans have learned how to refine their essence into concentrated forms that heighten the rush. Coca leaves, for instance, are mildly stimulating when chewed, but powerfully addictive when refined into cocaine; even more so taken directly into the lungs when smoked as crack cocaine. Sugar, too, has been refined from its original form to heighten its rush and concentrate its effects.

The more we use these substances, the less dopamine we produce naturally in the brain. The result is that we need more of the drug to get the same pleasurable response, while natural pleasures, such as sex and eating, please us less and less.

“There is little doubt that sugar can allay the physical craving for alcohol,” the neurologist James Leonard Corning observed over a century ago. The 12-step bible of Alcoholics Anonymous recommends the consumption of sweets and chocolate in lieu of alcohol when the cravings for drink arise. Indeed, the per capita consumption of sweets in the US doubled with the beginning of prohibition in 1919, as Americans apparently turned en masse from alcohol to sweets.

Sugar and sweets inexorably came to saturate our diets as the annual global production of sugar increased exponentially.

By the early 20th century, sugar had assimilated itself into all aspects of our eating experience, and was being consumed during breakfast, lunch, dinner and snacks. Nutritional authorities were already suggesting what appeared to be obvious: that this increased consumption was a product of at least a kind of addiction – “the development of the sugar appetite, which, like any other appetite – for instance, the liquor appetite – grows by gratification”.

A century later still, sugar has become an ingredient in prepared and packaged foods so ubiquitous it can only be avoided by concerted and determined effort. There is sugar not just in the obvious sweet foods – cookies, ice creams, chocolates, fizzy drinks, sodas, sports and energy drinks, sweetened iced tea, jams, jellies and breakfast cereals – but also in peanut butter, salad dressing, ketchup, barbecue sauces, canned soups, processed meats, bacon, hot dogs, crisps, roasted peanuts, pasta sauces, tinned tomatoes and breads.

From the 1980s onwards, manufacturers of products advertised as uniquely healthy because they were low in fat, or specifically in saturated fat, took to replacing those fat calories with sugar to make them equally, if not more, palatable – often disguising the sugar under one or more of the 50 names by which the combination of sugar and high-fructose corn syrup might be found.

Fat was removed from candy bars so that they became “health-food bars”, in spite of added sugar. Fat was removed from yoghurts and sugars added, and these became “heart-healthy snacks”. It was as though the food industry had decided en masse that, if a product wasn’t sweetened at least a little, our modern palates would reject it and we would purchase instead a competitor’s version that was.

For those of us who don’t reward our existence with a drink (and for many of us who do), it’s a chocolate bar, a dessert, an ice-cream cone or a Coke (or Pepsi) that makes our day. For those of us who are parents, sugar and sweets have become the tools we wield to reward our children’s accomplishments, to demonstrate our love and our pride in them, to motivate them, to entice them.

The common tendency is, again, to think of this transformation as driven by the mere fact that sugars and sweets taste good. The alternative way to think about this is that sugar took over our diets because the first taste, whether for an infant today or for an adult centuries ago, is a kind of intoxication; it’s the kindling of a lifelong craving, not identical but analogous to the effect of other drugs of abuse.
Because it is a nutrient, and because the conspicuous ills connected to its consumption are benign compared with those of nicotine, caffeine and alcohol – at least in the short term and in small doses – sugar remained nearly invulnerable to moral, ethical or religious attacks. It also remained invulnerable to attacks on grounds of damage to health.

Nutritionists have found it in themselves to blame our chronic ills on virtually any element of the diet or environment – on fats and cholesterol, on protein and meat, on gluten and glycoproteins, growth hormones and oestrogens and antibiotics, on the absence of fibre, vitamins and minerals, and surely on the presence of salt, on processed foods in general, on over-consumption and sedentary behaviour – before they’ll concede that it’s even possible that sugar has played a unique role in any way other than merely getting us all to eat too damn much.

And so, when a few informed authorities over the years did indeed risk their credibility by suggesting sugar was to blame, their words had little effect on the beliefs of their colleagues or on the eating habits of a population that had come to rely on sugar and sweets as the rewards for the sufferings of daily life.

So how do we establish a safe level of sugar consumption? In 1986, the US Food and Drug Administration (FDA) concluded that most experts considered sugar safe. And when the relevant research communities settled on caloric imbalance as the cause of obesity and saturated fat as the dietary cause of heart disease, the clinical trials necessary to begin to answer this question were never pursued.

The traditional response to the how-little-is-too-much question is that we should eat sugar in moderation – not eat too much of it. But we only know we’re consuming too much when we’re getting fatter or manifesting other symptoms of insulin resistance and metabolic syndrome.

Insulin resistance is the fundamental defect present in type 2 diabetes, and perhaps obesity too. Those who are obese and diabetic also tend to be hypertensive; they have a higher risk of heart disease, cancer and strokes, and possibly dementia and even Alzheimer’s as well.

If sugar and high-fructose corn syrup are the cause of obesity, diabetes and insulin resistance, then they’re also the most likely dietary trigger of these other diseases. Put simply: without these sugars in our diets, the cluster of related illnesses would be far less common than it is today.

Metabolic syndrome ties together a host of disorders that the medical community typically thought of as unrelated, or at least having separate and distinct causes – including obesity, high blood pressure, high blood sugar and inflammation – as products of insulin resistance and high circulating insulin levels. Regulatory systems throughout the body begin to misbehave, with slow, chronic, pathological consequences everywhere.

Once we have observed the symptoms of consuming too much sugar, the assumption is that we can dial it back a little and be fine – drink one or two sugary beverages a day instead of three; or, if we’re parenting, allow our children ice cream on weekends only, say, rather than as a daily treat. But if it takes years or decades, or even generations, for us to get to the point where we display symptoms of metabolic syndrome, it’s quite possible that even these apparently moderate amounts of sugar will turn out to be too much for us to be able to reverse the situation and return us to health. And if the symptom that manifests first is something other than getting fatter – cancer, for instance – we’re truly out of luck.

The authorities who argue for moderation in our eating habits tend to be individuals who are relatively lean and healthy; they define moderation as what works for them. This assumes that the same approach and amount will have the same beneficial effect on all of us. If it doesn’t, of course, if we fail to remain lean and healthy or our children fail to do so, the assumption is that we’ve failed – we ate too much sugar, or our children did.

If it takes 20 years of consuming sugar for the consequences to appear, how can we know whether we’ve consumed too much before it’s too late? Isn’t it more reasonable to decide early in life (or early in parenting) that not too much is as little as possible?

Any discussion of how little sugar is too much also has to account for the possibility that sugar is a drug and perhaps addictive. Trying to consume sugar in moderation, however it’s defined, in a world in which substantial sugar consumption is the norm and virtually unavoidable, is likely to be no more successful for some of us than trying to smoke cigarettes in moderation – just a few a day, rather than a whole pack.

Even if we can avoid any meaningful chronic effects by cutting down, we may not be capable of managing our habits, or managing our habits might become the dominant theme in our lives. Some of us certainly find it easier to consume no sugar than to consume a little – no dessert at all, rather than a spoonful or two before pushing the plate to the side.

If sugar consumption is a slippery slope, then advocating moderation is not a meaningful concept.

In my own mind, I keep returning to a few observations – unscientific as they may be – that make me question the validity of any definition of moderation in the context of sugar consumption.

The roots of the modern discussion on sugar and disease can be traced to the early 1670s. Thomas Willis, medical adviser to the duke of York and King Charles II, noted an increase in the prevalence of diabetes in the affluent patients of his practice. “The pissing evil”, he called it, and became the first European physician to diagnose the sweet taste of diabetic urine – “wonderfully sweet like sugar or hon[e]y”.

Willis’s identification of diabetes and the sweetness of the urine happens to coincide with both the first flow of sugar into England from its Caribbean colonies, and the first use of sugar to sweeten tea.

Other observations that resonate with me when I wrestle with the concept of moderation include one of Frederick Slare’s comments in 1715, in his article “Vindication of Sugars Against the Charges of Dr Willis”.

At a time when sugar was just beginning to be more widely consumed in England, Slare noted that women who cared about their figures but were “inclining to be too fat” might want to avoid sugar, because it “may dispose them to be fatter than they desire to be”.

When Slare made his observation, the English were consuming, on average, perhaps 5lb of sugar a year. The US FDA research suggests we now consume 42lb a year.

We have to acknowledge that the evidence against sugar is not definitive, compelling though I personally find it to be.

Let’s say we randomly assigned individuals in our population to eat a modern diet with or without sugar in it. Since virtually all processed foods have sugar added or, like most breads, are made with sugar, the population that is asked to avoid sugar would simultaneously be avoiding virtually all processed foods as well.

They would dramatically reduce their consumption of what journalist Michael Pollan, author of books on food, agriculture and drugs, has memorably called “food-like substances”, and if they were healthier, there would now be a host of possible reasons why. Maybe they ate fewer refined grains of any type, less gluten, fewer trans fats, preservatives or artificial flavourings? We would have no practical way to know for sure.

We could try to reformulate all these foods so that they are made without sugar, but then they won’t taste the same – unless, of course, we replace the sugar with artificial sweeteners. Our population randomised to consume as little sugar as possible is likely to lose weight, but we won’t know if it happened because they ate less sugar, or fewer calories of all sorts.

Indeed, virtually all dietary advice suffers from this same complication: whether you’re trying to avoid gluten, trans fats, saturated fats or refined carbohydrates of all types, or just trying to cut calories – eat less and eat healthily – an end result of this advice is that you’re often avoiding processed foods containing sugar and a host of other ingredients.

Artificial sweeteners as a replacement for sugar muddy these waters even more. Much of the anxiety about these sweeteners was generated in the 60s and 70s by the research, partly funded by the sugar industry, that led to the banning of the artificial sweetener cyclamate as a possible carcinogen, and the suggestion that saccharin could cause cancer (at least in rats, at extraordinarily high doses). Though this particular anxiety has faded with time, it has been replaced by the suggestion that maybe these artificial sweeteners can cause metabolic syndrome, and thus obesity and diabetes.

This suggestion comes primarily from epidemiological studies that show an association between the use of artificial sweeteners and obesity and diabetes. But it is likely that people who are predisposed to gain weight and become diabetic are also the people who use artificial sweeteners instead of sugar.

As Philip Handler, then head of the US National Academies of Sciences, suggested in 1975, what we want to know is whether using artificial sweeteners over a lifetime – or even a few years or decades – is better or worse for us than however much sugar we would have consumed instead. It’s hard for me to imagine that sugar would have been the healthier choice. If the goal is to get off sugar, then replacing it with artificial sweeteners is one way to do it.

The research community can certainly do a much better job than it has in the past of testing all these questions. But we may have a very long wait before the public-health authorities fund such studies and give us the definitive answers we seek. What do we do until then?

Ultimately, the question of how much is too much becomes a personal decision, just as we all decide as adults what level of alcohol, caffeine or cigarettes we’ll ingest.

Enough evidence exists for us to consider sugar very likely to be a toxic substance, and to make an informed decision about how best to balance the likely risks with the benefits. To know what those benefits are, though, it helps to see how life feels without sugar.

Former cigarette smokers (of which I am one) will tell you that it was impossible for them to grasp intellectually or emotionally what life would be like without cigarettes until they quit; that through weeks or months or even years, it was a constant struggle. Then, one day, they reached a point at which they couldn’t imagine smoking a cigarette and couldn’t imagine why they had ever smoked, let alone found it desirable.

A similar experience is likely to be true of sugar – but until we try to live without it, until we try to sustain that effort for more than days, or just a few weeks, we’ll never know.

This is an edited extract from The Case Against Sugar, published by Portobello Books (£14.99). To order a copy for £12.29 go to or call 0330 333 6846.