Eating disorders are complex psychosocialbiological illnesses that are a continuous interplay of nature versus nurture. While scientists are interested in uncovering the neurobiological predispositions that lead people to develop eating disorders, psychology is more interested in looking at the personality traits of the sufferer, his/her past experiences, traumas, co-morbid conditions, and interpersonal relationships to determine what may provoke an individual to intentionally starve him/herself. In this article, I will be assessing the sociological forces and cultural influences that trigger individuals to develop eating disorders through an anthropological lens.
The prevalence of eating disorders is on the rise and becoming an increasing problem worldwide. If we can understand the negative aspects of our Western culture that contributes to the development of eating disorders within the upper-middle class white female population, perhaps we can try to change our cultural norms and values that would ultimately prevent people from wanting to starve themselves to death. Eating disorders are a major public health issue. They are the deadliest of any mental illness. A better understanding of sociocultural influences may also be useful in learning how to successfully treat those with eating disorders.
Self-inflicted starvation is not a new phenomenon. People have been starving themselves for centuries. It wasn’t until the 1800s, that it first appeared in the medical literature, and around the 1970s, until it became a part of our social consciousness.
In the 1870s, Dr. William Gull and Dr. Charles Lasegue were the first to document patients who displayed what matches our modern day definition of anorexia nervosa – preoccupation with one’s body size, fear of gaining weight, self-starvation, and significant weight loss not associated with any otherwise explained medical condition. Prior to this, there is no documentation with reference to people starving themselves in an effort to lose weight. Historians are unsure if this is because it either didn’t exist or its signs were attributed to some other type of illness, like hysteria or nervousness (Stryer, 2009, p. 2-5).
Based on this, it is reasonable to conclude that eating disorders of the Biblical era and beyond were probably motivated by other factors than an obsession with thinness, which in itself encompasses its own unique sociocultural factors. The Old Testament mentions that fasting was a common practice used to cleanse oneself from previous sins, a form of self-castigation. People also fasted to purify oneself, prove a devotion to God, and/or to place themselves in a trance-like state which enabled them to receive prophetic visions from God. Moses, for example, fasted for 40 days prior to going to Mount Sinai where he received Ten Commandments from God. Whether or not you believe that the Bible is at all historically accurate, it is undeniably part of the culture during that time period.
During the first to the fourth centuries, men belonging to Gnostic cults were known for fasting in an attempt to deny themselves the physical and material comforts of life. The first known death due to self-inflicted starvation occurred in late fourth century Rome when a 20-year old woman had been fasting under the direction of her priest, St. Jerome. There were also non-religious reasons why people fasted; Greco-Roman physicians recommended fasting and purging with the use of herbs that acted as laxatives and diuretics as a remedy for various physical ailments (Stryer, 2009, p. 2-6).
The incidence of self-starvation, interestingly, decreased during the early Middle Ages when society was in economic decay and there was famine. After the plague, it took a while for Europe to piece itself back together, but by the tenth and eleventh century, people started to prosper again and accounts of self-starvation resurfaced.
This time, it was linked directly to the Catholic Church, which promoted fasting to attain spiritual perfection and holiness. Young women who were able to survive on so little food were labeled Saints. Historians refer to them as “Holy Anorexics.” The Protestant Reformation of the 1500s challenged the idea of ascetic fasting and asserted that one does not need to starve themselves in order to show their devotion to God, but rather to show faith in Jesus as the Messiah. Those who did continue to fast were charged with witchcraft.
By the seventeenth century, the bewitchment theory died down and those who continued to starve themselves were referred to as “Miraculous Maidens”. Unlike the Saints, their fasting had nothing to do with religion; they just starved themselves for presumably unknown reasons. Due to the societal intellectualization of the Scientific Revolution, people began to question how these women were surviving on such little amount of food (Bemporad, 1996).
Dr. Richard Morton (1637-1698) was the first physician to provide a medical description of what we classify today as anorexia nervosa in his Treatise of Consumptions (1694). He spoke of anorexia as “a wasting disease of the muscular parts of the body” due to either two types of “consumption” – original or symptomatical. Original consumption, is directly due to a medical illness, while symptomatical or nervous consumption has no known physical cause. The cases he discussed in his Treatise describes the same symptoms we see in modern day anorexics – loss of menses, pale skin, cold body temperature, fainting spells, etc.
Dr. Robert Whytt (1714-1766) was the first to record the biological changes that occur with severe fasting – e.g. a temperature below 95°F (hypothermia) and a resting heart rate under a BPM of 60 (bradycardia). He called anorexia “nervous atrophy,” which he described as an “unnatural or morbid state of the nerves, of the stomach, and intestines.” He also noted psychological changes that occur during severe fasting, such as melancholy (Silverman, 1987).
Physicians began noting these psychological motivations for self-starvation and blamed it on mental illnesses that were commonly diagnosed at the time, such as hysteria, nervous atrophy, and melancholy. In the 1870s, Gull and Lasegue were the first to note that anorexia nervosa was a disease of in and of itself (Stryer, 2009, p.16-22).
The history of bulimia is bit more complicated than the history of anorexia because the only accounts of intentional vomiting preceded by overeating during Antiquity lacks comparison to the modern bulimic insofar that it may not be describing the same experience. (Garner & Garfinkel, 1997). How we conceptualize bulimia nervosa today wasn’t even medically recognized until 1903 by Pierre Janet, who also linked bulimia to major depressive disorder and anxiety disorders. The diagnostic label, bulimia nervosa was officially coined in 1979 (Pope, Hudson, & Mialet, 1985).
By the Industrial Revolution, society was drastically changing economically, politically, and socially. As a result, there was an emerging middle class that became more prosperous. Family dynamics shifted; men began working in factories to financially support the family while women stayed at home taking care of the children, cooking, and doing household chores. At this point in time, families had more disposable income, which led to the rise of women’s interest in fashionable clothes and accessories. It was further reinforced by capitalism. Ultimately, women became more concerned about her outer beauty.
The waves of the Feminist movement brought about liberal change; it guaranteed women the right to vote, enabled them to pursue an advanced education, and increased female workforce participation. Fashion began to change, too. We shifted from a sexually conservative Victorian era to a sexually explicit post-Industrial era. Women used to wear dresses that consisted of multiple layers of clothing – petticoats, corsets, long skirts, frills, and lace – that attempted to emphasize the hour glass figure by accentuating the waist and hips. In the 1920s, the flapper style dress was introduced, which looked best on small-busted, narrow-waisted, thin women. In order to fit into the dress, some women had to wear a flattening brassiere underneath. As fashion changed over the next century, womens’ clothing became more revealing and necessitated a slim figure (Rollin, 1999).
Marilyn Monroe, a model with womanly curves was known for her beauty in the mid-1900s. In 1965, America was introduced to the stick thin model, Lesley Lawson, who went by the name Twiggy. Since then, the modeling industry has recruited mostly thin models and the definition of beauty has become associated with thinness. This, plus our newfound understanding of the science behind food led to a society obsessed with dieting. Mass media has also played a huge role in spreading this new ideal of stick thin. As a result, the weight loss industry became a multibillion dollar enterprise.
In 1983, the death of singer, Karen Carpenter stunned Americans and made us realize that we, as a society, were out of control with our dieting. People began to blame our cultural ideals of extreme thinness on eating disorders. Although eating disorders are certainly far more complex than an obsession with thinness, there is no doubt that our culture had a destructive influence on the way we view our body. Mortality rates of eating disorders are often hard to calculate because many people who die from eating disorders were never diagnosed to begin with so they cannot be accounted for. Also, eating disorders are never listed on death certificates. Medical complications such as heart failure or suicide are usually listed as the cause of death. In 1995, a researcher calculated the crude mortality rate of anorexics to be 5.9% based on a weighted linear regression that combined crude mortality proportions from 42 previously published studies indicating 178 deaths in 3,006 subjects (Sullivan, P. F.).
In the 2000s, organizations such as the National Eating Disorders Association (NEDA) recognized that there was a serious problem with how our culture worships the skinny ideal and how this is harmful to our mental and physical health. NEDA seeks to change the dialogue about how we talk about our bodies. Their efforts, thus far, has resulted in the modeling industry embracing plus size models and banning any model who is below a BMI of 18.5.
Unfortunately, there is an underground pro-ana/mia movement on the Internet working against organizations like NEDA. This movement seeks to promote eating disorders as a lifestyle choice rather than a disease.
It is useful to view eating disorders through the paradigm of symbolic interactionism. Quite simply, people struggle with food and body image because they attach meaning to it. For those in Biblical times, restriction of food meant closer attainment to God. Today, restriction of food largely symbolizes self-control. Many people who develop eating disorders also struggle with issues of control, obsessive compulsiveness, and rigidity. By restricting their food intake, they gain an illusionary sense of control of their life. People further reinforce this concept by making statements such as, “I’ve been good today” when referring to their diet, as if following a strict diet means that you’ve been good. Interestingly enough, the frequency of eating disorders appears to be directly linked to rates of dieting.
Sociocultural indoctrination for thinness begins in childhood. From a young age, girls are groomed with dolls such as Barbie to illustrate how you must be thin to be successful, beautiful, loved, and worthy. It sends the message that if you aren’t thin, you are ugly, you fail at life, and nobody loves you because you aren’t worth it. This ideal form of beauty is reminiscent of Charles Cooley’s looking glass self and George Herbert Mead’s generalized other. And it is further reinforced through our interactions with each other and the media. Hence, the reason why people starve themselves to achieve these thin ideals relates back to the positive meanings that we as a society have attached to them. In other words, our distorted cultural values of thinness positively reinforces unhealthy eating habits.
In addition, there is a common belief in the United States that “You are what you eat.” People, hence, falsely associate laziness with being overweight or obese. Cultural role-stereotyping of thin and overweight people is a huge problem in our society. Weight stigma happens in schools, the workforce, within interpersonal relationships, etc. Even in the healthcare system, medical doctors have been found guilty to act biased towards their overweight patients, i.e. stereotype them as lazy, unintelligent, and non-compliant and easily become frustrated with them compared to their thinner patients (Puhl, 2013). This follows that the drive to be thin may also be motivated by the desire to be intelligent, hardworking, and successful because of this cultural stereotype.
Thinness can also be symbolically related to one’s socioeconomic status. Up until the twentieth century, well-off people demonstrated their wealth by showing off how much food they could afford to eat. Henry VIII, for example, prided himself in how overweight he was because back then, a high BMI was a symbol of affluence. Now, wealth is displayed by eating smaller portions, exercising, and staying slim. In other words, there is a subtle association between one’s weight and one’s socioeconomic status. Although people are more likely to show off their wealth in terms of their material possessions, the elites in society prefer to be thin. This may explain why the upper and middle class have higher rates of eating disorders compared to the working or lower class.
Feminist theories have emerged in an effort to explain why 9 out of 10 people with eating disorders are statistically women. One theory is that, “Anorexic behavior may represent the negotiation of personal and bodily control in a social context that otherwise relatively restricts opportunities for autonomy among young women. This struggle is ultimately expressed through extreme efforts to control an individual’s body, a restraint that ironically escalates out of control into disorder” (Thompson, J. K. 2004, p.579). This makes sense insofar that women have historically been marginalized and have struggled for their independence. However, in recent decades, women’s opportunities have expanded such that they have become increasingly autonomous. If this premise is correct, then we would expect to see a decrease in women developing eating disorders. To the contrary, we see a significant increase. Therefore, the feminist approach fails at explaining the increased prevalence of eating disorders in our society.
The reason why women develop eating disorders disproportionately compared to men can better be explained by psychology and other sociological paradigms like symbolic interactionism. Feminist theory requires you to accept that women as a whole are repressed in Western society. While there certainly are cases of women being repressed, such as instances of sexual abuse, these cases are individual and thus, should be deferred to psychoanalysis. There is a whole field of research that explores the effects of sexual trauma on body image.
Psychology also postulates that a woman’s attempt to androgynize her appearance by starving herself is perhaps a subconscious way to avoid the mature demands of womanhood. This then, explains why most women who develop eating disorders show initial symptoms in adolescence. It is a stressful reaction to puberty and the demands of adulthood. Anorexics and bulimics consistently report a fear of growing up, even those who are chronologically adults in their 20s and 30s.
In conclusion, self-starvation has existed for centuries, but the reasoning behind it is largely due to sociocultural forces present during specific historical epochs. The earliest cases of self-starvation originated from religious beliefs, while present struggles with food are typically associated with the thin ideal of beauty and its symbolic meanings.
We live in a society where we are constantly brainwashed with this thin ideal. However, it would be foolish to blame the prevalence of eating disorders on the media and skinny models. Eating disorders encompass a variety of sociocultural factors as well as psychological dynamics and biological determinants.
Since beauty ideals are culturally bound and changeable, we have the power to help eliminate eating disorders. Organizations like NEDA can help raise the awareness of eating disorders, change the dialogue about beauty, and shift the conversation about nutrition and exercise to health and wellness instead of weight loss.
Bemporad, J. R. (1996). Self-starvation through the ages: Reflections on the pre-history of anorexia nervosa. International Journal of Eating Disorders, 19(3), 217-237.
Garner, D. M., & Garfinkel, P. E. (1997). Handbook of treatment for eating disorders (2nd ed.). New York: Guilford Press.
Harrison G. Pope, James I. Hudson and Jean-Paul Mialet (1985). Bulimia in the late nineteenth century: the observations of Pierre Janet. Psychological Medicine, 15, pp 739-743.
Morton, R. (1694). Treatise of consumptions: wherein the difference, nature, causes, signs, and cure of all sorts of consumptions are explained. London:
Puhl, R. (2013). Weight discrimination and bullying. Best Practice & Research Clinical Endocrinology & Metabolis, 27(2), 117-127.
Rollin, L. (1999). Twentieth-century teen culture by the decades: a reference guide. Westport, Conn.: Greenwood Press.
Silverman, J. A. (1987). Robert Whytt, 1714-1766, eighteenth century limner of anorexia nervosa and bulimia, an essay. International Journal of Eating Disorders, 6(1), 143-146.
Stryer, S. B. (2009). Anorexia. Santa Barbara, Calif.: Greenwood Press.
Sullivan, P. F. (1995). Mortality in anorexia nervosa. The American journal of psychiatry, 10473-1074.
Thompson, J. K. (2004). Sociocultural Aspects of Eating Disorders. Handbook of eating disorders and obesity (pp. 565-581). Hoboken, N.J.: John Wiley & Sons.