Here’s an idea I found interesting. The author, Bruno Latour, calls it a “plausible fiction.” (emphasis added)
The enlightened elites—they do exist—realized, after the 1990s, that the dangers summed up in the word “climate” were increasing. Until then, human relationships with the earth had been quite stable. It was possible to grab a piece of land, secure property rights over it, work it, use it, and abuse it. The land itself kept more or less quiet.
The enlightened elites soon started to pile up evidence suggesting that this state of affairs wasn’t going to last. But even once elites understood that the warning was accurate, they did not deduce from this undeniable truth that they would have to pay dearly.
Instead they drew two conclusions, both of which have now led to the election of a lord of misrule to the White House: Yes, this catastrophe needs to be paid for at a high price, but it’s the others who will pay, not us; we will continue to deny this undeniable truth.
If this plausible fiction is correct, it enables us to grasp the “deregulation” and the “dismantling of the welfare state” of the 1980s, the “climate change denial” of the 2000s, and, above all, the dizzying increase in inequality over the past forty years. All these things are part of the same phenomenon: the elites were so thoroughly enlightened that they realized there would be no future for the world and that they needed to get rid of all the burdens of solidarity as fast as possible (hence, deregulation); to construct a kind of golden fortress for the tiny percent of people who would manage to get on in life (leading us to soaring inequality); and, to hide the crass selfishness of this flight from the common world, to completely deny the existence of the threat (i.e., deny climate change). Without this plausible fiction, we can’t explain the inequality, the skepticism about climate change, or the raging deregulation.
When it comes to how much water we should drink every day, Chinese medicine teaches that we should drink when we’re thirsty. None of this eight-glasses-of-water-a-day business — a misunderstanding of a 1940s US Food and Nutrition Board recommendation that’s been widely exposed (see How much water do we need?). For those who’ve always believed in drinking when thirsty, there’s no longer a need to be aware of our bodily sensations. We can simply wear digitized clothes that will notify us when we need to drink.
Listen to your shirt. Smart clothing could warn its wearers when they need a drink. Xsensio, based in Switzerland, is developing textiles that look for signs of dehydration by measuring body temperature, sweat and skin conductance. Sensors also take air temperature and humidity into account. As a person becomes weary and thirsty, the shirt will send alerts reminding them to drink – useful for sporty types.
We are told “as the person becomes thirsty the shirt will send alerts reminding them to drink”. Isn’t that what the sensation of thirst does? Talk about pointless, redundant and wasteful technology. For their next trick, how about a hat that reminds you to breathe?
Well, interesting you should mention breathing. Read more
A complaint one often hears about electronic medical records (EMRs) is that the doctor pays more attention to the computer than the patient during an office visit. Among nations using EMR, is this a characteristically American problem?
I read an illuminating letter to the editor recently that compares the doctor/patient/EMR experience in the US and Canada. The letter was from Dr. Alan B. Astrow, a hematologist/oncologist who practices in Brooklyn, NY. He writes: (emphasis added)
Many American physicians agree that recording patient data electronically has interfered with “a deeply human, partly intuitive and empathetic process,” and has led to inefficient care. Since no one wants to revive illegible paper charts, however, the indictment encourages us to ascribe these harms to the price of progress.
A Canadian physician friend, though, says he uses an electronic record that does not disturb his rapport with patients. He also sees more patients hourly than American counterparts without compromising quality.
Why the difference? American physicians must choose from five levels of service when submitting bills. Of necessity, we tend to include extraneous information to justify higher levels and satisfy potential insurance company audits. Canada has only two levels, so doctors’ notes are short and succinct. Read more
[I don’t seem to be able to display that image anymore, but here’s a link to what I’m talking about.]
This superb graphic was created to dramatize what’s happening these days in the UK, where the National Health Service is being ruthlessly privatized. Here in the US, for-profit medicine is so taken-for-granted that we barely notice it. It’s true, we hear a good deal about conflicts of interest involving pharmaceuticals. Doctors get paid — in one way or another — to increase the profits of Big Pharma, a practice that is detrimental to the financial and/or medical interests of their patients. We hear less about scaring healthy patients into using doctors and services that increase hospital profits (also known as fear mongering). So it was nice to see a recent opinion piece in JAMA that discussed precisely this. Read more
The final post in this series on interrogating inequality is about another possible clue as to why we no longer seem to care about inequality. It’s from the book Excellent Sheep by William Deresiewicz.
The role of elite institutions of higher education
In his book, Deresiewicz argues that elite educational institutions reproduce a class system, exacerbate inequality, retard social mobility, and perpetuate privilege. Not only is the elite class that’s created by these institutions “isolated from the society that it’s supposed to lead.” It runs society for its own exclusive benefit. (emphasis added)
Our educational system, it’s been suggested, is what America developed in lieu of a European-style social welfare state to mitigate inequality. Instead of “handouts,” opportunity. And once upon a time, it worked as advertised. Both the unprecedented expansion of public higher education and the equally unprecedented opening of access to the private sort were instrumental in creating a mass middle class, and a new upper and upper middle class, in the decades after World War II. But now instead of fighting inequality, the system has been captured by it.
I mention this not simply as another possible clue, but because the article (Rebooting Social Science) that prompted me to write this series of posts appeared in Harvard Magazine. That may or may not be relevant to the attitude it expresses towards inequality, an attitude I found troubling. Read more
Continuing my discussion of interrogating inequality, here is another post with a possible clue as to how we came not to care. This one considers a rather wide expanse of history
We have neglected to cultivate a culture that cares
I recently struggled through the book Governmentality: Power and Rule in Modern Society by Mitchell M. Dean. The book is very clearly written — the publisher calls it “exceptionally clear and lucid,” and it is. The book is intended, however, for experts already familiar with Foucault’s writings and lectures, particularly those on governmentality.
I frequently found myself in a fog, but I persisted. I was hoping to find ideas that would explain the changes that produced the contemporary self, including why we have become a society that fails to care about increasing inequality. And I did find a brief reference to this development in a section where Dean asks: “Where do our notions of ‘care’ come from?” Why do we think the state should care for the welfare of its citizens? Read more
How did we become a society that passively accepts the injustice and discrimination inherent in inequality? How did we come not to care? It would undoubtedly take me a very long time to adequately address that question, but in this and the next two posts I offer a few small clues.
We are each the stars of our own lives
First up is Pierre Rosanvallon’s recent book The Society of Equals. In a review of the book, Paul Starr mentions what may be an impediment to a society of equals: We see ourselves not simply as individuals, but as unique singularities. (emphasis added in this and the following quotations)
The story that Rosanvallon tells here is that as new forms of knowledge and economic relations have emerged, people have come to think of their situation in less collective ways. Since the 1980s, he writes, capitalism has put “a new emphasis on the creative abilities of individuals,” and jobs increasingly demand that workers invest their personalities in their work. No longer assured of being able to stay at one company, employees have to develop their distinctive qualities—their “brand”—so as to be able to move nimbly from one position to another.
As a result of both cognitive and social change, “everyone implicitly claims the right to be considered a star, an expert, or an artist, that is, to see his or her ideas and judgments taken into account and recognized as valuable.” The demand to be treated as singular does not come just from celebrities. On Facebook and many other online sites millions are saying: here are my opinions, my music, my photos. The yearning for distinction has become democratized.
Rosanvallon does not criticize the society of singularities, with its “right to be considered a star.” Since it’s now a fact of life, we need to figure out how to deal with it. Read more
In a previous post (Interrogating inequality: An annoying article) I discussed an article about a group of interdisciplinary scholars who were “interrogating” the societal consequences of increasing inequality. While the group included individuals with backgrounds in psychology and history, it was dominated by academic scholars who specialized in economics, business, and public policy. (The first three individuals quoted in the article are a professor of business administration, a professor of management practice, and a senior lecturer at Harvard Business School.)
The concluding comments on inequality were offered by a professor of social policy. This particular individual “recently revealed” that he had given up on his long-term research on the social effects of inequality (a project he’d started in the 1960s) because there were no “convincing conclusions.” In other words, research had not been able to provide statistical proof that inequality is in any way harmful to society as a whole. As one of the social scientists put it: (emphasis in original)
The problem is, there is no consensus in the research on the consequences of inequality.
May I suggest that a more significant problem is that social scientists ask the wrong question. As Tony Judt writes (emphasis added): Read more
The previous post, this post, and the next four were provoked by an article that made two assertions I found troubling: one, that there is no consensus among researchers on the consequences of inequality, and two, that evidence of a “causal relationship” between income inequality and health is unclear. In the last post, I discussed those assertions and quoted Daniel Goldberg on whether health behaviors determine health. To continue …
Ground control, we have causation
Over the past few months, since I first read that annoying article, I keep coming across accounts that offer evidence of the harms that result from inequality (particularly in childhood), as well as actions that doctors and politicians are willing to take to address the problem. We’ve known for some time that there was a correlation between poverty and health. Now we’re finally discovering the mechanisms, the causation. Read more
I recently read an article that really annoyed me. It was called “Rebooting Social Science: The interdisciplinary Tobin Project addresses real-world problems.” I began to realize that I wouldn’t see eye to eye with this article when I got to the section that discussed the “real-world problem” of inequality. The section was titled “Interrogating Inequality.” Not “addressing” inequality. Interrogating. Shades of “doubt is our product,” as I’ll explain.
One of the scholars interviewed for this article characterized inequality as “the most contested of contemporary issues.” The evidence cited for said contestation was the lack of agreement on whether inequality contributed to the recent financial crisis. Some claim that it did. Others, however,
dismiss this argument, viewing rising inequality “as little more than a hiccup” or even celebrating it as “a favorable development … in the progress of American capitalism.”
As it turns out, the real issue being “contested” by these “social scientists” (economists, not sociologists) is not whether inequality exists or whether it’s just a hiccup or an inevitability of capitalism. No. (emphasis in the original)
The problem is, there is no consensus in the research on the consequences of inequality.
No consequences? What about childhood trauma, increased rates of disease, shorter lifespans, human dignity? Well, it turns out those things may affect individuals, but what these researchers are looking for are societal consequences. For example, is there a relationship between inequality and economic growth? Evidently, if we cannot detect a decrease in economic growth, there’s no reason to alleviate inequality. And it seems social scientists disagree among themselves about the quality of the evidence on that issue. Read more
Jill Lepore has an article in a recent New Yorker called The Disruption Machine: What the gospel of innovation gets wrong. Her target is Clayton M. Christensen’s book The Innovator’s Dilemma and, specifically, disruptive innovation. As usual with Lepore, her essay is personable and well-argued. What I liked most about it, though, was its brief discussion of how unfortunate it is that professions such as higher education and medicine are being privatized (if they’re not already) and administered to maximize efficiency, making profits more important than students or patients. (emphasis added) Read more
Animal species are going extinct at a rate thousands of times faster than was the case before there were humans. And this is a conservative estimate.
At least half the tortoises and turtles, a third of the amphibians, a quarter of the mammals, and an eighth of the birds on this planet face a risk of extinction in the near future. What’s worse, these numbers apply only to the small fraction of known species whose conservation status has actually been assessed. The overall picture is likely to be much worse.
It’s not just climate change. It’s our way of life.
It’s not just climate change that accounts for the increased rate of species extinction. (emphasis added in the following quotations)
The general tendency of our species—a tendency that seems to be intensifying all the time—is to decrease biological diversity on this planet. We do so by destroying habitats, overconsuming natural resources, and spreading invasive species, willingly or not. It’s tempting to say that this is the cost of consciousness. We like to imagine that cultural diversity is an adequate substitute for biological diversity—for ourselves, if not for other species. It isn’t.
In a recent essay on climate change, Zadie Smith touches on matters not usually mentioned in connection with this topic. “What’s missing from the account,” she says, “is how much of our reaction is emotional.”
Smith is the mother of two young children. She imagines how, in the year 2050, she would explain to a hypothetical granddaughter why previous generations failed to act. (emphasis added)
I don’t expect she will forgive me, but it might be useful for her to get a glimpse into the mindset, if only for the purposes of comprehension. What shall I tell her? Her teachers will already have explained that what was happening to the weather, in 2014, was an inconvenient truth, financially, politically—but that’s perfectly obvious, even now. A global movement of the people might have forced it onto the political agenda, no matter the cost. What she will want to know is why this movement took so long to materialize. So I might say to her, look: the thing you have to appreciate is that we’d just been through a century of relativism and deconstruction, in which we were informed that most of our fondest-held principles were either uncertain or simple wishful thinking, and in many areas of our lives we had already been asked to accept that nothing is essential and everything changes—and this had taken the fight out of us somewhat.
Chris Hayes sometimes gets dismissed as just another commentator on a failing liberal TV network, but I found his book Twilight of the Elites a perceptive, well-written account of how American meritocracy perpetuates inequality.
I especially liked this passage from Hayes’ review:
Why, one might ask, in an economy in which 49 million Americans are poor and the median household income hovers around $51,000, should we care about the psychic plight of 23-year-olds making $90,000? Because these are the people who run our country, and the process by which their own empathetic faculties are destroyed is a key part of how this entire corrupt finance-state is maintained.
The occasion for the rambling reflections on neoliberalism in the previous post was three “perspective” articles on tobacco in a recent issue of The New England Journal of Medicine. Two of them concern the FDA’s attempt to place graphic warnings on cigarette packs. The other is on cigarette smoking among the homeless.
The First Amendment
Placing graphic warnings on cigarette packs was part of the 2009 Family Smoking Prevention and Tobacco Control Act. The tobacco industry sued the FDA (R.J. Reynolds Tobacco Co. v. FDA), claiming the warnings violated the industry’s First Amendment rights. In a case decided last year, the tobacco industry won.
David Orentlicher, in his article The FDA’s Graphic Tobacco Warnings and the First Amendment, writes that the decision is both surprising and not surprising. It’s not surprising “given the Supreme Court’s increased sympathy toward corporations and their First Amendment rights. Regulations of commercial speech often succumb to judicial scrutiny.” It’s surprising because, while the Supreme Court now restricts the government’s power to regulate corporate speech, it has not in the past interfered with the government’s authority when it comes to regulating matters of public health. Evidently, that’s not the case anymore.
The upshot: (emphasis added)
[C]ompanies today are better able to promote their products, and government is less able to promote health than was the case in the past. Ironically, early protection of commercial speech rested in large part on the need to serve consumers’ welfare. In 1976, for example, the Supreme Court struck down a Virginia law that prevented pharmacists from advertising their prices for prescription drugs. The law especially hurt persons of limited means, who were not able to shop around and therefore might not be able to afford their medicines. Today, by contrast, courts are using the First Amendment to the detriment of consumers’ welfare, by invalidating laws that would protect the public health.
This post became much too long, so I’ve divided it into two parts. The first part is mainly about neoliberalism; the second mainly about graphic warnings on cigarette packs (plus smoking among the homeless). When I read, in a recent NEJM article, “The Supreme Court’s increasing sympathy for corporate speech and decreasing deference to public health authorities makes it more difficult for government to protect the public’s health,” my first thought was: What a perfect example of neoliberalism in action.
No one would claim that neoliberalism strives for consistency when implementing its ideals. For example, neoliberalism blames individuals for the health consequences of cigarette smoking (“I cause disease”) and at the same time opposes legislation to reduce cigarette consumption (graphic warnings on cigarette packs). When there is a choice to be made, the deciding factor for neoliberalism will be the efficiency with which wealth can be upwardly redistributed.
Personal responsibility — including personal responsibility for health — is a fundamental principle of neoliberalism. David Harvey writes on this in the context of neoliberalism and labor: (emphasis added in this and subsequent quotations from Harvey)
[L]abour control and maintenance of a high rate of labour exploitation have been central to neoliberalization all along. The restoration or formation of [elite] class power occurs, as always, at the expense of labour.
It is precisely in such a context of diminished personal resources derived from the job market that the neoliberal determination to transfer all responsibility for well-being back to the individual has doubly deleterious effects. As the state withdraws from welfare provision and diminishes its role in arenas such as health care, public education, and social services, which were once so fundamental to embedded liberalism, it leaves larger and larger segments of the population exposed to impoverishment. The social safety net is reduced to a bare minimum in favour of a system that emphasizes personal responsibility. Personal failure is generally attributed to personal failings, and the victim is all too often blamed.
Personal responsibility for health — fundamental to healthism (a frequent topic on this blog) — serves the interests of neoliberalism in a number of ways. It can be used to justify reduced spending on health care and social services by the state. This is desirable in itself, according to neoliberals, but it also increases consumer spending on health care, which in turn benefits the health care, pharmaceutical, and insurance industries. Read more
What is a general health checkup? It’s when you visit a doctor not because of an ongoing chronic condition or because you’re concerned about new, unexplained physical or mental symptoms, but because you want a general evaluation of your health. The assumption behind such a visit is that if you do this regularly, you may prevent a future illness.
A recent issue of JAMA had two articles on general health checkups. One of them asked the question: What are the benefits and harms of general health checks for adult populations? It summarized a 2012 Cochrane review that addresses this question (it was written by three of the four authors of that review). The review concluded that health checkups were not correlated with fewer deaths (reduced mortality), neither deaths from all causes nor from cancer or cardiovascular disease in particular. Health checkups were associated with more diagnoses, more drug treatments, and possible (but probably infrequent) harm from unnecessary testing, treatment, and labeling. Read more
Rick Santorum, responding to Obama’s statement that “the middle class in America has really taken it on the chin,” said that he would never, ever, stoop to using the word “class.” (Dorothy Wickenden in The New Yorker)
Sociologist Annette Lareau has done extensive field work that involves unobtrusively inserting herself (or her field-worker assistants) into the homes and daily lives of families (treat us like “the family dog,” she recommends). Her observations have led her to identify a difference in the parenting styles of families from different social classes. Middle-class families practice what she calls concerted cultivation: parents teach their children skills that prepare them to engage successfully with the social institutions of adult, middle-class life. Working class families value natural growth: parents give their children a great deal of unstructured time in which they must use their own creativity to plan and execute their activities.
Lareau’s work is described in her book Unequal Childhoods: Class, Race, and Family Life. Originally published in 2003, it was updated for a 2011 edition. It’s a wonderful book. I think of it whenever people argue – as they frequently do in the US – that America is the land of equal opportunity, therefore those who fail to exert themselves sufficiently have only themselves to blame.
I’d like to cite two stories from Lareau’s book that relate to health care. Read more
Attending to the social determinants of health is especially important for children, since children’s experiences – of poverty, poor nutrition, trauma, abuse, neglect, the prenatal environment – can affect physical and mental health for an entire lifetime. As the authors of a recent commentary in JAMA write: “Pediatrics … continues to evolve clinical practice aimed at addressing social determinants because of children’s exquisite vulnerability to the deleterious effects of the social and physical environment, especially the aggregation of social factors associated with poverty.”
The occasion for the commentary – titled Addressing the Social Determinants of Health Within the Patient-Centered Medical Home: Lessons From Pediatrics — is the imminent implementation of the Affordable Care Act. The medical home (also known as the patient-centered medical home) is a concept that originated in pediatrics. The basic idea is that when a team of providers — physicians, nurses, nutritionists, pharmacists, social workers – work together, they can best meet the needs of patients. The Affordable Care Act has several provisions designed to establish and promote medical homes, and the authors of this commentary (two pediatricians and a family medicine practitioner) ask: What has pediatrics learned about addressing social determinants that can be translated to medical homes for adults. Read more
I’ve started another blog called Basic research on the self. My intention is to write there about the social and cultural history of the self, aided by insights from sociology, anthropology, philosophy and psychology (especially critical psychology). This is a subject that relates to a number of topics I’ve written about here.
A while back I grouped together my interest in psychopharmaceuticals, cosmetic surgery, happiness/positive psychology, and self-help and labeled these topics “psychological and physical conformity.” When I’ve written about these subjects, I’ve talked about the way things are today. In my new blog, I’d like to step back and ask: How did the society I live in end up valuing self-actualization, self-improvement, and maximized happiness – as well as an impossibly ideal notion of physical appearance — above all else?
That question also relates to a number of my other interests here — healthism, the social determinants of health, inequality, neoliberalism. It’s much easier to convince people they’re personally responsible for their health and well-being (including their socioeconomic status) if they’ve already developed a self-concept based on the ideology of the self-contained, autonomous individual. Read more
This feels encouraging: Two Viewpoint articles in a recent issue of JAMA (The Journal of the American Medical Association) on improving population health (both behind a paywall, unfortunately).
What is population health? Apparently it depends on who you ask. If you ask those with a financial stake in the health care delivery system, population health means improving the health of patients who currently use (i.e., pay for) the system. You get a different answer if you ask those involved in public health, community development, or social services. They believe “population” should include everyone in the entire geographic community, whether or not those individuals are able to use or benefit from health care services. They also believe “health” should include quality of life and economic well-being – measures that prevent disease in the first place – and not just conditions addressed by the medical model of disease.
What I especially liked about Stephen Shortell’s article – Bridging the Divide between Health and Health Care – was its economic realism. I dearly wish that those with a financial interest in the health care industry, as well as politicians who control health policy, would acknowledge that the way to improve health is to address its social determinants. But trying to change the hearts and minds of stakeholders is like pushing against the tide. Read more
Continued from the previous post, where I noted that the Lalonde report — despite its good intentions — was followed by an emphasis on healthy lifestyles and personal responsibility for health, as well as increased health care costs.
Personal responsibility and social class
In Why Are Some People Healthy and Others Not?, Marmor et al, writing in 1994, were disappointed that the Lalonde report had not effectively prompted governments to address the underlying causes of health and disease. One reason for this, they believed, was that health policy reflects public opinion. If the public holds traditional views on what makes us sick (pathogens), what prevents disease (medical care), and what we can do to be healthy (take personal responsibility), new policies that include social determinants are unlikely. Those who are on the forefront of professional, scientific opinion may very well understand the importance of social determinants, but public opinion changes slowly. Without an education program, such as the relatively successful anti-smoking campaign, the public is unlikely to endorse change.
This is certainly true, although I believe there’s also something more fundamental at work here, namely, how a society accounts for the different life outcomes of its citizens. In Unequal Childhoods: Class, Race, and Family Life, Annette Lareau describes the assumptions people make when they hold others personally responsible for their life circumstances. Read more
Continued from the previous post, where I discussed the expansion of universal health care prior to the 1970s, how this created a growing demand for health care, and the problem health care costs posed for governments, especially when the economy suffered a downturn in the seventies. One response to the situation was to consider new ideas. Rather than limit strategies to what could be done by the health care industry, why not directly address the underlying causes of disease by considering social determinants of health.
[the] first modern government document in the Western world to acknowledge that our emphasis upon a biomedical health care system is wrong, and that we need to look beyond the traditional health care (sick care) system if we wish to improve the health of the public.
The US Congress emulated this thinking in 1976 by creating the Office of Prevention and Health Promotion. The US Department of Health, Education, and Welfare began publishing the document Healthy People: The Surgeon General’s Report on Health Promotion and Disease Prevention in 1979. The response in European countries — caught in the same bind of greater demand, increasing costs, and the financial consequences of a deteriorating economic landscape – was similar.
The common thread in these new perspectives on health was the assertion that health could be improved — without increasing health care costs — if we concentrated on such things as the work environment (occupational health), the physical environment (air and water pollution, pesticides and other carcinogens in food), genetics, and healthy lifestyles. The approach was broad: the environment was considered at least as important as the promotion of healthy lifestyles. Read more
In the 1970s, public health policies began to promote the idea that individuals are responsible for their health and therefore have an obligation to adopt healthy lifestyles. Over the ensuing decades, health became both an extremely popular topic for media coverage and a lucrative market for vendors of health-related products and services. What followed was a substantial increase in health consciousness and greater anxiety about all things that concern the body.
Do healthy lifestyles actually produce better health? That they should may seem like common sense, which is one reason it’s been so easy to promote the idea that they do. The question is difficult to answer with absolute certainty, however. For one thing, the behavior that counts towards a healthy lifestyle does not readily lend itself to the objective measurements required for reliable scientific evidence. Defining health is also tricky. Lifespan is often used to compare the ‘health’ of different nations, but this fails to capture the subjective sense of health that is meaningful to individuals. Perhaps most important, while in theory a healthy lifestyle might improve health, that does little good if – as is now obvious – it’s extremely difficult to maintain behaviors that require things like changing what we eat and how often we exercise.
A related question would be: Did the promotion of healthy lifestyles reduce health care costs? This too seems like a sensible assumption, and the assertion is quite popular, especially among politicians. Health care costs have increased to hand-wringing levels. Promoting healthy lifestyles costs governments next to nothing, while the cost of health care is all too easily quantified. Read more
Why is it so hard to convince policy makers worldwide to address the social determinants of health, including poverty, hunger, and income inequality? Judging by the excerpt below, we shouldn’t count on the US to champion this cause any time soon. It’s from a document called “The Future We Want,” issued by the Rio+20 conference last June. The US requested changes to the document, indicated in bold (additions) and strike-outs (deletions). Read more
Corporate medicine may achieve its goal creating greater customer retention, loyalty, and repeat business. Patients are not well-served, however, when the commercialized, privatized business model is applied to health care. The result is superficially satisfied patients who make greater use of the health care system at the expense of their own health. Read more
I was initially attracted to the subject of healthism because I felt I’d been a victim of health messaging. But I was also attracted by a sense that something deeper was going on. I now see that the taken-for-granted – the questions that don’t get asked in media coverage of health issues or in the policy positions of governments — unites my blogging topics. In whose interest is neoliberalism? Medicalization? Conformity? Non-holistic medicine? The commercialization of health? Healthism? More often than not the answer is that it’s not in my interest. Nor is it in the interests of the society I want to live in. And that makes these topics personally meaningful to me. Read more
~ Conformity and corporatism: Surgically altering one’s appearance (e.g., designer feet) presumably increases one’s chance of success in a society that commodifies bodies (i.e., in a society where salary, career advancement, social status and marriage prospects are influenced by appearance). Altering one’s personality with psychopharmaceuticals allows one to project the qualities necessary for success in a highly competitive society. Read more
Social determinants of health (often abbreviated SDOH) refers to unequally distributed social and economic conditions that correlate with unequal and inequitable distributions of health and disease. Presumably there is a causal relationship between the two, not merely a correlation. Definitively identifying the causal mechanisms, however, is difficult. A great many things influence our health, including things we’re not even aware of yet, and it’s difficult to isolate and scientifically study the ones we can identify. Read more