Why cats make the worst patients (and the dog ate my homework)

Charlie stopping to smell the flowers in healthier times

Charlie stopping to smell the flowers in healthier times

Charlie, one of the cats, was seriously ill and Lyme Disease (which was the designated subject for this post) went clear out of my head. It shall return. Meanwhile, I’ve been nursing Charlie, aka Houdini cat (who will literally disappear into the towel you think you’ve wrapped around him securely), reminding myself that nursing is a noble, noble profession. (That’s what you call professions that are bloody hard and nobody appreciates.) I’ll say one thing, taking a cat to the animal hospital does give one a quick lesson in the perils of for-profit medicine (my Visa may never recover) – especially in our risk-obsessed age where tests and scans trump individual history, personality and symptoms (human or animal). It also reminded me that one must be vigilant when faced with the ponderousness of Expertise.

In Charlie’s case it began with a neurological condition called Horner’s, an irritation of a nerve running down one side of the face, eye and down the neck and into the chest – not a disease but a symptom. Naturally Expertise immediately rushed to the worst possible diagnosis: lymphoma, or, in a pinch, brain tumour. (Do not pass ‘go’, just head for the hills.) I mildly posited inflammation or infection, probably ear related, particularly since Charlie’s had those before. But noooo.

Critical Care, human or animal, is rife with Expertise: grave, gravel toned and confident. Why? Because they have tech toys, that’s why. Cool devices and imaging technologies that purport to explain the mysteries of life. Even (ha ha) a cat scan. All of which push the patient into ever higher levels of care – because they can. Problem is, the patient often can’t.

I tried to hold my ground but it’s a slippery slope that one; the surer they are the more one caves, especially when they start to say, well, with cats elevated white blood cell count could mean X. I mean, what do I know from cat physiology?

So the cash register tinged and Charlie looked steadily worse. Of course nobody looks good in ICU between the ugly fluorescent light and the tubes but there’s something especially pathetic about a small furry creature sitting in a cage. And Charlie, well that cat could have taught Stanislavsky a thing or two about looking sad.

I kept getting calls to tell me things I already knew (he has a heart murmur). The last time I snapped, “I know. I have one too. Big deal.” That didn’t, naturally stop them from getting a cardiology consult. Bearing in mind that cats don’t hold still for much of these so need to be anaesthetized.

Finally, after every possible dire diagnosis had been ruled out, we came round to my original hypothesis: ear infection.

Don’t get me wrong. I have enormous respect for veterinary physicians. They study long and hard (far longer than human doctors) and by and large they are great. They deal with a diverse patient population who’s uncooperative and uncommunicative. And when I say diverse I’m talking species. And they need to make a living, I get that.

What they, and most of us, do not get however is that they are part of the culture at large and the culture at large is obsessed with the “science “ of medicine, leaving the art further and further behind. Watching Charlie work his way through the system reminded me of just how much medical focus has shifted away from the patient and towards disease, technology; towards what tend to be called “objective” results (versus the messy subjective ones patients bring).

I see this on a human level very time I go to the retinologist with my mother (that, by the bye, is a sub-specialty of ophthalmology). First, they get her to read the letters on the chart and are all impressed at how well she sees. Then they take their pictures and look grave: how could she possibly see that well with those terrible ridges in her retina? (To me they just look like the Alps.) Then they look puzzled. Scan says you can’t but you actually did see. What a gonzo dilemma. So, they go with the scan and give her the medication. Objective trumps subjective.

Question is, should it? Does it make sense for the patient to get lost in this morass of ostensibly objective ‘data’?

Not to my way of thinking. “Normal” – blood pressure, lipid level, whatever – is a best-guess average based on population statistics and what some committee has deemed appropriate. If you’re truly sick it shows up. C-reactive protein in the clouds – well, objective and subjective tend to match. Your joints hurt, you have some kind of inflammatory condition and the test backs you up. It’s that grey zone that’s problematic. Levels fluctuate in every individual, tests can be wrong (some more than others).  Error rates in some tests are as high as 75%. But we forget that.

So, cat or human we are lumped in with the many-too-many – and our individual narrative gets lost. In Charlie’s case nobody believed this pretty little cat who had only been ailing for a week could possibly “just” have a madly inflamed  ear affecting his balance and appetite. An infection is no picnic. But it’s not a brain tumour. And of course Charlie’s Oscar winning ability to look mournful didn’t help. This cat can look sad when he feels ignored; imagine how dreadful he looked when he was dizzy and queasy. It’s a gift. But it’s not diagnostic.

You need a proper history; the back story. The person with the disease is as important as the disease, said Hippocrates. Let’s say you end up in hospital with severe abdominal pain. It matters whether you’ve had this pain before, but less intense or of shorter duration. Sudden abdominal pain could be many dire things; a worsening of an existing problem is probably nothing that will kill you (otherwise you wouldn’t be in the ER in the first place). The clinical picture changes with the history. Someone has to factor it in.

Charlie’s doing better now. As for the rest of us – who knows. We may never survive the tech age.

Summer Reruns

Everyone may stream their entertainment on their teeny tiny phones but that’s just the tech; without a good story summer still means reruns.

So in the fine old tradition of reruns, I give you  a recap of the summertime blues.

(Coming soon, Lyme Disease. Stay tuned.)

Is there an epidemiologist in the house? *

There’s an appalling advert for one of the adjunct health unions/associations/whatever where someone collapses in a restaurant and the doctor starts to call for various “technologists” (x-ray, CT scan, etc.). I don’t doubt their word that health care today is complex; still, call me crazy but if I collapsed somewhere I’d really rather have an actual health professional, a clinician, by which I mean a nurse or physician, at my side than someone who knows how to operate an ultrasound machine thank you very much. Remember, machines don’t “know” whether something is a concern or not – it’s people, actual humans, who make that determination.

A determination that all too often relies on over-optimistic beliefs regarding the accuracy of “tests”.

(But as neurologist and author Oliver Sachs once sadly remarked: They don’t give Nobel prizes to clinicians, only medical researchers.)

Worse, everything from the images and data we get from those machines not to mention the health information that’s flung at us from all sides is based on statistics. More accurately, a statistical approximation of “normal”. The normal person, whatever or whomever that may be. Another term, whether one is being statistically accurate or not, is “average”.

Now I don’t know about you but I’ve never met an “average” person. Everyone I know is distinctive, sometimes eccentric, often times interesting, funny and, well, different. People are a jumble of ethnicities, backgrounds, socio-economic and otherwise; their education and passions and hobbies and interests differ as does everything from their diets to their bad habits. Er, risky behaviours to the epidemiologists in the house.

So here we all are, contorting ourselves into bizarre shapes trying to fit into the statistical moulds they’d have us fit, from the not-so-benign lipid levels and blood pressure (for which drugs are available should one not conform to aforesaid norm) to clothing sizes and availability in everything from lipstick colours to food. Oh, yeah, tell me you haven’t noticed that your favorite kind of frozen chips appeals to you and six other people so it’s been discontinued.

From supplements to ideal weight, glucose and you-name-it, normal follows us around like some malevolent mosquito, buzzing in our ear and biting us in the you-know-what when we try to ignore it. Whether it’s Dr. Google or the news items on everything from your phone to your TV.

But when we’re feeling off, or sick or have had something bad happen what we want and need is a clinician: someone who knows how to set that broken bone, do that tracheotomy, or CPR and know just how much morphine to prescribe so we’ll keep breathing. Unfortunately, the spate of bad health news out there makes us all so nervous that all to often when we do end up at the clinic or the ER we’ve got nothing more terrifying than bronchitis or a particularly bad bout of cystitis. Not for us former generations stoicism; we race over ‘just in case’ for everything from a sore knee to a cough.

An American chap once disparagingly told me Canadian health care was simply dreadful. How did he know? Well, when he lived in Montreal he had a bad cold. One assumes in winter when people get those in cold climates. Then, late Saturday night he decides to head over to the Emerg because his cough was worrying him. Could he breathe? Yes. Was he running a temp? No. But he went anyway. And couldn’t figure out why the ER staff didn’t rush him to have tests and x-rays. Ah, d’you think you could have picked a less busy time? Of course not.

Where’s an epidemiologist when you need one?

* Not my line, though I wish it were. I read it in Gordon Clark’s column in The Vancouver Province on July 8. Laughed out loud as a matter of fact.

They got stones, I’ll give you that

I was going to call this post “nobody knows the trouble I seen” except that it seems ludicrously self indulgent to whine that one has been living through construction hell when the rest of the world has revolutions, civil wars, hurricanes and so on to contend with. (But, to paraphrase Will Rogers, everything is manageable provided it’s happening to someone else.) This isn’t to say my curmudgeonly instincts have been dormant . One particular item a while back had me seething.

“Sugary drinks are not so sweet”  was the headline in the Health section of the Globe and Mail (24 May 2013).  Apparently, drinking a sugar-sweetened drink a day not only rots your teeth and adds up to empty calories (with the added bonus that it makes New York’s Mayor Bloomburg crazy) but “may increase the risk of kidney stones”.  Gasp. I had to pause to take a sip of my ginger ale* while that sank in.

I puzzed and I puzzed, to reference one of my favorite curmudgeons, the Grinch. Didn’t make sense. How on earth could fructose cause blobs of crystallized minerals to form in the kidney? (To reinforce the point that sugary drinks are Evil the accompanying photo was of a surgeon with a scalpel. Someone had fun with that.)

The research cited was from 2007, published in a journal called Kidney International (2008, 73; 207-212).  The worthiest journal nobody’s ever heard of.  My curiosity got the better of me and I downloaded the article and read through the cringe-inducing prospective study; yet another data-mining expedition hoping to find a “link” between X and Y. (For more on my distaste for the term, see post.) The data? From – wait for it – the appalling Nurses’ Health Study, formerly used to “prove” that taking estrogen was just a boffo idea.  Here, the research cites some 19,000+ women along with some 46,000 men from the Health Professionals Follow-up Study. Impressive numbers. Pity the hypothesis is so feeble.

Not of course to our heroes, researchers Taylor and Curhan, unspecified experts at a renal division/lab at Brigham Young and Harvard who engage in enough statistical jiggery-pokery to make the world go round.  (Pity nobody blinks when data gets tortured.)

Just a few problems here. As I explained, in often far too exhaustive detail in The Estrogen Errors, extrapolating to the general population from the Nurse’s Study is massively problematic. For starters, there’s its basic design, bi-annual self reports, which are notoriously unreliable. We’re all prone to error as any gibbon with half a brain knows: we forget, lie and generally get things wrong. Good grief, most of us stutter when they ask us how much we weigh when we get a new driver’s license. Plus, there’s the healthy user bias – people who respond to any questionnaire tend to be richer, smarter, better off, i.e., healthier than the average bear. Er, person. Often they’re white and frequently younger. All of which means they are not like the real at-risk population who, by and large, tends to be poorer, less educated, older, more diverse, less health and diet conscious, more stressed and sicker. Face it, d’you think you’d have time to sit around reading some blog if you had to work at two or more minimum wage jobs just to put food on the table and pay your rent? Could you even afford an iPad or even high speed internet?

This is on top of the fact that professionals in general can’t stand in for “everyone” and basing one’s conclusions of what these people do (or say) is what’s popularly known as being spectacularly wrong.

What really interested me, though, was what these researchers thought might be going on physiologically. In other words, how did we get from basic sugar metabolism to blobs of crystallized minerals in the kidney? Gremlins? Evil spirits? The authors do obligingly admit that the underlying mechanisms are “unknown” (ah, ya think?!) but postulate various processes, none of which make sense. Hence their masterful use of language:

“Fructose may also increase urinary excretion of oxalate, an important risk factor for calcium oxalate nephrolithiasis. Carbohydrates, along with amino acids, provide the majority of the carbon for glyoxylate and oxalate synthesis, and fructose may be an important dietary sugar influencing the production of oxalate.” (emphasis mine)

 The authors concede backing for their hypotheses are “sparse”; personally I would have said nonexistent. Rats make up the bulk of their research subjects in this section and the one study they cite using humans consists of eleven – yes, 11 – men whose pee was analyzed for calcium loss (versus calcium intake).  Fructose intake made no difference in the calcium these men excreted but the researchers still concluded that the reason fructose laden drinks caused kidney stones “may be related to the effect of fructose intake on urine composition”.  How they concluded this I have no idea. Maybe they were on a sugar high.

The only marginally plausible explanation had to do with uric acid metabolism and for a moment I thought, OK, this might make sense. Then I checked the reference and realized it only applied to people with gout, whose uric acid metabolism is already dysfunctional (that being the definition of gout). 

Kidney stones, by the bye, are hardly that common and rarely if ever life threatening. Even Wikipedia’s overblown, hyperventilating piece on the topic, that sounds as though it was written by a nephrologist who had just passed one, admits that the incidence or number of new cases a year is “0.5%”.  (Of course it doesn’t specify 0.5% of what which is rather an important point, but let’s not nitpick at this late point in the post.)

How did this 5-year-old study even make it into the health news section? Having spent some years as a medical writer and journalist, I can tell you exactly how. A group of people in an editorial meeting, drinking coffee – or pop – were bouncing around story ideas and someone suggested a piece vilifying soft drinks, currently Public Enemy No. 1 (see NYC, Bloomburg).  So, they wrote the headline then they contacted the hapless writer who cast about for some new and nifty problem that could be blamed on aforesaid sugary drinks.  Everyone  knowing full well that the majority of people only read the headline and the first paragraph; it’s only mutants such as myself who check the original research and parse the methods section.

If sugary drinks do give you kidney stones, these people didn’t prove it.

There are a lot of good reasons to consider soft drinks a treat, not a staple. They’re empty calories; they rot your teeth and many of them contain fairly high amounts of caffeine which can make you nervy and insomniac. But kidney stones? Really?! We think not. And it takes stones to say they do.




* oh get over it. It’s summer. There’s construction outside. Yes, I have the occasional ginger ale or Coke. Sometimes, when I’m especially cranky, two days in a row. Sue me.  

‘ear ‘ear (better yet, turn it down)

The Oscars were last night and no I didn’t watch. To paraphrase Ogden Nash, my interest in the subject would have to grow to be even cursory. (And you thought the curmudgeon in the title was just for alliteration.)

Frankly, I’d sooner mindlessly stare into space. At least it wouldn’t be deafening.

Well, not the Oscars so much; they are, after all, on television and one can turn the volume down. One cannot say the same thing about what the Oscar celebrate: movies.

When was it decided – and who did this deciding and why weren’t we consulted? – that we shouldn’t just hear the sound but feel it vibrate down from our toes to the top of our tinny tin heads? How did going to a movie turn into a full frontal assault on our senses, from movement and colour and flashing lights to that ubiquitous noise they are careful to remind us is the patented Dolby surround sound?

Once upon a time one could go to a movie, yes even action ones without going deaf. I was first in line when those early Star Trek movies came out and I saw the first Die Hard in the movie theatre. It was loud, but I don’t recall coming out feeling like I’d just been put through the wash cycle.

In those days I wasn’t flinching and stuffing my ears with Kleenex or covering my eyes to protect them from the flashing lights that would end up giving me a migraine. The last “action” movie I saw, one of the Johnny Depp pirate movies, well, I left that so dazed and battered that I barely remembered what I’d seen.

How did deafeningly loud become normal? Or is everyone just deaf? 

So much of the noise around us we barely notice. The hum of computers and air cleaners and refrigerators; the constant hum of traffic, the honking, the music blaring out of car stereos … And of course everyone is in their own little world of sound, with those earbuds.

I’m not a total Luddite, I got an MP3 player years ago. I filled it with music. I listened. Then I realized than as I was walking I kept turning the sound up to compensate for the noise all around me – and if I had been listening for a few hours at night there was a constant ringing in my ears. It would go away but what I know about the sensitivity of the ear told me that if I kept doing this it eventually would not and the result would be tinnitus. A ringing that simply never goes away.

I doubt most of the people on the subway or the bus or walking down the street really pay that much attention. Which suggests to me that in a decade or two anything relating to enhanced hearing will make a fortune since most people will be partially deaf. (We should all buy stock now.)

A number of big names – William Shatner, Jerome Groopman (author of the best selling How Doctors Think book) – have gone public with their tinnitus. Experts tell us that it’s probably the result of prolonged exposure to loud noise.

Then there’s the Who’s Pete Townshend, who is essentially deaf. The Who, as you might recall, is credited with having performed the loudest concert in history, at least at the time, circa 1976: decibel level 120. That’s about as loud as a jackhammer. Almost as loud as a jet engine. That’s loud.

Respect the ear – or it’ll give up on you

The human ear is an amazing thing. Inner, middle, outer ear: each have their function, each play a role in funneling sound through tiny cilia (little tiny hairs) and through the ear drum, into the brain where it’s interpreted and experienced. A human ear can pick up the dripping of a faucet in the middle of the night on another floor, hear a symphony or a the swish of even a piece of paper falling to the floor.

It’s amazingly balanced between the inside, conduit to the brain, and the outside world. Precisely balanced in between are three small bones (the smallest bones in the body) that are shaped somewhat like a stick figure or one of those triangles your music teacher would have you play if you weren’t musical enough to actually play something normal (like me). OK, I played piano but she was the one doing that.

These tiny bones can easily work their way out of alignment after a blow to the head or trauma – I believe that today they can be repaired with microsurgery but it’s complicated.  In between are the smallest bones in the body that reverberate in response to the ear drum (timpanic membrane) vibrating in response to sound waves in your environment. Like the tiniest of precision percussion instruments, eventually turning into an electrical pulse that is interpreted through the neurotransmitters in your brain.

Culture often determines what we think of as noise versus music. That dripping tap of which I spoke earlier (which has been known to drive me insane and keep me awake) is, apparently, music to the Shinto-based Japanese mind. I don’t think that anyone truly enjoys a jackhammer however.

Currently I am surrounded by construction noise and have been for the least four months. It is exhausting, tiresome, intrusive and dreadful. And my ears hurt. Even with noise reduction headphones, the noise is unrelenting. The last thing I need is to go out for an evening’s “entertainment” to find myself surrounded by noise, be it the latest James Bond, war film or science fiction flick.

I hope you enjoyed the Oscars. And the nominated films. Just remember that once your hearing is gone, it’s gone. There’s no going back.

Pity Pity Bang Bang

It’s depressing, how often these ‘incidents’ involving high-powered weapons seem to occur in the United States, where there are almost as many guns as people. Everyone cried about Sandy Hook but there have easily been four or five more that I’ve read references to in the paper.

I couldn’t believe it when I read that federal health agencies cannot comment on the public health consequences of guns; haven’t been able to do so for well over 20 years.

I wrote on this in January 2011 and frankly, don’t see there’s much to add … Except that, at least according to this piece in the  New England Journal of Medicine such fatalities are actually in the minority; the majority of gun deaths occur quietly among family members and people one knows.

Hey, I get that. You get mad, you want to hit someone – if you have a gun, it’s damn easy to pick it up and shoot. But if all you have on hand is a stick or even a knife, well, you can do some damage but the death toll usually doesn’t rise to the double digits.

Just last week some fellow went postal in a downtown apartment building in Vancouver. I know the place, I used to live a block from there. Reports are mixed but there appears to have been a knife and a hammer involved. Several people were hurt but only one remains in critical condition in hospital. The rest were kept overnight and let go. They’re fine. Which they would not have been had the perpetrator carried an M16.

Guns don’t kill people, people do? Ah, no. People with guns kill people. People with sticks and stones, not so much.



So they continue being a pain ..

Painkillers increase risk of car crashes proclaims the headline in today’s Globe and Mail. Apparently, researchers at the “Toronto based Institute for Clinical Evaluative Sciences have found a correlation between even low-dose regular opioid use (two Tylenol 3’s three times a day) and an increased risk of car accidents.

Not a huge risk, the head researcher David Juurlink, hastens to add; certainly nowhere near as high as alcohol, but a risk nonetheless.

Wonderful. Two of my favorite things – correlational studies and experts rambling on about opioids in the same piece with blinkered experts continuing on their merry way, all pleased and sending out press releases (don’t kid yourself, that’s the only way a paper from something called the Institute for Clinical Evaluative Sciences that nobody has ever heard of would get a piece in the Health section of the Globe and Mail).

Um, did it ever occur to these geniuses that the reason people take those drugs, namely pain, might have something to do with those slightly increased numbers of car crashes? I use the term slightly advisedly: the risk increases between 21 and 42% according to the “scientists”. (Scientists in quotes because surely any scientist worth his salt knows that unless you know what you’re comparing something to a percentage – relative risk – is absolutely meaningless.)

Surely pain – which means someone gets more easily fatigued and could become less alert – could have a thing or three to do with it?

Oh no, it’s the opioids.

Of course by the same token, ice creams causes an increased number of drownings. Think about it. In the summer people eat more ice cream – and more people drown. QED.

Last March I wrote a post on Oxycontin and made some disparaging noises (OK, loud, angry noises) about the ado being made about addiction and pain killers. Notably, a Fifth Estate that had me virtually apoplectic with rage. Using largely American stories the CBC newsmagazine insisted that addiction to oxycontin was a  massive problem that we should all get worked up about, especially when it came to First Nations communities in the north.

By contrast, a few weeks ago I happened to come across a BBC mini-documentary about the same topic and the contrast could not have been more marked. I missed the start of the piece but what I did watch was superb. It was a program called “Our World” and the journalist’s name was Linda Sills. (I hope that’s how one spells it.) She had travelled to several communities in northern Ontario, spoken to various tribal elders, artists and addicted individuals and – wonder of wonders – had actually done some research and thought about the subject.

Sills, like the people she spoke to, all agreed that the problem was not opioids (in the ‘80’s it was alcohol and in the ‘90’s glue sniffing) but the situation. The environment. The socio economic conditions. When people are unhappy and hopeless they take solace in drugs, whatever is around, whatever they can get. Solutions are complex, multi-factorial and must emerge from the grassroots of the community itself. An artist who looked to be in his forties, addicted to oxycontin himself, talked of how his art was helping him reduce his drug intake (even though he genuinely looked as though he was still in pain, physical and psychic).

Opioids have been around for thousands of years. Officially they were discovered around the time of the Trojan War (war has always been excellent for medicine) but no doubt people knew of the pain relieving properties of the poppy long before that. They are the single most effective agent in treating pain and although we’ve tried to come up with synthetic variants (Demerol, Fentanyl) and alternatives (non-steroidal anti inflammatories) there simply has never been a drug that works as well, as consistently.

Treating pain with opioids allows people who suffer from chronic pain to function. To have lives. To work, interact with families and friends and feel like the are part of the world.  But in recent years, perhaps with the rise of right wing moralizing in the U.S. and what some people call the rise of the nanny state we have taken a sharp turn away from treating pain to calling individuals who need medication “addicts”.

Our reverence for numeric reasoning and bad statistics naturally hasn’t helped any; after all, what could be more qualitative and unmeasureable than pain, which, by definition, is whatever the person says it is?


I stress, Eustress

It’s become such an ubiquitous concept that it’s difficult to imagine how recent a term “stress” really is. When Hans Selye first proposed that all tension, all sources of anxiety created the same kind of reaction within the body it seemed ridiculous. And this was in the fifties to the best of my recollection.  Of course now we all know that too much stress is bad for us and that stress is a factor in disease.

[So so many things that we take for granted – cardiac risk factors, prevention, stress – are such recent concepts. But we think they’ve been around since the year dot.]

Stress is hard on the immune system, affects us hormonally and causes muscle tension and fatigue. It gets in the way of sleep, which causes its own set of problems ranging from poor concentration to anxiety, and depresses normal pain signals. Which is why soldiers and athletes often don’t feel the pain of a major injury and it is only later that they realize they’ve damaged something.

Then there’s the good side of stress, or eustress. Without some stress we would have zero motivation, zero reason to excel or create. That’s why there’s that old graph showing how some stress is good before a major task, say an exam; with some stress performance gets better. But, if it gets too high then performance suffers.

Which we all know from our own experience.

Wandering around Paris what strikes me as well is the extent to which our actual, physical environment can create or reduce stress. When what is around us is beautiful, when we hear laughter, when the sun is shining – well, it’s hard to feel to unhappy or stressed. No accident that depressed areas inevitably are ugly.

It’s hard to be too stressed when one is a tourist in Paris, well, unless one tries too hard to make the French conform to one’s North American ideas of time, speed and interaction.  Personally, something that I think is rather wonderful here is the very formal aspect of saying ‘bonjour’ whenever one walks into a place, any place. It is a way, I think, of humanizing the service person, the waiter, the person in the store. When one stops to say ‘bonjour madame’ or ‘bonjour monsieur’, one has to pause and look at the person and realize this person is not simply part of the scenery, they are an actual human being. It adds a touch of humanity to what is often a rather soulless encounter.

The French are currently pilloried for their dislike of capitalism, their failing economy, their rising youth unemployment.  Several august bodies are miffed that in spite of all of this money markets still love France, which can borrow money at brilliantly low rates, which suggests they’re not worried about France’s future. There’s a palpable sense of outrage about this on the part of business writers, The Economist, various commentators – usually Anglo Saxon. Why? Why does everyone need to conform to the same ideas?

The French fought a revolution which had at its basis the value of the human being.  Extreme wealth, especially ostentatious wealth, is frowned on in France. I can think of worse things.

In any event, given the stress we all experience when all we focus on is money and making more of it, it seems to me that the French are on to something.

Republicans prepare for Brave New World (of 1835)

“I read somewhere that Mitt and I have a storybook marriage. Well, in the storybooks I read, there were never long, long rainy winter afternoons in a house with five boys screaming at once. And those storybooks never seemed to have chapters called [multiple sclerosis] or breast cancer,” offered Ms. Romney, who has battled both diseases. “A storybook marriage? No, not at all. What Mitt Romney and I have is a real marriage.”

That was a quote from Ann Romney speaking at the Republican Convention; apparently she is meant to humanize her husband who is perceived as wooden and clinical.

I did like the photo of her in the Globe – however much my evil twin wonders if that smooth forehead of hers has something to do with Botox some other cosmetic filler. (hey, we should all look this good at 63)

It is ironic, though, that she brings up having been ill, when her husband’s party so demonizes government spending, in particular anything that might benefit the less fortunate, namely programs like Medicare, Medicaid and Social Security.

But, I suppose when you have a gold-plated health insurance plan it doesn’t occur to you that not everyone has the means or the employment (private insurance being so tied in with work in the U.S.) to afford breast cancer treatment or long-term therapy for MS. Or can get any care outside of an ER.

I am of course sorry that she has been so afflicted; one would not wish those diseases on anyone. But they happen and it certainly helps if you can get reasonable care. As a friend of mine used to say, money may not buy happiness but it’s a easier to cry yourself to sleep on satin sheets.

Meanwhile, a rather good piece in a recent Archives of Internal Medicine plaintively asks why nobody is questioning the “multiple examples of overdiagnosis that arise when technology, rather than clinical findings, are the catalyst for finding disease”. The authors, Jerome Hoffman and Richelle Cooper, both MD’s, point out that incidental findings of “disease” that will never lead to anything are both a waste of time and money. Not to mention turning people into patients. Their italics. (Arch Intern Med/Vol 172 (No. 15), Aug 13/27, 2012, pp 1123-4)

I thought of this article because of what Ann Romney said about having had breast cancer. Mea culpa but whenever I hear the word “cancer survivor” my first thought is: was that a real cancer (the kind that would actually kill you) rather than one that you would die with. An  incidental finding when you went to have that mammogram everyone assures you is part of good, pro-active care? Simply having a handful of cancerous cells in your body (or “pre” cancer) is no indication of anything. As you age, some cells mutate. Doesn’t mean anything. My grandmother had colon cancer for the last 25 years of her life; she died at 92. Wasn’t the colon cancer what did it, just old age.

But Americans are staunch believers in the power of technology, in the predictive power of science and the ascendancy of good old American know-how in controlling disease just as they have tried – with such success – in controlling various countries in the Middle East.

One of the signatories of the Declaration of Independence, Benjamin Rush, was a physician. He, like many of his ilk at the time, genuinely believed that American diseases were meaner and tougher than their effete European cousins. Rush’s claim to fame was treating the yellow fever (which was said to have killed more enemy soldiers in Panama a century later than the enemy did) by using humungous doses of mercury which, presumably, if it didn’t kill the patient effected a “cure”. Historians later questioned Rush’s prowess, suggesting that the individuals he was said to have cured actually weren’t that sick (hence were able to withstand his toxic treatment). Rush and his contemporaries, nevertheless, were convinced that massive purging, bloodletting and strong dosing were key to good medicine.

A handful of physicians like Oliver Wendell Holmes, who had studied in France (the French were much admired at the time for their diagnostic prowess), disagreed with Rush and his ilk. Holmes felt the American penchant for aggressive cures had more to do with cultural constructs than science. But it was Rush’s perspective that prevailed; it fit better with how Americans saw themselves.

Women were particular victims of this “heroic” ideology of strong medicine – starting from pregnancy and childbirth, where the use of forceps, episiotomies (cutting the perineum in childbirth) and later Caesarians became increasingly common to midlife and menopause when  hysterectomies and oophorectomies (removal of the uterus and ovaries, respectively) became the norm, as these organs were no longer considered “useful” once the woman could no longer bear children. Ah yes, now we know what women’s utility consists of …

Oh, and thank you sociobiology so much for keeping that swell idea alive.

Hubris, children, hubris. How the hell do you know what’s useful and what’s not? Tonsils were routinely yanked out until we realized that they were actually part and parcel of natural immunity. We have an unfortunate tendency to dismiss any organ or part we can’t figure out – a case in point being the large junk of genetic material we call “junk” DNA.

“An aggressive approach, of course, implies that the doctor can do something for the patients, and this ‘can do’ attitude is as much a characteristic of American medicine as it is of the American character in general,” writes Lynn Payer in Medicine and Culture.

You’d think by the 21st century we’d have evolved somewhat but no. Romney, Ryan and the Republicans hang on to these nonsensical notions. Zero clue  that today’s globalized, cyber-connected, micro-blogging world is hardly the same as the one Rush knew in the 1800’s. (He died in 1845.)

I shan’t delve too deeply into the right-wing perspectives on women’s reproductive rights, abortion, rape and so on lest I self combust. I will simply quote an old line from a Whoopi Goldberg one-woman-show that I always loved. Her character, a wise-cracking, foul-mouthed former drug addict, walks past a group of marchers – men – holding anti-abortion signs. She (playing a he) pauses, thinks for a minute and says: Hey, you want to stop abortion? Shoot your d—k.

Boundless enthusiasm for Overtreatment

Last week in Slate, sent along by my friend Maryse whose blog, Frogheart  covering nanotechnology, art, technology and so on is immensely popular (one tries very hard not to be too envious of her close-to-a-million visitors daily), based on an update in the respected Cochrane Review: how treatment of mild hypertension essentially useless.

What neither piece points out is that what we call “mild hypertension” today (systolic 140-159) was considered essentially normal a scant fifteen years ago. Well, 140 anyway. Or that thoughtful (often older) clinicians would not consider this hypertensive in older patients today.

Ah, it’s just a number people. A number, determined by a group of individuals, often cardiologists but also other “experts” (many of whom have ties to the drug companies who make antihypertensive drugs) as to what should be considered “normal”.

I’ve spent much of my research career debunking this notion of “normal”.  Particularly as it pertains to physiology, biology and humans, who, as we all know, tend to come in a variety of shapes and sizes and whose health status is determined by many variables, not the least of which is how much money they have and how happy they are in their lives.

Women, of course, have long been outside this matrix – normal consisting essentially of the male body without its circadian rhythms and cyclic hormonal elements, never mind pregnancy or menopause.  The vast majority of clinical trials, the gold standard of evidence as it has been called, excluded woman altogether and even when they tried to bring them in often women themselves wouldn’t play ball.

The reasons seemed complex, social, domestic, personal, economic and psychological.  Women generally have been socialized to be risk averse, which means if they are told they have condition X then they want the damn treatment. They don’t have time to worry about whether or not they’re taking the placebo. Plus, large multi-centre trials require the time not to mention transportation to get to those bi-weekly weigh-ins or tests or what-have-you and women, particularly women over 40 tend to be overwhelmed with children and grandchildren and ageing parents and work and housework and life. “Who’s got the time to enter a trial?” most will ask. “I’ve barely got time to sit down never mind volunteer my time at a clinical trial.”

No doubt there are other reasons but at this point I haven’t researched it. I just know that women are vastly underrepresented in what we optimistically consider evidence-based medicine.

I see something inherently male and American in this perspective, this enthusiasm for aggressive treatment (as the cultural critic Lynn Payer in her wonderful book Medicine and Culture once remarked, there has to be something culturally satisfying in the notion of ‘aggressive’ given how often the term is used in American medicine; even the recommendations for gentler treatment of newborns was advised to be pursued aggressively).  Or overtreatment.

Cross cultural studies have repeatedly shown that countries like Canada, which can’t afford as many cardiac surgeries and procedures as the U.S., as well as countries like Finland, which simply doesn’t believe in them, have the same outcomes as the U.S. In other words, Americans spend huge amounts of time and money doing things – cardiac bypass, cardiac catheterization, stents, etc. – but cardiac patients are no healthier than in countries where they do half the number per capita. All that activity doesn’t result in better health or lower morbidity or mortality.

Less is often more in medicine. And bodies are fragile. Drugs, surgeries, procedures, tests: these are not benign. They exact a toll on the body. And all for what?

All because somebody somewhere decided they know what was best and what magic number was “normal” blood pressure.  Or what an artery “should” look like in a person with no symptoms.

The worst part is that as patients we are complicit in this, increasingly believing that more is better – and reject the notion of watchful waiting, considering a physician who says, “just take it easy for a while, it’ll get better on its own” a quack. So, fewer and fewer physicians say such things. As a doctor once said at a conference I was at: It’s easier to just write the prescription that to take twenty minutes to explain to a patient (who’s not going to believe you anyway) why she or he doesn’t really need it.

But hey, we wouldn’t want to miss out on something that could be really terrific now, would we?!