Category Archives: Cranky Comments

The prism of life

In the immortal words of Yogi Berra, if you don’t know where you’re going you’re going to end up some place else. Which is where I seem to have landed, at least these past months.

Again with the AI

The fascination with AI continues to irk, given that every second thing I read seems to be  extolling the magic of AI and medicine and how It Will Change Everything. Which it will not, trust me. The essential issue of illness remains perennial and revolves around an individual for whom no amount of technology will solve anything without human contact. The change-everything reference, by the bye, is not to the  “singularity” here, that point at which some thinkers believe power will shift dramatically and we will all end up serving our robot overlords. That is another story. (One with major movie potential thank you Skynet and Terminator.)

But in this world, or so we are told by AI proponents, radiologists will soon be obsolete. The adaptational learning capacities of AI mean that reading a scan or x-ray will soon be more ably done by machines than humans. The presupposition here is that we, the original programmers of this artificial intelligence, understand the vagaries of real life (and real disease) so wonderfully  that we can deconstruct these much as we do the game of chess (where, let’s face it, Big Blue ate our lunch) and that analyzing a two-dimensional image of a three-dimensional body, already problematic, can be reduced to a series of algorithms.

Attempting  to extrapolate what some  “shadow” on a scan might mean in a flesh and blood human isn’t really quite the same as bishop to knight seven. Never mind the false positive/negatives that are considered an acceptable risk or the very real human misery they create.

Moravec called it

It’s called Moravec’s paradox, the inability of humans to realize just how complex basic physical tasks are – and the corresponding inability of AI to mimic it. As you walk across the room, carrying a glass of water, talking to your spouse/friend/cat/child; place the glass on the counter and open the dishwasher door with your foot as you open a jar of pickles at the same time, take a moment to consider just how many concurrent tasks you are doing and just how enormous the computational power these ostensibly simple moves would require.

Researchers in Singapore taught industrial robots to assemble an Ikea chair. Essentially, screw in the legs. A person could probably do this in a minute. Maybe two. The preprogrammed robots took nearly half an hour. And I suspect programming those robots took considerably longer than that.

Commander Data (on Star Trek Next Gen) spent his life trying to emulate humans and understand the notion of having a “gut feeling” about something. Those, as most of us know, can be wrong but usually are based in experience. Something about the situation at hand reminds us of something. We may not remember the details but the feeling lingers and something in the present situation cues that memory. Personally I have great respect for my intuition, especially when it’s telling me not to buy into the hype.

Ironically, even Elon Musk, who has had major production problems with the Tesla cars rolling out of his high tech factory, has conceded (in a tweet) that “Humans are underrated.”

I wouldn’t necessarily go that far given the political shenanigans of Trump & Co. but in the grand scheme of things I tend to agree. But hey, who knows, perhaps soon we will all be Borg, far too involved in flying around the galaxy telling people resistance is useless to worry about petty nonsense like this.

Lean, mean and gene

On a somewhat similar note – given the extent to which genetics discourse has that same linear, mechanistic  tone – it turns out all this fine talk of using genetics to determine health risk and whatnot is based on nothing more than clever marketing, since a lot of companies are making a lot of money off our belief in DNA. Truth is half the time we don’t even know what a gene is never mind what it actually does;  geneticists still can’t agree on how many genes there are in a human genome, as this article in Nature points out.

Along the same lines, I was most amused to read about something called the Super Seniors Study, research following a group of individuals in their 80’s, 90’s and 100’s who seem to be doing really well. Launched in 2002 and headed by Angela Brooks Wilson, a geneticist at the BC Cancer Agency and SFU Chair of biomedical physiology and kinesiology, this longtitudinal work is examining possible factors involved in healthy ageing.

Turns out genes had nothing to do with it, the title of the Globe and Mail article notwithstanding. (“Could the DNA of these super seniors hold the secret to healthy aging?” The answer, a resounding “no”, well hidden at the very, the part most people wouldn’t even get to.) All of these individuals who were racing about exercising and working part time and living the kind of life that makes one tired just reading about it all had the same “multiple (genetic) factors linked to a high probability of disease”. You know, the gene markers they tell us are “linked” to cancer, heart disease, etc., etc. But these super seniors had all those markers but none of the diseases, demonstrating (pretty strongly) that the so-called genetic links to disease are a load of bunkum. Which (she said modestly) I have been saying for more years than I care to remember. You’re welcome.

The fundamental error in this type of linear thinking is in allowing our metaphors (genes are the “blueprint” of life) and propensity towards social ideas of determinism to overtake common sense. Biological and physiological systems are not static; they respond to and change to life in its entirety, whether it’s diet and nutrition to toxic or traumatic insults. Immunity alters, endocrinology changes, – even how we think and feel affects the efficiency and effectiveness of physiology. Which explains why as we age we become increasingly dissimilar.

This is important. It means that our personal histories matter more as we age and guidelines and evidence, as useful as they can be in a vague sort of way, need to be used with a large grain of salt, accompanied by a healthy dose of skepticism. Who we are, what we were like throughout our lives and what’s happened to us during that life are part and parcel of our health picture. As I’ve said before, much as we’d like to reduce medical decisions to a question of statistics and probabilities, it’s simply not possible. There are 89-year-olds for whom knee surgery is a perfectly viable alternative; 65-year-olds for whom it is not.

The circle of life seems to have come to a dead halt

Sadly, Super Seniors Study notwithstanding, our template for ageing is rather meager. Pathetic almost. As a sociologist told me many years ago when I was writing a story on geriatrics (when I myself was in my 30’s and didn’t have a clue), our mental picture of a “good”  old age is essentially that of a 20-year-old with a few wrinkles and grey hair. We admire seniors who run marathons and lift weights and do all the things they did decades earlier. We don’t value the  attributes that actually accompany ageing such as the ability to manage time, ideas and people better. Experience. Quicker thinking in deeper, more analytical subjects. Wisdom. Happiness.

Instead, we read that older folk react more slowly in tests where they’re shown some picture or asked to push a button on some video game; this is then used as some kind of bizarre proof that a 20-year-old brain is somehow superior. It may well be true that certain types of reaction time slow with age but my suspicion is that that’s because usually these are irrelevant in the grand scheme of things. Then again, what do I know. I couldn’t remember names or certain kinds of details when I was 25 and I still can’t. And I’m still here. Still can’t remember details but smart enough to look both ways before crossing the street (and putting my phone down before I step on to the curb, something I can’t say for way, way too many people.)

Life isn’t about one’s ability to do tests. Psychologists and educators finally realized several decades ago that IQ tests weren’t all that good at predicting future performance; all they measured was how well one did on IQ tests. And one’s ability to do well on those tests, all too often, was determined by extraneous factors like class and culture. (Something that I recall appeared to annoy researchers mightily, in the days of language/math IQ tests was the students in Sri Lanka did better on the language portion of the test than all other countries, including Britain and America. How could this be, researchers cried. Well, presumably Sri Lankan students spoke and read better English, geniuses.)

Just as we aim to declutter our living rooms and our lives, we want health and medical matters to be neat. Predictable. I wrote about this a while back but – amazingly – my one blog post doesn’t appear to have changed the world.

Ideally we will find a balance, us learning to live with the ambiguities of life – meanwhile, experts might try to realize that in the end people don’t care that you know – not unless they also know that you care.

He was a cool cat

Speaking of caring, Charlie, our noisy, furry friend died at the ripe old age of 17, which in human years is somewhere in the late 80’s. Not a bad age. He died peacefully at home, surrounded by the people he had bustled about managing all his life.  For as we all know, while dogs have owners, cats have staff.

Charlie contemplating his domain.

I wrote about Charlie some years ago when he was ill;. At the time I wasn’t sure he was going to survive the rigours of modern veterinary medicine. But he did and had  what I found out is called a hospice death, according to that fount of all wisdom, the internet. Apparently more of us are opting to let our pets go gently into that good night, contrary to Dylan Thomas’s exhortation.   And so the prism of life continues.

Perhaps strong drink is the only solution

To end on what I consider a more pragmatic note, I just heard a Dean Martin song I had never heard before – the chorus of which sounds as though it should perhaps be our new theme song.

 

 

QED.

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Artificial Intelligence and Natural Stupidity

We live in an irritating world, made all the more tiresome by the increasing, amount of tech we contend with every day. There are those Facebook algorithms, eerily sending you adverts geared to your “likes” or Linked In knowing your fourth grade classmates better than you do; that tinny  voice offering to help you navigate tech support (god help you if you have an accent); or having to prove you’re not a robot to some web site. Never mind refrigerators and “smart’ TV’s that can watch you – and be hacked  or apps professing to predict your cardiac or Alzheimer’s risk that are about as reliable as a palm reader. Human contact, it would seem, is becoming irrelevant. The few times an actual human answers a call or is there to help, well, I don’t know about you but I want to weep: things go so much more smoothly.

Yes, I know, that’s just crazy talk. The future is now and it’s automated.

Then there’s manufacturing where factories that used to employ hundreds if not thousands of people now have maybe twenty people working alongside the robots who now do the work. What would one call them I wonder? It’s a pride of lions – so perhaps a clank of robots?

Danger, Will Robinson, danger.

Algorithms, chatbots, trolls and nameless, faceless tech of all sorts are so ubiquitous we barely notice them any more, up to and including health care where, you’d think that in dealing with a sick, vulnerable person some human contact would be the basic requirement. Where’s Robot in Lost in Space to warn us, along with Will Robinson, of the danger?

In the health care realm even when there is ostensible human contact it’s technology driven. Hang out in a hospital room for a day or two as I did recently and watch as a nurse or PN (I know they’re not called that any more but I can’t keep up with the changes) wanders in, checks the beeping machines, jots something down and wafts out, without a word. Glance down the corridor and pretty much everyone at the nurse’s station is staring vacantly at a monitor, oblivious to anyone waiting to ask a question.  Honestly, it seems a pity to disturb those machines sometimes.

So why it should have come as a surprise to me I don’t know. In Ontario apparently there’s some conflict going on about robots in Canada’s operating rooms.  A handful of surgeons are seriously miffed that their toys are going to be taken away from them since there’s no real evidence that they actually work any better than the old-fashioned kind of surgery done by actual humans.

(Reminds me of a conference I attended years ago where a urologist spoke for nearly an hour about some cool new massively expensive “microwave” that was going to revolutionize prostate treatment and do away with prostate cancer forevermore. Well, it’s decades later and I haven’t noticed any revolution or reduction in cancer.  What was noteworthy at the time to me at least was that a later speaker, a woman doctor explaining cervical cancer was all about low-tech clinical matters and how to make the patient comfortable. Never noticed any patients being mentioned alongside the toys. Interestingly, a group of male prostate cancer patients made their own film around that time in which they related their experience(s) in the hope that more men would exercise caution before leaping into the high tech and/or surgical options.)

Granted, tech can be a godsend to the injured and disabled. Some of those new digital prosthetic limbs are truly miraculous. The trouble is when the outcome doesn’t match the time, money and effort it takes to create some piece of hard/software. Take robots in the operating theatre. There has to be some serious proof that the tech is superior to current care. And I don’t think I’m being a Luddite when I say if I was the person who was being cut open I’d just as soon have a thinking, adaptable being leaning over me, one whose intelligence was not artificial thank you very much, versus some robot programmed by a 17-year-old who actually believes everyone’s insides look just like that pictures in Gray’s Anatomy (the book, not the TV show). OK, I’m being facetious; presumably live human doctors have input into the development of whatever boffo artificial intelligence (AI) went into the thing. But, as with the cockpit, one would prefer to grump about legroom secure in the knowledge there are biological beings flying the plane; humans who could react to the unknown based on expertise and experience, not rote. (Highly motivated humans who also understand that they go down with plane so it behooves them to figure out how to set the damn thing down safely, maybe on the Hudson River.)

As anyone (read: all of us) who have banged on a mouse in frustration as a drop down menu or some program insists we haven’t done such-and-such even though we have, seventeen times, and there’s simply no recourse, I can see the day coming when a surgical arm needs rebooting and gosh, nobody’s there.

No doubt AI aficionados will object and tell me (in no uncertain terms) that my understanding of artificial intelligence is flawed and as biased as Will Smith’s character on I Robot. Maybe so. But as someone who read the original Asimov books way back when, I tend to think the potential for harm increases exponentially with the complexity of the task at hand. Big Blue may have won at chess and even some game called “Go” was won by some digital gamer, but surgery and flying a plane aren’t a game.

Artificial intelligence, oh my

Alas, the toys ‘r us crowd are happily moving along, research funding at the ready and PR teams salivating at the thought of the coolness of it all. In fact, any day now in won’t just be a robotic arm in the OR but dead silence as STAR (Smart Tissue Autonomous Robot – who thinks of these acronyms anyway?! ) takes over.

At the moment STAR’s a bit slow: a simple gastric bypass operation that takes a surgeon  approximately eight minutes takes it roughly an hour. But just as chatbots reply to the question you tweeted some airline in that anodyne, generalized robotic tone, soon so will your surgery. Trouble is, things can go wrong in surgery and I’m not so sure STAR is up to the challenge. Then again, if something goes wrong you can always sue the manufacturer.

Honestly, I think I’d rather go with lions and tigers and bears. At least they’re real. And their babies are cute.

Here’s where it gets seriously creepy

I suspect most people have an image of AI and tech based on the movies where our hero/heroine leaps in to land the plane or do that tracheotomy when evil terrorists have taken over but those are, by definition, fiction: scripted, directed, edited. No computer ever crashes on any of those shows – people are being chased by clever villains (who love classical music) but every time they manage to download the vital clue in ten seconds flat and get away. No memory stick eve screws up; nobody ever has an issue finding the right port and naturally no software ever needs rebooting: everything from the GPS on up or down works perfectly. Then there’s the rest of us who are simply trying to order a book on line and have spent the last half hour retyping our home address, only to see that  irritating red pop-up telling us our order can’t go through because our information is incomplete.

Inevitably, at times things get seriously creepy. Pornographic androids and now this female android called Sophie who’s been given Saudi citizenship – I kid you not.  I bet you anything this female, now Saudi, android has rights that real women in that Wahhabi  country don’t have.

Then again, what can one expect from the desert kingdom where the crown prince throws anybody and everybody in jail at whim and plans some robot-run economic free zone for the future?  Maybe Sophie can work there.

Fake news?

Moving on – and keeping to the theme of technology taking over – and androids aside, the so called health news that social media zip over for our eyes only are all too often dead wrong. And potentially dangerous. Not only are these almost inevitably disease oriented (which most minor problems are not) but their advice can be downright dangerous. I see catchy ads asking me to click on some supplement that will make smarter, fitter, thinner and no doubt taller. Needless to say I don’t click but somebody does, otherwise they’d stop posting those ads. Like spam, if nobody ever replied they’d stop doing it.

Research in general has increased exponentially over the last decades, to an extent that’s difficult to conceptualize. What we used to call the Index Medicus in the 20th century was in the tens or, eventually, hundreds of thousands; now we are in numbers too large to calculate (and getting larger all the time). And the sad thing is that most of us don’t appreciate just how much the error rate goes up as the numbers increase.

My friend Frogheart, aka Maryse de la Giroday whose gem of a blog contains everything nano you’d ever want to know passed along this piece to me; it’s about a single issue, breast cancer germ cells used over the last four decades that actually aren’t breast cancer cells at all but melanoma.

The mind boggles at just how much research might have been based on this faulty methodology/thinking and just how many patients and clinicians have been led astray. Notably because breast cancer is such an emotive topic.

Over the years I’ve read many, many research papers and articles; some I’ve been asked to review, others I’ve written about, still others have simply been out of interest (or fury). There’s a lot of bad research out there people. Fake news isn’t just about politics; it’s also about life, health and everything else. So please use caution when you read or hear that “scientists” have found X or Y may help with this or that and don’t add supplements or alter your medications based on faith on science. Science, like pretty much everything else human beings engage in, is not only fallible but subject to the same human foibles as anything else: money, position, power, jealousy, idiocy …  As the geneticist Lewontin once said, scientists come from the same world as the rest of us.

A world that these days is probably binary, digital – and not that bright.

 

 

Random Thoughts and Staircase spirits

Time, said Auden, will say nothing but I told you so. Time also gives one the opportunity to brood – darkly – on so many of the idiocies out there in the ever-expanding world of health information.  So here, in no particular order, what’s been making me especially cranky:

Monster under the bed roams city streets  

Diabetes, the latest health scourge to hit the news, is now a City of Vancouver problem, at least according to a headline in a throw-away newspaper I threw away,

“Vancouver to track and attack diabetes”.  With what, one idly wonders. Bicycle spokes dropped on those bicycle lanes? Pointed sticks? Stern warnings? Nothing so mundane it turns out. This, apparently is part of some international initiative (a word that sets my teeth on edge) and creme de la creme cities like Houston, Mexico City, Copenhagen, Shanghai and Tianjin (where?) are on board, tracking “people at risk of diabetes” as part of a campaign to promote “healthier cities”. Curiouser and curiouser. Who knew cities were sentient and could get sick.

So the plan is – what? Skulk behind anyone leaving Starbucks with a large, frothy coffee? Tap anyone who seems a bit plump on the shoulder and read them the health riot act? (Honestly officer, it’s this outfit. Makes me look fat.)

Someone with the unlikely title of managing director of social policy at, one assumes, the City of Vancouver  will start “consultations” with Vancouver Coastal Health and – wait for it – Novo Nordisk, the sponsor of this demented plan.

Of course. Silly us, not to have realized a drug company had to be involved.

Must be diabetes lurking back there in them there bushes….

 

Novo Nordisk, a nominally Danish but probably multinational drug company almost exclusively manufactures diabetes drugs (oral hypoglycemics) as well some types of insulin. (The old insulin by the way, the non-patent-able kind that came from animal pancreases and was easily tolerated isn’t around any more at least on this continent. Banting, bless him, donated his discovery to the people of the world; he didn’t believe anyone should benefit financially from diabetes. Unfortunately he had no way of knowing that by the late 20th century pretty much anything could be “property”: manufactured and sold, up to and including a person’s genome.)

This diabetes sneak attack has already started up in Houston where they “mapped” various areas (for what one wonders) and went door to door to “educate” people about diabetes. Then, if their numbers don’t match some ideal level no doubt they need some of Novo Nordisk’s boffo drugs. (This class of drugs, by the bye, doesn’t tend to have a long shelf life as they usually are fairly toxic to the liver and quite a few of them have come and gone.) These hapless people will be told to get their fasting glucose and A1C* checked and down the rabbit hole they will go. We will all go.

These days after all it has nothing to do with the actual human being who may be in there somewhere but about the numbers. (There’s an American drug ad that doesn’t even pretend it’s about anything but “bringing your numbers down”.)  I suppose racial profiling could play a part as well, given that, statistically, people of South Asian, Hispanic, Asian and First Nations background may be at greater “risk” – whatever that means.

What few people realize is that this ostensible epidemic of type 2 diabetes sweeping the world has much to do with the continual lowering of inclusion criteria. A few decades ago “normal” glucose levels were around ten. Now they’re about half that. For people over 50 the latter number is especially problematic as close to half of us, as we age, tend to have somewhat higher levels of glucose and if you think about it, it simply makes no sense that a physiologic change that affects close to half the population in a particular demographic is a pathology. It’s what’s called, um, normal.

As for me, if anybody tries to corner me and talk to me about my diabetes risk, I plan to run shrieking into oncoming traffic. At least that’s a risk that makes sense.

Fight them on the Beaches

In that previous story what initially struck me was the term “attack”. As though a glucose level that could potentially be problematic was some kind of enemy – not some fluctuating number based on a myriad factors ranging from weight to diet to sleep. A number that moves up and down depending on the time of day and a host of other factors.

Physiology is dynamic, not that you’d ever know it these days given how mesmerized we are with the numbers.

Oliver Sacks, RIP

Someone who understood the complexities of physiology – and stood up for clinical knowledge and patient narratives – was Oliver Sacks., who died last August.

Physician, author, eccentric and host of oddball characteristics, Sacks wrote some amazing books (Migraine, The Man who Mistook his Wife for a Hat, An Anthropologist from Mars, A Leg too Few are some of the ones I enjoyed reading. Apologies if I got the titles slightly wrong as I’m quoting from memory). Most important, his writing reminded us of the diversity and variation(s) there are between us; not simply the similarities that clinical trials, statistical averages and guidelines exploit. Sick or well we’re all different and, to paraphrase Hippocrates and Osler and other famous sorts, medically the person with the disease matters as much as the disease. Or ought to. Alas, the trajectory of modern medicine whether it’s so-called preventive care, apps or genetics has a tendency to iron out those differences and push us towards some mythical average or “normal” that few of us come close to.

Colourful, thoughtful clinicians like Sacks have become vanishingly rare. Perhaps it was Sacks own differences – Jewish, gay, former biker and user of psychoactive drugs, gefilte fish aficionado – that made him realize just how much one’s personal history and narrative played into one’s physiology. Or just how vital it is for clinicians to listen as well as talk.

Dem bones, dem bones

L’esprit de l’escalier is a French phrase referring to all the pithy remarks one ought to have made but which only come to mind some hours later. Usually as one’s interlocutor is long gone.

So, to the pleasant woman who came up to me after my CAIS (Canadian Association of Independent Scholars) talk last year to ask about vitamin supplements, more specifically calcium, what I omitted to mention was that calcium is not a vitamin, it’s a mineral. An element, if one wants to be pedantic, Ca+ (20 on the Periodic Table). Hence, the “elemental calcium’ you can buy in the drug store.

The notion that we all need to take calcium supplements for our bones is based on somewhat simplistic notion, namely that simply ingesting this mineral will somehow magically increase bone density which we are told we are losing at an alarming rate, especially if we are women over 50. Clever advertising ably preys on our fears of “weak” bones, metaphors being what they are.

Bone is an amazing substance. It is dynamic – the collagen demineralizes and then degrades even as other cells (in sync) remineralize the collagen that has just .. diminished for want of a better word. It ebbs and flows (how else could a broken bone heal?) to achieve a balance; a balance that alters with age. When we are young/growing bone builds to its apex, in our twenties. It then plateaus for a time then, as we pass age 35 or thereabouts we gradually lose bone density. This is what we used to realize was normal development. And the bone in your body differs in form, hardness and elasticity depending on where it is and what it does – the vertebrae in your spine and the long bones in your body are of a different consistency and respond to changes in pressure differently than the ribs or the wrist.

The calcium/Vit D directive has become so engrained however that most people believe what they are doing is somehow maintaining or feeding their bones with supplementation.

But our endocrine system monitors the blood level of calcium and maintains it at our personal set point. One that is different for each person. This means that taking in more calcium is generally pointless as it simply cannot be absorbed. To quote Nortin Hadler, an MD, in his book, The Last Well Person, “If the blood calcium level trends down, vitamin D is converted to an active metabolite, which makes the intestinal absorption of calcium more efficient and vice versa”. More is not better; it’s useless. And potentially harmful as calcium can deposit in joints and other bits. As for vitamin D, it too has a set point that differs in each person; too large doses can build up and become toxic. So, those generic amounts you’re advised to take may or may not apply to you. Probably don’t in fact.

We tend to think that the supplements we take as a kind of top-up to diet, like adding oil to a car or salt to soup. Our bones rely on calcium so we basically assume that bone density is improved by taking supplemental calcium. And since our bones contain calcium, and as we get older our bones become less dense, we should “supplement”. It’s a mechanistic form of thinking about the body, one that took off after the Industrial Revolution when an “engineering mentality” took hold about physiology (in anthropologist Margaret Locke’s term). It certainly doesn’t hurt that the nice people at Bayer (who are taking over the world and now sell everything from vitamins to glucose meters) continually tell us we should. Alas, physiology is rarely so cut and dried and our understanding of how bone (or anything else) works remains primitive.

The real advantage of dietary calcium is when we are young and our bones are developing (in our teens). Unfortunately, short of building a time machine and going back in time there’s not much we can do to reverse the bone mass we accrued before our twenties.

So for now the basics of health remain the same as they were in decades past. Relax, eat well, exercise and stop stressing out about supplements. Most important: stop listening to all that bogus advice out there. If all we do is obsess about our health, our diets, our bodies – well, we won’t actually live any longer but it sure will seem that way.

 

*A1C is a measure of a red blood cell that is said to provide a “snapshot” of your glucose levels over the previous three months. It’s rather elegant but is still a correlation. A good one to be sure but correlation is not, as we all know, causation.

 

 

 

 

Civil Scientific Discourse RIP

It’s no secret that I am not fond of hot weather in general and summer in particular. Making me especially cranky at the moment is the hyperbole surrounding the science/non-science discourse, e.g., around childhood diseases like chicken pox or measles, mumps and rubella (the three dire conditions the MMR vaccine is supposed to prevent). The crux  appears to be that either you’re either one of those unscientific, Jenny McCarthy-quoting, loons who believes vaccines causes autism – or you’re a normal, nice, sane person who believes in science. Paradoxically, science appears to have gained the status of a deity in this discourse.

No need to get hysterical about skepticism.

No need to get hysterical about skepticism, Hume might way.

Case in point, a headline last year: “Shun anti-vaccine talk, SFU urged”. Some anti vaccine conference was going to take place at some SFU campus and an angry group of critics were whopping mad lest this event “lend credibility” to this “dangerous quackery”.  This, er, quackery was some symposium where the discussion was on how “families are facing increasingly intense pressure from the vaccine lobby and big government to comply with vaccine mandates” and was  organized by something calling itself the “Vaccine Resistance Movement”. Hardly saving the free world from tyranny but hey, the resistance carries on, large as life and flakier than thou.

The 18th century philosopher David Hume, the granddaddy of skepticism would no doubt be turning in his grave at this hysterical, humourless assault.

BC’s Chief Medical Officer replied in his usual vein: “Vaccines, like any medicine, can have side effects, but the benefits … outweigh the risks,” By and large this would seem to be true. But, surely, it can’t be verboten to wonder whether suppressing all childhood diseases may perhaps also have consequences? Especially the trend towards vaccinations against diseases “such as chicken pox which cause only inconvenience rather than danger” in the words of British sociologist and science and technology writer Trevor Pinch. (In Dr. Golem: How to Think About Medicine by Harry Collins and Trevor Pinch, University of Chicago Press, 2005). Especially given the sheer number of jabs (approx. 20 I think) that infants now get.

SFU president Andrew Petter apparently refused to cancel anything, merely saying universities stand for freedom of expression and, as far as I know, the conference went ahead. I have no idea what was discussed but I suspect it was a lot of nonsense. That’s not the point. What’s perturbing is the vitriol of the protesting group and the smug suggestion that if one dares to question the many vaccines tiny babies are subjected to (or suggest these might, just might, have adverse immune or other effects) one has no right to speak. Either you toe the party line or you’re a crazy person. One who should be run out of town on a rail to coin a phrase. (I’ve never been sure why being run out on a rail – which to me implies a train – would be such a bad thing. Personally I am mega fond of trains.)

The photo of the conference protestor indicates that the group (“The Centre for Inquiry”) is just as obscure as the one they’re protesting. Maybe the whole thing was a publicity stunt or performance art, who knows.

Any child not vaccinated against the measles should not be allowed in school, someone firmly said to me last month. Measles can cause deafness and blindness, not to mention encephalitis, someone else said. I mildly agreed, merely pointing out that the numbers on these dire effects in the developed world were actually vanishing small, at least based on the (admittedly limited) research I had done. Buried in the contradictory numbers one small group of children was clearly at risk from measles, namely children undergoing cancer treatment.

Years ago, when I wrote a book on the immune system, I did a bit of desultory research on measles; there was some evidence that a natural bout of measles appears to reduce the incidence of allergies and asthma in later life. (Operative word appears – the data was correlational and based on medical records; there is no way to know for sure if this was cause and effect. Bearing in mind that most health recommendations, from lowering cholesterol to vitamins, are based on correlation.)

Immunologically measles appears to have a modulating effect; in a way allowing the immune system to become less inappropriately reactive and reducing the incidence of asthma and allergies. Perhaps this struck a cord with me because in my own case a natural bout of German measles (rubella) massively and obviously cleared the really bad eczema (also an auto immune over reaction) I had suffered since I was two or three. Large, itchy welts covering my legs, arms and face, especially knees and elbows. Then poof, I get sick, high temp, general malaise and my eczema essentially clears. I still occasionally get eczema, usually in reaction to an allergen (like aloe). But, by and large, I’m fine. The research I did years later gave me a context for that (better than my grandmother’s “well, the high fever burned it off” which made the eczema sound like a forest fire – though, come to think of it, that’s not the worst description).

But when I wondered out loud some weeks ago if maybe, maybe, over zealous vaccination programs could have anything to do with the increase in peanut allergies out loud some months ago you’d have thought I had suggested a plot for Criminal Minds. It was speculation, people. I’m not the vaccine police.

I’m not sure quite how this binary, myopic perspective evolved and became so engrained, but it seems now that any questioning of standard medical dogma (“vaccines are good”, “sugar is bad”) ends up as some version of t’is/t’isn’t, t’is/t’is NOT: all the subtle dynamics of a nursery school. Either you’re a feeble minded dweeb who fell for the fraudulent, discredited Wakefield Lancet article linking vaccines with autism (actually GI problems not autism but that’s lost in the mist of rhetoric) – or a sensible, right thinking person who believes in science, good government and iPhones. (As it happens I now have a Blackberry Z10 which I think is far, far superior. Were we to pause for a commercial break.)

Science is a method. Neither static nor dogmatic. But I guess if you’re going to turn science into a religion then it will end up that way. Pity, since scientific inquiry was, to a large extent, what dragged us out of the Dark Ages.

 

 

Lyme Lies – Ticks me Off

Each season has its own medical threats or so they tell us, so by rights I should be warning you about the flu – but I’ve already done that. Or I could warn you about carnivorous Christmas trees (sorry, old joke c/o the late Chuck Davis who mocked a pamphlet referring to “deciduous and carnivorous* trees”) but I promised you Lyme Disease and Lyme Disease it shall be. As it happens,  to my way of thinking, Lyme and flu may well share an immunologic link: as with the flu, where the virus is spoken of as though it’s a rampaging army, similarly, with Lyme Disease it is that original tick bite that has gained iconic status. Differences (biological, physiological, genetic) between people ironed out in the search for easy answers and someone to blame.

 

Lyme Disease, for anyone raised by wolves who’s missed the thousands of news items over the last 40 years, is a tick-borne disease that tends to cluster in areas such as New England where there are deer, believed to be the vector. Named after the county in Connecticut where it is said to have originated, Lyme has garnered increasing attention as some patients seem to develop vague but debilitating symptoms usually years after the original infection; symptoms that experts tend to dismiss as psychosomatic and unrelated to Lyme (even as conspiracy theorists maintain these medical denials are a plot and There Be Skullduggery afoot). Maybe aliens are involved, who knows.

 

(I use the term “disease” here , by the bye, with some disquiet; there seems to be much overlap in descriptions and discussions of Lyme between disease and illness – illness usually being defined as the patient’s experience versus the more objective signs and symptoms which are classified as a disease.)

 

It all begins with a bull’s eye – usually, maybe, sometimes

 

Ticks, said, Aristotle, clearly not a fan, are “disgusting and parasitic”. Ugly too. These tiny thumbtack creatures survive by boring into a host organism such as mouse, deer or human and – a la Twilight – sucking its blood. They’re vampires in other words. Once the tick has sunk its, er, fangs some patients develop a rash resembling a target or bull’s eye and a bacterial infection that may or may not have symptoms. This, it is said, results from the tick passing on a rare type of bacterium called a spirochete.  Known as Borrelia burgdorferi (after Willy Burgdorfer, who painstakingly identified spirochete in a tick the early 1980’s) a spirochete looks a bit like a coiled telephone cord, hence its name. I will not bore you with the intricacies of the different types of tick, the link to another disease, babesiosis, a malaria like illness also found in New England, though I could. Believe it or not parasitology is actually quite fascinating.

 

The problem, at least from a purely scientific perspective, is that the spirochete hypothesis came after the realization that, in most cases, Lyme Disease responded quickly and well to antibiotics. This led researchers to work backwards to find the culprit bacterium. In other words, as physician Robert Aronowitz in Making Sense of Illness (Cambridge University Press, 1998) writes, “To say that the discovery of the Lyme spirochete led to rational treatment is to put the cart before the horse [and] owes more the idealization of the relationship between basic science and therapeutics than to the actual chronology of investigation.” It is, Aronowitz suggests, more like a Texas bull’s-eye: you shoot the gun then draw the bull’s eye around the bullet hole.

 

This is especially problematic since early antibiotic treatment means that any trace of bacteria are usually wiped out and their existence is more in the abstract than anything else.

 

If you’re a disease, at least be new, modern and famous

 

Nevertheless, the narrative that’s evolved around Lyme Disease is as follows, this quote from that recent New Yorker article that sparked my curmudgeonly instincts: “Lyme Disease was all but unknown until 1977 when Allern Steere, a Yale rheumatologist produced the first definitive account of the infection.”  Just one problem. It ain’t necessarily so.

 

If we want to nitpick (and you know I do), a disease called ECM  (erythema chronicum migrans) which is uncannily similar to Lyme Disease appears in European medical texts as far back as the late 19th century. Also characterized by a bull’s-eye rash (called erythema migrans wouldn’t you know) – ECM, in some people, also appeared to result in flu-like symptoms. It was never definitely demonstrated whether it was a tick (which are also common in northern Europe) or a virus, and since the majority of cases were mild and self limiting, nobody paid that much attention.

 

Plus, ECM was described by a lowly branch of medicine, dermatology (think, Lars, Phyllis’s husband on the Mary Tyler Moore Show if you can remember that far back). Lyme Disease though, was identified through the exalted ranks of a specialty with more nous, rheumatology and then championed by a group of angry, well-off mothers in New England who were furious that their children seemed to be coming down with some kind of disease nobody knew much about; a disease, moreover, that seemed to mimic rheumatoid arthritis.  Since the focus was children, the media immediately jumped on board (and the ringleader-mother, Polly Murray, appears to have been adept at channeling their interest). There may have been joint pain in the European ECM patients but the patients were all adults, in whom joint pain may well have been considered more or less normal.

 

But in New England, well, there you had a veritable PR maelstrom: children being bitten by these vampiric insects; distraught mothers and heroic scientists swooping in to figure out what this strange, dire new disease could be.

 

Why does this matter? It matters because new things, new diseases are always more terrifying than old, known ones. Just as we all relax when we find out the potentially lethal symptoms keeping us up at night are actually shared by three quarters of the people in our office and is just what’s “going around”. But a new disease? Affecting children? With bizarre symptoms? That’s scary. And whenever there are descriptions of disease, incidence of that disease increase.

 

In the case of Lyme Disease, that interest hasn’t waned, with the end point always the same: a plea for more good science (not that bad kind of science people usually like to do).

 

Guidelines uber alles

 

The American Infectious Diseases Society guidelines maintain that Lyme Disease is usually easy to treat and cure. A few weeks of antibiotics does the trick in most cases and relapses are rare. Patients, advocates, as well as some rather strange conspiracy sorts, disagree and here’s where we run into one of my pet peeves, that objective/subjective, disease/illness demarcation that shouldn’t be a problem but all too often is.

 

Patients and their families and friends, at least in the fairly small number of Lyme sufferers who develop lingering and mysterious symptoms (ranging unpleasant but benign ones like headache and insomnia to weird and wonderful: “joints on fire”, “brain wrapped in a dense fog”), feel that the medical community has deserted them and is ignoring their very real pain, the very real fact that their lives have been horribly affected. As with chronic pain and other conditions that simply defy our reductionist explanations, the rhetoric descends into and either/or proposition. Either the disease exists as explained by the guidelines, or it does not. Either the tick bite leads to dreadful long-term symptoms in everyone – or it does not. Nothing in between.

 

Which is clearly nonsense.

 

Terms like “idiopathic” (of unknown origin) or “post” (post-viral, post traumatic) have been coined to describe these symptoms, these patients, mostly because we simply don’t know what to do with them. And by “we” I mean everyone.  Society. The culture at large. (I wrote about our issues around chronic pain in an earlier post.)

 

 

The biomedical model simply cannot explain the complexities of human experience, human disease, illness. Not only are there vast differences between individuals in their physical and physiological selves, there are social and cultural and dietary and a myriad others. It is simply not feasible to “fix” every underlying “cause” to get rid of a “disease”. Even infectious disease that we know is caused by a virus or bacterium does not affect everyone. Necessary but not sufficient is the phrase. The TB (or any other) bacterium is necessary for TB but not sufficient. Other factors must be present.

 

So why is it so difficult to believe that in some people that tick bite, with or without the bull’s eye rash, might lead to long term problems; problems amplified by the individual who also believes there is a problem that needs fixing and whose stress levels rise as a result. After all, if they feel so lousy it must be something terrible – cancer, maybe.

 

We believe in the magic of medicine so when it fails us we are hurt, angry, disappointed. This explains why Lyme (or chronic fatigue etc.) activists so often sound like such loony tunes. Even as they decry the evils of the medical establishment they search for legitimacy from it, absolution, that what they are feeling is “real”. (Which will also translate into other institutions recognizing said condition which then has other consequences like disability.)  True, there is the odd hypochondriac, Munchhausen’s, factitious patient. But there are also people who suffer from pains and disabilities that medicine cannot explain – and abandons, using the term “psychosomatic” like a cudgel. So what if it’s psychosomatic? All psychosomatic means is that the illness or symptoms originate from the mind, not the body (at least insofar as we can tell – our imaging and tests and so on not being exactly infallible). Who cares where the problem originates when people need help?  Isn’t medicine about exactly that, doing no harm, helping people feel better, function better? It seems logical that some people have the type of immune system that reacts, over time, to some kind of toxic insult, tick related or otherwise. These are the folks who develop rheumatic and other symptoms over time, the ones that medicine refuses to countenance.

 

What I do not understand is why.  Why does not having a diagnosis, a label, mean you have to deny people even have a problem?  (Some Hon. Members: Shame, Shame.)

 

 

 

 

 

* they meant coniferous

Why cats make the worst patients (and the dog ate my homework)

Charlie stopping to smell the flowers in healthier times

Charlie stopping to smell the flowers in healthier times

Charlie, one of the cats, was seriously ill and Lyme Disease (which was the designated subject for this post) went clear out of my head. It shall return. Meanwhile, I’ve been nursing Charlie, aka Houdini cat (who will literally disappear into the towel you think you’ve wrapped around him securely), reminding myself that nursing is a noble, noble profession. (That’s what you call professions that are bloody hard and nobody appreciates.) I’ll say one thing, taking a cat to the animal hospital does give one a quick lesson in the perils of for-profit medicine (my Visa may never recover) – especially in our risk-obsessed age where tests and scans trump individual history, personality and symptoms (human or animal). It also reminded me that one must be vigilant when faced with the ponderousness of Expertise.

In Charlie’s case it began with a neurological condition called Horner’s, an irritation of a nerve running down one side of the face, eye and down the neck and into the chest – not a disease but a symptom. Naturally Expertise immediately rushed to the worst possible diagnosis: lymphoma, or, in a pinch, brain tumour. (Do not pass ‘go’, just head for the hills.) I mildly posited inflammation or infection, probably ear related, particularly since Charlie’s had those before. But noooo.

Critical Care, human or animal, is rife with Expertise: grave, gravel toned and confident. Why? Because they have tech toys, that’s why. Cool devices and imaging technologies that purport to explain the mysteries of life. Even (ha ha) a cat scan. All of which push the patient into ever higher levels of care – because they can. Problem is, the patient often can’t.

I tried to hold my ground but it’s a slippery slope that one; the surer they are the more one caves, especially when they start to say, well, with cats elevated white blood cell count could mean X. I mean, what do I know from cat physiology?

So the cash register tinged and Charlie looked steadily worse. Of course nobody looks good in ICU between the ugly fluorescent light and the tubes but there’s something especially pathetic about a small furry creature sitting in a cage. And Charlie, well that cat could have taught Stanislavsky a thing or two about looking sad.

I kept getting calls to tell me things I already knew (he has a heart murmur). The last time I snapped, “I know. I have one too. Big deal.” That didn’t, naturally stop them from getting a cardiology consult. Bearing in mind that cats don’t hold still for much of these so need to be anaesthetized.

Finally, after every possible dire diagnosis had been ruled out, we came round to my original hypothesis: ear infection.

Don’t get me wrong. I have enormous respect for veterinary physicians. They study long and hard (far longer than human doctors) and by and large they are great. They deal with a diverse patient population who’s uncooperative and uncommunicative. And when I say diverse I’m talking species. And they need to make a living, I get that.

What they, and most of us, do not get however is that they are part of the culture at large and the culture at large is obsessed with the “science “ of medicine, leaving the art further and further behind. Watching Charlie work his way through the system reminded me of just how much medical focus has shifted away from the patient and towards disease, technology; towards what tend to be called “objective” results (versus the messy subjective ones patients bring).

I see this on a human level very time I go to the retinologist with my mother (that, by the bye, is a sub-specialty of ophthalmology). First, they get her to read the letters on the chart and are all impressed at how well she sees. Then they take their pictures and look grave: how could she possibly see that well with those terrible ridges in her retina? (To me they just look like the Alps.) Then they look puzzled. Scan says you can’t but you actually did see. What a gonzo dilemma. So, they go with the scan and give her the medication. Objective trumps subjective.

Question is, should it? Does it make sense for the patient to get lost in this morass of ostensibly objective ‘data’?

Not to my way of thinking. “Normal” – blood pressure, lipid level, whatever – is a best-guess average based on population statistics and what some committee has deemed appropriate. If you’re truly sick it shows up. C-reactive protein in the clouds – well, objective and subjective tend to match. Your joints hurt, you have some kind of inflammatory condition and the test backs you up. It’s that grey zone that’s problematic. Levels fluctuate in every individual, tests can be wrong (some more than others).  Error rates in some tests are as high as 75%. But we forget that.

So, cat or human we are lumped in with the many-too-many – and our individual narrative gets lost. In Charlie’s case nobody believed this pretty little cat who had only been ailing for a week could possibly “just” have a madly inflamed  ear affecting his balance and appetite. An infection is no picnic. But it’s not a brain tumour. And of course Charlie’s Oscar winning ability to look mournful didn’t help. This cat can look sad when he feels ignored; imagine how dreadful he looked when he was dizzy and queasy. It’s a gift. But it’s not diagnostic.

You need a proper history; the back story. The person with the disease is as important as the disease, said Hippocrates. Let’s say you end up in hospital with severe abdominal pain. It matters whether you’ve had this pain before, but less intense or of shorter duration. Sudden abdominal pain could be many dire things; a worsening of an existing problem is probably nothing that will kill you (otherwise you wouldn’t be in the ER in the first place). The clinical picture changes with the history. Someone has to factor it in.

Charlie’s doing better now. As for the rest of us – who knows. We may never survive the tech age.

Summer Reruns

Everyone may stream their entertainment on their teeny tiny phones but that’s just the tech; without a good story summer still means reruns.

So in the fine old tradition of reruns, I give you  a recap of the summertime blues.

(Coming soon, Lyme Disease. Stay tuned.)

Pity Pity Bang Bang

It’s depressing, how often these ‘incidents’ involving high-powered weapons seem to occur in the United States, where there are almost as many guns as people. Everyone cried about Sandy Hook but there have easily been four or five more that I’ve read references to in the paper.

I couldn’t believe it when I read that federal health agencies cannot comment on the public health consequences of guns; haven’t been able to do so for well over 20 years.

I wrote on this in January 2011 and frankly, don’t see there’s much to add … Except that, at least according to this piece in the  New England Journal of Medicine such fatalities are actually in the minority; the majority of gun deaths occur quietly among family members and people one knows.

Hey, I get that. You get mad, you want to hit someone – if you have a gun, it’s damn easy to pick it up and shoot. But if all you have on hand is a stick or even a knife, well, you can do some damage but the death toll usually doesn’t rise to the double digits.

Just last week some fellow went postal in a downtown apartment building in Vancouver. I know the place, I used to live a block from there. Reports are mixed but there appears to have been a knife and a hammer involved. Several people were hurt but only one remains in critical condition in hospital. The rest were kept overnight and let go. They’re fine. Which they would not have been had the perpetrator carried an M16.

Guns don’t kill people, people do? Ah, no. People with guns kill people. People with sticks and stones, not so much.

 

 

I stress, Eustress

It’s become such an ubiquitous concept that it’s difficult to imagine how recent a term “stress” really is. When Hans Selye first proposed that all tension, all sources of anxiety created the same kind of reaction within the body it seemed ridiculous. And this was in the fifties to the best of my recollection.  Of course now we all know that too much stress is bad for us and that stress is a factor in disease.

[So so many things that we take for granted – cardiac risk factors, prevention, stress – are such recent concepts. But we think they’ve been around since the year dot.]

Stress is hard on the immune system, affects us hormonally and causes muscle tension and fatigue. It gets in the way of sleep, which causes its own set of problems ranging from poor concentration to anxiety, and depresses normal pain signals. Which is why soldiers and athletes often don’t feel the pain of a major injury and it is only later that they realize they’ve damaged something.

Then there’s the good side of stress, or eustress. Without some stress we would have zero motivation, zero reason to excel or create. That’s why there’s that old graph showing how some stress is good before a major task, say an exam; with some stress performance gets better. But, if it gets too high then performance suffers.

Which we all know from our own experience.

Wandering around Paris what strikes me as well is the extent to which our actual, physical environment can create or reduce stress. When what is around us is beautiful, when we hear laughter, when the sun is shining – well, it’s hard to feel to unhappy or stressed. No accident that depressed areas inevitably are ugly.

It’s hard to be too stressed when one is a tourist in Paris, well, unless one tries too hard to make the French conform to one’s North American ideas of time, speed and interaction.  Personally, something that I think is rather wonderful here is the very formal aspect of saying ‘bonjour’ whenever one walks into a place, any place. It is a way, I think, of humanizing the service person, the waiter, the person in the store. When one stops to say ‘bonjour madame’ or ‘bonjour monsieur’, one has to pause and look at the person and realize this person is not simply part of the scenery, they are an actual human being. It adds a touch of humanity to what is often a rather soulless encounter.

The French are currently pilloried for their dislike of capitalism, their failing economy, their rising youth unemployment.  Several august bodies are miffed that in spite of all of this money markets still love France, which can borrow money at brilliantly low rates, which suggests they’re not worried about France’s future. There’s a palpable sense of outrage about this on the part of business writers, The Economist, various commentators – usually Anglo Saxon. Why? Why does everyone need to conform to the same ideas?

The French fought a revolution which had at its basis the value of the human being.  Extreme wealth, especially ostentatious wealth, is frowned on in France. I can think of worse things.

In any event, given the stress we all experience when all we focus on is money and making more of it, it seems to me that the French are on to something.

Beware the Bandersnatch my son (aka the “link”)

If I read the word “link” one more time in some ostensibly serious health article I will – well, let’s just say that like Dorothy Parker’s Tonstant Weader I will thwow up.

Looks like a Bandersnatch to me …

Last week “scientists” apparently linked one’s gait as one aged to one’s likelihood of developing Alzheimer’s. Yet another observational study, casting about for some connection to something; naturally they eventually found some tenuous connection somewhere – at least one that they could write a press release about.

(As a researcher once described estrogen – “a drug in search of a disease”.)

No mention of whether this gait thing might have had something to do with other, perhaps undiagnosed, problems such as osteoarthritis or inner ear issues or what-have-you. No, one more thing for us to worry about as we get older – our damn gait.

Earlier headlines with that vile word “link” (plus variations like “linked”, “linking” and so on) always seem to be in the headline, which, of course, is what most people read. So we read that higher levels of Vitamin D3 are linked to all manner of marvelous things, from not getting cancer and heart disease to staying young and sharp and simply mah-velous. Never mind that when you simply test people who are well and compare them to people who are not, measure their “level” of D3 (as though all of us have the same ideal level) and then say, ‘oh, look, high D means better health so why don’t we all take a supplement” you have no way of knowing which came first, the good health or the D3. For all we know, various diseases deplete the body of D3 and the lack of the vitamin is not the cause of the problem but its consequence.

A number of more cautious researchers have been saying exactly this, to no avail. Various and sundry institutions from the Cancer Agency to the WHO have all decided to chime in with their recommendations that people take supplements.

This same kind of nonsense proliferated in the talk around estrogen for pretty much most of the 20th century.  Researchers gushed that women who took estrogen “replacement” therapy (later “hormone replacement therapy” or HRT after it was found that estrogen alone could cause endometrial cancer) kept women young and healthy and prevented heart disease and dementia and probably hives and hangnails.

Replacement is in quotes earlier, incidentally, because it makes no sense to consider the hormone level of a woman of 23 normal for a woman at all other stages of life, particularly midlife, when all women’s hormones naturally decline.

Observational study upon observational study found a correlation (“link”)  between women who took hormones and improved cardiac function, fewer heart attacks and strokes, better health, you-name-it.  Well, except for the smidgeon of extra risk relating to breast cancer which epidemiologists dismissed as irrelevant. Of course this was not irrelevant to women, who didn’t rush to take hormones in droves, much to the researchers’ dismay.

Then the other show dropped. The largest clinical trial in history, the Women’s Health Initiative definitely showed that not only did estrogen not protect women from various and sundry age-related conditions, it actually could cause them.  Cardiac disease was higher in women who took hormones and there was nothing “healthy” about HRT at all.

But hey, they had studies that “linked” estrogen use with health and who were we to argue?

A lot of people ask me about supplements, Calcium and D3, this and that, largely, I think, because of those headlines linking this and that arcane nutrient with health. Which is where my problem with all of this lies.

You can print whatever nonsense you want, provided you don’t make it sound as though you know what you’re talking about. Especially in the headline. People actually change their behavior based on these things. People start taking things, adding things, subtracting things. Forgetting that health is multifactorial, complex and begins in the womb.

You won’t have strong bones as an adult if you were malnourished as a child. Wealth tends to lead to health. People are different. And the nutrients we ingest in food are in a balance and ratio that the body can absorb. Versus our best-guess estimate of what an ideal amount of D3 or B3 or T3* might be.

So beware the dreaded link as though it were the bandersnatch. On average, I think the latter is more benign.

 

*Tylenol 3