News and Media

The dangers of using Facebook to measure hospital quality

“The problem with measurement,” Dennis S. O’Leary once said, “is that it can be a loaded gun: dangerous if misused and at least threatening if pointed in the wrong direction.”

Today, in the era of social media and “Big Data”, researchers and consultants salivate because: a) data (e.g. ‘likes,’ tweets, ‘shares’, bit.ly links) are freely available and easy to measure; and b) clients and governments have been primed for decades to believe the more data, the better their decisions.
As Gary Marcus of New York University and my colleague Eric Meerkamper and I have argued, the ability to capture more and more data — “Hey, we’ve got a quintillion bytes! How much do you have?” — is a flawed investment model for a subset of faddish Venture Capitalists in Silicon Valley more enamored by data size than by data quality. Yes, testosterone, not logic, is at play here.
As the political forecaster Nate Silver has noted, “Big Data will never replace thinking or hypothesis-testing.”
The strangest example of measurement misfire comes in the form of Facebook studies. Beware any study that measures “Likes” and correlates them to anything of significance, notably anything to do with healthcare. The capacity to game “Likes” has long been proven. My colleague and I received one “Like” on our 2011 policy book on obesity policy (drawing on data from over 100 countries), used at Harvard and in syllabi around the world. Our book received global media acclaim, award nominations and sales. I felt bad about our Facebook flop until I learned that the best-selling author in the same category, David Kessler, had received no “Likes”!
Facebook, lest I remind readers, is a game, originally designed to compare the beauty of freshman women at Harvard. To suggest that the number of “Likes” that an attractive young woman, subjectively defined, correlated in any manner or form with her intelligence or studiousness, is absurd. What would be a statistical certainty is that one could correlate her presumed beauty with something. Correlations, as any undergraduate student knows, do not equal cause, especially in the era of Big Data, where the size of the data can generate correlations that may be the precise opposite to the truth.
In one of the oddest of such recent Facebook studies, published in the American Journal of Medical Quality by Alex Timian and colleagues at the Healthcare Innovation and Technology Lab, a quantitative analysis of the Facebook pages of 40 New York hospitals was used to determine whether Facebook “Likes” were associated with hospital quality and patient satisfaction. Results suggested that hospitals with lower 30-day mortality rates received more “Likes”; and those with more “Likes” were more likely to be recommended by patients. Both relationships were statistically significant, suggesting that the number of “Likes” may be used as a proxy for patient satisfaction and an indicator of hospital quality.
I know a little about hospital indicators, academic publishing and a little about the Internet.
Rule 1: Publish a study or press release about a study and use the word “Facebook” in the title and you are guaranteed media publicity. To wit: A press release once mistakenly correlated the number of Facebook mentions in divorce proceedings to suggest that Facebook was causally related to actual divorce rates (1 in 5 Divorces Caused by Facebook!). This media-grabbing strategy is effective for public relations.
The study to which I am now referring, on hospital quality and “Likes”, was, in fact, distributed to me by the $48 billion tax-payer funded Ontario Ministry of Health and Long-Term Care, as if it were of grand global significance to data researchers.
Rule 2: 30-day mortality rates vary vastly for heart attack for complex and unknown reasons. Most hospitals do not have active Facebook pages. Some do. Since patient satisfaction measures, using many types of measurement, are prone to ceiling and floor effects — often suggesting that a hospital will receive a high patient satisfaction score (say, 90%) — any statistically significant correlation between more “Likes” and higher satisfaction scores in New York area-hospitals is, on its face, a questionable hypothesis.
Rule 3: Facebook users are a subset of a subset of a subset of hospital users. They are generally younger, healthier and more active than typical hospital users. People who hit the “Like” button on a hospital do so for all sorts of reasons, including the fact that their “friends” “Like” the hospital. Remember: Facebook is a social network. People who congregate in Toronto are more likely to say they like – or dislike – the same thing or person, such as the Mayor. There is a tremendous social desirability bias inherent in whom you tell your friends you like – in perpetuity, on the Web.
It’s hard enough to convey to very sophisticated researchers about hospital quality measurement and limitations, using the most rigorous, risk-adjusted data available. And now some are saying that we’re supposed to be determining hospital quality for our loved ones by the number of “Likes” the hospitals have? Rubbish. My own research during the last pandemic, SARS, showed that the most-heavily shared and downloaded YouTube videos were scary-sounding, gross misinformation concerning the alleged lack of safety of the flu vaccine.
Social media is a game, not a serious decision tool.