Another Article Shows that Price Transparency Tools Don’t Work (Plus My Explanation Why)

I have a bunch of papers I’ve saved to read, and I think it’s time to start going through them.

This one is called Online Advertising Increased New Hampshire Residents’ Use of Provider Price Tool But Not Use of Lower-Price Providers. It’s in Health Affairs, 2021, and was written by Sunita Desai, Sonali Shambhu, and Ateev Mehrotra. I always look forward to reading studies that Dr. Mehrotra was involved in.

Based on the title, you can guess that this paper will be good to challenge my assertions that people can make value-sensitive decisions if they are given the right information. Let’s see what they found!

The background to this study is that price transparency efforts are “plagued by low consumer engagement.” Typically less then 15% of people actually use whatever price transparency tool (usually a website) was provided, and often the number is much lower. So the question is, are so few people using these price transparency tools because they don’t know they exist, or is it for some other reason? Possible other explanations the researchers provide include patients just plain not being interested in using the price transparency information, and patients wanting to use the information but being unable to for some reason.

To answer their question, the researchers launched a “large targeted online advertising campaign using Googe Ads to increase awareness of and engagement with New Hampshire’s price transparency website.” Then, if the ads drove awareness of/traffic to the price transparency website but the claims data still did not reveal any changes in how often patients select lower-price providers, they could conclude that awareness isn’t the main contributor to the ineffectiveness of price transparency websites.

They created ads for three specific services: emergency department visits, physical therapy services, and imaging services, all chosen because they’re generally shoppable (yes, that includes emergency department visits, a high percentage of which would be more amenable to urgent care!) and because their cost is typically low enough that people will not meet their deductible by receiving the service, which means they’ll be paying out of pocket for some percentage of the service.

The ads took people straight to the NH HealthCost website, which uses as its source data actual negotiated prices between provider-insurer diads (oh, the magic of all-payer claims databases!) (gag clauses be damned), although it doesn’t plug a patient’s exact plan or year-to-date spending info into the website to give an exact out-of-pocket cost; the best it can offer is an “estimate of procedure cost” and it also gives a “precision of the cost estimate” (low or high).

After spending $39,000 on ads over the course of 6 months, the website traffic went up from 265 visits/week to 1,931 visits/week. People who clicked on the ads spent, on average, about 30 seconds on the site. I just tried going to the site myself, and 30 seconds is definitely not enough time to truly search for a procedure and use the results to choose a provider. But maybe they returned later and spent more time on the website, which is plausible since the non-ad-related visitors typically spent more time than that.

Voila, awareness can drive traffic to a price transparency website! They cannot exactly calculate what percent of all patients who used one of those three services during the study period visited the website, but best-case estimations are 77% of ED visits, 13% of imaging services, and 54% PT visits. The true numbers may be much lower. But still, it’s enough that you’d think there would at least be some measurable difference if going to the website impacted provider selection.

Nope. Not at all. When compared to other states (as a control group, using a difference-in-differences analysis), there was absolutely no measurable impact on provider selection.

Before reading this paper, I was a little surprised at the title. But now, having read the paper and checked out the website myself, I’m not surprised.

Think about it this way: We all compare Amazon prices when we’re shopping at Walmart, don’t we? Americans seem to be very interested in saving money and getting the best value as often as we can. But if we’re at Walmart and the best we can get when we pull up Amazon on our phones is, “This item will cost approximately $10 less than what you’re seeing right now in Walmart, and the precision of this estimate is low. Also, we can’t say anything useful about the quality of this product since there are no helpful reviews.” How influential do you think that would be for people? It almost seems like the risks of ending up with a more costly or lower-quality product are high enough that you might as well go for the sure and familiar thing right in front of your face. So it is with provider decisions.

The investigators nailed it when they said, “Our findings emphasize that awareness of prices does not simply translate into price shopping and lower spending. There are numerous barriers to using price information. People might not know the details of their benefit design to infer their out-of-pocket expenses. Customized out-of-pocket spending estimates may be critical.”

Agreed. They nailed it. And another thing is critical: easy-to-understand and relevant quality information. I’ve still never seen a research study that’s been able to give patients both exact out-of-pocket prices and relevant quality information side by side. Those two details, plus insurance plan designs that require patients to pay more if they choose a higher-priced provider, are all essential to actually impacting patient decisions about where they will receive care.

Think of these efforts to get patients to alter their provider decisions as a bridge. This bridge has to span a deep chasm 30 metres wide. If the bridge is made of three piles, each supporting a 10-metre long deck, the only way to cross to the other side is by having all three of those decks in place at the same time. Even if you build two of the three decks, nobody is going to cross that bridge (except, maybe, a few daredevils willing to risk their lives). Likewise, very few patients will risk choosing a different provider than the default one (the nearest one, the one their friend recommended, the one their doctor mentioned, the one they’ve gone to before, etc.) unless they have all three of those pieces in place.

Do these findings change by mind in any way? No. It was certainly a useful and well-performed study, but until I see a study that gives patients all three decks of that bridge (and make them aware that the bridge exists!), I doubt I will ever see a study on price transparency tools show any significant impact on the number of value-sensitive decisions. Which makes me mourn all over again that CMS did not fund the study I designed for the Utah Department of Health. It would have been the first to build that bridge and then watch what happens to patients’ provider selection and prices and quality in the market.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s