Health Outcomes: Building the next button

, , , , , , ,

 

 

By David Windhausen, executive VP of Intouch Solutions.

 

amazon-dash-pill-bottle

Last March 31st, Jeff Bezos and his merry pranksters in Seattle gave us all a good laugh – an ironic laugh, as it turned out. On that day, suspiciously close to April Fools, Amazon announced the launch of the Dash, a tiny brand-emblazoned device that took “one click” to the extreme – one touch of its button would bring a particular frequently-used product right to your door, without any of the inconvenience of, say, going online and ordering it from Amazon’s website. Just place the Dash next to wherever you keep your laundry detergent or coffee or toilet paper, press the button when needed, and more magically appears.

Given the borderline surreality of the Dash and the curious date of its launch, plenty of folks thought it was another one of those ubiquitous techie jokes, like the Google Actual Cloud Platform, YouTube SnoopaVision, or Samsung’s Internet of Trousers.

But it wasn’t, as several embarrassed news sites were quick to clarify. The Dash button was quite real. As of a year later, several dozen consumable brands – Tide, Charmin, Glad, Red Bull, Doritos, Schick, Gerber, and many others – already have their own Dash buttons, and when Amazon announced the addition of Trojan condoms to the list this March 31st, the world at large wasn’t quite so skeptical as last time.

Why does this matter to pharmaceutical marketers? Because the Dash may be the ultimate example of how a company has been able to use technology in such a way as to make a particular choice practically invisible to the customer. If I have a Tide Dash button stuck to the wall in my laundry room, I don’t have to add Tide to my shopping list, I don’t have to go to the store for it, I don’t have to go to Amazon’s website and order more. At the very moment that I’m pouring Tide into this week’s load of laundry and the container starts feeling a bit light, I just press the button. I don’t need to know a thing about the vast logistical complexities behind that button; I just press it and Tide shows up at the door. No more than a moment’s thought is required.

And if technology can be used to make consumer decisions invisible and instantaneous for Amazon, why can’t the same be done for healthcare treatment? Because we as humans being what we are, we will be much more inclined to follow an adherence regimen in which all we are required to do is press the button.

Imagine that. Imagine if positive health outcomes could be created just from pressing the button. And imagine, further, that a patient’s day to day life is filled with an entire tapestry of nearly invisible buttons, each associated with some point in the health care journey and tied to improving outcomes.

It is perfectly possible, and becoming more so each day. In fact, many of the precursor technologies already exist. Lots of hay has been made by the forward-thinking among us of wearables like the Apple Watch or the Fitbit, devices that can passively collect health data, and there’s been plenty of speculation as to how our industry might be able to integrate such things into outcomes-focused solutions for patients. But plenty more is out there. What about the Sleep Number Bed with Sleep IQ, which monitors restfulness, breathing, and heart rate? Or, yes, the Toto Intelligent Toilet II, which can record weight, BMI, blood pressure, blood sugar, and sodium levels with no change at all in patient behavior; everyone has to go to the potty, after all. Or the iTBra, which can help detect breast cancer and is only one of many first-generation articles of consumer smart clothing for all manner of health purposes – Ralph Lauren’s Polotech shirt, Hexoskin, and Athos are others. Or the Google/Novartis contact lens with embedded blood sugar monitoring. Or even the Samsung Family Hub Intelligent Refrigerator, which – while it can’t yet actually keep track of whether your milk is running low – offers a “family hub” touchscreen that could potentially be integrated with other data sources to offer relevant health messaging, perhaps a snapshot of last night’s sleep state for the user of the Sleep IQ bed, or an encouraging note to the dieting user of the Intelligent Toilet.

Then, underneath all these consumer-facing examples, we’ve seen extraordinary strides made in technology’s ability to understand and react to our natural means of communication through text, speech, or visual cues to predictively assist us in establishing the right course of action. Everyone with an iPhone has fiddled with Siri at one time or another, and here at Intouch we’ve explored specific applications within healthcare for natural speech and artificial intelligence technologies like Amazon Echo. But those technologies are only the beginning. Right now there are new advancements being made that go far beyond what we have experienced with Siri, Google Now or Microsoft’s Cortana. In fact, some of creators of Siri are working on “Viv,” a far more advanced version of their previous invention that can handle much more complicated questions – “Will it be warmer than 70 degrees near the Golden Gate Bridge after 5PM the day after tomorrow?” its creators asked Viv at the first public demo in May, and the app had no trouble responding accurately. And Apple itself recently purchased a company called VocalIQ that’s working on a similar offering. Both Viv and VocalIQ have the ability to associate location, time, and consumption traits to better understand and recommend or act upon a request. They can learn, they can remember previous queries, they can access a virtually unlimited number of data sources, they can handle multiple simultaneous complex requests across a variety of subject matter, they can act on behalf of the user by placing orders – and they can and will become the primary interfaces between users and the companies they patronize, from the pizza shop around the corner (“Get me the usual, except with olives this time, tell them to deliver at 7PM, and put it on my Visa”) to, yes, Amazon, and do so with – as the Washington Post’s Silicon Valley correspondent put it when discussing Viv – “the spontaneity and knowledge base of a human assistant.”

It honestly does not take much foresight to predict what all this might mean for our own business of health care – or what it could mean, if we in the business have the will to make it so. Imagine it – a patient surrounded by effectively invisible passive measuring devices, supported by an artificial intelligence that can answer complex questions (“Viv, how many calories have I burned in the last two weeks? How does my kidney function look this morning? Did I take my meds?”), proactively communicate (“Don’t forget to take your meds!”) and take real actions, and with access to a persistent health record that can be parsed by permitted health care providers, not to mention analytics algorithms throwing off timely suggestions, recommendations, and health tools of all flavors according to need and circumstance. The more advanced among health care marketers sometimes speak of the marketing web or tapestry – being able to touch the right patient at the right time with the right content in the right medium. But what I’m talking about goes one enormous step further; it goes beyond mere communications, beyond individual brands or disease states, and into the actual day to day lives and actions of our patients. And if we in health care aren’t excited about this possibility – we who speak of putting the patient’s needs first, who complain nonstop to each other about the costs and frustrations of noncompliance and the difficulty of getting through to patients and earning their trust – well, then, we just aren’t paying attention.

No one is doing this, really. Yes, a few pharma companies have experimented with wearables here and there, focusing on one disease state or another. But no one in health care has even begun to attempt combining “invisible” technologies, data collection, predictive analytics, communications, and artificial intelligence into an entire ecosystem of care and support for patients, covering the whole person and any health-related challenges she might face.

It will be done, because it can be done, and the human value, not to mention the business value, of doing it is just too great to be ignored forever, even by an industry like ours so adept at ignoring new technologies until forced to do otherwise. Someone – some company, some consortium, some mad scientist entrepreneur – is going to create the health-technology ecosystem that will change everything. Will we in pharma just be pressing the button? Or will we be building the button? That remains to be seen.