Insiders say data was mined to aid in marketing, and to push therapists to favor enterprise patients over others
By MATTHEW ROZSA
Former employees and therapists at Talkspace told The New York Times that anonymized conversations between medical professionals and their clients were regularly reviewed by the company so that they could mine them for information. Because the text conversations are considered medical records, users are unable to delete the transcripts. One therapist claimed that, when she referred a client to resources outside of the Talkspace app, a company representative told her that she should advise clients to continue using Talkspace — even though she says she had not disclosed that conversation to anyone at the company. The company argued the conversation may have been flagged due to algorithmic review, according to The Times.
A pair of former employees claim that Talkspace data scientists reviewed clients’ transcripts so they could find common phrases and mine them to better target potential customers. Talkspace denied that they data mine for marketing purposes. Similarly, a number of therapists told The Times that Talkspace seemed to know when clients worked for “enterprise partners” like JetBlue, Google and Kroger and would pay special attention to them. One therapist claims that, when they thought she was taking too long to respond to two clients from Google, the company reached out to her and expressed concern. The Times also reported that the company was working on bots to catch cues that a client may be in distress, ostensibly to help therapists catch red flags that they may otherwise miss.
A Talkspace spokesperson denied the allegations. “We pay special attention to all our corporate partners and their employees just as we do each consumer client,” John Reilly, general counsel at Talkspace, told Salon by email. “The key difference is onboarding a large corporate account is a bit more complicated than matching one person correctly, so we have extensive implementation protocols and implementation managers for each large corporate client to ensure a smooth transition at the start of each relationship.”
Regarding the bots, Reilly told Salon that “we provide our Therapist network with an array of analytical tools for their digital practice. One program will look at the encrypted text to alert the therapist to language that may indicate a client with emergent issues or escalating language trends.”
The Times highlighted the story of a man named Ricardo Lori, who was hired in the company’s customer service department after being an avid user for years. When an executive asked him to read excerpts of therapy chat logs in front of staff to give them a better impression of user experiences and assured him that he would remain anonymous, he agreed. After the presentation, however, the Times reports that Lori’s confidence was betrayed.
As Mr. Lori drank a tall glass of red wine and watched, he noticed that a few employees kept glancing his way. Afterward, a member of the marketing department approached and asked if he was OK. Later, Oren Frank, Ms. Frank’s husband and the chief executive, thanked him in the elevator. Somehow, word had gotten around that Mr. Lori was the client in the re-enactment.
In response to these accusations, Talkspace wrote in a post on Medium that many of their responses to interview questions from the Times did not make it into the final story, and that a prominent clinician who supports Talkspace also did not have his answers included in the story.
Despite the evidence and sources from the company who spoke out, Reilly denied that Talkspace data-mined transcripts for marketing, claiming that only data purged of identifiable user information was used for quality control.
The Health Insurance Portability and Accountability Act (HIPAA) Privacy Rule places strict rules on health care providers when it comes to sharing patients’ information. It specifically states that health care practitioners can only share medical information among themselves and solely for the purpose of providing adequate medical care to their patients; that they are not allowed to casually disclose private medical information to the general public; that patients have the right to see and if necessary correct their records; and that personal medical information cannot be disclosed so that providers can improve their marketing.
“If it is true that Talkspace used information from private therapy sessions for marketing purposes, that is a clear violation of trust with their customers,” Hayley Tsukayama, Legislative Activist from the Electronic Frontier Foundation, told Salon by email. “All companies should be very clear with their customers about how they use personal information, make sure that they don’t use information in ways that consumers don’t expect, and give them the opportunity to withdraw consent for those purposes on an ongoing basis. Talkspace trades on its trustworthiness and mentions privacy frequently in its ad campaigns. Its actions should be in line with its promises.”
Talkspace has previously come under fire for its ethics, labor practices and efficacy. The company was also accused in 2016 of forcing therapists to use scripts that promoted Talkspace services, lacking sufficient plans for patients who are in danger and monitoring conversations between therapists and patients. (That story was originally reported by The Verge.) The company was also accused of threatening legal action against therapists who tried to establish relationships with clients off of the platform, which CEO Oren Frank said only happened “in a few extremely unusual cases” because the therapists allegedly engaged in “serious ethics violations or potentially dangerous communications.”
Like other online therapy apps, Talkspace is not covered by most health plans, and generally costs hundreds of dollars in out-of-pocket costs for a regular subscription. Therapists who spoke with Salon about online therapy apps like Talkspace last year said they paid therapists poorly and hid the actual wages. That mirrors what many gig workers have seen with similar contingent labor-based companies like Uber, DoorDash and Lyft: that employers obscure the real wages in order to make their gig work seem more appealing.
This article originally appeared in Salon.
[…] post Therapy app Talkspace allegedly data-mined patients’ private conversations with therapists appeared first on […]