Your Health Data Is for Sale

 

Like it or not, data brokers have probably monetized your health data.

But how do they get it in the first place? And what measures can you take to safeguard your health data? In this enlightening episode, I talk to data privacy expert Lucia Savage to find out.

Lucia Savage is a nationally-recognized thought leader on using HIPAA to advance digital health while protecting privacy. As Chief Privacy and Regulatory Officer, she drives Omada Health’s continued commitment to advancing digital health, including the safe, secure, and effective handling of participants’ personal health information. Prior to joining Omada, she served the Obama Administration as Chief Privacy Officer at the U. S. Department of Health and Human Services Office of the National Coordinator for Health IT. She first implemented HIPAA for Stanford University in 2000, and later oversaw the HIPAA compliance and data strategy for California’s pre-ACA health insurance exchange, PacAdvantage, where she served as General Counsel. Lucia has a B.A. with Honor from Mills College and received her Juris Doctor summa cum laude from New York University School of Law.

Topics covered:

  • Who is buying our health data and why

  • The implications of health data being sold

  • Understanding HIPAA, GDPR, and CPRA

  • If consumers should be compensated when their data is used for research

  • How to safeguard your healthcare data

  • Why GoodRx got fined by the FTC

Listen

Transcript

Halle: So maybe you can start by telling us the backstory about how you became so passionate about data privacy and digital health.

Lucia: Um, I would love to. It's a story that is very dear to me. So it was around 2005, like the mid aughts, no high tech yet. And I was the primary health advocate for my mother, who was in her late seventies at the time. And she had, um, advanced C O P D, but also bipolar. I don't remember which type. And furthermore, she's allergic to steroids, which are a normal drug for treating C O P D, um, because they would cause a man.

And she'd been taking lithium for her bipolar disorder. And um, she had a situation where her primary care physician ran normal kidney tests and they came back in a way that concerned the PCP and the PCP conferred with the psychiatrist and they decided to take my mom off the lithium and they gave her an on Patt medication.

And Halle, you and I are old enough to remember the donut. Which would've had very significant financial consequences for our family, this patented medication that in fact wasn't working very. . And so I asked my mom, did her doctor consult with a nephrologist, a kidney doctor? And my mom said no. And I said, well let, let's go talk to the nephrologist.

So we got the consult and we went, I drove, you know, the 90 miles to my mom's nephrology appointment and he opened the ehr. So this is the mid. Two thousands. Um, and this is before high tech and he had like a 10 year history of her lab values and everything was color coded. And he shared the screen with us and we could really see how, you know, hospitalizations had affected her labs, what drugs were affecting, the kidney function tests, all the things.

And he explained really carefully that, um, essentially the PCP. Read the results wrong, that certain drugs were making the lab tests look worse than they were, et cetera, et cetera, and we could put her back on this non patented, completely generic medication that cost $5 a month that actually worked. And so just looking at that patient interaction and the records and the way they were visually designed from the data that we could see them, I mean, I was completely sold on this as a way to help families really manage complicated medical condit.

Halle: Thanks for sharing that -- Can you tell us more about what sort of health data is being bought and sold?

Lucia: Sure. Well, I think the bottom line is, um, there's a lot of health data that is not from the traditional healthcare system. They're very specific rules within the. Traditional healthcare system. About not only disclosing the health information, but uh, really prohibiting the sale of data when it's not de-identified.

So that's everything that's kind of in the HIPAA zone, and we can talk about that. But with the advent of smartphones and social media and people's digital interactions and, and what we call their digital exhaust, you know, people talk. On their social media, whether they had a cold, did they have covid, you know, cancer in their family?

Um, you know, all kinds of things about their health are exposed on social media as well as the, um, digital exhaust of our own search histories, which you might not be, you know, disclosing something expressly. On social media, like on your platform saying, I have a fever of whatever, but literally just searching for, you know, cold medication or searching for, um, physicians who offer particular services in your area, but you're using Google or whatever your browser of choice is.

So everyone has who, who's online has these, these digital trails behind them and they're often a lot about health and that data, um, can freely be.

Halle: But would information on my asthma or my period be valuable to a company? It doesn’t seem that interesting.

Lucia: All right, so let's, uh, asthma's a great example. So if a social media company knows that you have asthma and their business model is getting revenue from people who want the social media company to push ads into your feed, now they know that you have asthma and it's all about what are the ads they can sell. to you that might catch your eye because you, because you have asthma. Clearly there are medication ads, you know, the latest and greatest, uh, asthma medications. But then there's all the related things, HEPA filters, air filters, um, you know, um, allergy medication. Um, You know, I, I'm trying to think of all the asthma things, you know, humidifiers, um, what, whatever, right.

All the things that might go with treating, uh, an asthmatic. And I think, um, for anyone who's had a young child or a miscarriage, we see this actually, people talking about this because, you know, you get pregnant or your, your partner gets pregnant, you go online, you start looking for baby stuff, then there's a miscarriage.

Now you have this digital trail indicating you were pregnant or gonna have a baby in your. Uh, and there isn't a trail indicating the miscarriage, and so you keep getting ads for the baby supplies.

Lucia: that's really tragic. It, it's, it kind of salt in the wound for the people. Uh, but that's sort of how the system works.

Halle: Yeah. How big of an industry is it buying and selling health?

Lucia: Well, I think you have to separate the literal buying and selling of the data from the monetization of, of the digital exhaust through ad placement. They're like two different revenue models. Does that make.

Halle: Okay.

Lucia: Right. I honestly don't know because I, I tend to work more in the healthcare space itself, but I know it's in the.

Halle: Yeah. So recently GoodRx got fined by the FTC for sharing health data with Google and Facebook. Presumably for these ads that you're talking. Can you give us a little backstory of what happened here?

Lucia: Yeah, I mean I apparently what happened is a couple things. First of all, GoodRx never was within traditional healthcare. It wasn't a HIPAA covered entity. It wasn't providing services to HIPAA covered entity, but they asserted in public that they were. HIPAA compliant, and your listeners should imagine the air quotes, um, including little like seals of an aicia on their webpages of HIPAA compliance.

And HIPAA is, you know, ranks pretty high on the list of, uh, well-known and completely misunderstood laws in America. Probably right behind the First Amendment. Um, And so, uh, it was asserting that it was HIPAA compliant, but the activities it did in, um, its own marketing processes were actually leaking all of this health information to social media, which would not have been allowed had they been in hipaa.

And so there were two big problems. One is they weren't in hipaa, but they implied in their advertising very expressly with these symbols that they were. And secondly, they didn't do what HIPAA would've require. So those are two things that that, two things that were part of the complain.

Halle: Yeah. And so I assume they had to take these seals, these HIPAA seals off their website and they were also fined.

Lucia: yes. They were fined 1.5 million by the ftc. Um, they had actually taken the seals off a while ago, but um, the FTC was kind of looking backwards at their conduct, which it found to be misleading. And, you know, GoodRx has a statement out on the internet about this, but honestly, they stipulated to the allegations in the complaint.

we can take those as true. But the other interesting thing about GoodRx, and we find this often with um, FTC settlements, is there's kind of a tailing civil litigation that's now sprung up. A nationwide class action was filed last week, and that, um, class action is not only alleging similar. , um, allegations that GoodRx lied to its customers, basically misrepresentation, but also that GoodRx was covered by some state health privacy laws, specifically California Medical Information Act, and that it's leakage of that data violated that act as well.

So it's a complicated legal space, and GoodRx is not outta the woods yet.

Halle: Yeah. And did they actually monetize that data when they were sharing it with Google and Facebook, or were they just sharing it for their own use? For retargeting or something.

Lucia: Uh, it was most, they were sharing it. Uh, the allegations are that they were sharing it for their own use for retargeting. They, there's not an allegation that they sold it, um, in, in, in receiving revenue for having disclosed the data. You know what I mean?

Halle: And so, so our listeners understand retargeting. This is essentially a way that websites can serve ads specifically to people who have already been on their website… to remind them to come back.

Lucia: Right, exactly. It's that digital exhaust we are talking about. The website owner says, oh, Lucia came to this website and seemed to have been interested in this part of my product. I wanna send her ads about related services.

Halle: But there's some more nefarious uses of buying and selling data. I was reading through a Senate Commerce Committee testimony on this topic and saw some of the lists the data brokers sold, they included a list of rape victims, a list of people suffering from genetic diseases, a list of seniors with dementia. Each being sold for under eight cents per name.

For a list of seniors with dementia, you can imagine that that could be sold to someone who could use it really take advantage of those people. 

You know, the case with GoodRx might not negatively impact your life. Sure, you’ll see some annoying ads. And no it’s not ethical. But it could be worse. 

Lucia: No, I, I think that's exactly right. And it's interesting because. , you know, uses of that say that dementia list that could be really helpful, like adds to the family members of the people with dementia so that they understand where they can get social services, resources to help with that. We might actually want that, but um, you're right, it could be sold to actors who are careless and actors who are malevolent.

And I don't think the broker. Have like, there's no rule that says you can't sell it to a malevolent actor once you can sell it.

Halle: Yeah. these data brokers are just incentivized to sell these lists to as many people as possible.

Lucia: that's right. But if I could just add one more thing. I think this is a really, uh, you know, I live and breathe this and sort of been watching the evolution of consumer understanding. Probably since Cambridge Analytica broke six years ago, seven years ago now, 2018, however long that is. Um, and I think consumers, there are more consumers who are aware than used to be about how this works.

Um, but I also think that people get a lot of, I mean, I, I use social media and I use it to connect to my cousins and my friends, and I don't necessarily put my personal health information on there. But remember social media sites, which are then obtaining this health information can also be really important sources.

You know, community support for, um, people with particular diseases and conditions as they trade, peer support information, um, you know, across, even across a country or, or, or across, across countries, you know, north to South America, et cetera. So that can be, there's pros and cons here to disclosing your health information in a non-healthcare.

Halle: Yeah. Yeah. I think it's just knowing that there are third parties that are like monetizing. your data without any sort of oversight of who they're selling to and how  it'll be used. 

But having phone numbers, addresses, those are the sort of things that feel a little more invasive. I had a situation, this is not healthcare related, but I had a situation,  where I. Googling,, replacement cushion covers, for an outdoor sofa and came across a site of a brand I had never heard of before. Then two weeks later, in the mail was addressed to me, a catalog for that company. It was not a coincidence. When retargeting is just on my browser, that’s one thing. But  when they know where I live -- that totally spooked me. 

Lucia: No, that's exactly right. But remember, like you might have had a catalog subscription somewhere else, and then that address got sold to a broker, and then that's kind of where the large scale data analytics come in, right? It's the same data techniques, um, that connect your, the IP address and your Google.

Uh, Cushion cover search to this other data and match it up. And then you get mail that's actually the same, you know, analytic techniques we use to figure out comp complicated, multiple comorbid condition pharmacology results. When we do research, it's the same, you know, level of data science. And so, um, but yeah, it is, it's totally, it can totally creep you out.

I had to look up a new company for work once, and it was about 30 minutes before I started getting ads and LinkedIn for that company. 30 minutes. It was so fast.

Halle: So we know that data we are willingly giving to digital health apps is being bought and sold… but what about data we give our doctors?

Lucia: Um, they should not be. That's the plan. So theoretically, this is what's supposed to be happening. First of all, just to be very clear, if your hospital, or your health plan, or your doctor's office, or even, you know, if you use an app like Amadas, but it's in hipaa, absolute sale of identifiable information is prohibited.

It's just full stop, not allowed. you can de-identify data and there's standards for what that means. And then you can release that de-identified data, uh, you know, uh, for free for research, and you can recover the cost of data preparation for that data. So that activity also happens, but it's supposed to be de-identified and there's supposed to be contractual prohibitions against re-identification.

So, but there is a lot of crunching of the de-identified data. There's a lot of, you know, peer reviewed literature about the good and bad of that, but it's not supposed to be sold identifiable if it's in hipaa, that's full stop prohibited. It's different if you are, um, using social media or an app that you might have downloaded from the app store that's not a HIPAA covered entity.

And, um, I hope you ask me how people can know the difference cuz we could talk a little bit about. That app should be doing what it promises to do about your data. So that's where companies have really fallen down. Sephora fell down, glow fell down, and GoodRx fell down. They made some assertions to the consumers that they didn't stand by.

So misrepresentation is illegal.

Lucia: if you had an app that said, absolutely, I'm gonna give your data to TikTok, and you signed up for that, and that wouldn't be a misrepresentation.

Halle: Yeah. Yeah. But it's, it's usually small print. It's like you have to scroll through, you just want to get to the app.

Lucia: I'm not, I mean that's, so two things just to get back to GoodRx.

So that's a, a very interesting fact about GoodRx. So that it had this like symbol of, uh, you know, an acronym that people tend to recognize and tend to associate with something valuable. And it was a lie. Right, this little HIPAA symbol, right? And it was a lie on, on multiple levels. So I think that's important.

But the other thing I think about GoodRx is, um, there's, uh, if you read the congressional testimony, you know that there's a weak spot because the Federal Trade Commission, which really oversees the consumer space here, doesn't have all the authority. We might wish it. To protect us here, but what's interesting about the GoodRx is it's kind of the first shot across the bow of this fortified ftc.

Biden has appointed a bunch of much more aggressive commissioners, and they've empowered their staffs. To go do stuff, and this is kind of the first thing they've done. And so I expect to see more. They have been very active in guidance documents about, um, consumer health information. There's a new guidance document about, you know, how to handle health data in a non HIPPA setting, et cetera, et cetera, et cetera.

So I think, um, GoodRx is, they're intending it to be both what we call a sentinel event, meaning everyone looks at it and goes, oh, I better stop doing that. As well as kind of the first shot across the bow, so we'll have to see what comes.

Halle: I wanna talk about kind of the less nefarious ways that our data de-identified, can be used, because that's really interesting to think about. If you’re providing health data that is being used to create lifesaving cures -- that’s great. But should you get a cut of the pharmaceutical company’s profits? Should you be rewarded for that?

Lucia: Yeah, that's a really great question. So the shorter answer is there's definitely people who believe that that should happen. And, and that in it builds on the way the current research system works. Right now, if you are participating in the research activity directly and you sign up, you will no doubt receive some kind of compensation.

It probably won't be a lot, but you'll probably receive something for your time and effort. Um, and um, the thing is, with the identified data, of course, the whole point is that the researchers don't know who the people are. That's the whole point. So there's a trade off between paying a person. and having the data be essentially de-identified or, or a certain level of anonymized, there's trade-offs there.

There's also an ethical piece to the compensation part of it, which is, um, you know, we, we want people to be compensated for their time and trouble, but we don't want research to be financially coercive. So an example that you would find in an ethics textbook is, you know, a person's incarcerated. and if you pay them, can you pay them less and still get the information you need cuz they're incarcerated and they don't have choices or they're impoverished.

So you have to think about the ethical side of what happens when you start paying, uh, individuals through their data. Right now people are mostly quite, um, altruistic about participating in research if it's of interest to them.

Halle: Yeah. So 23andme obviously, um, has become famous for that, right? You're answering questions, you're not compensated for it, but it's kind of fun and you get some information back about your earwax or if you have the cilantro gene. And so they kind of gamify it for us.

Lucia: That's right, that's right. And I've, I've thought about the 23 and me, I know they were doing a whole big long form on people whose families had a history of mental illness. Cuz I have that in my family. I'm like, well, maybe I can help, you know, improve pharmacology and improve diagnostics by letting my genes be analyzed.

I decided not to do that and, and, and the former privacy officer, there was a friend of mine, so I, it wasn't about a privacy thing, it was just, I decided that I wasn't informative enough to, you know, sign up.

Halle: yeah. It's, you know, it's interesting. Do you think that it's our willingness to share this information on social media, um, on, you know, apps like 23 and Me, do you feel like there's a, a generational divide on our willingness to share?

Lucia: You know, I have, I don't, I don't really think so. I think. Because I see young people younger than me, uh, feeling more private and more protective about their social media than maybe I do. And there's people older than me that sort of are private and protective. I have a really dear friend who has a very complicated medical situation, and she will literally, you know, , um, shame her family members, if they imply on social media that she in fact has a condition, she's very protective of her privacy.

So I think to me it's more about your cultural background and what brings you joy. Let, let's be honest, sometimes social media is fun.

Halle: And sometimes having, some sort of group that has been through what you've been through when it feels like no one around you in your day-to-day understands can feel, can give you a, a sense of, of not being alone

Lucia: Exactly a hundred percent. And again, you know, it's, I think what I see is. Consumers now are more sophisticated about how everything works. So you and I were joking about me looking something up and having it show up in LinkedIn and I do this for a living. But I think more people are aware of that process, right?

You look up something and pretty quickly you get ads about the thing you looked up.

Halle: Yeah. So if I wanted to see like all of my health information and the exhaust that you speak of in one place, uh, that could, you know, like what is out there that's kind of, that I'm vulnerable to, like what have I willingly put out there? What is out there that I don't realize is out there? Um, what could be bought and sold?

Lucia: I a hundred percent believe it's possible. I think you really need some expert hackers. Um, the Light Collective is doing some work. I'm just gonna give you all the women. Know, Nina Ali is doing some great work, um, you know, at the biohacking village. I think that that, that you have to use the hacker tools to find out what the companies know about you and what they're, uh, collecting from your patterns.

Halle: Yeah. And so what are some things that listeners can do to safeguard information that they don't want out there?

Lucia: Well, I think number one is just be aware that your browsing history, um, does. Create this digital exhaust. So if you wanna be really private about your browsing history, you know you're gonna have to go to the store and look on the shelves instead of the convenience of looking it up online, . And I think that we all love that convenience.

So I don't know that that will change, but that would certainly, um, not cause a, a digital trail. Another thing you can do is just be cautious about how you're using social media relative to health information about you or your family members. And remember, we are. safe keepers of our family's information as well, who, you know, who's having a baby where, you know, new niece or new nephew born, you know, what, what is, what are your teenagers that around you going through?

So it's other people's data too, and you shouldn't, we've all seen that, you know. Advice column where the daughter-in-law's mother, the mother-in-law posts grandchildren, pictures on social media, but the daughter doesn't want to have the pictures on social media. So just think about that for yourself and the people around you, because that once it's out there, it's pretty hard to get it back.

And then lastly, yes, you might have to read some terms of service. So if you're. Going to go look for a, um, an app in the app store. One key thing you can really look for is how does that app make money? So I'll just contrast really, uh, quickly. The way Omada makes money is we bill your insurance company.

It's all arranged by a contract ahead of time, and we'll never monetize your data. And, um, we're a hundred percent in HIPAA because we bill your insurance company just like your doctor's. , not all apps are like that. So rather than reading terms of service, you might consider how is this app making money?

Are they making money by selling the data? Are they making money because I'm paying them a particular fee, like a monthly subscription fee? Are they making money because they're billing My insurance company, if they're in that last category, HIPAA's gonna apply. So you can feel somewhat better about what you're gonna do, uh, in the prior two categories.

You have to think about how you wanna use that app and how it's going. What it knows about you and whether you are, in fact, the product

Halle: Okay, so cash pay. 

Lucia: cash pay apps are not in hipaa.

Halle: Yeah. How would you approach thinking about using an app? Like should you look up the developer and where they're based? Who's behind it, and if they're venture funded , like what are some things that you should do in your diligence as a consumer?

Lucia: Well, I have noticed how the app stores are kind of like from that the app developer registration process have ratcheted down a little bit and want to know more about the app developer, and I think that's a good. Step in the right direction. What I would love to see the app stores do is kind of help the consumer categorize, right?

Like if an app developer says I'm, uh, in HIPAA, cuz I bill insurance companies, let's make that transparent in the app store.

Halle: Yeah, that's a good idea.

Lucia: Right. Or um, you know, if the app developer says, um, I comply with. The gdpr, the European rules, but I'm actually based in, you know, Poughkeepsie, New York. Well, then they don't have to comply with gdpr.

It's not a law that applies to them. What if they change their mind on a go forward basis? Like you have to just think about this. So yes, unfortunately right now we're in a stage where, Lots of reading of tous is required if you really wanna be careful about this. And I think that sort of gets us back to the ftc.

You know, Congress hasn't acted here. Um, I think that they've heard a lot from industry, but I'm not sure there's a clamoring of consumers to fix this. And I wonder what would happen if the consumers were really clamoring. Um, but we have many other things in our lives that are also of deep concern to us.

And maybe this just isn't a.

Halle: Can you tell us about GDPR and what we can learn from the EU and everything that's happening?

Lucia: Sure. I think gdpr, um, has some good benefits in that it's really helped, um, certainly people and, and, and created standards for. Who has custody of the data for whom, right? It's got this whole processors and collectors kind of rubric. And so if you think about, you know, the, the total tech stack of something that's, um, an internet-based company, it really helps a person.

Understand who's in that stack. I think we all experienced cookie fatigue, um, with the requirements that you consent to cookies and do your cookie settings and just like notice fatigue in the healthcare system. Like we all just wanna get to the page and we say yes, and we don't really think about our settings and think it through.

And there's been actually a fair amount of testimony about how that's. um, unfortunate part of G D P R that probably didn't work as well as it did. Um, but the other thing is G P R really rests a lot on the idea of notice and consent. We're gonna give you notice and you are going to consent or not consent to how we do stuff, and that puts a burden on you as the user.

And the interesting thing about a, uh, structure like HIPAA and some of the state laws is that they create baseline. Uh, requirements for the company that you don't have to apply whether you consent or not,

Halle: Mm-hmm.

Lucia: and I think that, to me, is better for the consumer to just create an environment where the consumer can have baseline trust and not have to think about it all the time.

Halle: Yeah, so I, I just got back on Monday from London and, uh, I did notice that every time I was, um, you know, on a different website, a pop-up. , you know, showed up and it said, you know, this site uses cookies. Which ones are you willing to use? Uh, or which ones are you willing to opt into? And it would be like marketing ones, analytical ones, functionality ones none.

Um, and it was, you know, I, I felt like it was nice to have the choice to say none and feel like I, I was, um, you know, protecting my privacy a little more. But it gets old, as you said, like cookie fatigue. Like every time you go to a website, you kind of. The quickest way to just, you know, hit none. Um, certainly there is gonna be a browser extension that'll do that for us soon.

Right.

Lucia: Yes. And, and then, you know, people already are, um, working with that. There's definitely companies that sort of have browsers where they are promising higher levels of privacy, but maybe the search results aren't as good. There's a lot of trade-offs in this space.

Halle: Yeah. And then what about the California Privacy Rights Act?

Lucia: So that's a great, um, that's, uh, just spend a lot of time trying to implement that. It's, it's a very, um, remember the California Privacy Rights Act amends the earlier law, California. Consumer Privacy Act. So it's all C C P A. It's gonna be really interesting to see what happens, Halle, because it's applying for the first time to some areas that haven't typically been regulated.

One would be business to business data collection. So for example, for your company, if you, you know, had a webinar and you wanted, um, potential, um, business development partners to sign up for that webinar, then you would be potentially responsible for CPA compliance for. Broker information or the.

Prospective customer information that you've collected. That's a very, that's a first. Um, so that's gonna be new for a lot of companies. Also, interestingly, you know, I think, uh, if C CPA a had been in effect then, um, I think GoodRx would've had some serious problems under C C P A. Um, but it wasn't a law at the time.

Um, of the allegations of the FTC complaint. We'll see what happens going forward. Um, and California is such a big economy that, you know, there's a lot of thought that maybe. Because people have to comply if they're based in California or they're serving a certain quantity of California residents, that that's gonna have the spillover effect and sort of improved behavior across the country.

That was also anticipated for gdpr. Um, but I don't know the research whether that actually happened with GDPR or not. The spillover.

Halle: Yeah. And a lot of these, I mean gdpr, I believe this is true for, but the California Privacy Rights Act definitely is true for the, like startups are aren't obligated, right? Like they're too small, they don't have to oblige. I don't remember what it was, but it's like, If you have to have 50,000 customers in California or something.

Lucia: It's a certain amount of customers or a certain amount of revenue.

Halle: there is a lot of fear that sensitive medical data could be used to identify pregnant people seeking termination. I’ve even read that even though law enforcement agencies may be prohibited from accessing company-held data without a warrant, they can just buy that data. Should women be worried?

Lucia: Um, I, I think it's all women, not just young women. And I think, uh, no, I think people should be super cautious. Uh, it's been good to see some of the period tracking apps. Try to update their privacy practices. And I definitely would endorse people strengthening those practices by adopting policies and securing the data more rigorously against hackers.

Um, but I also think that this may be one of those things where period tracking app might be really beneficial. But, uh, and I said this at the time, like, I don't know, I'm old school. I kept track of my periods in a notebook. Pretty hard to hack a notebook. So, so there's that, right? There's how do you track the information that you need, um, and the convenience of the app versus the not app.

And I know people have left the apps. Um,

Halle: Yeah.

Lucia: yeah. And I think, and, and that, and the hackers is a really important thing because, , if you are, you know, a period tracking company and you have been selling the data, now it's out there from a, a historical point of view. And maybe you've changed your business practices and you've moved to a subscription model instead of a monetize the data model.

Um, but now you have to protect it from prying eyes, right? And the prying eyes could be, you know, international mafia, people trying to steal identities and they could. , um, hackers who are on a mission here in the US to find people who, um, should still be pregnant.

Halle: Yeah. And a lot of these, I mean gdpr, I believe this is true for, but the California Privacy Rights Act definitely is true for the, like startups are aren't obligated, right? Like they're too small, they don't have to oblige. I don't remember what it was, but it's like, If you have to have 50,000 customers in California or something.

Lucia: So again, you know, the consumer should really be thinking about how did they find out about this app-based health thing?

And I would, uh, Ahmad's apps are a hundred percent available in the app store, but you can't really get the program until you enroll and we pay, we get paid by your insurance company. And that's a really key thing. Like, is this something that. , um, doctor's office or your insurance company is paying for, that's a definitely an important piece of information about what rules apply.

But as we saw when we talked in the very beginning, even hospitals can make mistakes about how they connect stuff up. Um, and so I don't have like a magic bullet, you know? How do you protect yourself? Um, I think you protect yourself by being careful. and by thinking about what's important to you as you manage your digital life and your, your, your health information or that of your family, uh, the example of people I know young people with children who don't use social media for their children at all.

They don't want their child to arrive at adulthood and have an 18 year history of pictures,

Halle: Yeah. Well my, that's, that's my family. My Husband is a data scientist and convinced me that we shouldn’t post pictures of our child online for that, for that exact reason. His argument is that our child has not consented to that. 

But others are out there sharing intimate health details about an illness their child has.or the fact that their child was born via sperm or egg donation through egg donation. 

And you know, that is health information that is now out there. Even if that child grows up to become an adult that wants that private, you can’t ever really remove it once it’s out there.

Lucia: That's right. But you know, the, the thing is that there are some great things that this technology can do. I mean, the advent of the smartphone basically changed all the way we deliver healthcare, right? So you have programs like Oma and we can help you track your blood pressure and your diabetes and, uh, keep, keep you on a path to health.

And there are similar ac, you know, coaching based programs for. Complex GI conditions and for asthma. Um, I saw this very cool demo on asthma and we could talk about that offline alley recently. Um, and the, the digital signaling we can collect from our bodies being delivered directly to our healthcare providers for input is really a phenomenal thing that we never used to have.

You know, used to be you wanted your glucose checked, you would go in, they'd stick a needle in your. , you'd get your results, uh, a week later and the week after that you'd have an appointment in your doctor's office. Maybe if it was that prompt like two weeks later, do you know what caused your sugar spike?

Probably not. Um, so there's some really amazing things out there that this technology can do, and I think what we have to do as an industry is really fortify the clinically valid stuff and make sure it's being delivered in an appropriate private. so we don't undermine trust in it, and then let the bad actors, you know, fall to the wayside.

Halle: Yeah, so what I'm hearing from you, you're, you're bullish on the use of technology and social media to help people manage their health and live a, you know, better, healthier life. And we really need these companies to step up and be super responsible so that people can trust using them and actually get the benefits without worrying that this data will be used against them.

Lucia: absolutely. We need that. And you know, if the companies can't help themselves, then they need to lobby Congress harder for a law.

Halle: Yeah. Amazing. Well, Lucia, thank you so much for all of your insights today. We appreciate you

Lucia: Well, thank you for having me. Uh, it was delightful conversation.

Previous
Previous

An Action Plan for Solving Our Climate Crisis

Next
Next

If Digital Health Was a Party, What Kind of Party Would It Be?