The Algorithms Fueling a Mental Health Crisis

 

“That insult you receive in the hallway of your school is ephemeral. But online it's permanent. And so now you can always look back on, oh, that's when my friends made fun of me. That's when that cruel statement happened. Which ends up having quite serious impacts.”

In this episode, we delve deep into the shadowy corridors of one of the world's most influential social media platforms with our guest, Frances Haugen. As a former Facebook product manager and internationally recognized whistleblower, Frances offers invaluable insights into how tech giants prioritize profits over public safety, and the profound implications this has on mental health of young people.

We explore her brave journey, which took her from the inner workings of Meta to testifying before the world's parliaments, sparking a global conversation about the role and regulation of social media. Listen in as we shed light on the pressing issues of our digital era, and the steps we can take to safeguard vulnerable populations from social media's hidden impacts.

Topics covered:

  • How the design and algorithmic choices of social media platforms contribute to youth mental health crisis

  • Her experience becoming a whistleblower and speaking out and releasing “The Facebook Files”

  • The role of phone manufacturers and regulators

  • How can parents and educators help protect children and teens from the potential mental health harms of social media

About Frances Haugen

Frances Haugen is an advocate for accountability & transparency in social media. Born in Iowa City, Iowa, Frances is the daughter of two professors and grew up attending the Iowa caucuses with her parents, instilling a strong sense of pride in democracy and responsibility for civic participation.

Frances holds a degree in Electrical and Computer Engineering from Olin College and a MBA from Harvard University. She is a specialist in algorithmic product management, having worked on ranking algorithms at Google, Pinterest, Yelp and Facebook. In 2019, she was recruited to Facebook to be the lead Product Manager on the Civic Misinformation team, which dealt with issues related to democracy and misinformation, and later also worked on counter-espionage.

During her time at Facebook, Frances became increasingly alarmed by the choices the company makes prioritizing their own profits over public safety and putting people's lives at risk. As a last resort and at great personal risk, Frances made the courageous decision to blow the whistle on Facebook. The initial reporting was done by the Wall Street Journal in what became known as “The Facebook Files”.

Since going public, Frances has testified in front of the US Congress, UK and EU Parliaments, the French Senate and National Assembly, and has engaged with lawmakers internationally on how to best address the negative externalities of social media platforms.

Frances has filed a series of complaints with the US Federal Government relating to Facebook (now named ‘Meta’) claiming that the company has been misleading the public and investors on how it handles issues such as climate change, misinformation, and hate speech, and the impact of its services on the mental health of children and young adults.

Frances fundamentally believes that the problems we are facing today with social media are solvable, and is dedicated to uniting people around the world to bring about change. We can have social media that brings out the best in humanity.

Listen

Transcript:

Halle: Hello Heart of Healthcare listeners. Today I'm excited to introduce you to a friend of mine from business school, Frances Haugen, who has been at the heart of one of the biggest movements in social media and mental health. Frances is a former product manager at Facebook who made international headlines as a whistleblower shedding light on the inner workings of one of the most influential social media platforms of our time.

During her time at Facebook, Frances became increasingly alarmed by the choices the company was making, prioritizing their own profits. Over public safety. Armed with internal documents, Frances bravely stepped into the limelight to expose practices that indicated among other things that met a new Instagram adversely affected the body image and mental health of teen girls, but buried its findings.

Her disclosures have sparked important global discussions about the role of social media in our lives. Its impact on our mental health and how we can better regulate these platforms to protect vulnerable populations. She has testified in front of US Congress, UK and EU Parliaments, the French Senate and National Assembly, and has engaged with lawmakers internationally on how to best address the negative externalities of social media platforms.

Frances, welcome to the show.

Frances: Happy to be here. Thank you for inviting me.

Halle: To start, why don't we talk about the impact of social media on mental health of children and young adults. If you could explain what these harms are and how prevalent they might be.

Frances: So if you go by the official policies of the large platforms, all of them say very clearly if you're under 13 years old, you're not allowed to use these products.

But if you go into basically any elementary school in the United States, except for, you know, a small number of quite privileged private schools, you know you're gonna find elementary students. You're gonna find kids as young as eight on social media. One of the challenges is that, Safely using social media requires some, some pretty complicated what are known as metacognitive processes.

So you, you need to be able to know that what you're seeing isn't representative of your friends' lives. That people don't just put a, you know, a random sample of moments. They put like the moments that make them look their best, which is a process that is actually even hard for adults to do, let alone say 11, 11, 11 year old.

The second issue is that, um, when kids go through puberty, their brains literally begin changing to have more dopamine and oxytocin receptors, social good feeling hormones, so that when you get criticized as a 12 year old girl, uh, when you get complimented, it feels much, much more intense. Than it would if you or I got a compliment or got a, an insult.

The last big issue other than say sleep quality, which is a, an issue unto itself, is that, uh, if you were a middle school student, if you were an eighth grader, a seventh grader, and you faced a conflict, you know, maybe someone's making fun of you. Maybe you did something silly, stupid, and you're getting made fun of.

That insult you receive in the hallway of your, of your school is ephemeral. You know, you can, you can ruminate on it. You can dwell on that, that criticism. But online it's permanent. And so now you can always look back on, oh, that's when my friends made fun of me. That's when that cruel statement happened, which ends up having quite serious impacts.

Halle: How do you think the design and algorithmic choices of social media platforms contribute to mental health issues?

Frances: So one of the biggest issues I would say is, all of the major platforms today use something called engagement based ranking. That means when they're trying to decide, should I show you this piece of content, they're trying to optimize for your likelihood to, to click, like, uh, to reshare it, to leave a comment.

The content that gets more of those interactions is considered better content. And that could have some very interesting effects. So I'll give you an example that came to me from a journalist. So he had just had a new baby. Cute, healthy, happy baby boy. That baby had an Instagram account 'cause he is a modern father.

That account had maybe four or five other healthy, happy baby friends. What the, so remember all the content that is going through this, guys, that that is being generated by these accounts is healthy, happy babies. The only thing the father had ever clicked on is healthy, happy babies. Only thing he put comments on healthy, happy babies and yet, 10% of his feed was kids who had just suffered horrible, debilitating accidents.

Kids in hospital beds with tubes coming out of them, dying of cancer, kids with painful looking deformities. And he was like, how did we get from healthy, happy babies to like suffering children? Because like, think about how cruel that is. Like you're a new father. Like what's your worst fear in the world?

Yeah. And the reality is that these algorithms don't under really understand what they're showing you. They just know that if you like babies, you are going to dwell, you know, as you scroll through your feed, you're not, you're not gonna be, you're gonna be unable to just mindlessly go past. For content about suffering children.

Mm-hmm. And when it comes to things like depression or um, self-harm eating disorders, the algorithms don't really understand the difference between content that is encouraging dangerous dieting and healthy recipes. All they know is you're interested in weight loss and they know this extreme content is more likely to get a click.

And so you end up in these scenarios where, where you know you can be in a slightly blue mood, you can be, you know, feeling insecure about your body, and the algorithm will progressively show you more and more extreme content.

Halle: So those algorithms that you're talking about were written by probably very smart people working at these social media companies.

Why don't they just update the algorithm so that they're not so terrible?

Frances: So I just had a new memoir come out called The Power of One that Tell, that talks about not just my whistle glowing experience, but also my experience working at Google and Pinterest and Yelp in an, in an attempt to kind of lay out, you know, where did these harmful decisions come from and like why, why is it so hard for Facebook to fix these on its own?

And I think the, the main issue is that Facebook, uh, is governed. In a very specific way. So internal to the company, uh, they want to be quote unquote objective. Um, they want to, you know, if they wanna be a, a quote, a level playing field. So if you're someone straight out of college or you're someone with 10 years of experience, if your idea moves the metrics the most, your ideal wins.

But the problem with a, uh, an ideology that like that, a philosophy is that any given set of metrics can only capture as much complexity as those metrics are designed to capture. I. So there's always gonna be things that are left out. And unfortunately for Facebook, a lot of those things that were left out are things like, does this make people more depressed?

Does this cause eating disorders? Like none of those were within the, the goal metrics that were used to evaluate if an algorithm was good or not. And part of what makes it difficult is that in a world where these platforms don't have to be transparent about the outcomes of the algorithms, There's very little incentive to try to address these, these problems because let's be honest, most kids who use social media don't get suicidal.

Most kids who use social media don't get eating disorders. How do you weigh off protecting that one or 2%? When maybe your internal corporate culture has trouble even believing that there's causality?

Halle: Yeah. You have to look at it from a population health perspective because we're seeing, yeah, maybe suicidality is a small piece of it, but if depression levels across all teens are going up, which we know they are, totally.There's an impact. 

Are all of the platforms equally problematic or are some of these algorithms worse than others? 

Frances: Yeah.Great question. T TikTok iss so huge now. TikTok, YouTube, Instagram. So what's interesting is, uh, so Facebook's own researchers, when they were looking at Teen Wellbeing on Instagram, one of the documents from from the Instagram researchers laid out pretty clearly that based on the nerve views they had conducted with teenagers with tweens, they found that things like TikTok are about performance, about doing fun things with your friends and family.

I. Uh, Snapchat is about faces and augmented reality. Reddit is at least vaguely about ideas. But Instagram is about comparing lives and comparing bodies. And so there's, it's one of these questions around, when we design a platform, we can have a vision of what we want to encourage on those spaces. And in the case of TikTok, you know, the, the, it's a Chinese platform.

They want to be a happy, positive place. 'cause then you don't get revolutions. And so you, you know, they, they just are spaces that encourage different kinds of behavior.

Halle: Fascinating. So in May the US Surgeon General issued an advisory about the effects of social media use on young users. I have no doubt that your work and advocacy in this space influenced, uh, this advisory.

And at the same month, the Biden administration announced the creation of a new task force dedicated to looking at the impact of social media on children. What role do you think governments and regulators should play in mitigating these issues?

Frances: 

So one of the things that makes Facebook very different than say, Google. Is, I know, I know this is gonna sound like a stretch, but it's true. You know, you and I could sit down together for three weeks and I could teach you enough programming that you could ask basic questions about, about Google. You know, what is, what's being shown on Google?

What kinds of sites are getting priority? Where on the page are things showing up to do just that? Basic level of accountability 1 0 1 on Facebook or on TikTok for that matter. We'd have to recruit 20,000 people and convince 'em to let us install software on their computers to send back what they are seeing.

Because each of us sees different things on these platforms, it becomes much, much harder to do, make any kind of statement about. X is happening on this platform. So the place that I think we're at as a society or as like a civilization, is we need to require transparency because we have been denied the opportunity to.

To grow, you know, a public muscle of accountability, like a public muscle of understanding. We can ask only very, very basic questions today of these platforms because there's, there's, we don't have access to what's actually happening. And as long as Facebook is required to report publicly, it's profit and loss statements, its expenses for how it generated that profit.

But it's not required to report the social costs. It can take from the societal balance sheet to prop up and make its economic balance sheet look more attractive.

Halle: Are there other industries that you look to as analogies for measuring the externalities? Uh, and the impact? Mm-hmm. Maybe, yeah. What, what do you look at when you're comparing.

Frances: So I, I often, uh, talk about, um, uh, the automotive industry, you know, like what has been our journey to, like, people take for granted how safe cars are, you know? Yeah. Um, right now the fatality rate for auto, uh, automobiles is, is about one third what it was in the sixties. And, and you know, people assume overtime cars will only get safer.

What's kind of shocking is in the 1960s, cars actually had their fatality rate go up for about five years before a book called Unsafe at Any Speed came out and caused a huge stir because the book laid out that executives were at automotive companies were afraid to be first movers on safety because they were worried that people would ask, why are, why are you talking about safety and no one else's?

Uh, how, uh, are your cars more dangerous than their cars are? And, and to give context for your listeners, you know, in 1965, no car on the market came with seat belts by default. You know, it was an, an extra add-on. Um, that would be the equivalent of a couple hundred dollars today, and very, very few people opted for them.

And so, uh, the, the, the, the example I give is like, We require physical products like cars to make available enough data that we can assess that their claims are true. Mm-hmm. And right now there's a law that was passed in the European Union last spring, um, that I talk about in the book called The Digital Services Act.

And the Digital Services Act says, Hey, the fundamental problem of social media, Is not eating disorders. It's not depression, it's not extremism, it's not human trafficking. It's a power asymmetry. That's the fundamental problem I. That these platforms know that they operate in the dark and that as long as they get to see what's going on and we don't, we're never gonna be able to have a real conversation about what should be happening with these systems.

Mm-hmm. And so they established a really interesting Right, which no one else in the world currently has, which is if a platform knows there's a risk to its system, they have to disclose it. They have to say, this is how we're planning on trying to reduce this risk. And here's enough data that you can know that we are actually making progress.

And that sounds so basic. You know, the idea that, like, you can't lie to me anymore, but we've, we've, we've entered this new era where opaque systems, you know, things that can run on a data center, on a chip are gonna operate more and more of our economy. Things like ai. And unless we pass laws like that, we actually don't have a right to the truth.

Halle: Do you think that's the seatbelt, or is, is that going to be the seatbelt? Will that change things, at least for the platforms in the eu?

Frances: Sure. One of the things I, I walked through in the book is the idea that, that, when that book came out, we had at least a hundred thousand automotive engineers in the world.

You know, you could get a graduate degree in automotive safety. You, uh, there were professional organizations that had existed for 60 years. Um, there were lots of people who could say, oh yeah, seat belts are real. This is a thing we've known about for 20 years. Seat belts would save lives right now.

Nowhere in the world can you take even a single class on the design of the algorithms that run these systems or how you'd wanna design a social network. I. And so the one, the analogy I often say is there's a bunch of things like seat belts that I've talked about before. This is things like, you know, should you have to click on a link before you reshare it?

You know, it sounds really basic, but you get like 10 or 15% less misinformation when you have to click on a link or things like,

Halle: and Twitter's doing that now. You I've gotten, yeah, I've gotten these popups. Yeah, yeah,

Frances: yeah. Um, or if you prioritize content that you asked for. So that's like, I picked my friends, I picked my pages groups.

Yeah. If you prioritize what you, you actually consented to over what an algorithm thinks you want, you get less violence, less hate speech, less nudity, your friends and family are not the problem. So those are things that are like seat belts. But right now we're so far behind that. There's only a few hundred people in the world who even understand the quote physics of seat belts.

You know, why would a, a seatbelt work? Theoretically. And so we, we need, uh, the nonprofit that, uh, I founded last year is, is committed to that goal, right? Like, how do we build out a world where there's a million people who really deeply understand, you know, the physics that would let us invent things like seatbelts.

Halle: And is that beyond the screen? That's the nonprofit? Mm-hmm. Yeah.

Frances: Yeah. Uh,

Halle: beyond the screen, do you wanna do, tell us about it. Tell us about what you guys are doing.

Frances: Oh, thank you. So one of our projects, uh, is called Standard of Care, and it's based on the idea, you know, it came out of a conversation we had with a sovereign wealth fund of all places.

So for context for your listeners, some nations have like basically a giant bank account. That is like the wealth of the country. Like often these are like oil countries or countries with a lot of natural resources. Um, and they invest them around the world. And we were talking to the ethics board of this, the Sovereign Wealth Fund, and they were like, we would love distinction Facebook.

The only problem is there, we can only act on established ethical norms. So like we've had an ethical norm for 200 years that. Slavery is wrong. So if you use forced labor in your factories, like we can, we can sanction you. But when it comes to social media, there are no norms for what is good behavior.

And so we're doing a project that's devoted to documenting harms. So creating kind of a, a central place where you can go and read about what's known about a range of harms or starting with harms to children and families. You can learn about what are the opportunities for preventing or mitigating those harms.

So we call those levers. So for example, a common lever across a lot of harms to kids is, let's keep under 13 year olds off these platforms. And then the last part of that project that I think is maybe the most important part is right now the people who understand the social problems of these platforms often don't understand what's possible with technology.

And what happens then is they latch onto a single strategy for pulling a lever. So, for example, in the case of kids, again, uh, they'll say, you, we, we wanna keep under 13 year olds off these platforms. We're gonna check IDs, you know, every social media account, we'll have to have a driver's license affiliated with it.

Which has huge, uh, civil liberties issues and just doesn't work unless you have something like the great, uh, China's firewall. But if you'd gone to a technologist and said, I need to keep under 13 year olds off this platform, they'd say, oh, here's 10 or 15 different ways to do that. You know, everything from kids will tell you I'm an elementary school student in their bio to kids who will report each other's accounts to punish each other.

Um, you know, like you made me angry at recess. I report your Instagram account, and once you find, you know, a thousand, 10,000 kids, you can find all the other kids. There's no reason why children need to be on these platforms. And so we're hoping that by making kind of a central menu, Reasonable people can sit around a table and say, you know, we're not saying you have to do every single last possible thing, but there should be some floor where we say, Hey, this is what, what, what it means to be responsible.

And then we have a sister project to that project. Um, so that, that at least the children and family's version of that we're hoping to launch. This September, maybe October. The sister project to that is, is meant to help guide investors, PE-people bringing lawsuits, regulators, which is what information. Do companies need to make public investors?

What information do companies need to make public such that we could track the magnitude of these harms and how hard companies are fighting to reduce those harms? And so we call that project Minimum Viable Queries. And then our last project, so we have three, is we're gonna do sim, a simulated social network so that we can do like college classes or even, you know, high school clubs like Imagine Model un, but for building social networks, because we want to give more people, uh, a hands-on intuition for, uh, what happens when you build a social network.

Halle: We will be right back after the break.

How else can parents and educators help protect children and teens from the potential mental health harms? Of social media?

Frances: You know, I, I think a great place to start is phones need a charge in the parents' bedroom at night. Mm. Like, I think the, the single lowest hanging fruit when it comes to, um, mental health and kids is sleep.

And, you know, it, it's, it sounds really basic, but it, yeah. It's so easy when you're stressed to just sit there and doom squirrel like I do it. Um, yeah. Oh yeah. So easy, easy solution. And then I think the secondary thing is, you know, sit down and come up with a, a family, uh, technology plan, family media plan.

Um, say like, our lives are so short and so precious. How do we wanna spend that time? And so 'cause but when you say it, you know, it's for the whole family. It's not just for you parents need to play by the same rules.

Halle: Yeah. What do, where do the phone makers come in? Where does Apple come in at at helping in this problem?

Frances: Oh, oh, I love that question. So one of the, the things I've suggested, just to give you a sense of like how. How easily some of these problems could be addressed is we've known for 20 years that if you make things a little bit slower, you know, you, you, it takes a little bit, it takes a little bit longer to search on Google, it takes a little bit longer to scroll.

People use products less and they figure it, they figure it out that by, um, one looking at experiments, but two, you can, you can artificially add in slowness. And you can see how much less do people use it. Yeah. And I imagine a world where instead of right now if, if a platform has anything along these lines, usually their attempt at helping people go to bed will be you pick a bedtime.

At that bedtime, a little thing pops up and says, Hey, do you wanna go to bed? Right. Uh, I don't know about you, but when I see that I hit snooze, right. Dismiss, um, it's not, it's not meant to work. Right. And, but a imagine world instead where, you know, let's say you have a, a 16 year old and they were up till two o'clock the night before, and now they're, they're sitting in math class like all grumpy and hungover on Instagram.

And, uh, a little thing pops up on Facebook or Instagram saying, Hey, I noticed you were up really late last night. When do you wanna go to bed tonight? And they say, 11, my mom wants me to go to bed at 10, but I wanna go to bed at 11. And imagine for two or three hours before 11, Instagram got a little bit slower and a little bit slower, and a little bit slower.

You didn't even notice it was getting slower. And around your bedtime, you got bored and you just went to bed. That's what's, what's crazy about that feature is if you are stealing content from Instagram, that feature is already live. I. Facebook knows that if you are downloading, copying the content off of Instagram, if they take away your account, you'll, you'll just get a new account.

And so instead they slow the app down. So if they really cared about, interesting. Oops. Yeah,

Halle: they already do this for some people, for some situations already. Yeah.

Frances: So if you're, if you're, if you're, if you're a thief, if you're stealing from, from Instagram, this is already live. Yeah. So like part of what frustrates me so much is, you know, if, if Instagram really cared about helping kids sleep at night, you know, they could launch this in two weeks.

Halle: Interesting. So in, in June, they actually announced new safety features aimed at protecting teens who use Instagram. What do you make of these new features? Are they a step in the right direction or just like, definitely not enough.

Frances: So all the features are optional. I think that's the first thing.

Um, and I, that's problematic. I don't think they, they, it's problematic because lots of kids have great parents. Not all kids have great parents. You know, we're expecting, um, parents to, to know that these products could be dangerous. Like, unless you're really following along, it's, it's surprising how many parents aren't aware.

Of, you know, what the emerging research is on these topics. But the, but the second thing is, uh, when we have uneven enforcement or uneven protections for kids, you know, it makes it harder for parents to be able to do what might be in the best interest of their kids. You know, if, if you have, uh, if you get cut off at night and your friends don't, You have to leave them behind while they're still having a conversation.

You know, you're gonna whine at your parents, you know, if you have, um, the safer settings on your account, you know, are your friends gonna make fun of you for having a, a little kid count? Mm-hmm. It's, it's one of these, it's interesting. One, one of my God kids is very small. He's like three, and he's already learned that he likes adult YouTube more than kids.

YouTube. Mm. Right. And so, and so it becomes one of these things where we need to talk about mm-hmm. Making sure that there is systematic safety for children. Yeah. And that given the, the consequences are so high, i e like depression rates and suicide rates for kids are hockey sticking. We, we need to have some very serious conversations.

Halle: Yeah. Yeah. My son who's, who's five now, loves his iPad. We tried to put, you know, all educational things on there, but of course we have YouTube kids kids on there and I don't know how, he also knows that there's another YouTube, 'cause I saw him going into Safari and opening up adult YouTube and watching these like Pokemon videos on adult YouTube and I that I like frantically searched like, Parental controls on iPad and, and it's actually not that intuitive.

Frances:You know,  you asked, and you asked a question earlier about platforms, like where do platforms come in? Yeah. Like, I think there, there's a real conversation to be had on, you know, what are the rights of children online?

Because like Apple could be going in there and saying, Hey, you know, here's a really easy, like, we'll, we'll basically set up a, a firewall for you on this iPad where your kid can only access sites that you chose and, and here's, and here's a basket of sites to make it even easier for you. Like that's, that's a trivial thing that could happen tomorrow.

Halle: Yeah. Yeah. What can the public do to hold social media companies accountable? You know, is it. Supporting your nonprofit. Are there other ways that we can, not just to protect our own children, but protect, protect society?

Frances: Well, we're never gonna turn down donations, so, you know. Yes. Feel free to write us fra France, francis@francishaugen.com orFrancis@beyondthescreen.org.

But, um, and the, I think one of the most important things is call your elected representatives. Um, and that can be at the state level. It can be at the, at the, the national level. Like if you engage in any way with politics, like ask. You're, ask the people you're engaging with and say, Hey, what's your plan on social media and kits?

Because right now, um, like the reason why I wrote my book was, you know, transparency doesn't sound sexy. People are like, well, I, I wanna solve the problem. Now, I don't, I don't wanna like, like, give nerds data, but I'll give you an example of like how very rapidly data can transform these products. So let's imagine a world where Instagram had to publish.

Or TikTok for that matter. 'cause I think there's more compulsive use on TikTok. You know how many kids are online at 10, 11, midnight, 1, 2, 3 am. You know, if they had to publish that every week, yeah. People would very rapidly see, does Instagram have a better, like a worse compulsive use problem? Or does TikTok, which platform is taking these issues more seriously?

And that means advertisers can choose to allocate their dollars. I. Parents can choose what to watch out for. Um, you can have advertiser, boycotts. There's lots of things, divestment campaigns. So talk to your elected representatives and say, we need laws like the Digital Services Act. We deserve to all be safe, not just people who know how to like hack their phones the right way.

Halle: And is this a bipartisan issue? Are we all on the same side of this or are you seeing sort of no different points of views on each side of the aisle?

Frances: So the, I would say it, it should be a bipartisan issue, like transparency at the very least. Uh, the, I, I would say the challenge is that, that it's not seen as a quote, big enough issue.

Um, and the, the, the reality is that like we need. To be able to begin building our public muscle of accountability. You know, we need to be able to teach college classes or give out graduate degrees. And as long as the platforms are allowed to keep the curtain drawn to, you know, operate in the dark. Um, and this also applies to like large language models.

That's like a generative AI is something we're looking at more and more. Uh, as long as you're allowed to operate in the dark, you will hide the consequences from the public and that should be a bipartisan issue. Mm-hmm. 

Halle: So how can people follow your work? 

Frances: So, uh, it, uh, the Power of one was actually NPR R'S Book of the Day last month. Um, so amazing. Congrats. So, so, yeah. Thank you. Um, so it's not just me that enjoyed it, but, um, the, uh, I say the other I'd say, you know, follow, follow me on, on, uh, Instagram or, or Twitter, ironically.

Um, we also have a mailing list. So if you go to beyond the screen.org, you can sign up for our mailing list. We will be engaging more with the public over the next year. Um, 'cause we are still in the startup phase. Uh, and so our team is growing quite quickly and expect to see more from us in the future.

Halle: Um, Fantastic. Francis, thank you for your hard work in this space, and thank you for joining us.

Frances: My pleasure.

Previous
Previous

Where There Is Waste There Is Opportunity

Next
Next

There’s a Way to Fix Healthcare, but Is There a Will?