Big Leaky Tech with The Markup | Crooked Media
Jon, Jon & Tommy's first ever book is here - Order Democracy or Else NOW! Jon, Jon & Tommy's first ever book is here - Order Democracy or Else NOW!
July 19, 2022
America Dissected
Big Leaky Tech with The Markup

In This Episode

The fall of Roe has opened up the risk that authorities could use data from period tracking apps or internet searches in legal proceedings in abortion ban violations. But Big Tech may already be tracking a lot more about your health than you know. Todd Feathers and Simon Fondrie-Teitler of The Markup join Abdul to share their reporting.





[sponsor note]


Dr. Abdul El-Sayed, narrating: Infections, hospitalizations, and deaths are rising as BA.5 spreads across the country. A new CDC study found that rates of infection with antibiotic-resistant bacteria have skyrocketed during the pandemic. And last Saturday, 988 became the new nationwide mental health crisis hotline number. This is America Dissected. I’m your host, Dr. Abdul El-Sayed. If you’re like me, the first thing that you do every morning is look at your phone. Look, I know it’s like the worst thing you can do, but it’s 2022, and I’m holding it anyway. After all, my phone is my alarm clock. It’s also my weather report, my newspaper, the Quron I read every morning, and my meditation app. And when it comes to my health, my phone is my personal trainer, my activity tracker, and my calorie counter. I use it to book medical appointments and to check any labs when they come back. We use one machine to do nearly everything, and that’s by design by all of the folks making money off of that machine. But when that becomes the case, when one machine does almost everything for you, that machine gets really personal. You start to think that it’s just you and your phone, but that phone is designed to make you forget that every single one of those apps–every single one–is constantly sending information back to a stream of servers. My meditation app, my weather app, my calorie counter, each collects stores and sells my data. And it does it to you too. In fact, you don’t even have to be using it for it to collect, store, and exchange your information. After all, all those services, they may feel free, but no, they’re not free. The ones that I don’t pay for, like Facebook or Google, they’re still making money off me, they’re just doing it by either selling my data or using it to push me advertising for things I probably don’t need. Think about this: how often have you had a conversation with someone about a very specific product, only to see advertisements for that exact same product pushed to you across a whole bunch of different services? It’s not that your phone is listening, it’s that apps are collecting so much information on you that they can identify what it is you may be interested in in the first place, and therefore what you’re probably talking about. It’s as uncanny as it is terrifying. But what happens when it’s not just selling you underwear or headphones? What happens when the intimate details you punch in your phone have to do with your health? What happens when the company is trying to monetize you through your phone and collect that information, too? And what happens when that information can be used against you, a loved one, or a provider in a court of law? Because that’s the nightmare scenario millions of people are currently facing since the fall of Roe. When I interviewed law professor and co-host of Crooked Media’s Strict Scrutiny, Kate Shaw, a few weeks back, I asked her specifically about whether or not people should consider deleting period tracking apps, or be careful about what they plug into search engines for fear that the information could be subpoenaed and used in a prosecution. Here’s what she had to say:


[Prof. Kate Shaw] I think that these kind of warnings that we have seen circulating on social media in the last few days that people should delete those apps are probably right, actually. So I think that even if you’re not, even if we’re not, you know, actually worried about some kind of panopticon-style surveillance state actually beginning immediately in the wake of the fall of Roe to track the kind of fertility activities of all, you know, women and childbearing-capable people in the United States, I still think that having this kind of, you know, personal medical information in the hands of a third party and also stored, you know, on your own device, potentially, you know, vulnerable to, say, a subpoena request from a D.A. Who is investigating an allegation of an abortion in violation of state law, I would be concerned about those records being available to potential subpoenas. And so I think they’re probably, I think that those warnings are probably well-founded.


Dr. Abdul El-Sayed, narrating: That is bone chilling, but all of this should remind us that the, quote, “private information” we share into our phones is actually being shared all the time, sometimes in ways it’s explicitly not supposed to be. Last month, Todd Feathers and Simon Fondrie-Teitler published a series of articles exposing the way that Facebook has been collecting and storing medical information just like that. That company was inadvertently tracking and importing protected health information off of hospital websites. The story opened up new questions about how much data is too much for Facebook to own, what role does HIPAA have in a world where websites are automatically sharing information, and so much more? And while all of that is bad enough on its own, it’s also raised a number of troubling implications about our new post-Roe hellscape. I invited Todd and Simon to share their story and implications with us today. Here’s our conversation.


Dr. Abdul El-Sayed: All right. Can you introduce yourselves for the tape?


Todd Feathers: I’m Todd Feathers. I’m an enterprise reporter with The Markup.


Simon Fondrie-Teitler: And I’m Simon Fondrie-Teitler. I’m the infrastructure engineer at The Markup.


Dr. Abdul El-Sayed: Awesome. I really appreciate you all joining us today. This is really one of those topics that sits at the cusp of a whole set of anxieties–particularly right now–around the role of the Internet in our lives and some of the things that we take for granted around how the back-up is set up, around the way that the Supreme Court has politicized health care with its recent decision to end the right to choose, and the ways that sometimes what we do online can be weaponized against us. So I want to start from the top just with a question about the law that everyone thinks on when we think about health care data privacy, which is HIPAA. What is HIPAA?


Todd Feathers: Yeah, so HIPA stands for the Health Insurance Portability and Accountability Act. And a lot of people, including myself until pretty recently, you know, think of HIPAA mainly as a privacy law that protects our medical privacy and information. But its original intention, back when it was, you know, signed and enacted in 1996, was really to kind of govern and make possible the transfer of medical information between insurers and providers and kind of entities like that. You know, the privacy rules didn’t go into effect until 2003, you know, a number of years after the law was actually enacted. And Facebook itself wasn’t founded until 2004, and so that kind of gives you a sense of how old the framework of this law is. Even though it’s been updated, you know, it certainly wasn’t built for a world with Facebooks and with that kind of business model and technology.


Simon Fondrie-Teitler: And there are three prongs, I think, three main prongs to HIPAA. The privacy aspect in terms of controlling what health system or a doctor or health insurance company can do with the information they receive, who they can share it with. It also allows for, as Todd was talking about, transferring data between health systems and allowing patients to get their data out of one health system in order to move it to a different health system or doctor. And the third prong is around what marketing a covered entity under HIPAA is allowed to do. They can’t necessarily just take your data and send you ads for whatever the hell they want to. It limits to some extent the scope to which they can market.


Dr. Abdul El-Sayed: You know, it’s interesting because HIPAA in popular culture is sort of seen as the be-all end-all shield around health data privacy, to the point where you have not two impressively bright Republican congresspeople talking about how HIPAA prevents them from sharing publicly their own vaccination status–which is decidedly not true. But what your reporting demonstrates is that HIPAA, even for those three prongs that that you talked about, Simon, that HIPAA’s got some really big holes in it. And a lot of this tends to stem from the fact that it’s a pretty old law and a lot about the Internet has changed since that law went into effect. Can you walk through some of the ways that, you know, HIPAA might not have foreseen data movement, and how that may be opening up some of the holes you reported on?


Simon Fondrie-Teitler: I don’t know whether that’s true, and I’m not sure that the issue here is that HIPAA isn’t ready for this sort of thing, like hasn’t been updated in a long enough and like the Internet, it doesn’t apply to the internet. There are tons of digital health care facets that it does apply to and it does regulate, and HHS has announced they are going after companies for violating. So I don’t actually know what the issue is. I don’t know if it’s a clarity issue. I don’t know if it’s an enforcement issue. I don’t know if the stuff isn’t covered. And I don’t know necessarily what the state of this privacy stuff was before the Internet. And so I guess, from my perspective, it’s a little bit hard to say exactly why this stuff is happening.


Todd Feathers: Yeah. If I could just add on to that. You know, I totally agree with Simon. You know, the experts that we talked to, including some who are former regulators with, you know, Health and Human Services in the office, you know, within Health and Human Services that enforces have, you know, told us that, you know, the kind of behavior that we documented, the collection of this data by Facebook from hospital websites is most likely a HIPAA violation. I think, you know, where some of the bottleneck is, according to these people, is the ability to enforce these kinds of potential violations and to investigate them. The office within HHS that does this is, you know, not, you know, overflowing with resources and staff, from what we were told. And these kinds of investigations, they’re hard, they’re time-consuming. It took a lot of work for us to do this, and, you know, we weren’t processing hundreds of other, you know, complaints of various kinds of the same time.


Dr. Abdul El-Sayed: So how is this exactly happening? What exactly is going on here?


Simon Fondrie-Teitler: So the root of what’s happening here is that the hospital systems that we looked at–so there’s two things that we looked at. One of them is hospital systems that have added this script called the Meta Pixel to their website. And so this pulls in some, when you browse the website, it pulls in some code from Facebook and it runs that in your browser. And one of the things–and this code is, is used for a number of different things. The main use seems to be tracking how well the ads that you’ve placed on Facebook are doing. So if you want, if you’re advertising to people and you want to know, do they click on my ad, and then when they’ve clicked on my ad, sort of what they do on my website. It also helps attribute like sort of linking the people that click on ad to the people on the website. There’s some other sort of like, answering what are people that are browsing this website doing. And so the script will collect data on page views and other sorts of metrics, send it back to Facebook, and then Facebook provides a window into this to the site owner. So hospitals have been adding this presumably to track, like they advertise on Facebook. That’s a thing that a lot of hospitals do, and so they added the script. But there are other like, there are things that happen on hospital websites that are somewhat sensitive. One of them we realized was booking appointments, right? You go, you search for a doctor on the hospital website that have a button somewhere that says “book appointment.” You click on that, enter some information, and now you have an appointment with the doctor at some time and some date. What the script was doing on these sorts of sites was when the user clicked that button to schedule an appointment, the Facebook script was detecting this button click and was sending an event back to Facebook saying, Hey, this user at this IP address, in some cases with sort of cookies identifying what account was logged into Facebook, said this user’s clicked on this button on this doctor’s website to schedule an appointment. And the other thing that we found–so we found that, I should say we checked the top 100 hospitals on Newsweek and we found 33 of them doing that. The other thing that we found was there’s a sort of patient electronic health record system that allows patients at these hospitals to log in, see things related to their care. There’s a few big names in this. The biggest is Epic with a product called My Chart that many, many, many large hospitals are running. And we found that seven hospital systems had added this same Meta Pixel script to their, inside their EHR, and it was sending things that they were clicking on inside this EHR to Facebook. And so we were seeing things like doctor’s appointment, times, but also medications, things like that. Like sort of whatever was in this portal that the patients could click on and that Facebook thought was sort of like a button, it would send the text back over to Facebook.


Todd Feathers: Yeah. I mean, I, think that that was the perfect explanation. As, you know, kind of a non-technical person, the way that I had to kind of conceptualize it was that Meta Pixel creates basically a receipt of your activity on a website. And so when it’s a hospital website and you’re booking a doctor’s appointment, it, you know, gathers the information of what you fill into various forms and click on. And it doesn’t know what that is, but it collects it anyway and sends it to Facebook. And when you’re inside your Mychart or whatever the other EHR portal is, there’s a lot more sensitive information there. And as you navigate through it, you’re potentially–not the default is on you–but, you know, you’re potentially exposing that information because this pixel is there collecting everything you click on.


Dr. Abdul El-Sayed: So how did you discover that this was happening?


Simon Fondrie-Teitler: So we have a partnership right now with Mozilla, and they’re the nonprofit that makes the Firefox browser, and they have a project called Rally and Rally allows the regular users of Firefox to install an add-on on their browser. And it’s, I think just Firefox right now, or it’s Firefox right now plus, I think we were the only one on Chrome, but there are other, there are other research institutions participating. And the user can install this plug-in, they can look through the list of various studies that are ongoing and they can say, yes, I want to join this one. And it allows them to sort of donate their browsing data or information about their browsing data to Mozilla. And then Mozilla, in a very secure environment, allows various, the institutions to access the data associated with their study. So we actually just finished up data collection for a study that was users were opting into allowing us to see what data on their browser was being sent to Facebook. It sort of it looked, it was specifically targeting this script called the Meta Pixel that was looking at all the network traffic going to Facebook associated with it, and including information about sort of what site this was from. And in looking through that, initially, we found a couple of, I mean, looking even just through the domains that were there, I saw MyChart.<something>.com. And I was like, well, that’s not good. And then you could see in that, in that bit of data that was being sent someone’s name and the time and date of their appointment with a doctor, I think they had clicked on like the little message center inside MyChart and that information had gotten sent. And at that point we’re like, well, that, that seems like a problem. And started looking into this some more. We found a few different places through that. And then as we were looking for this, we were also looking more generally at what these hospital websites, what sort of data they were sending to Facebook. And we realized that this was also happening on the scheduling pages. This, we were using their sort of built-in tools to, Firefox and Chrome, that allow you to see what information is being sent to various places as you’re browsing. And so we were using that to, to investigate this.


Todd Feathers: I think a really cool aspect of this Mozilla Rally, you know, add-on and feature is that it gives tech reporters like us a really kind of unprecedented way to investigate data collection, you know, that’s something that we couldn’t really do before, you know? If we had just, you know, happened on our own to have the idea to check these hospital websites and perform the manual tests that we did, we could have shown, you know, some of what we were able to in the article about the kind of information collected, but we wouldn’t have been able to, you know, show that real patient information was being collected from inside these password-protected pages. And, you know, for me, as a person who has used MyChart and been to hospitals, that was, you know, really compelling for me.


Dr. Abdul El-Sayed: So did Meta, or Facebook, know that their pixel was doing this and did they take any effort to try and fix it?


Todd Feathers: So Facebook did not respond to the detailed questions we sent for the story, but they did provide a statement and a little bit of information. And so I guess the brief answer to your question is, Yes, Facebook is aware that the Meta Pixel and other tools like it that track folks on the Internet and through apps sometimes collect sensitive health information. And Facebook’s been aware of this for several years. The Wall Street Journal did a big investigation a couple of years ago about health information sent to Facebook from apps that sparked an investigation by the New York State Department of Financial Services, which found that Facebook was not doing a good job of making sure that that sensitive health information wasn’t then being, you know, put into the same databases it used to target ads. And as part of that New York State investigation, Facebook built this machine learning system that was supposed to filter all incoming information from the Meta Pixel and similar tools and block any sensitive health information from being added to, you know, this advertising database and this user profiling kinds of databases. But Facebook itself admitted to the New York state investigators that, you know, at least as of February 2021, that machine learning filtering system was not completely accurate. That it was, you know, generating tens of millions, I think it was, you know, flags every day for, you know, sensitive health information, which should be blocked, but that was still only like this microscopic percent of the overall interactions Facebook was collecting from health apps, which gives you a kind of sense of, you know, the scope of this problem. And you can kind of imagine, even without being a machine learning expert, how hard it would be to, you know, filter out and block every kind of information that could potentially be, you know, sensitive health information in a particular context.


[ad break]


Dr. Abdul El-Sayed: What’s so scary about this is that from a user experience, you have no sense that the fact that you’ve been on Facebook and then will go, you know, you get an email from your MyChart to go log in to take a look at the results of a lab or something, you have no sense that, in effect, Facebook is still tracking you into that website and then collecting that third-party information that goes into who-knows-where to be seen by who-knows-whom. That’s a pretty scary thing. And, you know, at the end of the day, the liability, particularly given HIPAA actually sits with the hospitals. So I’m wondering, you know, what effort hospitals have taken to try and address the situation.


Todd Feathers: Yeah, well, during the reporting process, we’ve reached out to every hospital that we had kind of tested so the, you know, Newsweek’s top hospitals in the U.S., as well as hospitals whose MyCharts, you know, were sending the patient information that we found through these, you know, Mozilla Rally users. All told, this ended up being like around 110 hospitals and health systems I think. Of those, I think about, you know, a dozen at the time of publication, a little less than that, you know, removed The Pixel from their website or from their MyChart portal after we reached out and shared our evidence. More removed it after the article published. I haven’t actually gone back and looked at the full list very recently, so I don’t know where it stands now, but quite a few hospitals have removed this. In some cases, I think in most cases, the language has been something like, you know, we’ve removed this while we investigate further. Nobody’s saying, Oh crap, we shouldn’t have done this at all, this was a huge mistake. And various legal reasons why they might not be saying that. But other, you know, hospitals and health systems have defended this and said that, you know, Listen, when you book, you know, click the button to book a doctor’s appointment on a website, that’s not protected health information, you know? You might not follow through and actually confirm that appointment, or you might be booking at for a friend or a, you know, a relative, and so it’s not your PHI. And I’m not a legal expert and I won’t argue, you know, whether or not that’s correct, but that’s what we heard from hospitals.


Dr. Abdul El-Sayed: Hmm. Sounds a little dubious. I’m trying to think back to my hospital-provided HIPAA training, and I don’t think they would have thought that way given what they say in the training. It seems to me like a corporate attempt to minimize the impact here. I want to just switch tack because, of course, the implications of these kinds of data transfers have gotten a lot more severe after the fall of Roe, considering the fact that abortion service providers can be litigated against and held accountable for actual crimes in states where abortion has been banned, and it’s plausible that the kind of information that people are collecting or tracking on the Internet, just like a MyChart through an app or something else, could be leveraged in those kinds of cases. And I was hoping that you could reflect on the possibility for law enforcement to be harvesting this kind of information, and then what it means for people who transact increasingly their health transactions on the Internet or via the Internet, via apps, etc.–what are the implications of this overall?


Todd Feathers: Yes, I think it’s actually a kind of a tricky subject and a hard line to walk there. I’ve spoken to a couple reproductive law experts on this, in particular, Cynthia Conte Cook from the Ford Foundation. And, you know, what she said is that you don’t want to say that there is no threat of that happening, of, you know, law enforcement subpoena en mass Facebook for some kind of information because, you know, Roe was just overturned. States are passing laws. We don’t know how prosecutors are going to proceed in the future. So it’s not that there’s no risk of that happening, but that, in fact, there are actually a lot easier ways for prosecutors to get information about, you know, providers who provide abortions and patients who receive them. You can see somebody’s phone and look through it. You know, in the case of trying to find out whether a doctor, you know, provides abortions, you don’t have to subpoena Facebook, you can you can Google it, you can find other ways to do that. So the most acute risk from this kind of data collection, according to experts we’ve spoken to, is really not this kind of, you know, mass, you know, subpoenas and warrants from law enforcement. It’s perhaps the kind of thing that Grace Oldham and our colleagues at Reveal, found when they conducted a very similar investigation to ours, a joint investigation with The Markup, looking at the data that the Meta Pixel collected from people who visited crisis pregnancy centers. And these are organizations that don’t provide abortions. And I think in most, if not many cases, you know, the purpose of them is to encourage people not to have abortions. And, you know Grace and Reveal found that a lot of this information, a lot of these crisis pregnancy centers had Meta Pixels on their sites, was sending this information to Facebook, which potentially enables crisis pregnancy centers or other organizations to target advertisements to people who are pregnant and thinking about abortions or thinking about what to do next, and send misinformation or, you know, just, you know, bombard them with messaging that encourages them to to keep their pregnancy. That seems to be the kind of most acute fear that’s been conveyed to us.


Dr. Abdul El-Sayed: As you reflect on your work and what you’ve discovered, how should consumers be thinking about the choices that they make about how they transact their health information on the Internet?


Simon Fondrie-Teitler: I don’t know if it’s, like it’s such a hard question. Right? Like, it’s hard to sort of conceptualize the risks here. And it’s all sort of very, very technical and hard to understand exactly what’s going on. And so I, I mean, it’s sad, but sort of leaving this to consumers is, I don’t I don’t know if there’s a good answer here. I don’t have any particularly good advice. Right? Like, the answer, I don’t think is like, Don’t use the Internet at all for any health stuff, like only call your doctor and only go in person–because that’s a lot more work and there’s downsides to that as well. And so I’m not sure what the individual actions here are that could be helpful. I think it’s, this might only be changed via policy changes.


Todd Feathers: Yeah, I would totally agree with that. I mean, there are things that you as an individual can do to, you know, decrease the amount that you’re tracked on the Internet. You can use privacy extensions. You know, I personally use a browser that doesn’t allow, you know, cookies and other kinds of tracking. But I would agree with Simon that this is, you know, not something that individuals are going to solve by simply, you know, doing that. It’s going to require, you know, systems and institutions to kind of change the practices.


Dr. Abdul El-Sayed: So, you know, considering that it’s very difficult, right, not to live on the Internet in 2022 and that, you know, clicking the link from your email to go check your labs is the obvious next thing to do, you know, I guess it’s worth asking, what kind of pressure should we be putting on our hospitals and even on big tech to get them to take this seriously? Because it’s frustrating to me that, you know, you know that there’s a lot happening behind the scenes in every Internet transaction that you engage with, but you trust it anyway. And what your reporting suggests is that actually it’s less trustworthy than you thought it was. And, you know, as I think about what I want to do or what I want from my Internet experience, who are the institutions that we really can hold accountable? Do we go and talk to our hospitals and say, Listen, if you’re transacting and using my information to collect information about whether or not I click on your Facebook ad and it’s leaking my health information, I’m going to make different decisions about where I go. Or is it the kind of thing where we just lump it into the broader set of grievances that we all ought to hold against Facebook, Twitter, or everyone else that has made Internet 2.0 kind of a living hell?


Todd Feathers: You know one thing that experts and former regulators told us folks can do if they’re, you know, concerned about this or they think that, you know, their medical privacy may have been violated, is that they can file a HIPAA complaint with their hospital. I don’t know exactly how that process works, and I couldn’t promise what the result of that would be, if anything, but that’s one avenue. I think we should probably also note that, you know, right now the US doesn’t have an overarching privacy law like the GDPR in Europe and like some other countries have, and so health care because of HIPAA is one of the few areas where there is actually some regulation and enforcement of data privacy. But there are states that are passing their own privacy laws. You know, almost every day it seems like there are some bills in Congress that, you know, seem to have legs in ways that previous iterations of them haven’t. So I think there is movement on this, but, you know, the devil is always is in the details, and especially on stuff like this where, you know, so much of the you know, what’s happening is actually invisible. I wouldn’t set any high expectations exactly for the regulation is going to solve all of this.


Dr. Abdul El-Sayed: No, I mean, I’m going to interpret what you shared as let’s let’s lump it into the broader set of grievances against big tech more generally–you know, one of them is you’re stealing my personal health information without me even knowing it, alongside everything else. You know, I think as I think about this and, you know, what attracted me to to the story and why I think it’s really critical that folks understand this, is that when you don’t think through all of the implications of what you’re doing, these kinds of things tend to happen. And the problem with the Internet is that, you know, it is somewhat beyond most of our technical comprehension. If you’re not a software engineer and you don’t actually understand what all those odd things that happened before you hit a website actually are, you kind of assume that you’re just doing what you want to do without really appreciating everything that could be happening in the background. And when you’re talking about things that you trust to be protected, that implication becomes all the more dangerous. And so, you know, I think it’s, there’s a world here where all of this should, again, just force us to step back and ask, Well, how much of what we do, do we actually want to do via the Internet? I know it’s onerous to pull out your phone and make a phone call instead of pull out your phone and go to a website, but maybe that’s the better approach if you don’t actually know what’s happening on the back end. The second point, though, is that we need to demand a lot more oversight over what big tech companies are doing, particularly when it comes to things like this. And then the last point–which is a beat that we hit all the time on the show–is that in an effort to monetize sick people, right, our hospitals tend to be willing to do things that end up being fast and loose with things like our health care data. Right? The entire point of putting the Meta Pixel on your Web site is that you can advertise better on Facebook so that you can get more patients to choose your hospital, even though that now means that you’ve opened up a window where Facebook is siphoning off those people’s health information. And so in the effort to, you know, to win that corporate health care, patients end up losing yet again. And this is only one way of many. So I really appreciate you all taking the time to join us to talk through this story, to share your reporting with us. That was Todd Feathers and Simon Fondrie-Teitler from The Markup. Thank you guys so much for joining us today.


Todd Feathers: Yeah, thank you so much.


Simon Fondrie-Teitler: Thank you for having us.


Dr. Abdul El-Sayed, narrating: As usual. Here’s what I’m watching right now. BA.5 is spreading rapidly across the country: cases jumped 17%, hospitalizations 19%, and deaths 10% over the past two weeks. That jump has been driven by the leaps that the virus has taken in both transmissibility and immune evasion. Indeed, BA.5 is up to four times as resistant against mRNA vaccines, and the Mayo Clinic is called it “hyper contagious.” To that end, the administration is working to expand access to a fourth booster shot–currently only authorized for people 50 or older–to all adults. I hate to break it to you, but the fall is only about six weeks away. Fall has brought with it COVID spikes every single year during the pandemic, and officials are trying desperately to get ahead of the current upswing. Remember, three months ago, the dominant variants were BA.2 and BA.2.12.1. Both of these were quickly overtaken by BA.4 and BA.5, so it’s hard to say what COVID will look like three months from now. For their part, vaccine manufacturer Novavax is testing vaccines that specifically target BA.4 and BA.5, the results of which they’ll plan to release in the late summer or early fall. Remember, Novavax vaccine is a more traditional vaccine. Rather than using a piece of mRNA, which the body translates into viral proteins, Novavax introduces the proteins themselves so that our immune system can recognize them. The thinking is that it’ll yield a more robust T-cell and therefore longer-lasting immune response. And reminder that the full consequences, though, of the pandemic aren’t limited to COVID itself. Rates of antibiotic-resistant bacteria, or superbugs, have skyrocketed. They increased 15% between 2019 and 2020 alone. I want to offer a quick primer here on how antibiotic resistance develops. Imagine you treat a thousand bacteria with an antibiotic, but you kill only 999 of them. By definition, that last one is the most resistant to the antibiotic. So if you stopped treatment before that last one dies, it doesn’t have any other bacteria to compete with because you just killed the other 999, so it starts replicating unopposed, making a thousand new bacteria that are resistant to that antibiotic. COVID was a perfect storm. First, you had hospitals filled to the brink with extremely ill patients who required intubation, PICC lines, and catheters, all of which introduced the ability for bugs to infect deeper tissues in the body. Second, early in the pandemic, when doctors didn’t really quite know how to treat COVID, they were treated with mass spectrum antibiotics that patients probably didn’t need, given that COVID is a virus, not a bacteria. You also had limited PPE, so hospital personnel were often reusing PPE that could carry these superbugs between patients. But the other side of the problem, of course, is that drug manufacturers just don’t focus on making advanced antibiotics because, well, they’re not very lucrative. After all, if you’re a drug maker, you want to make drugs that people use a lot, and these are made to be conserved. So we’re losing the war against superbugs. It’s key that we steward our antibiotics effectively–don’t use them unless we need them, and then use them fully if you do need them. And then we have to incentivize drug makers to do the research and development that has to go into making new ones.


Finally on Saturday, 988 became the country’s new mental health crisis hotline number, replacing the old 11-digit suicide prevention hotline. Experts believe that the new number, because of how easy it is to remember, will increase overall use up to threefold. And that’s really important. In fact, 12% of callers who were experiencing suicidal thoughts, report that talking to someone prevented them from acting. Already with the old-11 digit number, wait times were too high, and with a surge in call volume, there’s a fear that there may not be enough support to staff the added volume, which may leave people on hold in their time of greatest need. It’s a reminder that even with massive increases in mental health funding post-COVID, we are so far from where we need to be.


Dr. Abdul El-Sayed: That’s it for today. On your way out, do me a favor: please rate and review the show. I know it just seems like a little thing, but it goes a long way, especially if a lot of you do it. So please, today, make sure you do that. Also, if you love the show, I hope you’ll check out my Substack. It’s called The Incision. You can find me at Abdul El-Sayed dot substack dot com. And don’t forget to check out my YouTube channel at slash Abdul El-Sayed. Also, if you want to rep America Dissected, drop by the Crooked store for some merch. We’ve got our logo mugs and T-shirts. Our dad cops are available on sale, and our Safe and Effective tees are on sale for $20 off while supplies last. America Dissected is a product of Crooked Media. Our producer is Austin Fisher. Our associate producer is Tara Terpstra. Veronica Simonetti mixes and masters the show. Production support from Ari Schwartz, Inez Maza, and Ella Price. Our theme song is by Yasuzawa and Alex Sugiura, and our executive producers are Sarah Geismer, Sandy Girard, Michael Martinez, and me, Dr. Abdul El-Sayed, your host. Thanks for listening.