Engelberg Center Live!

COVID-19 Contact Tracing App Privacy

Episode Summary

COVID-19 contact tracing through mobile devices and applications has become prevalent and popular around the globe but questions remain: How will it affect human rights and civil liberties? Are they secure or even technologically feasible? Will they contribute or detract from our public health? We have gathered experts in law and privacy, cybersecurity, and epidemiology to address all aspects of these issues.

Episode Notes

COVID-19 Contact Tracing Apps was originally held on May 1, 2020 to discuss the privacy implications of technology-based contract tracing applications.  The event was co-hosted by Marc Canellas and Rights Over Tech, the Engelberg Center, the Information Law Institute, and the NYU Center for Cybersecurity.  The discussion features:

-- Rachel Levinson-Waldman, Senior Counsel, Liberty and National Security, NYU Brennan Center for Justice (Moderator).

-- Lorna Thorpe, Professor of Epidemiology, Director of the Division of Epidemiology, NYU Langone School of Medicine.

-- Philip Alston, John Norton Pomeroy Professor of Law, NYU School of Law; UN Special Rapporteur on extreme poverty and human rights.

-- Ed Amoroso, Distinguished Research Professor, NYU Tandon School of Engineering; CEO, TAG Cyber LLC.

Episode Transcription

Marc Canellas (00:00:04):

Good afternoon and welcome everyone to the COVID-19 Contact Tracing Apps, a panel discussion on privacy and public health in the coronavirus age. My name is Mark Canellas and I am a second year law student and Vice President of Rights Over Tech, a student organization here at NYU School of L aw. We as an organization, believe that questions, answers to questions at the intersection of human and civil rights and technology require input from diverse stakeholders. To that end, we could not be happier about the wealth of expertise our moderator and panelists will bring to the discussion today. But before I introduce them, I'd like to thank our supporters here at NYU, the Center for Cybersecurity, the Engelberg Center on Innovation Law, and Policy and the Information Law Institute. And thank you also to my fellow Rights Over Tech-ers Cassie Carly and Lorna Mosher and Elisa Yup. Cardinal, who were absolutely instrumental in putting this all together. So let me start out by introducing our moderator and each of our panelists, and then we'll get right into the conversation. So it's my pleasure to introduce Rachel Levinson Waldman. She is Senior Counsel to the Liberty and National Security Program at the Brennan Center at NYU School of Law. She's an influential voice on issues ranging from law enforcement and government surveillance to ensuring that our national security policies, respect constitutional values and the rule of law. She's a prolific author, author, and commentator in everything from law reviews and Brennan Center reports to the Washington Post, Wired, and the Atlantic. Earlier in her career she was a trial attorney in the civil rights division of the department of justice. So onto our panelists, first, we have Dr. Lorna Thorpe. Lorna is a Professor of Epidemiology at the NYU Grossman School of Medicine and the Director of the Division of Epidemiology.

 

Marc Canellas (00:01:41):

Within then the school of medicine's department of public health, she has served at just about every level of epidemiology and public policy. She began her career on the frontlines of public health, battling tuberculosis as an officer in the CDCs epidemic intelligence service. She has since been the deputy commissioner of epidemiology, the New York city department of health and mental hygiene. And has served on committees for the national Academy of medicine and as an advisor to the CDC on population health surveillance issues. Next, Ed Amoroso. Ed is a research professor at NYU's Tandon School of engineering. He is currently the chief executive officer of tag cyber, LLC, a global cybersecurity advisory training consulting and media services company. Ed recently retired from AT&T after 31 years, including a 12 year stint from 2004 to 2016 as their senior vice president and chief security officer. He also worked with directly with four presidential administrations on national security and cyber policy, and has served on the NSA advisory board.

 

Marc Canellas (00:02:46):

Finally, I will introduce Philip Alston. Phillip is the John Norton Pomeroy professor of law at the NYU school of law. He is one of the foremost scholars and practitioners of international human rights law holding a range of appointments at the UN for over two decades. Most recently he served as the UN special repertoire on environment or on extreme poverty and human rights. The position that I understand ended yesterday. So congratulations Philip. He has written on the intersection of human rights and technology as well, notably, in a report to the UN general assembly, just this fall about his concerns about the digital welfare state and intrusive government surveillance systems and private corporate interests. So again, thank you all for being a part of this conversation. Rachel, I'll leave it to you. Could you lead off our conversation with a description of the proposal that has the most traction right now, the Google and Apple API.

 

Rachel Levinson-Waldman (00:03:38):

It's really an API, an application programming interface, meaning the apps would be built on top of it that would allow phones that are close to each other for a period of time to log that contact the contact with each other by exchanging anonymous identifier keys going directly from phone to phone. So not through sort of a centralized system. If a user who has the, who has one of these apps downloaded later, tests positive for coronavirus, they would enter a code saying, you know, I've been diagnosed with coronavirus entering that code would upload a certain amount of time, probably 14 days worth of those proximity keys that have been getting exchanged to a cloud server, which would then push those keys to other users phones. And so those others phones could check and see, Oh, was I in contact? Cause I enclosed proximity. So one of those codes, yes.

 

Rachel Levinson-Waldman (00:04:32):

Then now I know maybe I need to self isolate. It wouldn't identify who the person is, but it would say, Hey, it looks like you were close enough to somebody for a long enough period of time that you may have been exposed. The idea is that those keys would be randomized and would change intermittently probably pretty frequently. And they would be generated pretty significant volume, which would make it maybe not impossible, but close to impossible to associate a key with a particular phone. There are other proposals out there including one from MIT that would make you use more of location information. So where somebody is rather than their proximity to another user and certainly overseas we're certainly we're seeing a significant number of tools that are already being rolled out and that people are being either being asked to download and use and say Singapore or required to download and use as in, say, China.

 

Rachel Levinson-Waldman (00:05:26):

So that's just a very briefly set, a little bit of the stage and certain in terms of some of the tools that we're seeing. And with that, I would like to turn first to Lorna because I think it would be really useful to start out with some historical background on contact tracing, right? This isn't a new idea to practice with a long and storied history in the public health context. And I would love to ask you to set the stage on what contact tracing means and what it accomplishes and maybe some initial thoughts On questions that these kinds of apps raise.

 

Lorna Thorpe (00:06:01):

Sure. And thanks, Rachel, for that introduction, contact tracing, as you say is not new. Let's start with what it is. It's, it's a process by which a person who has been identified with an infectious disease is approached and their contexts are solicited. And via the tracing process aspect is that the context are, are informed that they have been exposed to somebody who has an infectious disease. And they're in, so they're informed about their exposure. Information is imparted to them that can help them identify if they're at risk or, or take protective measures for themselves. So sometimes there's medicine, you can take, you've been exposed, but you're not yet infected. There could be medicine one can take and you could be tested to identify if you have the condition. And broadly the main purpose is to interrupt the transmission of, of of the infectious disease spreading in a community.

 

Lorna Thorpe (00:07:07):

And it has been a central part of public health practice for many, many years. You can find historical references to it as far back as a 15 and 16 hundreds, it's been a mainstay. It was a mainstay of function for the eradication of smallpox in globally. We had a vaccine, but the vaccine didn't do it by itself. It was vaccinating and contact tracing around infected infectious cases to ensure that those individuals were identified tested. And if, if negative vaccinated and smallpox was a tremendous longstanding illness that the, that the world faced, and we eradicated it through contact tracing. It's a mainstay of public health with respect to reducing sexually transmitted diseases in the community, tuberculosis, HIV. It was used, it's often used in the context of a new outbreaks, such as MERS and SARS.

 

Lorna Thorpe (00:08:10):

When, when, when those outbreaks occurred, contact tracing was a primary function to reduce, spread to other countries and to reduce, spread within countries where cases existed. So it's not new. And just in, in like two or three sentences, it really involves the process of communicating with an infected individual about you know, their infection how long they may have trying to figure out their incubation period, figure out who they came in contact with during that time period, really reassuring that individual, that any information collected will be held to the utmost privacy standards of privacy. And then really thinking through carefully of the context that they've had, which ones might have had a contact that could results in a real exposure and of those real exposures, then a trained individuals go out and inform them anonymously that they've been exposed, help them gain access to testing and get the services that they need.

 

Rachel Levinson-Waldman (00:09:23):

Thank you. That's incredibly helpful. I think that's a really important place to start the discussion. Ed let me turn to you now, sort of with that introduction to public health contact tracing, can you offer sort of a more detailed technical explanation about the Google Apple proposal, then the one that I gave, and certainly if you want to share any initial thoughts about it we can also circle back on that.

 

Ed Amoroso (00:09:49):

Well, certainly I think a detailed technical description probably way beyond the context of our discussion, but I suspect most people are interested in the security and privacy sort of implications here for my technical infrastructure level. There's good news and there's bad news. So we'll start with the good news. Good news is that Apple and Google are extraordinarily talented in this area as is the team from MIT. It's basic cryptography one Oh one along the lines of what we would teach undergraduates at NYU. So the, so the protocol, these sort of producer consumer or advertise scan protocols are very well known, very well understood. And that's why it was so easy for social media to explode because we already know the weaknesses of these types of protocols. There was nothing really invented here, which is good. You don't want to, you don't want to standardize on something you've invented.

 

Ed Amoroso (00:10:44):

You'd rather standardize on something that we've all been using. So in looking at the protocol, I don't necessarily see anything that's specifically would cause any, any more risk than say using email. Like I could scare the life out of any one of you into never wanting to use email again. That's ridiculous. So everything has addressed. So the good news is the, the, these are professionally done. And if you trace through the protocol steps, as I have and looked at the frameworks, they're fine. So that's good. I give them an a on that. And some people may not. I might have a higher bar like Ross Anderson and some others and colleagues of mine have argued that there's high possibility for false positives. Like if you're on the other side of a wall, you might get a forced false positive for Bluetooth, but whatever. Now here's the bad news.

 

Ed Amoroso (00:11:32):

The bad news is that infrastructure level, meaning all the things around these systems, the registration processes, the presumed help desk that you might use. If you have a question, the social media discussion, the potential phishing messages that might come in could if not properly managed, create a spectacular mess. So for example, anyone listening here, imagine you are enthusiastic about this. You decide you're going to opt in. You decide, you want to use your mobile device to do this. You want to beacon out these, these they're called rolling proximity identifiers. It's your device ID to Rachel's point the random, let's say, you decide, you want to do that. And now you're in, and then you come home and in your Gmail, there's an email from somebody saying, Oh my gosh, I just got a note saying that we were in proximity. Were you on the, you know, the, the C train to Brooklyn yesterday?

 

Ed Amoroso (00:12:33):

You might have been on the C train to Brooklyn and it may seem pretty plausible. And before you know, it, you've got a whole a whole rap going on with some fraudster. So we've seen that in places where trolled out. So my concern is not with the cryptography is probably a lot of people on the call here who came to this webinar thinking, well, maybe there's some problems with the, but the underlying technology. That's not the issue. The issue is that if not properly manage the infrastructure and all the things around this could create spectacular chaos. And it's the thing I worry about the most as I, as I sort of think through the implementation of this, which would be non-trivial in the United States

 

Rachel Levinson-Waldman (00:13:17):

Two quick follow up questions on those points. So one is you use sort of the scenario where you get some kind of, you know, sort of phishing or fraudster email, you know, Oh, I was near you, you know, you need to be positive. Can you spin that out a little bit? What do you see happening next that would put somebody at risk as it being asked to share sensitive health information, asked to share account information of some kind, what sort of, what happens next?

 

Ed Amoroso (00:13:42):

Oh, it could be an infinity of possible things. We've all been exposed to phishing messages and it could be preposterous. I mean, it could be asking for money for having say gate, just anything the, the imagination can conjure is, is the types of or that would match up with the types of risks that you find in fishing. And everybody listening here is an expert in what fishes look like. Cause we get 50 of them day. So, so you know how that works. I I've argued that the way to do this in the U S is not to have 20 or 30 different apps from small companies and different providers, but rather to have an, you know, the gala Tarion and me maybe, you know, chokes a little when I say this, but I think this works best. If there is a centralized, trusted, unbiased authority, that's generally recognized amongst our citizens to be to, to be trustworthy and to run this thing centrally where there's one, look, one skin, one user interface, one helped us line one registration process.

 

Ed Amoroso (00:14:48):

It's repeated daily. There's a lot of communication. Anything that's different is obviously fake. It's that kind of thing that we need. And quite frankly, is my opinion. I don't think our federal government right now has the ability to do that. So it would have to be done in concert with some some organization or individual in my own mind, bill Gates comes to mind. I know he's busy, but, but someone like that, it would seem to me to be someone that me as an American citizen, I would trust him. And if you laid out a team and said, here's the infrastructure, let's say Microsoft was running it, Apple and Google design it. Microsoft does that. I would personally be okay with that. I know maybe some people on the call would not be, but I'm just offering my experience running like a bat back when I was at 18, Tim's involved in all the early iPhone rollouts, the first one. So I know what these big events are like. And unless you assume, you know that and assume the worst in terms of infrastructure support you, you you'll have something a surprise you. So this requires 10 ounces of planning around infrastructure for every one ounce of protocol.

 

Rachel Levinson-Waldman (00:16:01):

Thanks. That's really helpful. I think this is a really good time to turn to Philip. Philip, I'd love to get your thoughts, whether we're talking about sort of one, you know, national sort of unitary plan or just these proposals more generally, right? Whether it's proximity tracing, whether it's location tracking, what are your thoughts on the human rights implications of these kinds of proposals? And I guess in particular, how you think they're likely to impact poor populations and other marginalized groups.

 

Philip Alston (00:16:36):

Thanks Rachel. I just one, a preliminary point, I guess. I yeah, have a strong interest in the broader set of issues that this implicates, but I'm not a specialist and I haven't, I've been following obsessively a lot of the analysis, which is now exploding, but I must say that from what I've read in the last few days, there is great skepticism within the technical community as to the capacity for any of these apps to really be particularly effective. And I don't think we should take that aspect for granted. I think the problems of over inclusion under inclusion false alarms, false panic and a whole range of other things are really very important and need to be. So the case actually needs to be made rather than taken for granted as with so many things in the tech area.

 

Philip Alston (00:17:42):

We tend to be children in a toy shop and say, Oh my God, that's terrific. Tech will be able to do it and have the solutions. And I think we need to look much more carefully at that. One of the things that I suppose is good in some ways is that there isn't a single initiative. There's the Google Apple one that we've talked about, but the United Kingdom has rejected that and is developing its own, which it thinks will be in place in just a few weeks. Australia has developed its own, which is already in place and has 3 million users. And so there's a very broad array of different programs that are being used from a human rights perspective. The traditional way of looking at this is to say, well, there's a right to life. There's a right to health.

 

Philip Alston (00:18:33):

Therefore, anything that protects, those must be a good idea, but hang on a minute, there's also a right to privacy. What are the implications? You've then got people coming in and saying, look, any intrusion on privacy would be trivial compared to the grand advantages. Now that is not at all a given, even though it's often stated you then do have to look at what the intrusions into privacy amount to. And I think bigger picture is the one that concerns me most. If you read Shoshana, Zuboff people writing about surveillance capitalism and the way in which the system as a whole has evolved the way in which a lot of these initiatives are driven essentially by commercial interests. It's very hard to separate this out. So in Australia, for example, where, as I said, it's already in practice, you've got, it's not quite legislation yet, but it's a governmental directive and they've got all these precautions put in.

 

Philip Alston (00:19:37):

It's not centralized. It can't be mandatory. The police can't have access and so on. But I think what we've seen is that the really dramatic breakthrough is the idea that whenever you go out, Lorna, you're going to take your telephone with you and messages are going to be sent back about all sorts of things. And you're going to normalize that. And so are your friends, no, one's going to say, hang on a minute long and what's that in your handbag? What are those beeps that are going off? What's been registered, et cetera. Once you've normalized it under the promises, the police will never get access to this data won't be shared. And so on. You've got the software in place, you've got the mentality in place, and it's a very small step then to the sort of Chinese type system of constant surveillance using a whole range of different technologies, but not because we want them, the police to track you, it's because of your health, your social wellbeing and the community interest. And I think we have to be very wary of the deeper and broader implications of the step that we're being asked to take.

 

Rachel Levinson-Waldman (00:20:48):

Thanks Philip. That's a really powerful point. And I feel like there are so many different directions that we could go in having sort of heard, you know, initial comments from the, from the three of you. There are so many more questions that arise. I guess, Ed, let me turn back to you briefly. And then I, and then I'd like to turn, I think, to Lorna.

 

Rachel Levinson-Waldman (00:21:16):

Two questions I think are more on the technical and all that. Not necessarily the related one is your thoughts about Phillip had raised the question of, you know, kind of, would this work in the, in the first place, right? And then if it does, and there are all these other considerations, if it doesn't, then that obviously very much plays into Mmm. You know, kind of determinations or judgements about whether it's even worth going down this path of much more surveillance, kind of all these things. Do you have a sense of what percentage of the population would need to download an app like this for it to work? So I know there was a study that had come up from the, from Oxford suggesting around 60%. I'm curious what your take is on that. And then also, you know, you had spoken to sort of the, you know, the crypto graphical infrastructure you'd said, look, the specs look perfectly fine. That's not a concern. And I'm wondering if you see any concerns in that realm. So certainly one that's come up is, well, what if you combine, if we're talking about the Google Apple, we're talking about this proximity information, say you're combining that with other publicly available information with information, say something from surveillance cameras, sort of how likely is it? Do you think that, that, that even could look like more of a, of a privacy intrusion?

 

Ed Amoroso (00:22:33):

Well, it was a lot, a lot packed into that three or four questions. I'll try and unravel. I asked if it, if it, if it works and it's sorta like asking if your car works, we'll share my car works, but I can still go crash it into a wall somewhere, you know, does it, so looking at just the basics of the protocol, meaning when a, you know, an advertiser and a scanner are interacting, is the cryptography well designed? Of course it is. It's very nice. The secret generation is done properly. It's cryptography 101. So that piece of it, the engine of the cryptography was well done, but that's just a little piece of it. And again, there's something profound to that, an analogy that your car may work, but you can still crash it into a wall. So, so we do have that problem.

 

Ed Amoroso (00:23:20):

Now, this issue of you, you mentioned 60%, I know Lorna is the epidemiologist, but in cybersecurity, we had little games that we play as well, where we, we try to figure out infections between computers and 60% is an important number. Right? We hear that all the time. And in epidemiologists also, also in computer security, where like, if you vaccinate 60% of a population, they bounce around randomly. It has quite a quite ameliorative effect. So you don't have to stop everybody. I don't have to have everybody download this thing. And I have no idea what the take rate will be. Show me who's running it and show me the marketing program and tell me whether like the Kardashians are doing it. And then I'll tell you if people are going to download it and do it, it would be it's that kind of nonsense that will determine the take rate and adoption rate.

 

Ed Amoroso (00:24:08):

Now this issue of proximity with other things like a camera, sounds like you've been on social media. Cause there's all these collisions all over social media with people dreaming up, you know, a bunch of different scenarios and they're all kind of right, but it's the same thing. Like don't get in your car because look, if you're driving down the road, someone can just drive across the yellow line and kill you. And what are you supposed to say? You're supposed to say, no, that couldn't happen. You'd say, well, yes, that could happen. But I do this risk reward and I decide that that's an acceptable risk. So the idea that somebody is going to have like a camera on the, on the path, train to Hoboken and try and correlate, downloaded, rolling proximity, identifiers with camera images of the people on the, yeah, I guess in theory, there's probably a hack there, but at some point I think there's this opinion that many like myself would have, which would be, well, that seems like a pretty okay edge case to me.

 

Ed Amoroso (00:25:13):

And there are ways around that, you know, I forget about COVID. If somebody is on the path, train with a camera, I'm thinking that's a little weird to begin with. So I'm just saying that in cybersecurity, we have this process where technology has proposed, and then there's this platter of different problems that are pointed out. And then the dust settles, most of it's space junk, but there occasionally are big problems that don't settle. And I think that the registration issues and fishing issues that we talked about earlier that is that those are substantive risks. Those are, those are ones that I think are well, almost show stoppers if they're not properly attended to. I mean, I have great respect for Phillip's point. Lorna knows a lot about this. So they would have commentary on whether A, you know, there, there are serious privacy implications.

 

Ed Amoroso (00:26:08):

Like, does this create a, a tailwind for these types of things? Like we saw post 911 with some privacy legislation until Lorna's earlier point, does the epidemiology support this? Like if you told America, if you you'd give assurance that if we all adopt this, I know you can't do this, but if you could, if you said, we all adopt this and in a year, this is eradicated who wouldn't adopt it. You probably would say, that's pretty compelling. Let's do this, but because it's somewhat unclear and you don't know who signing up the risk equation is not as clear, but I'm just saying from the security community, which is where I play crypto's, okay. Infrastructure, fishing, that's a problem. And all these things you see on social media, you know, I could hack it this way, hack it that way. Yeah. It sounds like you were in one of our classes and we showed you that that was a problem, you know, five years ago. So those are acceptable, but they're not showstoppers. Fishing as a showstopper.

 

Rachel Levinson-Waldman (00:27:11):

Thanks. That's really helpful. And I think this is a perfect time to turn to Lorna both of you have sort of general thoughts to share kind of your analysis from an epidemiological or epidemiologists point of view about these kinds of proposals. I think in addition to these general questions, I think one question that would come up if something like this doesn't work, is that a negative from a public health point of view, right? Is it not just that we haven't communicated to people, you know, knowing that they're potentially infection isolating, does it actually find some element of trust in the public health system overall, and sort of how, you know, how do you assess that?

 

Lorna Thorpe (00:27:51):

Right. You know, I think the, I think about the fundamental goal, the fundamental goal of contact tracing and contact tracing programs is to interrupt transmission in a very timely manner and to reduce the impact of an outbreak and that the contact tracing a program sort of blueprint involves, as we said earlier, identifying contacts, testing contacts, and if they're positive, identifying their contacts and testing their content and all of that, it's a very iterative process and it requires a great amount of human contact. And I have very little faith in a technology led solution for this, because I don't think even in the countries that have used some of these apps currently, Singapore, Korea Germany elsewhere, they are the first to say that maybe these technologies support and aid in, in this, this effort, but they don't drive it a very careful executed plan drives it.

 

Lorna Thorpe (00:29:00):

And I don't think it can be in a country like ours. I don't think it can be a national program. And I don't actually think one trusted a broker could do that could really lead this. I think it has to be at the local level led by public health departments at the state and local level. And it can be supported by having one app. If we, if we support those local health departments in their initiatives, having apps to support that work makes a lot of sense. I'm having a proliferation of ass apps has a great risk as ed clearly articulated, but I also think the understanding of what's really working only happens when you have really careful monitoring of what's happening and are the steps being taken to identify contacts, test contacts, quarantine quarantine contacts, test them, identify their contacts if they're positive. And that whole circle is really a critically orchestrated strategy that can't happen when people may or may not react to their phones may or may not go get tested may or may not notify, you know someone that they're positive. It's not going to happen by people opting in the way they choose to opt in because we will all have different reactions to the apps and what it's telling us to do.

 

Rachel Levinson-Waldman (00:30:29):

Thanks. I wonder if you could actually even talk a little bit more about what the sort of human tracing process has looked like early, so say I come down with COVID or come down with some other sort of reportable infectious disease for which there needs to be contact tracing under normal circumstances where there's sort of that capacity for contact tracing. What happens? What does that actually look like?

 

Lorna Thorpe (00:30:58):

Well, I started to describe this earlier, you know, clearly the first thing is to gain the trust of the actual person who's infected and have them understand that process. And that's a critical step. And if they understand why we're eliciting their context and how we're going to handle that information then they, they, they are cooperative and they work with us. The questions really need to be carefully asked in non-judgment. I mean, people may come from all sorts of walks of life and have all sorts of privacy, risks and concerns. So the nonjudgmental and culturally appropriate insensitive ways to ask questions about context, the grounding in actual dates and times having them walk through their incubation period started with most recent events working their way backwards is a very you know, it takes some training and it takes some skill. I think that, you know, sometimes it gets into areas that are complicated because it may involve illegal practices and be involved, substance abuse. It may involve partners who, you know, are not so nice and, and all sorts of aspects that really require careful handling of that information. And if we're going to really tamp out disease in a community, you need to really handle each case very carefully because each case matters tremendously.

 

Lorna Thorpe (00:32:23):

Let me, let me just say one more thing, Rachel. I'm so sorry. And the reason I think it really is a health department function is because health departments have been doing this for a very long time. They have HIV registries. They have registries of people who have reportable sexually transmitted and health departments. No, the moment you break the trust of the public with that information, you've lost all credibility to be able to perform this function in the future. And so there's a longstanding understanding of how to, you know, sometimes it perfectly, but how to, to hold that information and use it for only appropriate purposes.

 

Rachel Levinson-Waldman (00:33:06):

Thanks. That's helpful. And then I guess as a, as a practical matter, right, if, if I'm contacted for instance, by, you know a staff from the department of public health, who's asking those questions, getting that information, where is that information then typically held? Is it just within that health department? Is it shared out with federal authorities? Are there other government agencies that it could be shared with? What does that look like?

 

Lorna Thorpe (00:33:30):

Right. So in the world of public health surveillance, there are reportable conditions and non reportable conditions. And even within reportable conditions, there's tiers of, should we report this in aggregate because it's not that critical to know every single individual. So individual names are not transported from a local health department to a state health department or to the centers for disease control and in certain types of highly concerning infectious diseases that may have large implications, individual names are sometimes elevated and, and, and reported upward to the state or therefore, or then federal level. But it's only if necessary. And those are usually very carefully thought through. And, and there are laws that have been developed to decide, you know, what's reportable at an individual level within what period of time is this reportable within 24 hours late botulism, or is this reportable on a monthly basis like chlamydia? And and so, and these laws after most outbreaks or pandemics or experiences like we're having now are often rethought and, and revised accordingly, not saying it's a perfect process, but it is, I think one that is born by the scientific need in foresight and hindsight.

 

Rachel Levinson-Waldman (00:34:55):

Thank you. That's really helpful. And I think that raises a question. But maybe to sort of some combination of you and Ed, which is, if the idea with one of these apps is that you self report, if you're infected, right. That's what would sort of, you know, trigger the process. And I see ed shaking his head and the idea is that it's a health department that really has control that. Right. So I think that's, I guess that's one of the questions is what does that reporting in look like? Is there a single one being proposed or are there multiple threads?

 

Ed Amoroso (00:35:28):

Yeah, I can explain. So the so Lorna is right. You want the local communities to do what they're good at, which is making us and keeping us healthy. I mean, that's what we want them to do. And the way the Google Apple proposal works is that there are these cryptographic diagnosis, keys, like get blasted out to the to the, to the device, presumably from the cloud. In fact, they even, they're kind of a morphous in the description. They say some cloud thing. It's almost that, that general, the way they describe it. But the idea is that your local hospital, your healthcare, that whole healthcare ecosystem would through this Apple, Google API would send these diagnosis keys. And then there's a resolution function on your device that helps notify you, that you've been near someone. So there's no names, there's no sort of obvious privacy issue there. You just get notifications for that trusted interface, that in the last, whatever they give you some time, and they also have other things in that resolution function around the amount of contact amount of time you were in contact there's things like that metadata around it. But I think that at least the way Google and Apple design this, they did in some sense, try, I don't know how perfect it is. Lorna would have more to say about this, but they did try to accommodate this idea that the healthcare providers should keep us healthy, but that the, the, the app providers should just make sure that we're not spilling data and doing it wrong and having your outbreak and, and doing the kinds of things that computer scientists, we're not healthcare providers. That's not our job, but but Lauren is right. And Phillip alluded to it as well. You do want decentralized, trusted organizations in your community participating here. Well, what I was saying simply is that if you have 10 different places, I can download apps. And, and these these diagnosis keys have to get blasted out to a hundred different apps from your local hospital. It can be kind of chaotic. And I just worry about there's where the computer science becomes more of a risk. So that, that, that was when I was saying a centralized by no means mean the healthcare centralizes mean the place you download the app and the place you call, if you have a problem, if there's a hundred different lines, then you can, I can guarantee you half of them will be fraudulent.

 

Rachel Levinson-Waldman (00:38:03):

Thanks Ed. So Phillip, I want to bring you back into the conversation. I mean, obviously we're talking about this through the lens of these proposals, these sort of technologically driven proposals, but of course at bottom, what's going on here is a global health pandemic, right? That is threatening the lives and stability and livelihood of hundreds of millions of people around the globe. And this is something into which you have had an almost unmatched perspective. You're talking about the, some at the, I know there's, some of you've been very vocal on recently. I'd love to hear from you even sort of stepping away a little bit from the proposals. Although certainly, you know, incorporating those as well. What, what are you seeing? And this certainly relates to where these proposals even help in the first place. What are you seeing and what do you think are the most important first measures or among the most important measures to kind of both bring this to heal and to try to mitigate some of the suffering that's happening as a result.

 

Philip Alston (00:39:10):

Okay. Well, that's a small question, Rachel. Yeah. I guess the, what you're asking really is where this sort of discussion about an app fits into the much broader Ray of issues designed to respond to cupboard 19. I mean, I think the starting point from my perspective is obviously this is a major fundamental public health crisis. And if we could get any sort of tracing app or whatever, that would really be effective, there could be no possible objection to it. It's just that all of the technical stuff that I've seen suggests that it would be a much less reliable, much more potentially chaotic than we would hope for. And therefore we'd need to think about it very carefully. There is, of course, just in relation to the app, the poverty dimension, if you like the figures are that 81% of people in the U S have smartphones I suspect that not all those smartphones would have Bluetooth which is important for these apps. And so you're going to have a lot of people who are not actually going to be covered that already raises major problems, because they're also going to be the ones who are most vulnerable. What we've seen is that the rate of deaths and infections among African Americans among the Hispanic community are much higher. And those are the communities because they're poor because their health status is neglected by the government that is who are also not going to have the smartphones and thus not be reporting and reported on. So that raises big problems. So my general response, but it's maybe not very helpful in this panel in a way, is that while we need to do everything we can from the epidemiological narrow, epidemiological point of view, too, trace people and try to reduce the spread.

 

Philip Alston (00:41:29):

And so on the longer term actions that are required go much beyond that. Yeah. Okay. The United States is that it's a healthcare system run down so much. The fact that the administration has wasted no opportunity to try to strike millions, tens of millions of people off health insurance rolls if it possibly can to reduce food benefits and so on all of that, you know, whatever your politics, you are, setting up a much less secure, much more vulnerable population, which is going to be very ill equipped to deal with the future pandemics that we're now, or much more confidently expecting to to afflict us,

 

Rachel Levinson-Waldman (00:42:17):

Sorry, I muted myself. Thank you. That is insightful and a and bracing. And one of the things that I think you touched on a bit, and we haven't really gotten into, but it seems like an incredibly important part of this, which is the rate of testing. Right. And we know that the rate varies wildly from country to country, the U S has had a really, or showing with testing it's yeah, it seems it's getting better. I think to some extent, I really just say that locally. I know we're starting to get messages now saying, you know, if you have reason to believe you have symptoms and you can actually get a test now, but it's taken a long time, it's not widely available. So maybe this is a question first to to Lorna as the epidemiologist, but I can also open it up generally. How is it, I guess part of the question is, is it worth it to talk about these kinds of apps with such insufficient testing and sort of what level of testing do we need to see maybe generally for contact tracing to be useful, and for these methods of contact racing to be useful,

 

Lorna Thorpe (00:43:16):

Right. Again, if we think about the fundamental goal of interrupting transmission, I think contact tracing and testing go hand in hand, they are, they can't be separated. That that's the whole purpose is to find cases, isolate them, take them out of the community and identify by what other cases are out there. And the, and identify what other cases are out there, come from the context. And so testing, we are in a situation where the testing capacity around the country is inadequate on overall and in some parts of the country, more inadequate than others. And I think that we are beginning to see important collaborations to improve that across state boundaries and between public and private institutions. And I think that we need to continue to make that a top priority in this country. And I think that the contact tracing workforce will have, you know, many health departments as both Phillip and ed had said you know, just clearly is that the health department infrastructures are, they know what to do.

 

Lorna Thorpe (00:44:23):

They don't necessarily have the resources anymore, especially over the last 10 to 20 years, to be able to do of this contact tracing by themselves. And so it is going to take partnerships and it will take partnerships potentially with, with, with tech industry. I think that there is a role for for apps on many levels, right? So even the individuals who are home and quarantined, I takes, it takes person power to call them and to check in on them and make sure there's some, you know, they're okay. And if they see if they're developing symptoms and I, and to set up the testing, if necessary, that could be, that could be a very, very nice niche for four apps to really monitor symptoms on a daily basis through symptom logs and things like that. And I think that there are other ways that we can use technology to really guide us all of this information, the information of cell phone users sort of monitored on, on aggregate have been invaluable to us in terms of really understanding the actual uptake of social distancing policies and allow us to feed that back into our mathematical models to understand, you know, are we having an impact on, on transmission in real time?

 

Rachel Levinson-Waldman (00:45:41):

That's great. And I wonder if you could speak to, I mean, sort of the comment about, right, we've been talking a lot about the sort of contact tracing to notify individuals when they might be at risk, or it might have been exposed, but there is this other piece of it, right. Sort of broader social distancing measures, right. To what extent those have been effective in being able to use location information? No, I think the way I've understood it is okay, you know, before coronavirus hit and before people started social distancing, you know, say if you looked at my neighborhood, you would see things that people going all over the place, you know, a lot to downtown, various other places. Now, hopefully you see fewer of those sort of like population wide movements from one part of the city to another parts you can sort of generally gauge that. Mmm. Which in part, I think raises the question about sort of public health education campaigns, right. Being able to draw on some of that information to say, you know, great job Des Moines, you know, it looks like you're staying home a lot, keep doing that. And then, Ooh, some other city, you know, look, we can tell you're still going out a bunch to wherever it is. And I wonder sort of how you see that playing out. And if there are models for those kinds of public education campaigns drawing on some of this data,

 

Lorna Thorpe (00:46:59):

My colleagues can also can also speak to this. But you know, with every major disaster comes a lot of innovation and we've seen some really fast innovation in this space. And there are some wonderful websites that really track the social distancing uptake using cell phone data and other other data sources. And I think I'm involved with a colleague at at Marin Institute here at NYU and we just received an NSF award to use cell phone information to really try and measure at a very, very granular level within cities. The the encounter contact density changes over time. And so that is, I think, a real opportunity think my colleagues will want to comment.

 

Rachel Levinson-Waldman (00:47:49):

Absolutely Ed or Phillip, do you have anything to add on that score?

 

Ed Amoroso (00:47:53):

Well, Rachel, if you're asking about awareness in, in the world that I live in, we're always trying to get human beings to make good decisions about things. Mmm. And frankly, our track record has not been all that good. So these types of schemes, whether it's, you know, medical professionals making good decisions around an app and infrastructure, are people making good decisions or interpreting things properly. My experience sadly is that far too many people completely misinterpret and make bad decisions about how technology's used. So that does have to be factored into any type of deployment. I mean, just imagine in your, still through a typical scenario in your own mind, let's say you sign up for this, you get it on your phone, you hop on a subway, you go home and then you get a notification that you were near someone now, what do you do?

 

Ed Amoroso (00:48:52):

Like, do you trust it? You say, wow. If I hadn't downloaded this, then I could go to work tomorrow, but I did download it. And now do I even trust this stupid thing? Or do I just delete this dumb app? Because I really would like to go to work tomorrow. That's the point we see that in cybersecurity all the time. And that that equation in your mind can only be made properly. If you've got really good communication about how these things work and you really trust it. And you're really see a lot of examples. And the people around you are supportive of tough decisions, life decisions that you'd be making based on a notification from an app. Like I'm not going to go open my bakery tomorrow because I got a little blip from an app here. What you, you see what I mean? So we have that insecurity. And I, and I, again, that was my point earlier. There was all that stuff that, that makes these types of things very difficult, right? It's not the crypto. It's the it's that you trust that little note from the app to shut down your bakery tomorrow? Yes or no. A lot of people are going to say, you know what, forget this stupid app. I feel fine. And they're going to go to the bakery.

 

Rachel Levinson-Waldman (00:50:14):

Yup. I think that also tees up really well. A question that's come through, which I think is, is well taken in this context. So say, you know, we've been talking about this with the assumption that the app would be voluntarily downloaded. And I don't, I don't think there's another possible model in this country. I don't think that there's a scenario in which there is something mandatory that's put onto everyone's phones. And frankly, even if it were, as we've already talked about know cell phone use and ownership, isn't uniform, the capacities of those cell phones, is it uniform, but see, there were some sort of mandatory scheme, right? Where I assume the idea is then, well, there's much more information that's coming into Iran. It has to be on your phone. Obviously it begs the question of are you then wired to take your phone with you everywhere you go. But it's on your phone and, you know, there's medical information. It's going to be uploaded into it if you're, if you're tested positive. And I guess I'm wondering what people think are the privacy. I think there's a lot of, a lot of implications could be privacy implications could be ethical implications and could be just efficacy implications. Right. If it's an, you know, sort of UN in voluntarily on your phone, do you then, I mean, Ed, this very much goes to your point, do you then make a decisions while not, absolutely not taking my phone with me where I go and I don't care what comes through on it because I didn't want to put this on in the first place.

 

Philip Alston (00:51:46):

Yeah. Yeah, I think this is, I mean, first of all yes, we're in the United States and that's where the debates taking place and it's going to be much harder for constitutional and federalism and other reasons to make it mandatory. But we should keep in mind that there are a lot of countries that have already done something mandatory and have the, both the capacity and the intention of doing so, but before we then get smug and say, well, that can't happen here in the United States. I think it's very important to take Ed's example. And I'm going to work out some way of keeping him from that goddamn bakery tomorrow morning. I don't want him going there. I've had some indicator that he might've been exposed. I don't want him to go. I'm going to find some way not of sending the police around to lock his door or whatever about to expose the fact that he has got this notification and he's ignoring it, or he's turned off his cellphone different ways in which one can imagine scenarios which cannot achieve mandatory application, but can stigmatized, can creative pariahs of people who have been notified in some way.

 

Philip Alston (00:53:14):

And this is a big public health issue. We can't just keep it quiet if we know that this damned Amoroso is heading off to his bakery every, despite beeping on our app every day. So I think it's a very good problematic scenario in fact, and it, it doesn't didn't necessarily have to go to the extreme of Andrew Weissman's question of whether, if it could be made mandatory, we would immediately opt for that.

 

Lorna Thorpe (00:53:51):

And I might add something. I mean, I think historically going back to sort of the role of contact tracing historically, it has always had an educative component to it and it's, it's educative at the individual level, but it's educative in, in, in a broader way as well. So I don't think you get the buy-in without that interaction with the individuals. Number one, and number two, just, you know, we haven't raised on this call, the fact that the countries that are using apps now even they have, you know, in tiny countries, such as Singapore that have very, very strong infrastructure to roll out something like this have not achieved the, the penetrance that they need to make it effective. So we don't have an effective modeling. Yeah. In Singapore, I don't know where we're at now, the last numbers I saw, we were under 20% uptake. There may be something more recent, but, but I think, you know, that's far from 60% in a tiny country. And so we don't have evidence that this will work.

 

Rachel Levinson-Waldman (00:54:59):

And I think also speaking to, to Phillip's point about sort of the, the shaming that could come out of it, that is something that we've seen. Right. I think in the, in the South Korean model, there's been reports that that was sort of one of the consequences, because there was so much sort of specific individualized information that was being shared. Right. You could see somebody kind of path of location. You could, it was either kind of explicitly clear or effectively clear who individuals were. Right. Which was then leading to a lot of, sort of blowback and backlash against individuals.

 

Lorna Thorpe (00:55:31):

Right. And so they've, they've actually some prominent leaders and physicians have been talking about, we need to rethink this and move back to a more aggregate model of under using all these data in, in South Korea more than the individual targeting, targeted shaming you know, model that they began rolling out very aggressively in response

 

Rachel Levinson-Waldman (00:56:01):

So I think we have time, maybe just for one, possibly two more questions. I know we're coming up on five o'clock. There's an interesting question that, that came in and fill up this one might be best directed to you as the question is what role would a more robust social assistance infrastructure system. So income assistance, something like that play in, I think people's willingness think the idea to adopt some kind of app that would make them stay at home. Right. So sort of speaking to Ed's point, like this is quite a, have a major effect on my life, but I don't really want to pay attention to this. And obviously part of the reason for that is really the, because especially in this country of any kind of significant social support system. And so I guess I wonder what your, anyone else's thoughts are on that kind of how those might play together.

 

Philip Alston (00:56:48):

Yeah. I think that's a key a really key point. A lot of I've been doing quite a lot of work about the development of biometric identification systems in different countries around the world. And very often these are sold on the basis that we're going, where really dedicated to providing better social protection and better financial inclusion for people. But of course, the reality is that what happens once these systems are set up is that they used this largely in a negative way. In other words, to monitor people, to discipline people, to take benefits away to sell financial services and so on. And it's not really part of an effort to provide a much better service for the citizen. And I think in the United States goes to the points that both Laura and ed have made, if this was really part of a package, you know, okay, we're going to have to rethink community health and community health is going to have some of this dimension all with the tracking and so on, but it's also going to have a support system. So the people who are alerted we'll then have a fallback, then you're going to get buy in at a much greater level. But in fact, that's not, what's being proposed and there's always this sort of punitive semi mandatory dimension in the background. And that makes it infinitely less attractive.

 

Rachel Levinson-Waldman (00:58:22):

Let me ask you one quick question. There's one more that came through, and I think this is also an interesting question, although I may show my hand a little bit in, in my concerns about how this might play out. So the question was whether there is some way that an app like this could be used to address specific [inaudible], so it could be specific workplaces, it could be specific facilities, some kind of communal living facility, something like that. Just sort of help make decisions about, you know, what sort of place of business or place of support stays open under what circumstances are workers required to come in, encouraged to come in, allowed to come in. Obviously, even right now, there are a lot of essential workers who are still going to their jobs, but ways to use one of these kinds of apps.

 

Rachel Levinson-Waldman (00:59:11):

I think my concern a little bit is that we there's almost, I'm not sure that there's any way to make those kinds of determinations without also having individual aspect to it. It'll make me worry a little bit that we wouldn't be looking more towards a China type model of a green, yellow, red, you know, I wake up in the morning and look at my smartphone to see if I'm allowed to go someplace, but that being said that the question might be more directed at, okay. If we know that there's been an exposure in a particular waste, do we then make different kinds of decisions about that?

 

Ed Amoroso (00:59:45):

Rachel, I suspect the inmates at Rikers Island would love to be issued iPhones to carry around. So I I'm going to guess that would be a very popular proposal on most prisons.

 

Rachel Levinson-Waldman (00:59:57):

Well, I mean, that, that does speak to I'm probably don't have time to now, but the incredible impact of this on detention facilities, right? Facilities in which people aren't allowed freedom of movement, freedom of separation, and are being kept despite

 

Ed Amoroso (01:00:12):

This wouldn't be the way to protect them. This, this, this is not the kind of infrastructure, obviously in nursing homes and in prisons and in apparently meat processing plants where people are so close to each other it strikes me that we can more or less presume that there's a catastrophe going on in those places now. And the contact tracing doesn't seem like the type of solution for the problem that they've got. There's a much deeper problem. Again, Lorna and Philip might have a, a much better answer. I'm a computer scientist, but it seems pretty obvious is that where you had that clustering of people you don't really need a contact tracing app to tell somebody that there's likely a problem. So, so that doesn't seem to me to jump off the page as the most likely the, the, the highest return place to deploy this type of app and infrastructure seems more for the person commuting and casually going and buying lunch and going to the office and going back, like all of us, when we get back to some semblance of normalcy, we'll have to decide whether you want this sort of thing on your app and you get home every night and you see if you came in contact, we're in proximity and you're probably wearing a mask and maybe you just know that, and maybe once a month, the thing dings and you're comfortable with that.

 

Ed Amoroso (01:01:40):

But if it thinks 50 times yesterday, then maybe that's something to attend to, again, that those are the ops concepts that I think lend to this, but not to give everybody in a Tyson's plant. You know, this and I, and I suspect that the Tyson's workers are probably not going in with their iPhones jammed in their pockets either. I think that's usually not the kind of I think they carry around a factor floor.

 

Rachel Levinson-Waldman (01:02:06):

I think that's probably right. Well, let me ask Lorna, Phillip, do you have any sort of last words that you would like to offer?

 

Lorna Thorpe (01:02:15):

I think these are really critically important questions. Thank for putting together this panel. And I think that what, we've, what we've learned maybe between the three of four of us here is that there, there are some, some promising opportunities, but merely the, the overarching strategy needs to be laid out before understanding what the role of these apps are sort of the cart horse issue. I think we need to, I think that through.

 

Philip Alston (01:02:47):

Yeah, I would just underscore that. I think that you know, the, as I said earlier, the risk of thinking there's a magic bullet and this is, it is very dangerous. And unless it's part of a really comprehensive strategy, which would then make it much more attractive we shouldn't be rushing into it.

 

Rachel Levinson-Waldman (01:03:10):

Well, thank you so much to the three of you. This is a really fascinating discussion. I'm so grateful to, to have been able to learn from, and to have the three of you really sort of move this conversation forward. And thank you again to, to Marc into the sponsoring organizations for inviting me to be part of this and all Jen, thank you to everyone for tuning in. I'll quickly. Turn it back to Marc, to close us out.

 

Marc Canellas (01:03:35):

Thank you, Rachel. And thank you also to Lorna, Philip, and Ed, and for each of you for taking time out of your very busy schedules and helping us understand what we should be asking as these contact tracing apps start to be deployed. Thank you also to our sponsors here at NYU Law, the Center for Cyber Security, the Engelberg Center on Innovation Law & Policy, and the Information Law Institute. Once again, I am Mark Canellas from NYU Rights Over Tech, please stay safe and healthy.