1 00:00:15,390 --> 00:00:16,540 PROFESSOR: All right. 2 00:00:16,540 --> 00:00:18,860 Let's get started. 3 00:00:18,860 --> 00:00:20,930 Welcome, ladies and gentlemen. 4 00:00:20,930 --> 00:00:24,700 Today it's my pleasure to introduce two guest speakers 5 00:00:24,700 --> 00:00:28,090 who will talk about the regulation of AI 6 00:00:28,090 --> 00:00:34,300 and machine learning and about both the federal FDA level 7 00:00:34,300 --> 00:00:38,380 regulation and about IRB issues of regulation 8 00:00:38,380 --> 00:00:40,090 within institutions. 9 00:00:40,090 --> 00:00:43,470 So the first speaker is Andy Coravos. 10 00:00:43,470 --> 00:00:49,510 Andy is the CEO and founder of Elektra Labs, which 11 00:00:49,510 --> 00:00:53,840 is a small company that's doing digital biomarkers for health 12 00:00:53,840 --> 00:00:55,240 care. 13 00:00:55,240 --> 00:01:00,500 And Mark is a data and software engineer 14 00:01:00,500 --> 00:01:05,410 for the Institute for Next Generation Healthcare at Mount 15 00:01:05,410 --> 00:01:08,500 Sinai in New York and was kind enough 16 00:01:08,500 --> 00:01:11,510 to come up to speak to us today. 17 00:01:11,510 --> 00:01:13,690 So with that, I'm going to introduce them 18 00:01:13,690 --> 00:01:15,565 and sit back and enjoy. 19 00:01:15,565 --> 00:01:17,530 ANDY CORAVOS: Thank you. 20 00:01:17,530 --> 00:01:20,280 Thank you for having us. 21 00:01:20,280 --> 00:01:24,550 Yeah, so I am working on digital biomarkers, 22 00:01:24,550 --> 00:01:27,460 and I'm also a research collaborator at the Harvard MIT 23 00:01:27,460 --> 00:01:28,990 Center for Regulatory Sciences. 24 00:01:28,990 --> 00:01:30,910 So you all have a center that is looking just 25 00:01:30,910 --> 00:01:34,453 at how regulators should think about some of these problems. 26 00:01:34,453 --> 00:01:36,370 And then I'm also an advisor at the Biohacking 27 00:01:36,370 --> 00:01:39,340 Village at DEFCON, which we can talk a little bit more about. 28 00:01:39,340 --> 00:01:41,260 My background-- I'm a software engineer, 29 00:01:41,260 --> 00:01:45,580 had worked with the FDA formerly as an entrepreneur resident 30 00:01:45,580 --> 00:01:47,890 in the digital health unit, and then spent some time 31 00:01:47,890 --> 00:01:50,618 in corporate land. 32 00:01:50,618 --> 00:01:51,910 MARK SHERVEY: I'm Mark Shervey. 33 00:01:51,910 --> 00:01:54,160 I work at the Institute for Next Generation Healthcare 34 00:01:54,160 --> 00:01:56,470 at Mount Sinai. 35 00:01:56,470 --> 00:01:58,570 I've been there about three years now. 36 00:01:58,570 --> 00:02:01,420 My background is in software and data engineering, 37 00:02:01,420 --> 00:02:04,603 coming mostly from banking and media. 38 00:02:04,603 --> 00:02:05,520 So this is a new spot. 39 00:02:05,520 --> 00:02:08,500 And most of my responsibilities focus around 40 00:02:08,500 --> 00:02:13,533 data security and IRB and ethical responsibilities. 41 00:02:13,533 --> 00:02:15,700 ANDY CORAVOS: We also know how much people generally 42 00:02:15,700 --> 00:02:18,940 love regulatory conversations, so we 43 00:02:18,940 --> 00:02:22,180 will try to make this very fun and exciting for you. 44 00:02:22,180 --> 00:02:24,263 If you do have questions, as regulations are weird 45 00:02:24,263 --> 00:02:26,138 and they're constantly changing, you can also 46 00:02:26,138 --> 00:02:27,340 shoot us a note on Twitter. 47 00:02:27,340 --> 00:02:30,430 We'll respond back if you have things that come up. 48 00:02:30,430 --> 00:02:34,552 Also, the regulatory community on Twitter, amazing. 49 00:02:34,552 --> 00:02:36,010 When somebody comes out with, like, 50 00:02:36,010 --> 00:02:37,810 what does real world data actually mean, 51 00:02:37,810 --> 00:02:39,440 everybody is talking to one another. 52 00:02:39,440 --> 00:02:41,470 So once you start tapping into-- 53 00:02:41,470 --> 00:02:43,570 I'm sure you have your own Twitter communities, 54 00:02:43,570 --> 00:02:45,940 but if you tap into the regulatory Twitter community, 55 00:02:45,940 --> 00:02:47,930 it is a very good one. 56 00:02:47,930 --> 00:02:50,850 The digital health unit tweets a lot at the FDA. 57 00:02:50,850 --> 00:02:51,370 OK. 58 00:02:51,370 --> 00:02:55,420 Disclaimers-- these are our opinions and the information 59 00:02:55,420 --> 00:02:58,390 that you'll see here does not necessarily 60 00:02:58,390 --> 00:03:02,500 reflect the United States government or the institutions 61 00:03:02,500 --> 00:03:05,320 that we are affiliated with. 62 00:03:05,320 --> 00:03:08,780 And policies and regulations are constantly changing. 63 00:03:08,780 --> 00:03:10,990 So by the time we have presented this to you, 64 00:03:10,990 --> 00:03:14,590 most likely parts of it are wrong. 65 00:03:14,590 --> 00:03:18,010 So you should definitely interact early and often 66 00:03:18,010 --> 00:03:20,560 with relevant regulatory institutions. 67 00:03:20,560 --> 00:03:24,300 Your lawyers might say not to do that. 68 00:03:24,300 --> 00:03:25,800 There are definitely different ways, 69 00:03:25,800 --> 00:03:27,883 and we can talk through how you'd want to do that. 70 00:03:27,883 --> 00:03:30,700 But especially as a software engineer and developing 71 00:03:30,700 --> 00:03:32,290 anything on the data side, if you 72 00:03:32,290 --> 00:03:34,600 spend too much time developing a product that is never 73 00:03:34,600 --> 00:03:37,780 going to get through, it is really a wasted period of time. 74 00:03:37,780 --> 00:03:40,150 So working with the regulators, and given 75 00:03:40,150 --> 00:03:42,670 how open they are right now to getting feedback, 76 00:03:42,670 --> 00:03:44,920 as you saw with the paper that you read, 77 00:03:44,920 --> 00:03:47,600 is going to be important. 78 00:03:47,600 --> 00:03:51,340 And then the last thing, which Mark and I talk a lot about, 79 00:03:51,340 --> 00:03:54,310 is many of these definitions and frameworks have not actually 80 00:03:54,310 --> 00:03:55,360 happened yet. 81 00:03:55,360 --> 00:03:57,622 And so when somebody says a biomarker, 82 00:03:57,622 --> 00:03:59,330 they might actually not mean a biomarker, 83 00:03:59,330 --> 00:04:00,400 they might mean a measurement. 84 00:04:00,400 --> 00:04:01,360 I'm sure you know this. 85 00:04:01,360 --> 00:04:03,485 When someone's like, I work in AI, and you're like, 86 00:04:03,485 --> 00:04:05,510 what does that actually mean? 87 00:04:05,510 --> 00:04:07,360 So you should ask us questions. 88 00:04:07,360 --> 00:04:12,730 And if you think about it, the type of knowledge that you have 89 00:04:12,730 --> 00:04:16,060 is a very specific, rare set of knowledge compared to almost 90 00:04:16,060 --> 00:04:18,079 everybody else in the country. 91 00:04:18,079 --> 00:04:20,529 And so as the FDA and other regulators 92 00:04:20,529 --> 00:04:23,260 start thinking about how to regulate and oversee 93 00:04:23,260 --> 00:04:26,530 these technologies, you can have a really big amount 94 00:04:26,530 --> 00:04:28,760 of influence. 95 00:04:28,760 --> 00:04:31,000 And so what we're going to do is a little bit 96 00:04:31,000 --> 00:04:33,400 of the dry stuff around regulatory, 97 00:04:33,400 --> 00:04:35,950 and then I am going to somewhat plead with you 98 00:04:35,950 --> 00:04:39,820 and also teach you how to submit public comments so that you 99 00:04:39,820 --> 00:04:42,490 can be part of this regulatory process. 100 00:04:42,490 --> 00:04:43,770 And then-- 101 00:04:43,770 --> 00:04:47,690 MARK SHERVEY: I will speak about the Institutional Review Board. 102 00:04:47,690 --> 00:04:50,620 How many people in here have worked with an IRB 103 00:04:50,620 --> 00:04:53,500 or aware of them? 104 00:04:53,500 --> 00:04:54,040 OK, good. 105 00:04:54,040 --> 00:04:55,450 That's a good mix. 106 00:04:55,450 --> 00:04:57,820 So it'll be a quick thing, just kind 107 00:04:57,820 --> 00:05:00,100 of reviewing when to involve the IRB, 108 00:05:00,100 --> 00:05:02,832 how to involve the IRB, things you 109 00:05:02,832 --> 00:05:04,540 need the IRB for and some things that you 110 00:05:04,540 --> 00:05:11,520 don't, as an alternative to taking the FDA approach. 111 00:05:11,520 --> 00:05:12,850 ANDY CORAVOS: All right, good. 112 00:05:12,850 --> 00:05:15,210 And then I'll go first, and then we'll go through IRBs, 113 00:05:15,210 --> 00:05:17,502 and then we'll leave the last part for your impressions 114 00:05:17,502 --> 00:05:19,840 of the paper. 115 00:05:19,840 --> 00:05:20,340 OK. 116 00:05:20,340 --> 00:05:22,320 So before I start, I'll ground us 117 00:05:22,320 --> 00:05:25,247 in some ideas around algorithmically-driven health 118 00:05:25,247 --> 00:05:25,830 care products. 119 00:05:29,290 --> 00:05:31,990 So as you know, these can have wide ranges 120 00:05:31,990 --> 00:05:33,430 of what they can do. 121 00:05:33,430 --> 00:05:36,160 A general framework that I like to use to think about them 122 00:05:36,160 --> 00:05:40,387 is products that measure, that diagnose, or treat. 123 00:05:40,387 --> 00:05:42,220 So measurement products might include things 124 00:05:42,220 --> 00:05:46,360 like digital biomarkers or clinical decision support. 125 00:05:46,360 --> 00:05:49,270 Diagnostics might take that measurement and then say 126 00:05:49,270 --> 00:05:51,460 whether or not somebody has some sort of condition 127 00:05:51,460 --> 00:05:52,870 given those metrics. 128 00:05:52,870 --> 00:05:56,770 And then treatment are ideas around digital therapeutics. 129 00:05:56,770 --> 00:06:02,355 How many people here think that software can treat a person? 130 00:06:02,355 --> 00:06:04,100 A few, maybe. 131 00:06:04,100 --> 00:06:05,383 OK. 132 00:06:05,383 --> 00:06:07,800 And I think one thing that people don't think about always 133 00:06:07,800 --> 00:06:10,577 when they have these sorts of tools is-- 134 00:06:10,577 --> 00:06:12,660 and you all probably think about this a lot more-- 135 00:06:12,660 --> 00:06:16,730 is even something as simple as a step count is an algorithm. 136 00:06:16,730 --> 00:06:19,317 So it takes your gyroscope, accelerometer, height, weight, 137 00:06:19,317 --> 00:06:21,150 and age, and then it predicts whether or not 138 00:06:21,150 --> 00:06:22,310 you've made a step. 139 00:06:22,310 --> 00:06:24,060 And if you think about the types of different steps 140 00:06:24,060 --> 00:06:25,890 that people make, older people drag their feet 141 00:06:25,890 --> 00:06:27,473 a little bit more than younger people. 142 00:06:27,473 --> 00:06:29,360 So an algorithm for a step count looks 143 00:06:29,360 --> 00:06:31,860 very different from an algorithm for younger people for step 144 00:06:31,860 --> 00:06:33,310 count. 145 00:06:33,310 --> 00:06:36,030 And so all of these tools have some level of error, 146 00:06:36,030 --> 00:06:39,480 and they're all algorithms, effectively. 147 00:06:39,480 --> 00:06:42,210 One of my favorite frameworks as you start thinking about-- 148 00:06:42,210 --> 00:06:44,460 a lot of people are very interested in the measurement 149 00:06:44,460 --> 00:06:47,280 side around what's called digital biomarkers. 150 00:06:47,280 --> 00:06:49,800 And it turns out that the FDA realized 151 00:06:49,800 --> 00:06:51,960 that many people, even within their own agency, 152 00:06:51,960 --> 00:06:53,892 didn't know what a biomarker was, 153 00:06:53,892 --> 00:06:56,100 and everyone was using the term slightly differently, 154 00:06:56,100 --> 00:06:58,710 probably how people approach you slightly differently of what 155 00:06:58,710 --> 00:07:00,240 machine learning actually is. 156 00:07:00,240 --> 00:07:02,250 And so there is a really good framework 157 00:07:02,250 --> 00:07:04,530 around the seven different types of biomarkers 158 00:07:04,530 --> 00:07:07,870 that I'd highly recommend you read if you go into this area. 159 00:07:07,870 --> 00:07:11,463 A digital biomarker only, in my definition 160 00:07:11,463 --> 00:07:13,380 and how other people have started to use this, 161 00:07:13,380 --> 00:07:15,885 is the way that that measurement is collected. 162 00:07:15,885 --> 00:07:17,760 And so you might have a monitoring biomarker, 163 00:07:17,760 --> 00:07:19,440 a diagnostic biomarker, but it is 164 00:07:19,440 --> 00:07:21,720 collected in an ambulatory, remote way that 165 00:07:21,720 --> 00:07:23,640 is collecting digital data. 166 00:07:26,350 --> 00:07:29,460 And this type of data is very tricky. 167 00:07:29,460 --> 00:07:31,960 To give you an example of why this is particularly difficult 168 00:07:31,960 --> 00:07:33,940 to regulate, so think about a couple 169 00:07:33,940 --> 00:07:36,130 of products that just look at something that 170 00:07:36,130 --> 00:07:37,860 would be simple, like AFib. 171 00:07:37,860 --> 00:07:39,535 So AFib is an abnormal heart condition. 172 00:07:39,535 --> 00:07:41,410 You might have seen in the news that a number 173 00:07:41,410 --> 00:07:44,050 of different companies are coming out with AFib products. 174 00:07:44,050 --> 00:07:47,350 Put simply, there is obviously a large stack 175 00:07:47,350 --> 00:07:51,160 of different types of data, and one person's raw data 176 00:07:51,160 --> 00:07:53,195 is another person's processed data. 177 00:07:53,195 --> 00:07:54,820 So what you see on this chart is a list 178 00:07:54,820 --> 00:07:56,530 of five different companies that are all 179 00:07:56,530 --> 00:08:00,250 developing AFib products, from whether or not 180 00:08:00,250 --> 00:08:02,120 they develop the product internally, 181 00:08:02,120 --> 00:08:04,600 which is the green part, versus whether or not 182 00:08:04,600 --> 00:08:06,850 they might use a third party product, so developing 183 00:08:06,850 --> 00:08:09,220 an app on top of somebody else's product. 184 00:08:09,220 --> 00:08:14,410 And so in a broad way, thinking about going from the operating 185 00:08:14,410 --> 00:08:16,360 system to the sensor data. 186 00:08:16,360 --> 00:08:19,360 So somebody might be using something like a PPG sensor 187 00:08:19,360 --> 00:08:23,380 and collecting this sort of data from their watch 188 00:08:23,380 --> 00:08:25,600 and then doing some sort of signal processing, 189 00:08:25,600 --> 00:08:27,340 then making another algorithm that 190 00:08:27,340 --> 00:08:28,930 makes some sort of diagnostic. 191 00:08:28,930 --> 00:08:32,750 And then you have some sort of user interface on top of that. 192 00:08:32,750 --> 00:08:37,120 So if you are the FDA, where would you draw the line? 193 00:08:37,120 --> 00:08:39,490 Which part of this product, when somebody 194 00:08:39,490 --> 00:08:43,179 says my product is validated, should it 195 00:08:43,179 --> 00:08:45,645 be actually validated? 196 00:08:45,645 --> 00:08:47,020 And then thinking about what does 197 00:08:47,020 --> 00:08:49,780 it actually mean if something is verified versus validated. 198 00:08:49,780 --> 00:08:52,300 So verified being like, if I walk 100 steps, 199 00:08:52,300 --> 00:08:53,770 does this thing measure 100 steps? 200 00:08:53,770 --> 00:08:55,600 And then validation being, does 100 steps 201 00:08:55,600 --> 00:08:57,460 mean something for my patient population 202 00:08:57,460 --> 00:09:00,140 or for my clinical use case? 203 00:09:00,140 --> 00:09:03,250 And so one of the things that the FDA has started 204 00:09:03,250 --> 00:09:06,520 to think through is how might you decouple the hardware 205 00:09:06,520 --> 00:09:08,830 components from the software components, where 206 00:09:08,830 --> 00:09:12,430 you think about some of the hardware components as the ways 207 00:09:12,430 --> 00:09:14,380 that you would-- 208 00:09:14,380 --> 00:09:17,575 effectively, the supply chain for collecting that data, 209 00:09:17,575 --> 00:09:19,450 and then you would be using something on top. 210 00:09:19,450 --> 00:09:20,973 And so maybe you have certain types 211 00:09:20,973 --> 00:09:23,140 of companies that might do some sort of verification 212 00:09:23,140 --> 00:09:24,940 or validation lower down the stack, 213 00:09:24,940 --> 00:09:26,440 and then you can innovate higher up. 214 00:09:30,570 --> 00:09:36,160 And so these measurements have pretty meaningful impacts. 215 00:09:36,160 --> 00:09:37,738 So in the past, a lot of these tools, 216 00:09:37,738 --> 00:09:39,280 you really had to go into the clinic. 217 00:09:39,280 --> 00:09:41,667 It was very expensive to get these sorts of measurements. 218 00:09:41,667 --> 00:09:43,750 And more and more, a number of different companies 219 00:09:43,750 --> 00:09:45,910 are getting their products cleared 220 00:09:45,910 --> 00:09:50,080 to use in some care settings with a doctor 221 00:09:50,080 --> 00:09:55,440 or possibly to inform decisions at home. 222 00:09:55,440 --> 00:09:55,940 All right. 223 00:09:55,940 --> 00:09:58,270 And so in the last of some examples 224 00:09:58,270 --> 00:10:00,040 is around digital therapeutics. 225 00:10:00,040 --> 00:10:01,870 So I had worked with a company that 226 00:10:01,870 --> 00:10:06,460 was using a technology based out of UCSF that is developing, 227 00:10:06,460 --> 00:10:09,700 effectively, a video game for pediatric ADHD. 228 00:10:09,700 --> 00:10:11,890 And so when kids play the game, they 229 00:10:11,890 --> 00:10:13,842 reduce their ADHD symptoms. 230 00:10:13,842 --> 00:10:16,300 And one of things that's pretty exciting about this game it 231 00:10:16,300 --> 00:10:18,160 is a 30-day protocol. 232 00:10:18,160 --> 00:10:21,010 And unlike something like Ritalin or Adderall, 233 00:10:21,010 --> 00:10:22,870 where you have to take that drug every day 234 00:10:22,870 --> 00:10:25,270 for the rest of your life as you reduce the symptoms, 235 00:10:25,270 --> 00:10:28,450 this seems to have an effect that, after 30 days, 236 00:10:28,450 --> 00:10:32,530 is more long-term and that when you test somebody months down 237 00:10:32,530 --> 00:10:35,780 the line, they still retain the effects of the treatment. 238 00:10:35,780 --> 00:10:38,650 So this technology was taken out of UCSF 239 00:10:38,650 --> 00:10:41,978 and licensed to a company called Akili, who decided, hey, 240 00:10:41,978 --> 00:10:44,270 we should just structure ourselves like a drug company. 241 00:10:44,270 --> 00:10:46,395 So they raised venture capital like a drug company, 242 00:10:46,395 --> 00:10:48,790 they ran clinical trials like a drug company, 243 00:10:48,790 --> 00:10:51,340 and they're now submitting to the FDA 244 00:10:51,340 --> 00:10:55,480 and might be the first prescription drug. 245 00:10:55,480 --> 00:10:59,290 So anybody who was told that video games might 246 00:10:59,290 --> 00:11:06,250 rot your brain, you now have a revenge, maybe. 247 00:11:09,190 --> 00:11:11,690 So the FDA has been looking at more and more of these tools. 248 00:11:11,690 --> 00:11:12,790 I don't have to tell you, you're probably 249 00:11:12,790 --> 00:11:14,040 thinking a lot about it. 250 00:11:14,040 --> 00:11:16,930 And the FDA has been clearing a number of these different types 251 00:11:16,930 --> 00:11:17,742 of algorithms. 252 00:11:17,742 --> 00:11:19,450 And one of the questions that has come up 253 00:11:19,450 --> 00:11:21,828 is, what part of the agency should you think about? 254 00:11:21,828 --> 00:11:24,370 What are you claiming when you use these sorts of algorithms? 255 00:11:24,370 --> 00:11:27,200 And what ones should be cleared and what's not? 256 00:11:27,200 --> 00:11:30,820 And how should we really think about the regulatory oversight 257 00:11:30,820 --> 00:11:33,570 for them? 258 00:11:33,570 --> 00:11:35,710 And a lot of these technologies enable things that 259 00:11:35,710 --> 00:11:37,395 are really quite exciting, too. 260 00:11:37,395 --> 00:11:39,020 So it's not just about the measurement, 261 00:11:39,020 --> 00:11:40,400 but what you can do with them. 262 00:11:40,400 --> 00:11:44,350 So one thing that has a lot of people really excited about 263 00:11:44,350 --> 00:11:46,940 is an idea around decentralized clinical trials. 264 00:11:46,940 --> 00:11:47,950 No block chains here. 265 00:11:51,320 --> 00:11:53,320 You might be able to build it with a blockchain, 266 00:11:53,320 --> 00:11:54,560 but not necessary. 267 00:11:54,560 --> 00:11:58,722 So on the y-axis, you can think about where 268 00:11:58,722 --> 00:11:59,680 are the data collected. 269 00:11:59,680 --> 00:12:01,540 So is it collected at a clinical site, 270 00:12:01,540 --> 00:12:03,580 or is it collected remotely? 271 00:12:03,580 --> 00:12:05,920 And then the method is how it's collected. 272 00:12:05,920 --> 00:12:08,500 So do you need a human to do the interaction, 273 00:12:08,500 --> 00:12:10,170 or is it fully virtual? 274 00:12:10,170 --> 00:12:11,920 So at the top you can think about somebody 275 00:12:11,920 --> 00:12:14,530 doing telemedicine, where they call into somebody at home 276 00:12:14,530 --> 00:12:16,510 and then they might ask some questions 277 00:12:16,510 --> 00:12:17,800 and fill out a survey. 278 00:12:17,800 --> 00:12:22,377 On the bottom, you can imagine in a research facility 279 00:12:22,377 --> 00:12:24,460 where I'm using a number of different instruments, 280 00:12:24,460 --> 00:12:26,043 and perhaps I'm in a Parkinson's study 281 00:12:26,043 --> 00:12:28,360 and you're measuring my tremor with some sort 282 00:12:28,360 --> 00:12:29,440 of accelerometer. 283 00:12:29,440 --> 00:12:31,848 And so the challenge that's happening is a lot of people 284 00:12:31,848 --> 00:12:33,640 use all of these terms for different things 285 00:12:33,640 --> 00:12:35,140 when they mean decentralized trials. 286 00:12:35,140 --> 00:12:35,950 Is it telemedicine? 287 00:12:35,950 --> 00:12:37,325 Is it somebody who's instrumented 288 00:12:37,325 --> 00:12:38,350 with a lot of wearables? 289 00:12:38,350 --> 00:12:41,330 How do you know that the data are accurate? 290 00:12:41,330 --> 00:12:43,810 But this is, I think, in many instances really exciting 291 00:12:43,810 --> 00:12:46,270 because the number one reason why 292 00:12:46,270 --> 00:12:48,460 people don't want to enroll in a clinical trial 293 00:12:48,460 --> 00:12:49,680 is to get a placebo. 294 00:12:49,680 --> 00:12:51,430 I think nobody really wants to participate 295 00:12:51,430 --> 00:12:54,020 in research if you're not getting the actual drug. 296 00:12:54,020 --> 00:12:56,180 And then the other reason is location. 297 00:12:56,180 --> 00:12:58,390 People don't want to have to drive in, find 298 00:12:58,390 --> 00:12:59,980 parking, participate. 299 00:12:59,980 --> 00:13:03,040 And this allows people to participate from home. 300 00:13:03,040 --> 00:13:04,960 And the FDA has been doing a lot of work 301 00:13:04,960 --> 00:13:08,020 around how to rethink the clinical trial design process 302 00:13:08,020 --> 00:13:09,940 and incorporate some of this real world data 303 00:13:09,940 --> 00:13:12,430 into decision-making. 304 00:13:12,430 --> 00:13:15,440 Before I jump into some of the regulatory things, 305 00:13:15,440 --> 00:13:17,980 I want to just set a framework of how to think 306 00:13:17,980 --> 00:13:20,200 about what these tools can do. 307 00:13:20,200 --> 00:13:23,590 So these are three different scenarios 308 00:13:23,590 --> 00:13:27,020 of how you might use software in a piece of clinical research. 309 00:13:27,020 --> 00:13:29,530 So imagine that somebody has Parkinson's and you 310 00:13:29,530 --> 00:13:32,260 want to measure how their Parkinson's is 311 00:13:32,260 --> 00:13:35,110 changing over time using a smartphone-based test. 312 00:13:35,110 --> 00:13:37,840 You have a standard Parkinson's drug that they would use, 313 00:13:37,840 --> 00:13:39,972 and then you would collect the endpoint data, which 314 00:13:39,972 --> 00:13:41,680 is how you see if that drug has performed 315 00:13:41,680 --> 00:13:43,630 using a piece of software. 316 00:13:43,630 --> 00:13:48,340 Another idea would be, say you have an insulin pump 317 00:13:48,340 --> 00:13:51,670 and then you have a CGM that is measuring your blood sugar 318 00:13:51,670 --> 00:13:53,920 levels, and you want to dose the insulin based 319 00:13:53,920 --> 00:13:55,340 on those readings. 320 00:13:55,340 --> 00:13:58,150 You might have software both on the interventional side 321 00:13:58,150 --> 00:14:00,220 and on the endpoint side. 322 00:14:00,220 --> 00:14:02,350 Or, like the company we talked about, 323 00:14:02,350 --> 00:14:04,570 which has a digital product, they 324 00:14:04,570 --> 00:14:07,900 said the only thing we want to change in the study 325 00:14:07,900 --> 00:14:09,610 is that the intervention is digital, 326 00:14:09,610 --> 00:14:13,210 but we want you to compare us like any other intervention 327 00:14:13,210 --> 00:14:14,770 for pediatric ADHD. 328 00:14:14,770 --> 00:14:16,390 So we want to use standard endpoints 329 00:14:16,390 --> 00:14:18,880 and not make that an innovation. 330 00:14:18,880 --> 00:14:22,330 The challenge here is the first one, most likely, 331 00:14:22,330 --> 00:14:24,760 would go to the drug side of the FDA. 332 00:14:24,760 --> 00:14:27,310 The second one would go to both the drug and the device 333 00:14:27,310 --> 00:14:29,780 side of the FDA as a combination product. 334 00:14:29,780 --> 00:14:32,660 And the final one would just go to devices, 335 00:14:32,660 --> 00:14:36,440 which has been generally handling software. 336 00:14:36,440 --> 00:14:40,070 We've never really had products at the FDA, in my opinion, 337 00:14:40,070 --> 00:14:40,640 where-- 338 00:14:40,640 --> 00:14:43,160 we don't have drugs that can measure, diagnose, 339 00:14:43,160 --> 00:14:45,380 and treat and change all the different ways. 340 00:14:45,380 --> 00:14:46,760 And so you're now having software 341 00:14:46,760 --> 00:14:49,520 hitting multiple different parts of a system, 342 00:14:49,520 --> 00:14:52,340 or it might even be the same product, but in one instance 343 00:14:52,340 --> 00:14:54,650 it's used as an intervention, another instance it's 344 00:14:54,650 --> 00:14:56,390 used as a diagnostic, another it's 345 00:14:56,390 --> 00:14:58,500 to inform or expand labeling. 346 00:14:58,500 --> 00:15:00,710 And so the lines are not as clean anymore 347 00:15:00,710 --> 00:15:03,740 about how you would manage this product. 348 00:15:03,740 --> 00:15:05,780 So how do you manage these? 349 00:15:05,780 --> 00:15:07,430 There's a couple agencies that are 350 00:15:07,430 --> 00:15:11,960 responsible for thinking through and overseeing health care 351 00:15:11,960 --> 00:15:12,620 software. 352 00:15:12,620 --> 00:15:16,077 The big one that we'll spend most of our time on is the FDA. 353 00:15:16,077 --> 00:15:17,660 But it's also worth thinking about how 354 00:15:17,660 --> 00:15:19,410 they interact with some of the other ones, 355 00:15:19,410 --> 00:15:24,740 including ONC, FCC, and FTC. 356 00:15:24,740 --> 00:15:28,070 So the FDA is responsible for safety and effectiveness 357 00:15:28,070 --> 00:15:31,340 and for facilitating medical product innovation 358 00:15:31,340 --> 00:15:34,250 and ensuring that patients have access 359 00:15:34,250 --> 00:15:37,590 to high quality products. 360 00:15:37,590 --> 00:15:41,520 The ONC is responsible for health information technology. 361 00:15:41,520 --> 00:15:44,955 And you can imagine where the lines between storing data 362 00:15:44,955 --> 00:15:47,580 and whether or not you're making a diagnosis on that data start 363 00:15:47,580 --> 00:15:49,680 to get really vague, and it really 364 00:15:49,680 --> 00:15:51,783 might be the exact same product but just 365 00:15:51,783 --> 00:15:54,075 the change of what claim you're making on that product. 366 00:15:57,090 --> 00:16:00,900 And most of these products have some level of connectivity 367 00:16:00,900 --> 00:16:04,380 to them, so they also are working with FCC 368 00:16:04,380 --> 00:16:09,960 and have to abide by the ways that these tools are regulated 369 00:16:09,960 --> 00:16:11,700 by this agency. 370 00:16:11,700 --> 00:16:14,160 And then finally, and probably most interesting, 371 00:16:14,160 --> 00:16:17,700 is around the FTC, which is really focused 372 00:16:17,700 --> 00:16:20,260 on informing consumer choice. 373 00:16:20,260 --> 00:16:25,050 And if you think about FDA and FTC, 374 00:16:25,050 --> 00:16:26,620 they're actually really similar. 375 00:16:26,620 --> 00:16:29,580 So both of these agencies are responsible for consumer 376 00:16:29,580 --> 00:16:33,840 protection, and the FDA really takes that with a public health 377 00:16:33,840 --> 00:16:34,830 perspective. 378 00:16:34,830 --> 00:16:37,980 So in many instances, if you've seen some of the penalties 379 00:16:37,980 --> 00:16:40,230 around somebody having deceptive practices, 380 00:16:40,230 --> 00:16:44,310 it actually wasn't the FDA who stepped in, it was the FTC. 381 00:16:44,310 --> 00:16:45,810 And I think some of the agencies are 382 00:16:45,810 --> 00:16:48,960 thinking about where do their lines end 383 00:16:48,960 --> 00:16:50,400 and where do others begin. 384 00:16:50,400 --> 00:16:52,857 And in many instances, as we've really 385 00:16:52,857 --> 00:16:55,440 seen with a lot of probably bad behavior that happens in tech, 386 00:16:55,440 --> 00:16:57,840 there's really gaps across multiple places 387 00:16:57,840 --> 00:17:00,850 where nobody's stepping in. 388 00:17:00,850 --> 00:17:04,270 And then there's some also non-regulatory agencies 389 00:17:04,270 --> 00:17:05,349 to think about. 390 00:17:05,349 --> 00:17:08,730 An important one is around standards and technology. 391 00:17:08,730 --> 00:17:10,480 You probably think about this all the time 392 00:17:10,480 --> 00:17:11,855 with interoperability and whether 393 00:17:11,855 --> 00:17:13,660 or not you can actually import the data. 394 00:17:13,660 --> 00:17:15,099 There are people who spend a lot of time 395 00:17:15,099 --> 00:17:16,141 thinking about standards. 396 00:17:16,141 --> 00:17:17,770 It is a very painful and very important 397 00:17:17,770 --> 00:17:20,300 job to promote innovation. 398 00:17:20,300 --> 00:17:20,800 OK. 399 00:17:20,800 --> 00:17:22,492 So the FDA has multiple centers. 400 00:17:22,492 --> 00:17:23,950 I'm going to use a lot of acronyms, 401 00:17:23,950 --> 00:17:26,420 so you might want to write this down or take a picture. 402 00:17:26,420 --> 00:17:28,248 And I'll try to minimize my acronyms. 403 00:17:28,248 --> 00:17:30,790 But there are three centers that will be the most interesting 404 00:17:30,790 --> 00:17:31,580 for you. 405 00:17:31,580 --> 00:17:33,760 So CDER is the one for drugs, and this 406 00:17:33,760 --> 00:17:36,400 is the one where you would have a regular drug 407 00:17:36,400 --> 00:17:38,470 and possibly use a software product to see 408 00:17:38,470 --> 00:17:40,040 how that drug is performing. 409 00:17:40,040 --> 00:17:42,070 CDHR is for devices. 410 00:17:42,070 --> 00:17:45,410 And CBER is for biological products. 411 00:17:45,410 --> 00:17:47,410 I will probably use drugs and biologics 412 00:17:47,410 --> 00:17:48,730 in a very similar sort of way. 413 00:17:48,730 --> 00:17:51,105 And the distinctions that we'll spend most of our time on 414 00:17:51,105 --> 00:17:52,690 are around drugs versus device. 415 00:17:55,910 --> 00:17:58,020 There's a bunch of policy that is 416 00:17:58,020 --> 00:18:02,350 coming out that is both exciting and making things change. 417 00:18:02,350 --> 00:18:05,880 So one of the big ones is around the 21st Century Cures. 418 00:18:05,880 --> 00:18:09,610 This has accelerated a lot of innovation in health care. 419 00:18:09,610 --> 00:18:12,780 It's also changed the definition of what device is, 420 00:18:12,780 --> 00:18:16,610 which has a pretty meaningful impact on software. 421 00:18:16,610 --> 00:18:18,982 And the FDA has been thinking a lot 422 00:18:18,982 --> 00:18:21,440 about how you would actually incorporate these products in. 423 00:18:21,440 --> 00:18:23,090 I think there is a lot of people who 424 00:18:23,090 --> 00:18:24,340 are really excited about them. 425 00:18:24,340 --> 00:18:26,810 There's a lot of innovation, and so how 426 00:18:26,810 --> 00:18:30,770 do we create standards both to expand labeling, 427 00:18:30,770 --> 00:18:33,050 be able to actually ingest digital data, 428 00:18:33,050 --> 00:18:35,450 and have these sorts of digital products 429 00:18:35,450 --> 00:18:37,130 that are actually under FDA oversight 430 00:18:37,130 --> 00:18:40,650 and not just weird snake oil on the app store? 431 00:18:40,650 --> 00:18:45,420 But what is a medical device? 432 00:18:45,420 --> 00:18:48,740 Pretty much, a device is like anything 433 00:18:48,740 --> 00:18:54,320 that's not the other centers, which 434 00:18:54,320 --> 00:18:58,760 has a big catch-all for all the other components. 435 00:18:58,760 --> 00:19:02,180 And so one of the big challenges for people 436 00:19:02,180 --> 00:19:05,570 is thinking about what a device is. 437 00:19:05,570 --> 00:19:08,450 If you think about generally what the FDA does, 438 00:19:08,450 --> 00:19:10,700 it doesn't always make sure that your products 439 00:19:10,700 --> 00:19:12,050 are safe and effective. 440 00:19:12,050 --> 00:19:14,930 They check whether or not you claim that they 441 00:19:14,930 --> 00:19:16,530 are safe and effective. 442 00:19:16,530 --> 00:19:18,800 So it's really all about claims management 443 00:19:18,800 --> 00:19:21,890 and what you're claiming that this product can do 444 00:19:21,890 --> 00:19:23,600 and evaluating that for marketing. 445 00:19:23,600 --> 00:19:26,750 Obviously if your product causes very significant harm, 446 00:19:26,750 --> 00:19:28,100 that is an issue. 447 00:19:28,100 --> 00:19:32,907 But the challenge really happens to be when somebody makes-- 448 00:19:32,907 --> 00:19:35,240 the product can do something that it doesn't necessarily 449 00:19:35,240 --> 00:19:38,930 claim to do, but then you are able to imply 450 00:19:38,930 --> 00:19:42,230 that it does other things. 451 00:19:42,230 --> 00:19:45,350 Most people don't really have a really good understanding 452 00:19:45,350 --> 00:19:47,630 of what the difference is between informing 453 00:19:47,630 --> 00:19:50,160 a product versus diagnosing a product, 454 00:19:50,160 --> 00:19:52,610 and so I think in many instances for the public, 455 00:19:52,610 --> 00:19:55,540 it gets a bit confusing. 456 00:19:55,540 --> 00:19:57,270 So as we talked about before, the FDA 457 00:19:57,270 --> 00:19:59,340 has been thinking about how do you 458 00:19:59,340 --> 00:20:01,230 decouple the hardware from the software, 459 00:20:01,230 --> 00:20:04,260 and they've come up with a concept around software 460 00:20:04,260 --> 00:20:06,840 as a medical device, so the software 461 00:20:06,840 --> 00:20:10,290 that is effectively defined as having no hardware 462 00:20:10,290 --> 00:20:12,640 components where you can evaluate just this product, 463 00:20:12,640 --> 00:20:15,830 and this is pronounced SaMD. 464 00:20:15,830 --> 00:20:18,910 And SaMDs are pretty interesting. 465 00:20:18,910 --> 00:20:20,680 This is very hard to read, but I pulled it 466 00:20:20,680 --> 00:20:22,630 straight from legal documents, so you 467 00:20:22,630 --> 00:20:23,840 know I'm not changing it. 468 00:20:23,840 --> 00:20:25,930 So something that's interesting about SaMD-- 469 00:20:25,930 --> 00:20:27,500 so if you go all the way to the end-- 470 00:20:27,500 --> 00:20:30,790 so if you have electronic health care data that's just storing 471 00:20:30,790 --> 00:20:33,880 health data, that is not a SaMD and often 472 00:20:33,880 --> 00:20:38,350 can go straight to market and is not regulated by the FDA. 473 00:20:38,350 --> 00:20:42,830 If you have a piece of software that is embedded into a system, 474 00:20:42,830 --> 00:20:45,310 so something like a pacemaker or a blood infusion pump, 475 00:20:45,310 --> 00:20:48,220 then that is software in a medical device, 476 00:20:48,220 --> 00:20:49,570 and that's not a SaMD. 477 00:20:49,570 --> 00:20:54,100 So there's a line between these about what the functionality is 478 00:20:54,100 --> 00:20:56,650 that the product is doing, and then how serious is it, 479 00:20:56,650 --> 00:21:03,640 and that informs how you would be evaluated for that product. 480 00:21:03,640 --> 00:21:06,360 And if you haven't noticed, I try not to almost ever use 481 00:21:06,360 --> 00:21:07,360 the term device. 482 00:21:07,360 --> 00:21:10,440 So when I talked about these connected wearables 483 00:21:10,440 --> 00:21:13,230 and other sorts of tools, I will use the word tool and not 484 00:21:13,230 --> 00:21:18,570 device because this has a very specific meaning for the FDA. 485 00:21:18,570 --> 00:21:21,870 And so if you're curious whether or not your product is a device 486 00:21:21,870 --> 00:21:23,910 or your algorithm is a device, first, you 487 00:21:23,910 --> 00:21:26,250 should talk to the regulators and talk to your lawyer. 488 00:21:26,250 --> 00:21:30,090 And we'll play a little game. 489 00:21:30,090 --> 00:21:32,160 So there are two products here. 490 00:21:32,160 --> 00:21:35,670 One is an Apple product and one is a Fitbit product. 491 00:21:35,670 --> 00:21:38,810 Which one is a device? 492 00:21:38,810 --> 00:21:40,510 I'm going to call on somebody randomly, 493 00:21:40,510 --> 00:21:43,020 or someone can raise their hand and offer as tribute. 494 00:21:49,936 --> 00:21:51,610 OK, which one? 495 00:21:51,610 --> 00:21:54,482 AUDIENCE: I think Apple received 510(k) clearance, 496 00:21:54,482 --> 00:21:56,030 so I'd say the Apple Watch device. 497 00:21:56,030 --> 00:21:58,530 I'm not sure about the Fitbit, but if it's one or the other, 498 00:21:58,530 --> 00:22:00,667 then it's probably not. 499 00:22:00,667 --> 00:22:02,000 ANDY CORAVOS: That's very sharp. 500 00:22:02,000 --> 00:22:03,560 So we'll talk about this. 501 00:22:03,560 --> 00:22:06,980 Apple did submit to the FDA for clearance, 502 00:22:06,980 --> 00:22:09,020 and they submitted for de novo, which 503 00:22:09,020 --> 00:22:11,450 is very similar to a 510(k). 504 00:22:11,450 --> 00:22:13,850 And they submitted for two products, two SaMDs. 505 00:22:13,850 --> 00:22:17,900 One was based on the signal from their PPG, 506 00:22:17,900 --> 00:22:19,610 and the second was on the app. 507 00:22:19,610 --> 00:22:23,810 So it has two devices, neither of which are hardware. 508 00:22:23,810 --> 00:22:27,695 And the Fitbit has, today, no devices. 509 00:22:30,330 --> 00:22:32,090 How about now? 510 00:22:32,090 --> 00:22:34,010 Is it a device, or is it not a device? 511 00:22:37,110 --> 00:22:39,373 Trick question, obviously, because there 512 00:22:39,373 --> 00:22:41,040 are two devices there, and then a number 513 00:22:41,040 --> 00:22:42,390 of things that are not devices. 514 00:22:42,390 --> 00:22:44,580 So it really just depends on what you 515 00:22:44,580 --> 00:22:47,010 are claiming the product does. 516 00:22:47,010 --> 00:22:48,840 And back to that set of modularity, 517 00:22:48,840 --> 00:22:50,400 what is actually the product? 518 00:22:50,400 --> 00:22:53,250 So is the product a signal processing algorithm? 519 00:22:53,250 --> 00:22:54,690 Is the product an app? 520 00:22:54,690 --> 00:22:57,870 Is the product the whole entire system? 521 00:22:57,870 --> 00:23:00,210 And so people are thinking about, 522 00:23:00,210 --> 00:23:02,160 strategically, frankly, which parts 523 00:23:02,160 --> 00:23:06,060 are devices because you might want somebody else to be 524 00:23:06,060 --> 00:23:07,060 building on your system. 525 00:23:07,060 --> 00:23:09,320 So maybe you want to make your hardware a device, 526 00:23:09,320 --> 00:23:11,070 and then other people can build off of it. 527 00:23:11,070 --> 00:23:15,310 And so there are strategic ways of thinking about it. 528 00:23:15,310 --> 00:23:18,230 So the crazy thing here, if you can imagine this, 529 00:23:18,230 --> 00:23:23,600 is that the exact same product can be a device or not a device 530 00:23:23,600 --> 00:23:27,350 through just a change of words and no change in hardware 531 00:23:27,350 --> 00:23:27,920 or code. 532 00:23:32,260 --> 00:23:35,370 So if you think about whether or not my product is a device, 533 00:23:35,370 --> 00:23:37,860 it's actually generally not the most useful question. 534 00:23:37,860 --> 00:23:41,580 The more useful question is, what is the intended use 535 00:23:41,580 --> 00:23:42,600 of the product? 536 00:23:42,600 --> 00:23:45,900 And so, are you making a medical device claim with what 537 00:23:45,900 --> 00:23:49,540 your product is doing? 538 00:23:49,540 --> 00:23:51,910 Obviously this is a little bit overwhelming, 539 00:23:51,910 --> 00:23:54,580 I think, in trying to figure out how to navigate all of this. 540 00:23:54,580 --> 00:23:56,920 And the FDA recognizes that, and their goal 541 00:23:56,920 --> 00:23:59,020 is to increase innovation. 542 00:23:59,020 --> 00:24:01,990 And so particularly for products like software, they're 543 00:24:01,990 --> 00:24:03,670 having constant updates. 544 00:24:03,670 --> 00:24:06,250 It seems a little bit difficult if you're constantly 545 00:24:06,250 --> 00:24:07,750 figuring out all the different words 546 00:24:07,750 --> 00:24:10,120 and how you're going to get these products to market. 547 00:24:10,120 --> 00:24:13,990 So something that I think is really innovative by the FDA 548 00:24:13,990 --> 00:24:15,325 is piloting-- 549 00:24:15,325 --> 00:24:17,950 this is one example of a program that they're thinking through, 550 00:24:17,950 --> 00:24:21,130 which is working with nine different companies. 551 00:24:21,130 --> 00:24:26,200 And the idea is, can you pre-certify an entire company 552 00:24:26,200 --> 00:24:28,600 that is developing software as an excellent company 553 00:24:28,600 --> 00:24:30,610 across a series of objectives, and then 554 00:24:30,610 --> 00:24:32,890 allow them to ship additional updates? 555 00:24:32,890 --> 00:24:34,875 So today, if you had an update and you 556 00:24:34,875 --> 00:24:36,250 wanted to make a change, you have 557 00:24:36,250 --> 00:24:38,980 to go through an entire 510(k) or de novo process 558 00:24:38,980 --> 00:24:41,980 or other type of process, which is pretty wild. 559 00:24:41,980 --> 00:24:43,480 If you imagine that we only would 560 00:24:43,480 --> 00:24:47,170 let Facebook ship one update a year, that would be crazy. 561 00:24:47,170 --> 00:24:49,780 And we don't expect Facebook to maintain or sustain 562 00:24:49,780 --> 00:24:50,860 a human life. 563 00:24:50,860 --> 00:24:54,040 And so being able to have updates 564 00:24:54,040 --> 00:24:57,595 in a more regular fashion is very important. 565 00:24:57,595 --> 00:24:59,470 But how do you know that that change is going 566 00:24:59,470 --> 00:25:01,690 to have a big impact or not? 567 00:25:05,450 --> 00:25:08,287 And I'll pause on this, but you all read the document. 568 00:25:08,287 --> 00:25:10,370 I'm actually very glad that you read this document 569 00:25:10,370 --> 00:25:12,890 without talking to us because you were the exact audience 570 00:25:12,890 --> 00:25:14,570 of somebody who would not necessarily 571 00:25:14,570 --> 00:25:16,310 have the background, and so it needs 572 00:25:16,310 --> 00:25:19,317 to be put in a way that is readable for people who 573 00:25:19,317 --> 00:25:20,900 are developing these types of products 574 00:25:20,900 --> 00:25:23,688 to know how to go into them. 575 00:25:23,688 --> 00:25:25,730 We'll save some time at the end of the discussion 576 00:25:25,730 --> 00:25:28,250 because I'm curious how you perceived the piece. 577 00:25:28,250 --> 00:25:30,710 But you should definitely trust your first reading 578 00:25:30,710 --> 00:25:33,710 as a honest, good reading. 579 00:25:33,710 --> 00:25:35,750 You also probably read it way more intensely 580 00:25:35,750 --> 00:25:37,710 than any other person who is reading it, 581 00:25:37,710 --> 00:25:39,800 and so the notes that you took are valid. 582 00:25:39,800 --> 00:25:41,700 And I'm curious what you saw. 583 00:25:41,700 --> 00:25:42,200 OK. 584 00:25:42,200 --> 00:25:44,960 Another thing to help you be cool at cocktail hour. 585 00:25:44,960 --> 00:25:49,170 FDA cleared, not the same thing as FDA approved. 586 00:25:49,170 --> 00:25:49,670 OK. 587 00:25:49,670 --> 00:25:54,120 So for devices, there are three pathways to think about. 588 00:25:54,120 --> 00:25:56,870 One is the 510(k), the next is de novo, 589 00:25:56,870 --> 00:26:00,850 the next is a premarket approval, also known as a PMA. 590 00:26:00,850 --> 00:26:03,290 They're generally stratified by whether or not 591 00:26:03,290 --> 00:26:04,850 something is risky. 592 00:26:04,850 --> 00:26:07,550 And the type of data that you have 593 00:26:07,550 --> 00:26:11,060 to submit to be able to get one of these clearances varies. 594 00:26:11,060 --> 00:26:13,040 So the more risky you are, the more type 595 00:26:13,040 --> 00:26:16,340 of data that you have to have. 596 00:26:16,340 --> 00:26:22,280 So de novos are granted, but people often will say cleared. 597 00:26:22,280 --> 00:26:23,600 510(k)s are cleared. 598 00:26:23,600 --> 00:26:29,633 Very few products that you've seen go through a PMA process. 599 00:26:29,633 --> 00:26:30,800 AUDIENCE: I have a question. 600 00:26:30,800 --> 00:26:31,100 ANDY CORAVOS: Tell me. 601 00:26:31,100 --> 00:26:32,892 AUDIENCE: Do you know why Apple chose to do 602 00:26:32,892 --> 00:26:34,430 a de novo instead of a 510(k)? 603 00:26:34,430 --> 00:26:37,490 ANDY CORAVOS: I am not Apple, but if I had to guess, 604 00:26:37,490 --> 00:26:40,910 once you create a de novo, you can then become 605 00:26:40,910 --> 00:26:42,720 a predicate for other things. 606 00:26:42,720 --> 00:26:45,170 And so if they wanted to create a new class of predicates 607 00:26:45,170 --> 00:26:47,460 that they could then build on over time, 608 00:26:47,460 --> 00:26:49,790 and they didn't want to get stuck 609 00:26:49,790 --> 00:26:51,790 in an old type of predicate system, 610 00:26:51,790 --> 00:26:53,540 I think, strategically, the fact that they 611 00:26:53,540 --> 00:26:55,907 picked a PPG and their app-- 612 00:26:55,907 --> 00:26:57,990 I don't know what they'll eventually do over time, 613 00:26:57,990 --> 00:27:00,620 but I think it's part of their long-term strategy. 614 00:27:00,620 --> 00:27:03,330 Great question. 615 00:27:03,330 --> 00:27:03,830 OK. 616 00:27:03,830 --> 00:27:07,680 So the tools are safe and effective, perhaps, 617 00:27:07,680 --> 00:27:10,838 depending on how much data is submitted. 618 00:27:10,838 --> 00:27:12,380 But what about the information that's 619 00:27:12,380 --> 00:27:15,050 collected from the tools? 620 00:27:15,050 --> 00:27:18,710 So today, our health care system has pretty strong protections 621 00:27:18,710 --> 00:27:24,200 for biospecimens, your blood, your stool, your genomic data. 622 00:27:24,200 --> 00:27:26,450 But we really don't have any protections 623 00:27:26,450 --> 00:27:28,820 around digital specimens. 624 00:27:28,820 --> 00:27:31,310 You can imagine how many data breaches we constantly have 625 00:27:31,310 --> 00:27:33,080 and what ads get served to us on Facebook. 626 00:27:33,080 --> 00:27:35,372 A lot of this is considered wellness data, not actually 627 00:27:35,372 --> 00:27:36,140 health data. 628 00:27:36,140 --> 00:27:38,245 But in many instances, you are finding 629 00:27:38,245 --> 00:27:39,620 quite a lot of health information 630 00:27:39,620 --> 00:27:42,760 from somebody in that. 631 00:27:42,760 --> 00:27:44,660 And I have a lot more. 632 00:27:44,660 --> 00:27:46,040 We can nerd about this forever. 633 00:27:46,040 --> 00:27:48,710 But generally, there's a couple of things 634 00:27:48,710 --> 00:27:51,990 that are good to know, is that with most of this data, 635 00:27:51,990 --> 00:27:53,933 you can't really de-identify it anymore. 636 00:27:53,933 --> 00:27:55,850 Who here thinks I could de-identify my genome? 637 00:27:58,700 --> 00:27:59,480 You can't, right? 638 00:27:59,480 --> 00:28:00,590 My genome's unique to me. 639 00:28:00,590 --> 00:28:03,132 Maybe you can strip out personally identifiable 640 00:28:03,132 --> 00:28:05,590 information, but you're not really going to de-identify it. 641 00:28:05,590 --> 00:28:08,730 I am uniquely identifiable with 30 seconds of walk data. 642 00:28:08,730 --> 00:28:13,970 So all of this biometric signatures is pretty specific. 643 00:28:13,970 --> 00:28:18,380 And so there are some agencies today 644 00:28:18,380 --> 00:28:19,880 who are thinking about how you might 645 00:28:19,880 --> 00:28:22,500 handle these sorts of tools. 646 00:28:22,500 --> 00:28:25,850 But in the end, there is, I think, a pretty substantial 647 00:28:25,850 --> 00:28:26,570 gap. 648 00:28:26,570 --> 00:28:30,140 So in general, the FDA is really focused on safety and efficacy, 649 00:28:30,140 --> 00:28:33,230 and safety is considered much more of a body safety 650 00:28:33,230 --> 00:28:36,403 and not as a we are very programmable as humans 651 00:28:36,403 --> 00:28:37,820 in the type of information that we 652 00:28:37,820 --> 00:28:39,960 see or change type of safety. 653 00:28:39,960 --> 00:28:42,200 So the data that we collect-- 654 00:28:42,200 --> 00:28:44,450 FTC could have a lot of power here, 655 00:28:44,450 --> 00:28:46,130 but they're a much smaller agency 656 00:28:46,130 --> 00:28:48,650 that isn't as well-resourced. 657 00:28:48,650 --> 00:28:51,290 And there's a couple of different organizations 658 00:28:51,290 --> 00:28:52,790 that are trying to think through how 659 00:28:52,790 --> 00:28:54,620 to do rulemaking for Internet of Things 660 00:28:54,620 --> 00:28:56,000 and how that data is being used. 661 00:28:56,000 --> 00:28:58,040 But generally, in my opinion, we probably 662 00:28:58,040 --> 00:29:00,170 need some sort of congressional action 663 00:29:00,170 --> 00:29:03,620 around non-discrimination of digital specimen data, which 664 00:29:03,620 --> 00:29:06,440 would require a Congress that could think through, 665 00:29:06,440 --> 00:29:08,060 I think, a really difficult problem 666 00:29:08,060 --> 00:29:10,920 of how you would handle data rights and management. 667 00:29:10,920 --> 00:29:11,420 OK. 668 00:29:11,420 --> 00:29:14,420 So I'll go through a couple examples 669 00:29:14,420 --> 00:29:16,370 of how government agencies are interacting 670 00:29:16,370 --> 00:29:18,078 with members of the public, which I think 671 00:29:18,078 --> 00:29:20,370 you might find interesting. 672 00:29:20,370 --> 00:29:25,200 So, many of the government agencies 673 00:29:25,200 --> 00:29:28,470 are really thinking through, realizing that they are not 674 00:29:28,470 --> 00:29:30,280 necessarily the experts in their field 675 00:29:30,280 --> 00:29:32,520 in how do they get the data that they need. 676 00:29:32,520 --> 00:29:35,520 So a couple pieces that will be interesting for you, I think. 677 00:29:35,520 --> 00:29:39,270 One is there is a joint group with the FDA and Duke, 678 00:29:39,270 --> 00:29:41,850 where they're thinking through what's called novel endpoints. 679 00:29:41,850 --> 00:29:44,100 So if you are working on a study today 680 00:29:44,100 --> 00:29:46,680 where you realize that you're measuring something better 681 00:29:46,680 --> 00:29:49,920 than the quote gold standard, and the gold standard is 682 00:29:49,920 --> 00:29:51,990 actually quite a terrible gold standard, 683 00:29:51,990 --> 00:29:56,190 how do you create and develop a novel metric 684 00:29:56,190 --> 00:29:58,920 that might not have a reference standard or a legacy standard? 685 00:29:58,920 --> 00:30:01,230 And this is a way of thinking through that. 686 00:30:01,230 --> 00:30:05,207 The second is around selecting a mobile technology. 687 00:30:05,207 --> 00:30:06,790 This used to be called mobile devices, 688 00:30:06,790 --> 00:30:08,100 and they changed it for the same reason 689 00:30:08,100 --> 00:30:10,740 around not calling things a device unless it is a device. 690 00:30:10,740 --> 00:30:12,420 And so these are thinking through what 691 00:30:12,420 --> 00:30:14,130 type of connected tech would you want 692 00:30:14,130 --> 00:30:16,020 to use to generate the patient data that you 693 00:30:16,020 --> 00:30:18,210 might use in your study. 694 00:30:18,210 --> 00:30:18,710 All right. 695 00:30:18,710 --> 00:30:23,500 Who here knows what DEFCON is? 696 00:30:23,500 --> 00:30:24,420 Three of you. 697 00:30:24,420 --> 00:30:25,140 OK. 698 00:30:25,140 --> 00:30:31,230 So DEFCON is a hacker conference. 699 00:30:31,230 --> 00:30:33,540 It is probably one of the biggest hacker conferences. 700 00:30:33,540 --> 00:30:36,317 It is a conference that if you do have the joy of going to, 701 00:30:36,317 --> 00:30:38,400 you should not bring your phone and you should not 702 00:30:38,400 --> 00:30:40,800 bring your computer, and you should definitely not 703 00:30:40,800 --> 00:30:43,440 connect to the internet because there is a group called 704 00:30:43,440 --> 00:30:46,350 the Wall of Sheep, and they will just straight stream 705 00:30:46,350 --> 00:30:50,610 all your Gmail passwords plain text and your account 706 00:30:50,610 --> 00:30:55,320 logins and anything that you are putting on the internet. 707 00:30:55,320 --> 00:30:57,945 This group is amazing. 708 00:31:00,630 --> 00:31:04,800 You may have also heard about them because they 709 00:31:04,800 --> 00:31:07,350 bought a number of voting machines last year, 710 00:31:07,350 --> 00:31:08,940 hacked them, found the voting records, 711 00:31:08,940 --> 00:31:10,870 and sent them back to Congress and said, hey, 712 00:31:10,870 --> 00:31:13,890 you should probably fix this. 713 00:31:13,890 --> 00:31:18,720 DEFCON has a number of villages that sit under the main DEFCON. 714 00:31:18,720 --> 00:31:20,850 One of them is called Biohacking Village. 715 00:31:20,850 --> 00:31:22,710 And there is some biohacking, so, 716 00:31:22,710 --> 00:31:25,570 like, doing the RFID chipping, citizen science. 717 00:31:25,570 --> 00:31:29,220 But there's also a set of people at Biohacking Village that 718 00:31:29,220 --> 00:31:31,313 do what's called white hat hacking. 719 00:31:31,313 --> 00:31:32,730 So for people who know about this, 720 00:31:32,730 --> 00:31:36,270 there's black cat, where you might 721 00:31:36,270 --> 00:31:39,900 encrypt somebody's website and then hold them for ransom 722 00:31:39,900 --> 00:31:41,660 and do things that are disruptive. 723 00:31:41,660 --> 00:31:44,390 White hat hackers are considered ethical hackers, 724 00:31:44,390 --> 00:31:47,610 where they are doing security research on a product. 725 00:31:47,610 --> 00:31:49,980 So the hackers in the Biohacking Village 726 00:31:49,980 --> 00:31:52,830 started to do a lot of work on pacemakers, which 727 00:31:52,830 --> 00:31:54,610 are connected technologies. 728 00:31:54,610 --> 00:31:57,193 A lot of pacemaker companies-- 729 00:31:57,193 --> 00:31:59,610 an easy way to think about how they're thinking about this 730 00:31:59,610 --> 00:32:02,272 was the pacemaker companies are generally trying 731 00:32:02,272 --> 00:32:03,480 to optimize for battery life. 732 00:32:03,480 --> 00:32:05,022 They don't want to do anything that's 733 00:32:05,022 --> 00:32:06,120 computationally expensive. 734 00:32:06,120 --> 00:32:08,970 Turns out, encrypting things is computationally expensive. 735 00:32:08,970 --> 00:32:11,820 They did a relatively trivial exploit 736 00:32:11,820 --> 00:32:14,190 where they were able to reverse engineer the protocol. 737 00:32:14,190 --> 00:32:17,770 Pacemakers stay in a low power mode as long as they can. 738 00:32:17,770 --> 00:32:20,370 If you ping it, it will turn into high power mode, 739 00:32:20,370 --> 00:32:23,490 so you can drain a multi-year battery of the pacemaker 740 00:32:23,490 --> 00:32:27,385 into a couple days or weeks. 741 00:32:27,385 --> 00:32:29,010 They were also able to reverse engineer 742 00:32:29,010 --> 00:32:32,200 the shock that a pacemaker can deliver upon a cardiac event. 743 00:32:32,200 --> 00:32:36,570 And so this has pretty significant implications 744 00:32:36,570 --> 00:32:38,670 for what this exploit can do. 745 00:32:38,670 --> 00:32:40,710 With any normal tech company, when 746 00:32:40,710 --> 00:32:43,453 you have an exploit of this type, you can go to Facebook, 747 00:32:43,453 --> 00:32:45,120 you can go to Amazon, there is something 748 00:32:45,120 --> 00:32:48,210 called a coordinated disclosure, you might have a bug bounty, 749 00:32:48,210 --> 00:32:50,910 and then you share the update, you can submit the update, 750 00:32:50,910 --> 00:32:52,440 and then you're done. 751 00:32:52,440 --> 00:32:54,480 With the device companies, what was generally 752 00:32:54,480 --> 00:32:57,480 happening is the researchers were going to the device 753 00:32:57,480 --> 00:32:59,550 companies, hey, we found this exploit, 754 00:32:59,550 --> 00:33:03,420 and the device companies were saying, thank you, 755 00:33:03,420 --> 00:33:05,460 we are going to sue you now. 756 00:33:05,460 --> 00:33:07,200 And the security researchers were 757 00:33:07,200 --> 00:33:08,580 like, why are you suing us? 758 00:33:08,580 --> 00:33:10,920 And they said, you're tampering with our product, 759 00:33:10,920 --> 00:33:13,290 we are regulated by agencies, we can't just 760 00:33:13,290 --> 00:33:17,960 ship updates whenever we want, and so we have to sue you. 761 00:33:17,960 --> 00:33:20,390 Turns out that is not true. 762 00:33:20,390 --> 00:33:23,570 And the FDA found out about this and they're like, 763 00:33:23,570 --> 00:33:25,700 you can't just do security researchers. 764 00:33:25,700 --> 00:33:30,300 If you have a security issue, you have to fix that. 765 00:33:30,300 --> 00:33:32,840 And so the FDA did something that was pretty bold, 766 00:33:32,840 --> 00:33:36,020 which was three years ago, they went to DEFCON. 767 00:33:36,020 --> 00:33:38,840 And if anyone has actually gone to DEFCON, 768 00:33:38,840 --> 00:33:40,220 you would know that you do not go 769 00:33:40,220 --> 00:33:42,530 to DEFCON if you are part of the government 770 00:33:42,530 --> 00:33:45,770 because there is a game called Find the Fed, 771 00:33:45,770 --> 00:33:47,770 and you do not want to be found. 772 00:33:47,770 --> 00:33:55,250 And of course, NSA, CIA, a lot of members of the government 773 00:33:55,250 --> 00:33:57,290 will go to DEFCON, but it is generally 774 00:33:57,290 --> 00:33:59,330 not a particularly friendly environment. 775 00:33:59,330 --> 00:34:03,400 The Biohacking Village said, hey, we will protect you, 776 00:34:03,400 --> 00:34:05,360 we will give you a speaker slot, we really 777 00:34:05,360 --> 00:34:06,860 want to work together with you. 778 00:34:06,860 --> 00:34:09,110 And so over the last three years, 779 00:34:09,110 --> 00:34:10,942 the agency has been working closely 780 00:34:10,942 --> 00:34:12,650 with security researchers to really think 781 00:34:12,650 --> 00:34:15,980 through the best ways of doing cybersecurity, 782 00:34:15,980 --> 00:34:17,844 particularly for connected devices. 783 00:34:17,844 --> 00:34:19,969 And so if you look at the past couple of guidances, 784 00:34:19,969 --> 00:34:21,802 there's a premarket and post-market guidance 785 00:34:21,802 --> 00:34:23,330 where they've been collaborating, 786 00:34:23,330 --> 00:34:26,429 and they're very good and strong guidances. 787 00:34:26,429 --> 00:34:29,639 So the FDA did something really interesting, 788 00:34:29,639 --> 00:34:31,949 which was in January, they announced 789 00:34:31,949 --> 00:34:34,139 a new initiative, which I think is quite amazing, 790 00:34:34,139 --> 00:34:36,900 called #WeHeartHackers. 791 00:34:36,900 --> 00:34:40,949 And if you go to WeHeartHackers.org, 792 00:34:40,949 --> 00:34:44,790 the FDA has been encouraging device manufacturers, 793 00:34:44,790 --> 00:34:47,670 like Medtronic and BD, and Philips, and Thermo 794 00:34:47,670 --> 00:34:51,150 Fisher and others, to bring their devices 795 00:34:51,150 --> 00:34:55,650 and work together with security researchers. 796 00:34:55,650 --> 00:34:57,900 Another group that is probably worth knowing 797 00:34:57,900 --> 00:34:59,670 is that if you think about what a lot 798 00:34:59,670 --> 00:35:02,880 of these connected products do, they, in many instances, 799 00:35:02,880 --> 00:35:06,210 might augment or change the way that a clinician does 800 00:35:06,210 --> 00:35:07,050 their work. 801 00:35:07,050 --> 00:35:10,140 And so today, if you are a clinician 802 00:35:10,140 --> 00:35:11,970 and you graduate from med school, 803 00:35:11,970 --> 00:35:14,730 you would take something like a Hippocratic oath to do no harm. 804 00:35:14,730 --> 00:35:17,340 Should the software engineers and the manufacturers 805 00:35:17,340 --> 00:35:19,980 of these products also take some sort of oath to do no harm? 806 00:35:19,980 --> 00:35:22,690 And would that oath look similar or different? 807 00:35:22,690 --> 00:35:25,710 And that line of thinking helped people 808 00:35:25,710 --> 00:35:29,190 realize that there are entire professional communities 809 00:35:29,190 --> 00:35:32,190 and societies for people who do this sort of thing 810 00:35:32,190 --> 00:35:33,880 for doctors in their specialties, 811 00:35:33,880 --> 00:35:36,930 so a society for neuro oncology, society for radiology. 812 00:35:36,930 --> 00:35:38,940 But there's really no society for people 813 00:35:38,940 --> 00:35:40,530 who practice digital medicine. 814 00:35:40,530 --> 00:35:43,470 So there is a group that is starting now, which you all 815 00:35:43,470 --> 00:35:45,510 might like to join because I think you would all 816 00:35:45,510 --> 00:35:48,280 be part of this type of community, 817 00:35:48,280 --> 00:35:51,240 which is the society for-- 818 00:35:51,240 --> 00:35:52,530 it's called the DIME Society. 819 00:35:52,530 --> 00:35:54,105 And so if you're thinking through, 820 00:35:54,105 --> 00:35:55,980 how do I do informed consent with these sorts 821 00:35:55,980 --> 00:35:57,990 of digital products, what are the new ways 822 00:35:57,990 --> 00:35:59,690 that I need to think through regulation, 823 00:35:59,690 --> 00:36:02,130 how am I going to work with my IRB, 824 00:36:02,130 --> 00:36:05,310 this society could be a resource for you. 825 00:36:05,310 --> 00:36:05,810 All right. 826 00:36:05,810 --> 00:36:09,350 So how do you participate in the rulemaking process? 827 00:36:09,350 --> 00:36:11,210 One is, I would highly encourage, 828 00:36:11,210 --> 00:36:13,940 if you get a chance to, to serve some time in government. 829 00:36:13,940 --> 00:36:16,550 There are more opportunities to do that through organizations 830 00:36:16,550 --> 00:36:18,217 like the Presidential Innovation Fellow, 831 00:36:18,217 --> 00:36:20,420 to be an entrepreneur resident somewhere, 832 00:36:20,420 --> 00:36:23,240 to be part of the US Digital Service. 833 00:36:23,240 --> 00:36:29,360 The payment system of CMS is millions of line of COBOL, 834 00:36:29,360 --> 00:36:32,930 and so that obviously needs some fixing. 835 00:36:32,930 --> 00:36:37,460 And so if you want to do a service, 836 00:36:37,460 --> 00:36:39,440 I think this is a really important way. 837 00:36:39,440 --> 00:36:40,940 Another way that you can do it is 838 00:36:40,940 --> 00:36:42,973 submitting to a public docket. 839 00:36:42,973 --> 00:36:45,140 And so this is something I will be asking you to do, 840 00:36:45,140 --> 00:36:48,770 and we'll talk about it after, is how can you 841 00:36:48,770 --> 00:36:51,470 take what you learned in that white paper and ways 842 00:36:51,470 --> 00:36:56,330 that you can share back with the agency of how you would think 843 00:36:56,330 --> 00:37:01,667 about developing rules and laws around AI and machine learning. 844 00:37:01,667 --> 00:37:04,250 There's a much longer resource that you can look at, my friend 845 00:37:04,250 --> 00:37:06,098 Mina wrote, which is that-- 846 00:37:06,098 --> 00:37:07,640 these are a couple of things to know. 847 00:37:07,640 --> 00:37:10,160 So anyone can comment, you will be heard. 848 00:37:10,160 --> 00:37:14,678 If you write a very long comment, someone at the agency, 849 00:37:14,678 --> 00:37:16,970 probably multiple, will have to read every single thing 850 00:37:16,970 --> 00:37:20,720 that you write, so please be judicious in how you do that. 851 00:37:20,720 --> 00:37:22,460 But you will be heard. 852 00:37:22,460 --> 00:37:26,360 And most of the time comments come from big organizations 853 00:37:26,360 --> 00:37:28,940 and people who have come together and not 854 00:37:28,940 --> 00:37:31,773 from the people who are experiencing and using 855 00:37:31,773 --> 00:37:32,690 a lot of the products. 856 00:37:32,690 --> 00:37:34,790 So in my opinion, I think someone 857 00:37:34,790 --> 00:37:37,220 like you is a really important comment and voice 858 00:37:37,220 --> 00:37:41,805 for the agency to have, and to have a technical perspective. 859 00:37:41,805 --> 00:37:43,430 Another way that you can do this, which 860 00:37:43,430 --> 00:37:45,890 I'm going to put Irene on the spot, is we 861 00:37:45,890 --> 00:37:47,910 need new regulatory paradigms. 862 00:37:47,910 --> 00:37:51,690 And so when you are out at beers or ice cream, 863 00:37:51,690 --> 00:37:57,200 or whatever you do for fun, you can think through new models. 864 00:37:57,200 --> 00:38:00,920 And so we were kicking around an idea of, 865 00:38:00,920 --> 00:38:02,810 could you use a clinical trial framework 866 00:38:02,810 --> 00:38:04,880 to think about AI in general? 867 00:38:04,880 --> 00:38:07,580 So algorithms perform differently 868 00:38:07,580 --> 00:38:10,280 on different patient populations and different groups. 869 00:38:10,280 --> 00:38:12,027 You need inclusion/exclusion criteria. 870 00:38:12,027 --> 00:38:13,610 Should this be something maybe we even 871 00:38:13,610 --> 00:38:15,980 expand beyond health care algorithms 872 00:38:15,980 --> 00:38:19,220 to how you decide whether or not someone gets bail or teacher 873 00:38:19,220 --> 00:38:19,820 benefits? 874 00:38:19,820 --> 00:38:22,190 And then the fun thing about putting your ideas online, 875 00:38:22,190 --> 00:38:24,320 if you do that, is then people start coming to you. 876 00:38:24,320 --> 00:38:27,500 And we realized there was a group in Italy who 877 00:38:27,500 --> 00:38:29,890 had proposed a version of FDA for algorithms, 878 00:38:29,890 --> 00:38:31,670 and you start to collect people who 879 00:38:31,670 --> 00:38:34,800 are thinking about things that you're thinking about. 880 00:38:34,800 --> 00:38:36,330 And now we will dig into the thing 881 00:38:36,330 --> 00:38:38,038 that you most likely will spend more time 882 00:38:38,038 --> 00:38:42,030 with than the government, which is your IRB. 883 00:38:42,030 --> 00:38:43,030 MARK SHERVEY: Thank you. 884 00:38:45,700 --> 00:38:46,310 OK. 885 00:38:46,310 --> 00:38:50,190 I could probably not give the rest of this talk 886 00:38:50,190 --> 00:38:52,030 if you just follow the thing on the bottom. 887 00:38:52,030 --> 00:38:55,860 If you don't know if you're doing human subject research, 888 00:38:55,860 --> 00:38:59,978 ask the IRB, ask your professor, ask somebody. 889 00:38:59,978 --> 00:39:01,520 I think most of what I'm going to say 890 00:39:01,520 --> 00:39:05,390 is going to be a lot softer, squishier than what Andy went 891 00:39:05,390 --> 00:39:06,860 around, and it's really just to try 892 00:39:06,860 --> 00:39:10,490 to get the thought process going through your head of if we're 893 00:39:10,490 --> 00:39:16,850 doing actual human research, if the IRB has to be involved, 894 00:39:16,850 --> 00:39:20,450 what actually constitutes human research? 895 00:39:20,450 --> 00:39:23,210 And just to be sure that you're aware of what's 896 00:39:23,210 --> 00:39:25,350 going on there all the time. 897 00:39:25,350 --> 00:39:26,040 We've done this. 898 00:39:28,820 --> 00:39:32,720 So research is systematic investigation 899 00:39:32,720 --> 00:39:34,880 to develop or contribute generalizable knowledge. 900 00:39:34,880 --> 00:39:37,550 So you can do that on a rock. 901 00:39:37,550 --> 00:39:40,070 What's important about human subjects 902 00:39:40,070 --> 00:39:42,920 research is that people's lives are on the line. 903 00:39:48,250 --> 00:39:50,020 Generally, the easiest thing to know 904 00:39:50,020 --> 00:39:53,470 is if there's any sort of identifiable information 905 00:39:53,470 --> 00:39:55,630 with the data that you're working with, 906 00:39:55,630 --> 00:39:58,630 that is going to fall under human subjects research. 907 00:40:01,300 --> 00:40:04,270 Things that won't are publicly available, anonymous data. 908 00:40:04,270 --> 00:40:08,130 There's all sorts of imaging training data 909 00:40:08,130 --> 00:40:11,660 sets that you can use that are anonymized 910 00:40:11,660 --> 00:40:14,200 to what is an acceptable level. 911 00:40:14,200 --> 00:40:16,450 But to Andy's point, there's really 912 00:40:16,450 --> 00:40:21,960 no way to truly de-identify a data set. 913 00:40:21,960 --> 00:40:25,740 And with the amount of data that we're working with all right 914 00:40:25,740 --> 00:40:29,630 now in the world, it's becoming impossible 915 00:40:29,630 --> 00:40:32,640 to de-identify any data set if you have any other reference 916 00:40:32,640 --> 00:40:33,140 data set. 917 00:40:33,140 --> 00:40:37,340 So anytime you're working with any people, 918 00:40:37,340 --> 00:40:39,500 you are almost certainly going to have 919 00:40:39,500 --> 00:40:42,350 to involve the IRB, again. 920 00:40:47,060 --> 00:40:49,610 So why the IRB is there, it's not specifically 921 00:40:49,610 --> 00:40:53,120 to slap you on the wrists. 922 00:40:53,120 --> 00:40:55,100 It's not that anything's expected 923 00:40:55,100 --> 00:40:56,630 to purposely do anything wrong. 924 00:40:56,630 --> 00:40:59,960 Although that has happened, that's such a small amount 925 00:40:59,960 --> 00:41:02,450 that it's just unhelpful to think 926 00:41:02,450 --> 00:41:06,560 that everybody is malicious. 927 00:41:06,560 --> 00:41:10,550 So you're not going to do anything particularly wrong, 928 00:41:10,550 --> 00:41:12,740 but there are things that you just may not know. 929 00:41:12,740 --> 00:41:17,090 And this is not the IRB's 1,000th rodeo, 930 00:41:17,090 --> 00:41:18,920 so if you bring something up to them, 931 00:41:18,920 --> 00:41:21,980 they'll know almost immediately. 932 00:41:21,980 --> 00:41:25,460 Participants are giving up their time and information, 933 00:41:25,460 --> 00:41:31,255 so the IRB, more than keeping the institution from harm, 934 00:41:31,255 --> 00:41:32,630 is really protecting the patients 935 00:41:32,630 --> 00:41:35,420 first and the institution at the same time. 936 00:41:35,420 --> 00:41:40,440 But the main role is to protect the participants. 937 00:41:40,440 --> 00:41:43,070 Specifically, here's something that might not 938 00:41:43,070 --> 00:41:45,950 go through everybody's head, research 939 00:41:45,950 --> 00:41:49,310 that may be questionable or overly manipulative. 940 00:41:49,310 --> 00:41:53,120 That gets into compensation for studies. 941 00:41:53,120 --> 00:41:57,950 You can imagine certain places in an impoverished nation 942 00:41:57,950 --> 00:42:03,200 that you say, we'll pay $50,000 per person 943 00:42:03,200 --> 00:42:06,860 to come participate in this study, 944 00:42:06,860 --> 00:42:09,260 you can imagine people want to be in that study 945 00:42:09,260 --> 00:42:11,150 and it can become a problem. 946 00:42:11,150 --> 00:42:15,500 So the IRB is also a huge part of making sure 947 00:42:15,500 --> 00:42:20,890 that the studies aren't actually affecting anybody negatively 948 00:42:20,890 --> 00:42:22,130 in that kind of sense. 949 00:42:25,100 --> 00:42:28,040 Now, before I do, this next slide gets dark for a second, 950 00:42:28,040 --> 00:42:29,780 so we'll try to move through it. 951 00:42:29,780 --> 00:42:33,570 But it talks about how the IRB came about. 952 00:42:33,570 --> 00:42:38,150 So we start with the Nuremberg Code, human research conducted 953 00:42:38,150 --> 00:42:45,630 on prisoners and others, not participants 954 00:42:45,630 --> 00:42:49,190 but subjects of research. 955 00:42:49,190 --> 00:42:52,410 Tuskegee experiment, another thing 956 00:42:52,410 --> 00:43:01,240 that people were not properly consented into the studies. 957 00:43:01,240 --> 00:43:04,310 They didn't know what they were actually being tested for, 958 00:43:04,310 --> 00:43:08,530 so they couldn't possibly have consented. 959 00:43:08,530 --> 00:43:11,170 The study went for 40 years instead of six months. 960 00:43:11,170 --> 00:43:15,700 And even after a standard of care had been established, 961 00:43:15,700 --> 00:43:18,940 the study continued on. 962 00:43:18,940 --> 00:43:23,320 That essentially is what began the National Commission 963 00:43:23,320 --> 00:43:28,870 for Protection of Human Subjects, which led to the IRB 964 00:43:28,870 --> 00:43:33,120 being the requirement for research. 965 00:43:33,120 --> 00:43:36,540 And then five years later, the Belmont Report 966 00:43:36,540 --> 00:43:40,500 came out, essentially enumerating 967 00:43:40,500 --> 00:43:47,280 these three basic principles, respect for participants, 968 00:43:47,280 --> 00:43:50,910 beneficence as far as do no harm, don't take 969 00:43:50,910 --> 00:43:53,190 extra blood if it just makes it more convenient, 970 00:43:53,190 --> 00:43:58,800 don't add extra drug if you just want to see what happens, 971 00:43:58,800 --> 00:44:02,520 and then just making sure that participants 972 00:44:02,520 --> 00:44:09,940 are safe outside of any other harm that you can do. 973 00:44:09,940 --> 00:44:12,520 So we follow the Belmont Report. 974 00:44:12,520 --> 00:44:14,650 That's essentially the state of the art 975 00:44:14,650 --> 00:44:19,010 that we have now with modernization moving forward. 976 00:44:21,650 --> 00:44:23,770 This is not something to really worry about, 977 00:44:23,770 --> 00:44:27,220 but HHS has a great site that has 978 00:44:27,220 --> 00:44:31,270 a flow chart for just about any circumstance that you can think 979 00:44:31,270 --> 00:44:38,610 of to decide if you're actually doing human subjects research 980 00:44:38,610 --> 00:44:39,960 or not. 981 00:44:39,960 --> 00:44:44,550 This is pretty much the most basic one. 982 00:44:44,550 --> 00:44:46,710 You can go through it on your own. 983 00:44:46,710 --> 00:44:50,730 Just to highlight the main thing that I think you guys will all 984 00:44:50,730 --> 00:44:53,520 probably be worried about, is you 985 00:44:53,520 --> 00:44:57,120 will be collecting identifiable data, which just immediately 986 00:44:57,120 --> 00:45:00,510 puts you in IRB land. 987 00:45:00,510 --> 00:45:03,270 So anytime you can identify that that's 988 00:45:03,270 --> 00:45:05,460 a thing that's happening, you're just there, 989 00:45:05,460 --> 00:45:07,585 so you don't really have to go through any of this. 990 00:45:09,830 --> 00:45:11,670 What is health data? 991 00:45:11,670 --> 00:45:13,310 So you have names obviously. 992 00:45:13,310 --> 00:45:20,370 Most of these are either identifications or some sort 993 00:45:20,370 --> 00:45:21,270 of identifying thing. 994 00:45:24,487 --> 00:45:26,070 The two, I guess, that a lot of people 995 00:45:26,070 --> 00:45:31,080 maybe gloss over that aren't so obvious is zip codes. 996 00:45:31,080 --> 00:45:34,320 You have to limit them to the first three numbers of a zip 997 00:45:34,320 --> 00:45:37,950 code, which gives a generalizable area 998 00:45:37,950 --> 00:45:45,120 without actually dialing in on a person's place. 999 00:45:45,120 --> 00:45:48,480 Dates are an extremely sensitive topic. 1000 00:45:48,480 --> 00:45:52,230 So anytime you're working with actual dates, which 1001 00:45:52,230 --> 00:45:55,178 I assume in wearable technologies 1002 00:45:55,178 --> 00:45:56,970 you're going to be dealing with time series 1003 00:45:56,970 --> 00:45:58,810 data and that kind of stuff. 1004 00:45:58,810 --> 00:46:06,460 There are different ways of making that less sensitive. 1005 00:46:06,460 --> 00:46:09,820 But anytime you're dealing with research, anytime 1006 00:46:09,820 --> 00:46:12,430 we're dealing with the electronic health records, 1007 00:46:12,430 --> 00:46:18,020 we deal in years, not in actual dates, which can-- 1008 00:46:18,020 --> 00:46:20,240 it creates problems if you are trying 1009 00:46:20,240 --> 00:46:24,320 to do time series analysis for somebody's entire health 1010 00:46:24,320 --> 00:46:29,540 record, in which case you can get further clearance to work 1011 00:46:29,540 --> 00:46:33,050 with more identifiable data. 1012 00:46:33,050 --> 00:46:36,010 But that is progressive as it can be. 1013 00:46:36,010 --> 00:46:39,380 There's no reason to start with that kind of data if you don't. 1014 00:46:39,380 --> 00:46:42,710 So it's always on a need to know. 1015 00:46:42,710 --> 00:46:47,600 Finally, if you're working with patients older than 90, 1016 00:46:47,600 --> 00:46:50,810 90 or older, they are just generalized as a category 1017 00:46:50,810 --> 00:46:53,480 of greater than 90. 1018 00:46:53,480 --> 00:46:59,040 The rest of these, I think, are fairly guessable, 1019 00:46:59,040 --> 00:47:00,540 so we don't have to go through them. 1020 00:47:00,540 --> 00:47:06,790 But those are the tricky ones that some people don't catch. 1021 00:47:06,790 --> 00:47:11,080 Again, just limit the collection of PHI as strictly as possible. 1022 00:47:11,080 --> 00:47:13,720 If you don't need it, don't get it. 1023 00:47:13,720 --> 00:47:16,450 If you're sharing the data, instead 1024 00:47:16,450 --> 00:47:22,780 of sharing an entire data set if you do have strong PHI, 1025 00:47:22,780 --> 00:47:27,520 limit what you're giving or sharing to another researcher. 1026 00:47:27,520 --> 00:47:32,380 That's just a hygiene issue, and it's really 1027 00:47:32,380 --> 00:47:35,650 limiting the amount of errors that can happen. 1028 00:47:39,360 --> 00:47:40,800 So why is this so important? 1029 00:47:40,800 --> 00:47:46,490 The IRB, again, is particularly interested in protecting 1030 00:47:46,490 --> 00:47:50,530 patients and making sure that there's 1031 00:47:50,530 --> 00:47:55,500 as little harm, if any, done as possible to patients. 1032 00:47:55,500 --> 00:47:59,940 Just general human decency and respect. 1033 00:47:59,940 --> 00:48:03,000 There's institutional risk if something 1034 00:48:03,000 --> 00:48:06,960 is done without an IRB, and you can't 1035 00:48:06,960 --> 00:48:09,720 publish if you have done human subjects 1036 00:48:09,720 --> 00:48:11,400 research without an IRB. 1037 00:48:11,400 --> 00:48:16,290 Those two are kind of the stick, but the carrot really 1038 00:48:16,290 --> 00:48:18,648 should be the top two, as far as just human decency 1039 00:48:18,648 --> 00:48:20,190 and making sure that you've protected 1040 00:48:20,190 --> 00:48:22,140 any patients or any participants that you 1041 00:48:22,140 --> 00:48:24,180 have involved in your research. 1042 00:48:27,195 --> 00:48:28,570 These are a couple of violations. 1043 00:48:28,570 --> 00:48:30,112 We don't have to get too far into it, 1044 00:48:30,112 --> 00:48:33,560 but they were both allegedly conducted 1045 00:48:33,560 --> 00:48:36,420 without any IRB approval. 1046 00:48:36,420 --> 00:48:39,760 There's possible fraud involved, and it 1047 00:48:39,760 --> 00:48:41,270 ruined both of their careers. 1048 00:48:41,270 --> 00:48:43,630 But it put people at huge exposures 1049 00:48:43,630 --> 00:48:48,800 to unhealthy conditions. 1050 00:48:48,800 --> 00:48:53,590 This is probably a much bigger common issue 1051 00:48:53,590 --> 00:48:54,920 that you're going to have. 1052 00:48:54,920 --> 00:48:58,270 PHI data breaches, they happen a lot. 1053 00:48:58,270 --> 00:49:00,940 They're generally not breaches from the outside. 1054 00:49:00,940 --> 00:49:02,085 They're accidents. 1055 00:49:02,085 --> 00:49:03,460 Somebody will set up a web server 1056 00:49:03,460 --> 00:49:09,310 on the machine serving PHI because they found it easier 1057 00:49:09,310 --> 00:49:11,920 to work at home one day. 1058 00:49:11,920 --> 00:49:15,160 It could just be they don't know how software is set up. 1059 00:49:15,160 --> 00:49:17,660 So anytime you're working with PHI, 1060 00:49:17,660 --> 00:49:20,833 you've really got to overdo it on knowing exactly how you're 1061 00:49:20,833 --> 00:49:21,500 working with it. 1062 00:49:24,180 --> 00:49:27,270 Other breaches are losing unencrypted computers, 1063 00:49:27,270 --> 00:49:30,270 putting data on a thumb drive and losing it. 1064 00:49:35,710 --> 00:49:39,420 The gross amount of data breaches 1065 00:49:39,420 --> 00:49:43,110 happen just from negligence and not 1066 00:49:43,110 --> 00:49:45,900 being as careful as you want to be. 1067 00:49:45,900 --> 00:49:47,790 So that's always good to keep in mind. 1068 00:49:50,500 --> 00:49:55,240 I guess a new thing with the IRB and digital research 1069 00:49:55,240 --> 00:50:02,610 is things have been changing now from face to face recruitment 1070 00:50:02,610 --> 00:50:07,740 and research into being able to consent online 1071 00:50:07,740 --> 00:50:12,440 to be able to reach millions of people across the world 1072 00:50:12,440 --> 00:50:15,540 and allowing them to consent on their own. 1073 00:50:15,540 --> 00:50:21,220 So this has become, obviously, a new thing since the Belmont 1074 00:50:21,220 --> 00:50:25,240 Report, and it's something that we are working closely 1075 00:50:25,240 --> 00:50:31,190 with our IRB to make sure that we're being as respectful as we 1076 00:50:31,190 --> 00:50:33,080 can to the patients, but also making sure 1077 00:50:33,080 --> 00:50:37,530 that we can develop software solutions that 1078 00:50:37,530 --> 00:50:43,440 are not hurting anybody and develop into swim lanes. 1079 00:50:43,440 --> 00:50:47,130 So what we've come up with a framework for is that there's 1080 00:50:47,130 --> 00:50:49,320 a project which is-- 1081 00:50:49,320 --> 00:50:54,310 we're studying all cancers. 1082 00:50:54,310 --> 00:50:59,390 So you can post reports about different research that's going 1083 00:50:59,390 --> 00:51:01,460 on, things that seem important. 1084 00:51:01,460 --> 00:51:03,170 A study is an actual person that's 1085 00:51:03,170 --> 00:51:10,640 consented to a protocol, which is human research 1086 00:51:10,640 --> 00:51:13,760 and subject to IRB. 1087 00:51:13,760 --> 00:51:20,390 Then we'll have a platform that the users will use, 1088 00:51:20,390 --> 00:51:24,590 and that will be like a website or an iPhone app 1089 00:51:24,590 --> 00:51:27,170 that they can get literature information about what 1090 00:51:27,170 --> 00:51:28,398 the project is going on. 1091 00:51:28,398 --> 00:51:30,440 And then we'll have a participant who is actually 1092 00:51:30,440 --> 00:51:35,510 part of a study, who's, again, covered under IRB 1093 00:51:35,510 --> 00:51:36,185 through consent. 1094 00:51:40,520 --> 00:51:44,720 So why this kind of development has been important, 1095 00:51:44,720 --> 00:51:46,250 the old way of software development 1096 00:51:46,250 --> 00:51:50,060 was the waterfall approach, where you work for three weeks, 1097 00:51:50,060 --> 00:51:52,670 implement something, work for three weeks, 1098 00:51:52,670 --> 00:51:57,050 implement something, where we have moved to a Agile approach 1099 00:51:57,050 --> 00:51:58,250 in software. 1100 00:51:58,250 --> 00:52:01,850 And so while Agile makes our lives a lot easier as far 1101 00:52:01,850 --> 00:52:07,140 as development, we can't be sure what 1102 00:52:07,140 --> 00:52:09,780 we're doing isn't going to affect patients 1103 00:52:09,780 --> 00:52:11,410 in certain contexts. 1104 00:52:11,410 --> 00:52:15,360 So within a study, working Agile makes no sense. 1105 00:52:19,180 --> 00:52:22,150 We have we want to work with the IRB to approve things, 1106 00:52:22,150 --> 00:52:24,940 but IRB approval takes between two and four 1107 00:52:24,940 --> 00:52:27,550 weeks for expedited things. 1108 00:52:27,550 --> 00:52:30,820 When we talk about projects and stuff, 1109 00:52:30,820 --> 00:52:34,330 that's where we want to work safely in an Agile environment 1110 00:52:34,330 --> 00:52:37,360 and try to figure out places where the IRB doesn't 1111 00:52:37,360 --> 00:52:41,800 necessarily have to be involved or doesn't want to be involved 1112 00:52:41,800 --> 00:52:44,817 and that there isn't any added patient risk 1113 00:52:44,817 --> 00:52:46,900 whatsoever in working in that kind of environment. 1114 00:52:46,900 --> 00:52:51,160 So it's working with software products versus studies, 1115 00:52:51,160 --> 00:52:53,110 and so working with the IRB to be sure 1116 00:52:53,110 --> 00:52:55,180 that we can separate those things 1117 00:52:55,180 --> 00:52:57,070 and make sure that things move on 1118 00:52:57,070 --> 00:53:01,250 as well as possible without any added harm. 1119 00:53:01,250 --> 00:53:05,320 So that's these categories again. 1120 00:53:05,320 --> 00:53:16,720 So project activity would be social media outreach, 1121 00:53:16,720 --> 00:53:20,650 sharing content that is relevant to the project and kind of just 1122 00:53:20,650 --> 00:53:23,350 informing about a general idea. 1123 00:53:23,350 --> 00:53:25,810 A study activity is what you would generally 1124 00:53:25,810 --> 00:53:32,120 be used to with consent, data sharing, 1125 00:53:32,120 --> 00:53:34,080 actually participating in a study, 1126 00:53:34,080 --> 00:53:41,600 whether it's through a wearable, answering questions, and then 1127 00:53:41,600 --> 00:53:43,130 withdrawing in the process. 1128 00:53:43,130 --> 00:53:46,340 And the study activities are 100% IRB, 1129 00:53:46,340 --> 00:53:49,100 where the project activities that aren't directly 1130 00:53:49,100 --> 00:53:54,610 dealing with the study can hopefully 1131 00:53:54,610 --> 00:54:00,350 be separated in most cases. 1132 00:54:00,350 --> 00:54:03,880 So the three takeaways really are just if you don't know, 1133 00:54:03,880 --> 00:54:08,140 ask, limit the collection of PHI as strictly as possible, 1134 00:54:08,140 --> 00:54:15,260 and working in Agile developments are great but it 1135 00:54:15,260 --> 00:54:19,080 is unsafe in a lot of human research, 1136 00:54:19,080 --> 00:54:23,220 so we have to focus on where that can be used and where it 1137 00:54:23,220 --> 00:54:24,390 can't. 1138 00:54:24,390 --> 00:54:25,230 And that's it. 1139 00:54:25,230 --> 00:54:28,200 Thank you. 1140 00:54:28,200 --> 00:54:30,200 [APPLAUSE] 1141 00:54:30,200 --> 00:54:32,200 Oh. 1142 00:54:32,200 --> 00:54:35,920 AUDIENCE: I have a question about how it's actually done. 1143 00:54:35,920 --> 00:54:39,700 So as the IRB, how do you make sure 1144 00:54:39,700 --> 00:54:42,130 that your researcher is complying? 1145 00:54:42,130 --> 00:54:45,790 Is that, like, writing a report, doing a PDF, 1146 00:54:45,790 --> 00:54:48,718 or is there a third party service? 1147 00:54:48,718 --> 00:54:49,760 MARK SHERVEY: Yeah, yeah. 1148 00:54:49,760 --> 00:54:52,380 So we certify all of our researchers 1149 00:54:52,380 --> 00:54:58,890 with human research and HIPAA compliance, just blanket. 1150 00:54:58,890 --> 00:55:06,180 And if you provide that and your certifications are up to date, 1151 00:55:06,180 --> 00:55:09,422 it's an understanding that the researcher 1152 00:55:09,422 --> 00:55:11,130 knows what they should be looking out for 1153 00:55:11,130 --> 00:55:13,935 and that the IRB understands. 1154 00:55:13,935 --> 00:55:16,042 AUDIENCE: So is that a third party? 1155 00:55:16,042 --> 00:55:17,250 MARK SHERVEY: Oh, yeah, yeah. 1156 00:55:17,250 --> 00:55:18,125 We use a third party. 1157 00:55:18,125 --> 00:55:18,990 You can have-- 1158 00:55:21,890 --> 00:55:23,935 I don't-- we use a third party. 1159 00:55:23,935 --> 00:55:25,060 PROFESSOR: Can I just add-- 1160 00:55:25,060 --> 00:55:25,540 MARK SHERVEY: Oh, yeah. 1161 00:55:25,540 --> 00:55:27,370 PROFESSOR: So at MIT, there's something 1162 00:55:27,370 --> 00:55:30,940 called COUHES, the Committee on Use 1163 00:55:30,940 --> 00:55:34,390 of Humans as Experimental Subjects, 1164 00:55:34,390 --> 00:55:38,800 and they are our official IRB. 1165 00:55:38,800 --> 00:55:40,210 It used to be all paper. 1166 00:55:40,210 --> 00:55:42,520 Now there's an electronic way where you can 1167 00:55:42,520 --> 00:55:46,900 apply for a COUHES protocol. 1168 00:55:46,900 --> 00:55:50,590 And it's a reasonably long document 1169 00:55:50,590 --> 00:55:54,550 in which you describe the purpose of the experiment, what 1170 00:55:54,550 --> 00:55:56,440 you're going to do, what kind of people 1171 00:55:56,440 --> 00:55:59,680 you're going to recruit, what recruiting material you're 1172 00:55:59,680 --> 00:56:02,860 going to use, how you will handle the data, 1173 00:56:02,860 --> 00:56:07,120 what security provisions you have. 1174 00:56:07,120 --> 00:56:08,650 Of course, if you're doing something 1175 00:56:08,650 --> 00:56:11,380 like injecting people with toxins, 1176 00:56:11,380 --> 00:56:14,740 then that's a much more serious kind of thing, 1177 00:56:14,740 --> 00:56:18,490 and you have to describe the preliminary data on why you 1178 00:56:18,490 --> 00:56:20,770 think this is safe and so on. 1179 00:56:20,770 --> 00:56:23,470 And that gets reviewed at, essentially, 1180 00:56:23,470 --> 00:56:25,150 one of three levels. 1181 00:56:25,150 --> 00:56:27,970 There is exempt review, which is-- 1182 00:56:27,970 --> 00:56:32,170 you can't exempt yourself, but they can exempt you. 1183 00:56:32,170 --> 00:56:36,010 And what they would say is, this is a minimal risk 1184 00:56:36,010 --> 00:56:37,430 kind of problem. 1185 00:56:37,430 --> 00:56:42,370 So let's say you're doing a data only study using mimic data, 1186 00:56:42,370 --> 00:56:43,900 and you've done the city training, 1187 00:56:43,900 --> 00:56:48,070 you've signed the data use agreement. 1188 00:56:48,070 --> 00:56:51,400 You're supposed to get IRB permission for it. 1189 00:56:51,400 --> 00:56:53,470 There is an exception for students 1190 00:56:53,470 --> 00:56:57,760 in a classroom, in which case I'm responsible rather 1191 00:56:57,760 --> 00:57:00,260 than making you responsible. 1192 00:57:00,260 --> 00:57:03,040 But if you screw it up, I'm responsible. 1193 00:57:05,810 --> 00:57:12,980 The second level is an expedited approval, 1194 00:57:12,980 --> 00:57:16,190 which is a low risk kind of approval, 1195 00:57:16,190 --> 00:57:18,980 typically data only studies. 1196 00:57:18,980 --> 00:57:23,940 But it may involve things like using limited data sets, 1197 00:57:23,940 --> 00:57:25,520 where, for example, if you're trying 1198 00:57:25,520 --> 00:57:29,840 to study the geographical distribution of disease, 1199 00:57:29,840 --> 00:57:34,370 then you clearly need better geographical identifiers 1200 00:57:34,370 --> 00:57:36,920 than a three-digit zip code, or if you're 1201 00:57:36,920 --> 00:57:40,460 trying to study a time series, as Mark was talking about, 1202 00:57:40,460 --> 00:57:42,500 you need actual dates. 1203 00:57:42,500 --> 00:57:46,880 And so you can get approval to use that kind of data. 1204 00:57:46,880 --> 00:57:49,160 And then there's the full on review, 1205 00:57:49,160 --> 00:57:52,250 which takes much longer, where they do actually 1206 00:57:52,250 --> 00:57:55,760 bring in people to evaluate the safety of what 1207 00:57:55,760 --> 00:57:58,010 you're proposing to do. 1208 00:57:58,010 --> 00:58:01,040 So far, my experience is that mostly 1209 00:58:01,040 --> 00:58:02,780 with the kinds of studies that we 1210 00:58:02,780 --> 00:58:05,840 do that are representative of the material we're studying 1211 00:58:05,840 --> 00:58:09,980 in class, we don't have to get into that third category 1212 00:58:09,980 --> 00:58:12,320 because we're not actually doing anything that 1213 00:58:12,320 --> 00:58:15,530 is likely to harm individual patients, 1214 00:58:15,530 --> 00:58:20,090 except in a kind of reputational or data-oriented sense, 1215 00:58:20,090 --> 00:58:23,330 and that doesn't require the full blown review. 1216 00:58:23,330 --> 00:58:25,130 So that's the local situation. 1217 00:58:25,130 --> 00:58:26,380 MARK SHERVEY: Yeah, thank you. 1218 00:58:26,380 --> 00:58:31,340 Yeah, I think I misunderstood the full range of the question. 1219 00:58:31,340 --> 00:58:33,500 Yeah, and that's roughly our same thing. 1220 00:58:33,500 --> 00:58:35,480 So we have-- 1221 00:58:35,480 --> 00:58:38,990 Eddie Golden is our research project manager, 1222 00:58:38,990 --> 00:58:42,050 who is my favorite person in the office for this kind of stuff. 1223 00:58:42,050 --> 00:58:45,650 She keeps on top of everything and makes 1224 00:58:45,650 --> 00:58:48,530 sure that the right people are listed on research 1225 00:58:48,530 --> 00:58:51,680 and that people are taken off, that kind of stuff. 1226 00:58:51,680 --> 00:58:54,710 But it's a good relationship with the IRB 1227 00:58:54,710 --> 00:58:56,730 on that kind of stuff. 1228 00:58:56,730 --> 00:58:57,670 Yeah? 1229 00:58:57,670 --> 00:59:00,990 AUDIENCE: So I'm somewhat unfamiliar with Agile software 1230 00:59:00,990 --> 00:59:02,130 development practices. 1231 00:59:02,130 --> 00:59:03,963 On a high level, it's just more parallelized 1232 00:59:03,963 --> 00:59:05,778 and we update more frequently? 1233 00:59:05,778 --> 00:59:06,820 MARK SHERVEY: Yeah, yeah. 1234 00:59:06,820 --> 00:59:08,490 I don't know if I took that slide out, 1235 00:59:08,490 --> 00:59:10,620 but there's something where Amazon 1236 00:59:10,620 --> 00:59:13,900 will deploy 50 million updates per year or something 1237 00:59:13,900 --> 00:59:14,400 like that. 1238 00:59:14,400 --> 00:59:19,980 So it's constantly on an update frequency 1239 00:59:19,980 --> 00:59:23,100 instead of just building everything up and then dropping 1240 00:59:23,100 --> 00:59:25,580 it. 1241 00:59:25,580 --> 00:59:28,290 And that's just been a new development in software. 1242 00:59:30,650 --> 00:59:32,650 AUDIENCE: Can we ask questions to both you guys? 1243 00:59:32,650 --> 00:59:34,570 ANDY CORAVOS: Yeah. 1244 00:59:34,570 --> 00:59:37,263 AUDIENCE: Can you tell us more about Elektra Labs? 1245 00:59:37,263 --> 00:59:38,430 I couldn't fully understand. 1246 00:59:38,430 --> 00:59:42,915 Are you guys more consultantancy for all these, we'll call them, 1247 00:59:42,915 --> 00:59:43,540 tool companies? 1248 00:59:43,540 --> 00:59:47,440 Or is it more like a lobbying kind of thing? 1249 00:59:47,440 --> 00:59:49,750 The reason I ask this is also because I wonder what 1250 00:59:49,750 --> 00:59:55,570 your opinion is on a third party source for determining 1251 00:59:55,570 --> 00:59:57,970 whether these things are a good or bad kind of thing 1252 00:59:57,970 --> 00:59:59,560 because it seems like the FDA would 1253 00:59:59,560 --> 01:00:01,775 have trouble understanding. 1254 01:00:01,775 --> 01:00:05,290 So if you had some organic certified kind of thing, 1255 01:00:05,290 --> 01:00:06,733 would that be a useful solution? 1256 01:00:06,733 --> 01:00:07,900 Or where does that go wrong? 1257 01:00:07,900 --> 01:00:08,983 ANDY CORAVOS: Mm-hm, yeah. 1258 01:00:08,983 --> 01:00:11,020 So what we're building with Elektra 1259 01:00:11,020 --> 01:00:13,730 is effectively a pharmacy for connecting technologies. 1260 01:00:13,730 --> 01:00:16,492 So the way that today you have pharmacies 1261 01:00:16,492 --> 01:00:18,700 that have a formulary of all the different drugs that 1262 01:00:18,700 --> 01:00:20,720 are available, this is effectively 1263 01:00:20,720 --> 01:00:22,720 like a digital pharmacy, like a Kelley Blue Book 1264 01:00:22,720 --> 01:00:24,040 of all the different tools. 1265 01:00:24,040 --> 01:00:27,040 And then we're building out a label for each of them 1266 01:00:27,040 --> 01:00:29,843 based on as much objective data as we can, 1267 01:00:29,843 --> 01:00:31,510 so that we're not scoring whether or not 1268 01:00:31,510 --> 01:00:32,510 something's good or bad. 1269 01:00:32,510 --> 01:00:34,270 Because in most instances, things 1270 01:00:34,270 --> 01:00:36,280 aren't good or bad and absolute, they're 1271 01:00:36,280 --> 01:00:38,380 good or bad for a purpose. 1272 01:00:38,380 --> 01:00:41,530 And so you can imagine something-- maybe 1273 01:00:41,530 --> 01:00:43,310 you need really high levels of accuracy, 1274 01:00:43,310 --> 01:00:44,230 so you need to know whether or not 1275 01:00:44,230 --> 01:00:45,938 that tool has been verified and validated 1276 01:00:45,938 --> 01:00:49,157 in certain contexts in certain patient populations. 1277 01:00:49,157 --> 01:00:50,740 Even if the tool's accurate, if you're 1278 01:00:50,740 --> 01:00:52,525 to recharge it all the time or you can't wear it 1279 01:00:52,525 --> 01:00:55,120 in the shower, you won't have the usability, or if the APIs 1280 01:00:55,120 --> 01:00:56,650 are really hard to work with. 1281 01:00:56,650 --> 01:00:59,860 And then security profile, whether or not they 1282 01:00:59,860 --> 01:01:02,410 have coordinated disclosure, how they handle things 1283 01:01:02,410 --> 01:01:06,130 like the tool companies, like a software bill of materials 1284 01:01:06,130 --> 01:01:07,570 and what kind of software is used. 1285 01:01:07,570 --> 01:01:09,580 And then even if the tool is accurate, 1286 01:01:09,580 --> 01:01:12,070 even if it's relatively usable, even if it's secure, 1287 01:01:12,070 --> 01:01:14,740 that doesn't solve the Cambridge Analytica problem, 1288 01:01:14,740 --> 01:01:18,130 so how tools are doing a third party transfer. 1289 01:01:18,130 --> 01:01:21,970 And so one of the philosophies is we don't score, 1290 01:01:21,970 --> 01:01:24,140 but we are building out the data set 1291 01:01:24,140 --> 01:01:26,455 so when you are evaluating a certain tool, 1292 01:01:26,455 --> 01:01:27,713 it's like a nutrition label. 1293 01:01:27,713 --> 01:01:29,380 Sometimes you need more sugar, sometimes 1294 01:01:29,380 --> 01:01:30,580 you need more protein. 1295 01:01:30,580 --> 01:01:32,350 Maybe you need more security, maybe 1296 01:01:32,350 --> 01:01:34,267 you really need to think about the data rates. 1297 01:01:34,267 --> 01:01:37,330 Maybe you can take a leave on some of the accuracy levels. 1298 01:01:37,330 --> 01:01:39,760 And so we're all building out this ability 1299 01:01:39,760 --> 01:01:42,460 to evaluate the tools, and then also to deploy them 1300 01:01:42,460 --> 01:01:45,970 like the way that a pharmacy would deploy them out. 1301 01:01:45,970 --> 01:01:48,260 One thing I would like to do with the group, 1302 01:01:48,260 --> 01:01:52,200 if you all are down for it, out of civic duty-- 1303 01:01:52,200 --> 01:01:53,400 and I'm serious, though. 1304 01:01:56,200 --> 01:02:00,160 Voting is very important and submitting your comments 1305 01:02:00,160 --> 01:02:02,600 to the public register is very important. 1306 01:02:02,600 --> 01:02:06,180 And I read all your comments because Irene sent them to me, 1307 01:02:06,180 --> 01:02:07,180 and they were very good. 1308 01:02:07,180 --> 01:02:10,180 And I know probably people who came 1309 01:02:10,180 --> 01:02:12,910 here want to polish everything and make them perfect. 1310 01:02:12,910 --> 01:02:15,640 You can submit them exactly how they are. 1311 01:02:15,640 --> 01:02:19,060 And I am very much hoping that we get about 95% 1312 01:02:19,060 --> 01:02:21,130 of you to submit, and the 5% of you that 1313 01:02:21,130 --> 01:02:25,390 didn't, like, your internet broke or something. 1314 01:02:25,390 --> 01:02:27,340 You can submit tonight. 1315 01:02:27,340 --> 01:02:31,120 I will email Irene because you already have done the work, 1316 01:02:31,120 --> 01:02:32,410 and you can submit it. 1317 01:02:32,410 --> 01:02:34,660 But I would like to just hear some of your thoughts. 1318 01:02:34,660 --> 01:02:36,360 So what I'm going to do is I'm going 1319 01:02:36,360 --> 01:02:39,850 to use that same framework around, what would you keep? 1320 01:02:39,850 --> 01:02:41,950 What would you change? 1321 01:02:41,950 --> 01:02:43,810 And then change can also include, like, 1322 01:02:43,810 --> 01:02:46,510 what was so confusing in there, that it didn't even 1323 01:02:46,510 --> 01:02:47,500 really make sense? 1324 01:02:47,500 --> 01:02:53,770 Part of the confusion might be that it was-- 1325 01:02:53,770 --> 01:02:55,120 some regulations are confusing. 1326 01:02:55,120 --> 01:02:57,790 But some of the confusion is that part of that document 1327 01:02:57,790 --> 01:02:59,770 was not written by people who-- 1328 01:02:59,770 --> 01:03:03,010 some people have technical backgrounds and some do not. 1329 01:03:03,010 --> 01:03:04,660 And so sometimes some of the language 1330 01:03:04,660 --> 01:03:06,202 might not actually be used in the way 1331 01:03:06,202 --> 01:03:08,920 that industry is using it today, so refining the language. 1332 01:03:08,920 --> 01:03:10,760 And then what did you see that was missing? 1333 01:03:10,760 --> 01:03:12,760 So here's what we're going to do. 1334 01:03:12,760 --> 01:03:29,560 Keep, change slash confusing, and then start or add. 1335 01:03:33,510 --> 01:03:36,818 And before I ask you, I want you to look 1336 01:03:36,818 --> 01:03:38,360 at the person next to you, seriously, 1337 01:03:38,360 --> 01:03:40,068 and if there's three of you, that's fine, 1338 01:03:40,068 --> 01:03:42,000 and I want you to tell them when you 1339 01:03:42,000 --> 01:03:43,390 will be submitting the comment. 1340 01:03:43,390 --> 01:03:46,660 Is it tonight, tomorrow, or you are choosing not to? 1341 01:03:49,700 --> 01:03:51,640 Just look at them and talk. 1342 01:03:51,640 --> 01:03:54,418 [INDISTINCT CHATTER] 1343 01:03:57,880 --> 01:03:58,960 There will be a link. 1344 01:03:58,960 --> 01:04:01,030 I will send you all links. 1345 01:04:01,030 --> 01:04:04,580 I will make this very easy. 1346 01:04:04,580 --> 01:04:05,080 OK. 1347 01:04:10,060 --> 01:04:10,560 All right. 1348 01:04:10,560 --> 01:04:12,035 Who wants to start? 1349 01:04:12,035 --> 01:04:13,410 We got three things on the board. 1350 01:04:16,980 --> 01:04:17,480 Yes? 1351 01:04:17,480 --> 01:04:18,740 AUDIENCE: So one thing that-- 1352 01:04:18,740 --> 01:04:21,740 I don't know if this is confusing or just intentionally 1353 01:04:21,740 --> 01:04:25,430 vague, but for things like quality systems and machine 1354 01:04:25,430 --> 01:04:28,100 learning practices, who sets those standards 1355 01:04:28,100 --> 01:04:29,893 and how can they be adapted or changed? 1356 01:04:29,893 --> 01:04:30,726 ANDY CORAVOS: Mm-hm. 1357 01:04:33,570 --> 01:04:35,640 I don't also know that answer, and so I 1358 01:04:35,640 --> 01:04:36,960 would like you to submit-- 1359 01:04:36,960 --> 01:04:39,850 one of the things that is nice is then 1360 01:04:39,850 --> 01:04:42,090 that you have to respond, yeah. 1361 01:04:42,090 --> 01:04:44,010 And I think it's also a little bit confusing, 1362 01:04:44,010 --> 01:04:44,790 even the language. 1363 01:04:44,790 --> 01:04:48,690 So people are using different things. 1364 01:04:48,690 --> 01:04:51,360 People call it GXP, good manufacturing practice, 1365 01:04:51,360 --> 01:04:52,390 good clinical practice. 1366 01:04:52,390 --> 01:04:54,390 These are maintained, I think, in some instances 1367 01:04:54,390 --> 01:04:56,710 by different orgs. 1368 01:04:56,710 --> 01:05:01,500 I wonder if good algorithm practice gap or good machine 1369 01:05:01,500 --> 01:05:02,977 learning practice-- 1370 01:05:02,977 --> 01:05:04,060 yeah, that's a good thing. 1371 01:05:04,060 --> 01:05:05,310 So who owns GXP? 1372 01:05:10,683 --> 01:05:11,183 OK. 1373 01:05:14,141 --> 01:05:16,130 Yes? 1374 01:05:16,130 --> 01:05:17,290 You didn't have a question? 1375 01:05:17,290 --> 01:05:17,790 No. 1376 01:05:17,790 --> 01:05:19,498 AUDIENCE: Just wanted to share something. 1377 01:05:19,498 --> 01:05:20,340 ANDY CORAVOS: Yes. 1378 01:05:20,340 --> 01:05:22,730 AUDIENCE: One of the things I found really to keep 1379 01:05:22,730 --> 01:05:24,900 were the examples in the appendix. 1380 01:05:24,900 --> 01:05:26,580 I don't know [INAUDIBLE]. 1381 01:05:37,110 --> 01:05:39,690 --general guidelines, and so it's 1382 01:05:39,690 --> 01:05:42,210 more that the language itself is more generalized 1383 01:05:42,210 --> 01:05:44,620 and so the examples are really hopeful for what 1384 01:05:44,620 --> 01:05:48,112 is a specific situation that's analogous to make. 1385 01:05:48,112 --> 01:05:50,580 ANDY CORAVOS: Yep. 1386 01:05:50,580 --> 01:05:51,375 Like that? 1387 01:05:51,375 --> 01:05:52,740 Yeah, examples are helpful. 1388 01:05:52,740 --> 01:05:53,270 Yep? 1389 01:05:53,270 --> 01:05:55,853 AUDIENCE: Speaking of specifics, I thought around transparency 1390 01:05:55,853 --> 01:05:57,520 they could have been much more specific 1391 01:05:57,520 --> 01:06:01,500 and that we should generally adhere to guidelines as opposed 1392 01:06:01,500 --> 01:06:05,110 to the exact set of data that is-- 1393 01:06:05,110 --> 01:06:07,670 this algorithm is exactly what's coming out of it, 1394 01:06:07,670 --> 01:06:10,220 the exact quality metrics, things 1395 01:06:10,220 --> 01:06:14,331 like that that hold people accountable as opposed to there 1396 01:06:14,331 --> 01:06:16,230 are many instances to not be transparent. 1397 01:06:16,230 --> 01:06:18,100 And so if those aren't as specific, 1398 01:06:18,100 --> 01:06:22,280 I worry that not that much really would happen there. 1399 01:06:22,280 --> 01:06:25,010 The analog like I thought of was when 1400 01:06:25,010 --> 01:06:26,640 Facebook asks for your data, they 1401 01:06:26,640 --> 01:06:29,937 say here are the things that we need or that we're using, 1402 01:06:29,937 --> 01:06:30,895 and it's very explicit. 1403 01:06:30,895 --> 01:06:33,300 And then you can have a choice of whether or not 1404 01:06:33,300 --> 01:06:35,402 you actually want that. 1405 01:06:35,402 --> 01:06:36,110 ANDY CORAVOS: OK. 1406 01:06:36,110 --> 01:06:38,318 AUDIENCE: So seeing something like that [INAUDIBLE].. 1407 01:06:40,757 --> 01:06:42,590 ANDY CORAVOS: So part of it is transparency, 1408 01:06:42,590 --> 01:06:45,653 but also user choice in data selection or-- 1409 01:06:45,653 --> 01:06:47,570 AUDIENCE: Yeah, I think that was, for me, more 1410 01:06:47,570 --> 01:06:49,980 of an analog because choice in the medical setting 1411 01:06:49,980 --> 01:06:52,460 is a bit more complex. 1412 01:06:52,460 --> 01:06:54,410 Someone who doesn't have the ability 1413 01:06:54,410 --> 01:06:57,310 in that case or the knowledge to actually make that choice. 1414 01:06:57,310 --> 01:06:58,720 ANDY CORAVOS: Yeah. 1415 01:06:58,720 --> 01:07:01,130 AUDIENCE: I think at the very least saying 1416 01:07:01,130 --> 01:07:05,142 this algorithm is using this, and maybe some sort of choice. 1417 01:07:05,142 --> 01:07:07,100 So you can work with someone, and maybe there's 1418 01:07:07,100 --> 01:07:09,170 some parameters around what you would or would not 1419 01:07:09,170 --> 01:07:09,810 have that choice. 1420 01:07:09,810 --> 01:07:10,560 ANDY CORAVOS: Yep. 1421 01:07:18,180 --> 01:07:19,378 Yes? 1422 01:07:19,378 --> 01:07:22,731 AUDIENCE: What if you added something about algorithm bias? 1423 01:07:22,731 --> 01:07:25,923 Because I know that that's been relevant for a lot 1424 01:07:25,923 --> 01:07:28,340 of other industries in terms of confidence 1425 01:07:28,340 --> 01:07:32,466 within the legal system, and then also 1426 01:07:32,466 --> 01:07:36,735 in terms of facial recognition not working fully across races. 1427 01:07:36,735 --> 01:07:40,088 So I think that breaking things down by population 1428 01:07:40,088 --> 01:07:43,441 and ensuring equitable across different populations 1429 01:07:43,441 --> 01:07:46,110 is important. 1430 01:07:46,110 --> 01:07:49,110 ANDY CORAVOS: Yep. 1431 01:07:49,110 --> 01:07:50,950 I don't know if I slept enough, so if I just 1432 01:07:50,950 --> 01:07:53,450 gave this example-- but a friend of mine called me last week 1433 01:07:53,450 --> 01:07:58,190 and asked for PPGs, so the sensor on the back. 1434 01:07:58,190 --> 01:08:01,940 She was asking me if it works on all skin colors 1435 01:08:01,940 --> 01:08:03,732 and whether or not it responds differently. 1436 01:08:03,732 --> 01:08:05,648 And if it responds differently, whether or not 1437 01:08:05,648 --> 01:08:06,890 somebody has a tattoo. 1438 01:08:06,890 --> 01:08:08,960 And so for some of the big registries that 1439 01:08:08,960 --> 01:08:10,940 are doing bring your own device data, 1440 01:08:10,940 --> 01:08:13,940 you can have unintended biases in the data sets 1441 01:08:13,940 --> 01:08:16,330 just because of how that it's processing. 1442 01:08:16,330 --> 01:08:16,830 So yeah. 1443 01:08:19,590 --> 01:08:20,859 What do you think? 1444 01:08:20,859 --> 01:08:22,720 What are ways-- 1445 01:08:22,720 --> 01:08:25,828 I think Irene's worked with some of this. 1446 01:08:25,828 --> 01:08:27,370 How do you think about whether or not 1447 01:08:27,370 --> 01:08:30,609 something is-- what would be a good system for the agency 1448 01:08:30,609 --> 01:08:33,897 to consider around bias? 1449 01:08:33,897 --> 01:08:35,939 AUDIENCE: I think maybe coming into consideration 1450 01:08:35,939 --> 01:08:39,490 with [INAUDIBLE] system might be part of the GNLP. 1451 01:08:39,490 --> 01:08:42,650 But I think it would be the responsibility of the designer 1452 01:08:42,650 --> 01:08:44,373 to assess [INAUDIBLE]. 1453 01:08:57,626 --> 01:08:59,610 ANDY CORAVOS: OK. 1454 01:08:59,610 --> 01:09:01,260 AUDIENCE: As a note, bearing this 1455 01:09:01,260 --> 01:09:04,260 is our next lecture, so anyone who might be confused or want 1456 01:09:04,260 --> 01:09:07,927 to talk about it more, we will have plenty material next time. 1457 01:09:12,149 --> 01:09:14,415 ANDY CORAVOS: You want to pick someone? 1458 01:09:14,415 --> 01:09:17,873 MARK SHERVEY: I'm sorry. 1459 01:09:17,873 --> 01:09:19,355 Go ahead. 1460 01:09:19,355 --> 01:09:20,048 AUDIENCE: Me? 1461 01:09:20,048 --> 01:09:20,840 MARK SHERVEY: Yeah. 1462 01:09:20,840 --> 01:09:22,295 ANDY CORAVOS: Cold call. 1463 01:09:22,295 --> 01:09:25,223 [LAUGHTER] 1464 01:09:28,640 --> 01:09:29,710 Yeah? 1465 01:09:29,710 --> 01:09:32,218 AUDIENCE: Just adding off at another place, so it looked 1466 01:09:32,218 --> 01:09:33,760 like there was a period for providing 1467 01:09:33,760 --> 01:09:36,819 periodic reportings to the FDA on updates and all that. 1468 01:09:36,819 --> 01:09:39,250 There could also be like a scorecard 1469 01:09:39,250 --> 01:09:41,939 of bias on subpopulations or something to that effect. 1470 01:09:41,939 --> 01:09:43,349 ANDY CORAVOS: Mm-hm. 1471 01:09:43,349 --> 01:09:45,229 That's cool. 1472 01:09:45,229 --> 01:09:48,529 Have you seen any places that do something like that? 1473 01:09:48,529 --> 01:09:51,380 AUDIENCE: I remember when I read Weapons of Math Destruction 1474 01:09:51,380 --> 01:09:54,189 from Cathy O'Neil, she mentioned some sort of famous audit. 1475 01:09:54,189 --> 01:09:55,981 But I don't really remember the details. 1476 01:09:55,981 --> 01:09:56,773 ANDY CORAVOS: Yeah. 1477 01:10:09,050 --> 01:10:11,900 When you do submit your comment, if you have ideas or links, 1478 01:10:11,900 --> 01:10:14,420 or it can be posts or blogs or whatever, 1479 01:10:14,420 --> 01:10:18,050 just link them in because one thing that you'll find 1480 01:10:18,050 --> 01:10:20,600 is that we read a lot of things, probably 1481 01:10:20,600 --> 01:10:23,570 the same things on Twitter, but other groups don't necessarily 1482 01:10:23,570 --> 01:10:24,360 see all of that. 1483 01:10:24,360 --> 01:10:28,060 So I think that Cathy O'Neil is really interesting work, 1484 01:10:28,060 --> 01:10:29,600 but yeah, just tag stuff. 1485 01:10:29,600 --> 01:10:33,300 It doesn't have to be formatted amazingly. 1486 01:10:33,300 --> 01:10:36,680 PROFESSOR: So in some of the communities 1487 01:10:36,680 --> 01:10:43,430 that I follow, not on Twitter but email and on the web, 1488 01:10:43,430 --> 01:10:47,450 there's been a lot of discussion about really terrible design 1489 01:10:47,450 --> 01:10:50,300 of information systems in hospitals 1490 01:10:50,300 --> 01:10:52,460 and how these lead to errors. 1491 01:10:52,460 --> 01:10:54,500 Now, I know from your slide, Andy, 1492 01:10:54,500 --> 01:10:59,660 that the FDA has defined those to be out of its purview. 1493 01:10:59,660 --> 01:11:02,210 But it seems to me that there's probably, 1494 01:11:02,210 --> 01:11:06,830 at the moment, more harm being done by information 1495 01:11:06,830 --> 01:11:10,040 systems that encourage really bad practice 1496 01:11:10,040 --> 01:11:12,590 or that allow bad practice than there 1497 01:11:12,590 --> 01:11:18,560 is by retinopathy, AI, machine learning 1498 01:11:18,560 --> 01:11:21,180 techniques that make mistakes. 1499 01:11:21,180 --> 01:11:23,420 So just this morning, for example, somebody 1500 01:11:23,420 --> 01:11:26,870 posted a message about a patient who 1501 01:11:26,870 --> 01:11:34,901 had a heart rate of 12,000, which seems extremely unlikely. 1502 01:11:34,901 --> 01:11:37,306 [LAUGHTER] 1503 01:11:37,306 --> 01:11:38,750 ANDY CORAVOS: Yep. 1504 01:11:38,750 --> 01:11:40,760 PROFESSOR: And the problem is that when 1505 01:11:40,760 --> 01:11:43,850 you start automating processes that 1506 01:11:43,850 --> 01:11:47,030 are based on the information that is collected 1507 01:11:47,030 --> 01:11:50,900 in these systems, things can go really screwy when 1508 01:11:50,900 --> 01:11:52,890 you get garbage data. 1509 01:11:52,890 --> 01:11:53,810 ANDY CORAVOS: Yeah. 1510 01:11:53,810 --> 01:11:56,870 Have you thought about that with your system? 1511 01:11:56,870 --> 01:11:58,625 MARK SHERVEY: We cannot get good data. 1512 01:11:58,625 --> 01:12:01,250 I mean, you're not going to get good data out of those systems. 1513 01:12:01,250 --> 01:12:06,730 What you're seeing is across the board, 1514 01:12:06,730 --> 01:12:08,810 and there's not much you can do about 1515 01:12:08,810 --> 01:12:12,780 it other than validate good ranges and go from there. 1516 01:12:12,780 --> 01:12:14,970 PROFESSOR: Well, I can think of things to do. 1517 01:12:14,970 --> 01:12:21,350 For example, if FDA were interested in regulating 1518 01:12:21,350 --> 01:12:23,250 such devices-- 1519 01:12:23,250 --> 01:12:25,058 oh, sorry, such tools-- 1520 01:12:25,058 --> 01:12:25,850 ANDY CORAVOS: Yeah. 1521 01:12:25,850 --> 01:12:27,390 Well, they would regulate devices. 1522 01:12:27,390 --> 01:12:29,233 So one of the funny things with FDA is-- 1523 01:12:29,233 --> 01:12:30,650 and I should have mentioned this-- 1524 01:12:30,650 --> 01:12:34,670 is the FDA does not regulate the practice of medicine. 1525 01:12:34,670 --> 01:12:39,320 So doctors can do whatever they want. 1526 01:12:39,320 --> 01:12:41,650 They regulate-- well, you should look up 1527 01:12:41,650 --> 01:12:45,950 exactly-- the way I interpret it is they regulate the marketing 1528 01:12:45,950 --> 01:12:47,660 that a manufacturer would do. 1529 01:12:47,660 --> 01:12:51,920 So I actually wonder if the EHRs would be considered practice 1530 01:12:51,920 --> 01:12:56,660 of medicine or if it would be a marketing from the EHR company, 1531 01:12:56,660 --> 01:12:59,803 and maybe that's how it could be under their purview. 1532 01:12:59,803 --> 01:13:00,470 PROFESSOR: Yeah. 1533 01:13:00,470 --> 01:13:01,262 ANDY CORAVOS: Yeah. 1534 01:13:05,600 --> 01:13:06,403 Yes? 1535 01:13:06,403 --> 01:13:07,820 AUDIENCE: I guess something that I 1536 01:13:07,820 --> 01:13:09,403 was surprised not to see as much about 1537 01:13:09,403 --> 01:13:11,507 were privacy issues in this. 1538 01:13:11,507 --> 01:13:13,840 I know there's ways where you can train machine learning 1539 01:13:13,840 --> 01:13:17,438 models and extract information that the data was trained on. 1540 01:13:17,438 --> 01:13:18,980 At least I'm pretty sure that exists. 1541 01:13:18,980 --> 01:13:20,880 It's not my expertise. 1542 01:13:20,880 --> 01:13:25,210 But I was wondering if anything like that [INAUDIBLE] have 1543 01:13:25,210 --> 01:13:27,205 someone try to extract the data that you can't. 1544 01:13:27,205 --> 01:13:29,705 But you talked about that a lot in your section of the talk, 1545 01:13:29,705 --> 01:13:33,500 but I don't remember it as much [INAUDIBLE].. 1546 01:13:33,500 --> 01:13:36,430 ANDY CORAVOS: OK. 1547 01:13:36,430 --> 01:13:36,930 Yep. 1548 01:13:47,985 --> 01:13:49,360 Realistically, how many of you do 1549 01:13:49,360 --> 01:13:52,105 think you'll actually submit a comment? 1550 01:13:52,105 --> 01:13:53,400 A couple. 1551 01:13:53,400 --> 01:13:56,400 So if you're thinking you wouldn't submit a comment, just 1552 01:13:56,400 --> 01:13:58,940 out of curiosity, I won't argue with you, 1553 01:13:58,940 --> 01:14:00,787 I'm just curious, what would hold you back 1554 01:14:00,787 --> 01:14:01,870 from submitting a comment? 1555 01:14:05,810 --> 01:14:08,300 If you didn't raise your hand now, I get to cold call you. 1556 01:14:08,300 --> 01:14:09,980 Yes? 1557 01:14:09,980 --> 01:14:12,250 AUDIENCE: I raised my hand before. 1558 01:14:12,250 --> 01:14:13,840 We were just talking. 1559 01:14:13,840 --> 01:14:16,060 Most of us have our computers open now. 1560 01:14:16,060 --> 01:14:19,447 If you really want us to submit it as is, if you put it up, 1561 01:14:19,447 --> 01:14:20,280 we could all submit. 1562 01:14:20,280 --> 01:14:20,988 ANDY CORAVOS: OK. 1563 01:14:20,988 --> 01:14:21,623 OK, OK. 1564 01:14:21,623 --> 01:14:23,102 Wow. 1565 01:14:23,102 --> 01:14:24,820 AUDIENCE: We are 95% [INAUDIBLE].. 1566 01:14:24,820 --> 01:14:25,820 ANDY CORAVOS: All right. 1567 01:14:32,800 --> 01:14:36,730 PROFESSOR: So while Andy is looking that up, 1568 01:14:36,730 --> 01:14:39,980 I should say when the HIPAA regulations, the privacy 1569 01:14:39,980 --> 01:14:45,620 regulations were first proposed, the initial version got 70,000 1570 01:14:45,620 --> 01:14:48,470 public comments about it. 1571 01:14:48,470 --> 01:14:53,900 And it is really true that the regulatory agency, 1572 01:14:53,900 --> 01:14:56,600 in that case, it was Health and Human Services, 1573 01:14:56,600 --> 01:15:00,240 had to respond to every one of those by law. 1574 01:15:00,240 --> 01:15:03,890 And so they published reams of paper 1575 01:15:03,890 --> 01:15:06,240 about responding to all those requests. 1576 01:15:06,240 --> 01:15:10,130 So they will take your comments seriously because they have to. 1577 01:15:15,610 --> 01:15:18,530 AUDIENCE: I was going to say, is there any way of anonymously 1578 01:15:18,530 --> 01:15:19,030 commenting? 1579 01:15:19,030 --> 01:15:21,926 Or does it have to be tied to us, out of curiosity? 1580 01:15:25,645 --> 01:15:26,770 ANDY CORAVOS: I don't know. 1581 01:15:26,770 --> 01:15:30,338 I think it's generally-- 1582 01:15:30,338 --> 01:15:32,130 I don't know, I'd have to look at it again. 1583 01:15:32,130 --> 01:15:34,135 I think most of them are public comments. 1584 01:15:34,135 --> 01:15:35,510 I mean, I guess if you wanted to, 1585 01:15:35,510 --> 01:15:40,220 maybe you could coordinate your comments and you could-- 1586 01:15:40,220 --> 01:15:41,960 yeah, OK. 1587 01:15:41,960 --> 01:15:45,542 Irene is willing to group comment. 1588 01:15:45,542 --> 01:15:47,750 So you can also send if you'd like to do it that way, 1589 01:15:47,750 --> 01:15:50,350 and it can be a set of class comments, if you would prefer. 1590 01:15:55,880 --> 01:16:02,620 The Bitly is capital MIT all lowercase loves FDA, will 1591 01:16:02,620 --> 01:16:09,690 send you over to the docket. 1592 01:16:09,690 --> 01:16:12,120 I'm amazed that that Bitly has not been taken already. 1593 01:16:15,577 --> 01:16:17,660 PROFESSOR: [INAUDIBLE] has been asleep on the job. 1594 01:16:22,800 --> 01:16:25,330 ANDY CORAVOS: What other questions do you all have? 1595 01:16:25,330 --> 01:16:26,190 Yes? 1596 01:16:26,190 --> 01:16:32,990 AUDIENCE: So what is the line between an EHR and a SaMD? 1597 01:16:32,990 --> 01:16:36,760 Because it said it earlier that EHR is exempted, 1598 01:16:36,760 --> 01:16:38,930 but then it also says, oh, for example, with SaMD, 1599 01:16:38,930 --> 01:16:43,590 it could be collecting physiological signals, 1600 01:16:43,590 --> 01:16:45,300 and then they might send an audible alarm 1601 01:16:45,300 --> 01:16:46,825 to indicate [INAUDIBLE]. 1602 01:16:46,825 --> 01:16:48,700 And my understanding is some of EHRs do that. 1603 01:16:48,700 --> 01:16:49,250 ANDY CORAVOS: Mm-hm. 1604 01:16:49,250 --> 01:16:51,792 AUDIENCE: And so would they need to be retroactively approved 1605 01:16:51,792 --> 01:16:53,350 and partially SaMD-ified? 1606 01:16:53,350 --> 01:16:54,578 Or how's that work? 1607 01:16:54,578 --> 01:16:56,120 ANDY CORAVOS: So I'm not a regulator, 1608 01:16:56,120 --> 01:16:58,270 so you should ask your regulator. 1609 01:16:58,270 --> 01:17:01,757 A couple resources that could help you decide this is, 1610 01:17:01,757 --> 01:17:04,090 again, it's about what you're claiming the product does, 1611 01:17:04,090 --> 01:17:07,060 perhaps not what it actually does. 1612 01:17:07,060 --> 01:17:09,340 The next thing, which I don't think-- 1613 01:17:09,340 --> 01:17:13,180 I mean, I think if it really does that, you should also 1614 01:17:13,180 --> 01:17:15,280 claim that it does what it does, especially 1615 01:17:15,280 --> 01:17:17,200 if it's confusing for people. 1616 01:17:17,200 --> 01:17:20,390 There's a couple of regulations that might be helpful. 1617 01:17:20,390 --> 01:17:22,630 One is called Clinical Decision Support. 1618 01:17:22,630 --> 01:17:25,780 And if you read any FDA things, they love their algorithms-- 1619 01:17:25,780 --> 01:17:27,380 I mean, they love their algorithms, 1620 01:17:27,380 --> 01:17:28,900 but they also love their acronyms. 1621 01:17:28,900 --> 01:17:36,730 So Clinical Decision Support is CDS, and then also 1622 01:17:36,730 --> 01:17:40,160 Patient Decision Support. 1623 01:17:40,160 --> 01:17:42,100 There's a guidance that just came out 1624 01:17:42,100 --> 01:17:45,040 around the two types of decision support tools, 1625 01:17:45,040 --> 01:17:47,860 and I would guess maybe that is supporting 1626 01:17:47,860 --> 01:17:49,000 a decision, that EHR. 1627 01:17:49,000 --> 01:17:51,310 So it might actually be considered something 1628 01:17:51,310 --> 01:17:53,560 that would be regulated. 1629 01:17:53,560 --> 01:17:56,080 There's also a lot of weird-- 1630 01:17:56,080 --> 01:17:58,390 we didn't go into it, but there are many instances 1631 01:17:58,390 --> 01:18:00,445 where something might actually be a device 1632 01:18:00,445 --> 01:18:03,070 and the FDA says it's a device, but it will do something called 1633 01:18:03,070 --> 01:18:05,290 enforcement discretion, which says it's a device 1634 01:18:05,290 --> 01:18:07,960 but we will not regulate it as such. 1635 01:18:07,960 --> 01:18:10,840 Which is actually a little bit risky for a manufacturer 1636 01:18:10,840 --> 01:18:14,110 because you are device, but you can now go straight to market. 1637 01:18:14,110 --> 01:18:16,360 In some instances, you still have to register and list 1638 01:18:16,360 --> 01:18:19,000 the product, but you don't have to necessarily get reviewed. 1639 01:18:19,000 --> 01:18:22,150 And it also could eventually be reviewed. 1640 01:18:22,150 --> 01:18:25,103 So the line of, is it a device, is it a device 1641 01:18:25,103 --> 01:18:26,770 and you have to register, is it a device 1642 01:18:26,770 --> 01:18:29,830 and you have to get cleared or approved, 1643 01:18:29,830 --> 01:18:31,310 is why you should early and often-- 1644 01:18:33,910 --> 01:18:35,170 yes? 1645 01:18:35,170 --> 01:18:37,640 AUDIENCE: I enjoyed your game with regards 1646 01:18:37,640 --> 01:18:38,520 to Fitbit and Apple. 1647 01:18:38,520 --> 01:18:41,210 And I have a question about the app. 1648 01:18:41,210 --> 01:18:42,810 I know that you're not Apple either, 1649 01:18:42,810 --> 01:18:48,922 but why do you think that Apple went for FDA approval versus 1650 01:18:48,922 --> 01:18:51,135 Fitbit who didn't? 1651 01:18:51,135 --> 01:18:53,870 What were the motivations for the companies to do that? 1652 01:18:53,870 --> 01:18:57,670 ANDY CORAVOS: I would say, in public documents, 1653 01:18:57,670 --> 01:19:01,090 Fitbit has expressed an interest in working with the FDA. 1654 01:19:01,090 --> 01:19:04,000 I don't know at what point they have decided what they 1655 01:19:04,000 --> 01:19:05,620 submitted or had their package. 1656 01:19:05,620 --> 01:19:07,990 They're also working with the pre-cert program. 1657 01:19:07,990 --> 01:19:13,040 So I don't know what's happening behind the scenes. 1658 01:19:13,040 --> 01:19:14,690 Yeah? 1659 01:19:14,690 --> 01:19:17,840 AUDIENCE: Does it give them a business edge, perhaps, 1660 01:19:17,840 --> 01:19:19,395 to get FDA approval? 1661 01:19:19,395 --> 01:19:21,020 ANDY CORAVOS: I cannot comment on that. 1662 01:19:21,020 --> 01:19:21,700 AUDIENCE: OK, no worries. 1663 01:19:21,700 --> 01:19:22,492 ANDY CORAVOS: Yeah. 1664 01:19:27,660 --> 01:19:29,430 I would say, generally, people want 1665 01:19:29,430 --> 01:19:31,170 to use tools that are trustworthy, 1666 01:19:31,170 --> 01:19:34,770 and developing more tools that have somebody of evidence 1667 01:19:34,770 --> 01:19:36,070 is a really important thing. 1668 01:19:36,070 --> 01:19:38,550 I think the FDA is one way of having evidence. 1669 01:19:38,550 --> 01:19:41,970 I think there are other ways that tools and devices can 1670 01:19:41,970 --> 01:19:43,567 continue to build evidence. 1671 01:19:43,567 --> 01:19:45,900 My hope is over time, that a lot of these things that we 1672 01:19:45,900 --> 01:19:49,620 consider to be wellness tools also have evidence around them. 1673 01:19:49,620 --> 01:19:52,590 Maybe in some instances we don't always regulate vitamins, 1674 01:19:52,590 --> 01:19:54,900 but you want to still trust that your vitamin doesn't 1675 01:19:54,900 --> 01:19:58,060 have sawdust in it, right, and that it's a real product. 1676 01:19:58,060 --> 01:20:01,860 And so the more that we, I think, 1677 01:20:01,860 --> 01:20:04,680 push companies to have evidence and that we 1678 01:20:04,680 --> 01:20:09,140 use products that do that, I hope over time this helps us. 1679 01:20:09,140 --> 01:20:11,230 PROFESSOR: Does it give them any legal protection 1680 01:20:11,230 --> 01:20:16,737 to have it be classified as an FDA device? 1681 01:20:16,737 --> 01:20:18,320 ANDY CORAVOS: I'm not sure about that. 1682 01:20:21,860 --> 01:20:24,240 Historically, it has helped with reimbursement. 1683 01:20:24,240 --> 01:20:28,730 So a class two product has been easier to reimburse. 1684 01:20:28,730 --> 01:20:31,100 That also is generally changing, but that 1685 01:20:31,100 --> 01:20:32,923 helps with the business model around that. 1686 01:20:32,923 --> 01:20:33,590 PROFESSOR: Yeah. 1687 01:20:36,670 --> 01:20:40,080 Well, I want to thank you both very much. 1688 01:20:40,080 --> 01:20:41,410 That was really interesting. 1689 01:20:41,410 --> 01:20:45,190 And I do encourage all of you to participate 1690 01:20:45,190 --> 01:20:49,510 in this regulatory process by submitting your comments. 1691 01:20:49,510 --> 01:20:52,440 And I enjoyed the presentations. 1692 01:20:52,440 --> 01:20:52,940 Thank you. 1693 01:20:52,940 --> 01:20:53,390 ANDY CORAVOS: Yeah. 1694 01:20:53,390 --> 01:20:54,030 MARK SHERVEY: Thank you. 1695 01:20:54,030 --> 01:20:55,030 ANDY CORAVOS: Thank you. 1696 01:20:55,030 --> 01:20:58,280 [APPLAUSE]