1 00:00:15,120 --> 00:00:19,440 PETER SZOLOVITS: Fortunately, I have a guest today, Dr. Adam 2 00:00:19,440 --> 00:00:24,570 Wright, who will be doing an interview-style session 3 00:00:24,570 --> 00:00:27,360 and will answer questions for you. 4 00:00:27,360 --> 00:00:30,720 This is Adam's bread and butter, exactly how 5 00:00:30,720 --> 00:00:35,440 to translate this kind of technology into the clinic. 6 00:00:35,440 --> 00:00:40,260 He's currently in the partner system at the Brigham, I guess. 7 00:00:40,260 --> 00:00:45,840 But he's about to become a traitor and leave us in Boston 8 00:00:45,840 --> 00:00:51,330 and occupy a position at Vanderbilt University, 9 00:00:51,330 --> 00:00:52,810 for which we wish him luck. 10 00:00:52,810 --> 00:00:57,810 But I'm glad that we caught him before he leaves this summer. 11 00:00:57,810 --> 00:01:02,340 OK, so quite frankly, I wish that I could tell you 12 00:01:02,340 --> 00:01:04,860 a much happier story than the one 13 00:01:04,860 --> 00:01:08,730 that you're going to hear from me during the prepared part 14 00:01:08,730 --> 00:01:10,380 of my talk. 15 00:01:10,380 --> 00:01:15,780 And maybe Adam will cheer us up and make us more optimistic, 16 00:01:15,780 --> 00:01:19,680 based on his experience. 17 00:01:19,680 --> 00:01:23,700 So you may have noticed that AI is hot. 18 00:01:23,700 --> 00:01:27,990 So HIMSS, for example, is the Health Information Management 19 00:01:27,990 --> 00:01:29,670 Systems Society. 20 00:01:29,670 --> 00:01:33,520 It's a big-- they hold annual meetings 21 00:01:33,520 --> 00:01:37,800 and consist of a lot of vendors and a lot of academics. 22 00:01:37,800 --> 00:01:41,670 And it's one of these huge trade show kinds of things, 23 00:01:41,670 --> 00:01:48,930 with balloons hanging over booths and big open spaces. 24 00:01:48,930 --> 00:01:51,240 So for example, they're now talking 25 00:01:51,240 --> 00:01:54,120 about a AI-powered health care. 26 00:01:54,120 --> 00:01:56,820 On the other hand, it's important to remember 27 00:01:56,820 --> 00:01:58,260 this graph. 28 00:01:58,260 --> 00:02:03,210 So this is the sort of technology adoption graph. 29 00:02:03,210 --> 00:02:06,300 And it's called the hype cycle. 30 00:02:06,300 --> 00:02:10,500 And what you see here is that R&D-- 31 00:02:10,500 --> 00:02:15,600 that's us-- produces some wonderful, interesting idea. 32 00:02:15,600 --> 00:02:19,380 And then all of a sudden, people get excited about it. 33 00:02:19,380 --> 00:02:22,205 So who are the people that get most excited about it? 34 00:02:22,205 --> 00:02:23,830 It's the people who think they're going 35 00:02:23,830 --> 00:02:26,080 to make a fortune from it. 36 00:02:26,080 --> 00:02:29,250 And these are the so-called vulture capitalists-- 37 00:02:29,250 --> 00:02:32,620 venture capitalists. 38 00:02:32,620 --> 00:02:35,010 And so the venture capitalists come in 39 00:02:35,010 --> 00:02:37,740 and they encourage people like us to go out 40 00:02:37,740 --> 00:02:43,290 and found companies-- or if not us, then our students to go 41 00:02:43,290 --> 00:02:44,370 found companies. 42 00:02:44,370 --> 00:02:48,900 And figure out how to turn this nascent idea 43 00:02:48,900 --> 00:02:53,620 into some important moneymaking enterprise. 44 00:02:53,620 --> 00:02:56,710 Now the secret of venture capital 45 00:02:56,710 --> 00:03:02,560 is that they know that about 90% of the companies that they fund 46 00:03:02,560 --> 00:03:04,180 are going to tank. 47 00:03:04,180 --> 00:03:06,290 They're going to do very badly. 48 00:03:06,290 --> 00:03:11,290 And so as a result, what they hope for and what they expect-- 49 00:03:11,290 --> 00:03:13,600 and what the good ones actually get-- 50 00:03:13,600 --> 00:03:18,310 is that one in 10 that becomes successful 51 00:03:18,310 --> 00:03:20,410 makes so much money that it makes up 52 00:03:20,410 --> 00:03:24,010 for all of the investment that they poured into the nine out 53 00:03:24,010 --> 00:03:26,390 of 10 that do badly. 54 00:03:26,390 --> 00:03:31,040 So I actually remember in the 1990s, 55 00:03:31,040 --> 00:03:35,170 I was helping a group pitch a company 56 00:03:35,170 --> 00:03:39,070 to Kleiner Perkins, which is the big venture-- 57 00:03:39,070 --> 00:03:43,240 one of the big venture capital funds in Silicon Valley. 58 00:03:43,240 --> 00:03:45,130 And we walked into their boardroom 59 00:03:45,130 --> 00:03:48,790 and they had a copy of the San Jose Mercury News, 60 00:03:48,790 --> 00:03:55,270 which is the local newspaper for Silicon Valley, on their table. 61 00:03:55,270 --> 00:03:57,250 And they were just beaming, because there 62 00:03:57,250 --> 00:04:02,920 was an article that said that in the past year, 63 00:04:02,920 --> 00:04:07,600 the two best and the two worst investments in Silicon Valley 64 00:04:07,600 --> 00:04:11,240 had been by their company. 65 00:04:11,240 --> 00:04:13,190 But that's pretty good, right? 66 00:04:13,190 --> 00:04:17,870 If you get two winners and two really bad losers, 67 00:04:17,870 --> 00:04:20,790 you're making tons and tons of money. 68 00:04:20,790 --> 00:04:23,250 So they were in a good mood and they funded us. 69 00:04:23,250 --> 00:04:24,500 We didn't make them any money. 70 00:04:28,970 --> 00:04:31,370 So what you see on this curve is that there 71 00:04:31,370 --> 00:04:36,220 is a kind of set of rising expectations that 72 00:04:36,220 --> 00:04:40,660 comes from the development of these technologies. 73 00:04:40,660 --> 00:04:42,820 And you have some early adopters. 74 00:04:42,820 --> 00:04:45,430 And then you have the newspapers writing 75 00:04:45,430 --> 00:04:48,520 about how this is the revolution and everything will 76 00:04:48,520 --> 00:04:51,250 be different from here on out. 77 00:04:51,250 --> 00:04:54,490 Then you have some additional activity 78 00:04:54,490 --> 00:04:56,350 beyond the early adopters. 79 00:04:56,350 --> 00:04:58,660 And then people start looking at this and going, 80 00:04:58,660 --> 00:05:03,940 well, it really isn't as good as it's cracked up to be. 81 00:05:03,940 --> 00:05:08,470 Then you have the steep decline where 82 00:05:08,470 --> 00:05:11,660 there's some consolidation and some failures. 83 00:05:11,660 --> 00:05:13,930 And people have to go back to venture capital 84 00:05:13,930 --> 00:05:16,480 to try to get more money in order 85 00:05:16,480 --> 00:05:18,760 to keep their companies going. 86 00:05:18,760 --> 00:05:20,500 And then there's a kind of trough, 87 00:05:20,500 --> 00:05:25,630 where people go, oh well, this was another of these failed 88 00:05:25,630 --> 00:05:29,050 technological innovations. 89 00:05:29,050 --> 00:05:35,980 Then gradually, you start reaching what this author calls 90 00:05:35,980 --> 00:05:40,420 the slope of enlightenment, where people realize that, 91 00:05:40,420 --> 00:05:44,530 OK, it's not really as bad as we thought it was when it didn't 92 00:05:44,530 --> 00:05:47,080 meet our lofty expectations. 93 00:05:47,080 --> 00:05:49,790 And then gradually, if it's successful, 94 00:05:49,790 --> 00:05:52,390 then you get multiple generations of the product 95 00:05:52,390 --> 00:05:54,610 and it does achieve adoption. 96 00:05:54,610 --> 00:05:58,870 The adoption almost never reaches the peak 97 00:05:58,870 --> 00:06:03,160 that it was expected to reach at the time of the top of the hype 98 00:06:03,160 --> 00:06:04,330 cycle. 99 00:06:04,330 --> 00:06:05,380 But it becomes useful. 100 00:06:05,380 --> 00:06:06,820 It becomes profitable. 101 00:06:06,820 --> 00:06:09,280 It becomes productive. 102 00:06:09,280 --> 00:06:13,420 Now I've been around long enough to see a number of these cycles 103 00:06:13,420 --> 00:06:14,560 go by. 104 00:06:14,560 --> 00:06:20,440 So in the 1980s, for example, at a time that was now jokingly 105 00:06:20,440 --> 00:06:22,690 referred to as AI summer-- 106 00:06:22,690 --> 00:06:26,650 where people were building expert systems and these expert 107 00:06:26,650 --> 00:06:31,600 systems were going to just revolutionize everything-- 108 00:06:31,600 --> 00:06:36,070 I remember going to a conference where the Campbell Soup 109 00:06:36,070 --> 00:06:40,900 Company had built an expert system that 110 00:06:40,900 --> 00:06:43,870 was based on the expertise of some old timers who 111 00:06:43,870 --> 00:06:45,520 were retiring. 112 00:06:45,520 --> 00:06:48,610 And what this expert system did is it told you 113 00:06:48,610 --> 00:06:51,280 how to clean the vats of soup-- 114 00:06:51,280 --> 00:06:54,880 y know, these giant million-gallon things where 115 00:06:54,880 --> 00:06:56,650 they make soup-- 116 00:06:56,650 --> 00:06:59,290 when you're switching from making one kind of soup 117 00:06:59,290 --> 00:07:00,920 to another. 118 00:07:00,920 --> 00:07:04,600 So you know, if you're making beef consomme 119 00:07:04,600 --> 00:07:07,720 and you switch to making beef barley soup, 120 00:07:07,720 --> 00:07:10,400 you don't need to clean the vat at all. 121 00:07:10,400 --> 00:07:16,780 Whereas if you're switching from something like clam chowder 122 00:07:16,780 --> 00:07:20,740 to a consomme, then you need to clean it really well. 123 00:07:20,740 --> 00:07:24,010 So this was exactly the kind of thing that they were doing. 124 00:07:24,010 --> 00:07:26,890 And there were literally thousands 125 00:07:26,890 --> 00:07:30,070 of these applications being built. 126 00:07:30,070 --> 00:07:33,220 At the top of the hype cycle, all kinds 127 00:07:33,220 --> 00:07:36,010 of companies, like Campbell's Soup and the airlines 128 00:07:36,010 --> 00:07:40,880 and everybody was investing huge amounts of money into this. 129 00:07:40,880 --> 00:07:44,450 And then there was a kind of failure of expectations. 130 00:07:44,450 --> 00:07:47,110 These didn't turn out to be as good as people thought 131 00:07:47,110 --> 00:07:50,230 they were going to be, or as valuable as people 132 00:07:50,230 --> 00:07:52,060 thought they were going to be. 133 00:07:52,060 --> 00:07:55,330 And then all of a sudden came AI winter. 134 00:07:55,330 --> 00:07:57,640 So AI winter followed AI summer. 135 00:07:57,640 --> 00:08:02,170 There was no AI fall, except in a different sense 136 00:08:02,170 --> 00:08:04,130 of the word fall. 137 00:08:04,130 --> 00:08:07,300 And all of a sudden, funding dried up 138 00:08:07,300 --> 00:08:10,270 and the whole thing was declared a failure. 139 00:08:10,270 --> 00:08:14,580 But in fact today, if you go out there and you look at-- 140 00:08:14,580 --> 00:08:20,860 Microsoft Excel has an expert system-based help system 141 00:08:20,860 --> 00:08:22,540 bundled inside it. 142 00:08:22,540 --> 00:08:25,270 And there are tons of such applications. 143 00:08:25,270 --> 00:08:27,010 It's just that now they're no longer 144 00:08:27,010 --> 00:08:29,530 considered cutting-edge applications 145 00:08:29,530 --> 00:08:30,940 of artificial intelligence. 146 00:08:30,940 --> 00:08:34,240 They're simply considered routine practice. 147 00:08:34,240 --> 00:08:38,289 So they've become incorporated, without the hype, 148 00:08:38,289 --> 00:08:40,510 into all kinds of existing products. 149 00:08:40,510 --> 00:08:43,120 And they're serving a very useful role. 150 00:08:43,120 --> 00:08:45,310 But they didn't make those venture capital 151 00:08:45,310 --> 00:08:49,270 firms the tons of money that they had hoped to make. 152 00:08:49,270 --> 00:08:51,460 There was a similar boom and bust 153 00:08:51,460 --> 00:08:56,830 cycle in the 2000s around the creation of the worldwide web 154 00:08:56,830 --> 00:08:58,530 and e-commerce. 155 00:08:58,530 --> 00:09:00,690 OK, so e-commerce. 156 00:09:00,690 --> 00:09:04,150 Again, there was this unbelievably inflated set 157 00:09:04,150 --> 00:09:06,460 of expectations. 158 00:09:06,460 --> 00:09:09,920 Then around the year 2000, there was a big crash, 159 00:09:09,920 --> 00:09:13,210 where all of a sudden people realized 160 00:09:13,210 --> 00:09:15,940 that the value in these applications 161 00:09:15,940 --> 00:09:19,510 was not as high as what they expected it to be. 162 00:09:19,510 --> 00:09:23,590 Nevertheless, you know Amazon is doing just fine. 163 00:09:23,590 --> 00:09:26,710 And there are plenty of online e-commerce sites 164 00:09:26,710 --> 00:09:31,690 that are in perfectly good operating order today. 165 00:09:31,690 --> 00:09:36,290 But it's no longer the same hype about this technology. 166 00:09:36,290 --> 00:09:38,860 It's just become an accepted part of the way 167 00:09:38,860 --> 00:09:41,380 that you do business in almost everything. 168 00:09:41,380 --> 00:09:41,880 Yeah. 169 00:09:41,880 --> 00:09:43,672 AUDIENCE: When you speak of expert systems, 170 00:09:43,672 --> 00:09:45,388 does that mean rule-based systems? 171 00:09:45,388 --> 00:09:47,680 PETER SZOLOVITS: They were either rule-based or pattern 172 00:09:47,680 --> 00:09:48,730 matching systems. 173 00:09:48,730 --> 00:09:50,980 There were two basic kinds. 174 00:09:50,980 --> 00:09:53,930 I think a week from today, I'm going to talk about some 175 00:09:53,930 --> 00:09:57,500 of that and how it relates to modern machine learning. 176 00:09:57,500 --> 00:10:01,150 So we'll see some examples. 177 00:10:01,150 --> 00:10:09,670 OK, well, a cautionary tale is IBM's Watson Health. 178 00:10:09,670 --> 00:10:14,590 So I assume most of you remember when 179 00:10:14,590 --> 00:10:19,300 Watson hit the big time by beating the Jeopardy champions. 180 00:10:19,300 --> 00:10:22,510 This was back in the early 2010s or something. 181 00:10:22,510 --> 00:10:25,300 I don't remember exactly which year. 182 00:10:25,300 --> 00:10:28,540 And they had, in fact, built a really impressive set 183 00:10:28,540 --> 00:10:32,200 of technologies that went out and read 184 00:10:32,200 --> 00:10:36,070 all kinds of online sources and distilled them 185 00:10:36,070 --> 00:10:39,340 into a kind of representation that they could very 186 00:10:39,340 --> 00:10:42,670 quickly look up things when they were challenged with a Jeopardy 187 00:10:42,670 --> 00:10:44,230 question. 188 00:10:44,230 --> 00:10:46,900 And then it had a sophisticated set 189 00:10:46,900 --> 00:10:50,830 of algorithms that would try to find the best 190 00:10:50,830 --> 00:10:53,320 answer for some question. 191 00:10:53,320 --> 00:10:57,340 And they even had all kinds of bizarre special-purpose things. 192 00:10:57,340 --> 00:11:00,310 I remember there was a probabilistic model that 193 00:11:00,310 --> 00:11:03,340 figured out where the Daily Double 194 00:11:03,340 --> 00:11:07,660 squares were most likely to be on the Jeopardy board. 195 00:11:07,660 --> 00:11:11,770 And then they did a utility theoretic calculation 196 00:11:11,770 --> 00:11:14,770 to figure out if they did hit the Daily Double, 197 00:11:14,770 --> 00:11:17,950 what was the optimum amount of money to bet, 198 00:11:17,950 --> 00:11:20,620 based on the machine's performance, 199 00:11:20,620 --> 00:11:22,360 in order to optimize. 200 00:11:22,360 --> 00:11:24,550 They decided that humans typically 201 00:11:24,550 --> 00:11:29,350 don't bet enough when they have a chance on the Daily Double. 202 00:11:29,350 --> 00:11:31,870 So there was a lot of very special-purpose stuff 203 00:11:31,870 --> 00:11:32,980 done for this. 204 00:11:32,980 --> 00:11:37,030 So this was a huge publicity bonanza. 205 00:11:37,030 --> 00:11:42,550 And IBM decided that next they were going to tackle medicine. 206 00:11:42,550 --> 00:11:45,070 So they were going to take this technology 207 00:11:45,070 --> 00:11:46,940 and apply it to medicine. 208 00:11:46,940 --> 00:11:49,750 They were going to read all of the medical journals 209 00:11:49,750 --> 00:11:55,150 and all of the electronic medical records 210 00:11:55,150 --> 00:11:57,310 that they could get their hands on. 211 00:11:57,310 --> 00:11:59,650 And somehow this technology would again 212 00:11:59,650 --> 00:12:03,130 distill the right information, so that they 213 00:12:03,130 --> 00:12:06,250 could answer questions like a Jeopardy question, 214 00:12:06,250 --> 00:12:09,790 except not stated in its funny backward way. 215 00:12:09,790 --> 00:12:12,760 Where you might say, OK, for this patient, 216 00:12:12,760 --> 00:12:14,800 what is the optimum therapy? 217 00:12:14,800 --> 00:12:17,470 And it would go out and use the same technology 218 00:12:17,470 --> 00:12:20,020 to figure that out. 219 00:12:20,020 --> 00:12:24,370 Now that was a perfectly reasonable thing to try. 220 00:12:24,370 --> 00:12:27,970 The problem they ran into was this hype cycle, 221 00:12:27,970 --> 00:12:31,810 that the people who made this publicly-known 222 00:12:31,810 --> 00:12:35,680 were their marketing people and not their technical people. 223 00:12:35,680 --> 00:12:39,460 And the marketing people overpromised like crazy. 224 00:12:39,460 --> 00:12:41,950 They said surely this is just going 225 00:12:41,950 --> 00:12:43,360 to solve all these problems. 226 00:12:43,360 --> 00:12:46,460 And we won't need anymore research in this area, 227 00:12:46,460 --> 00:12:49,450 because man, we got it. 228 00:12:49,450 --> 00:12:53,650 I'm overstating it, even from the marketing point of view. 229 00:12:53,650 --> 00:12:59,710 And so Watson for Oncology used this cloud-based supercomputer 230 00:12:59,710 --> 00:13:01,960 to digest massive amounts of data. 231 00:13:04,960 --> 00:13:08,750 That data included all kinds of different things. 232 00:13:08,750 --> 00:13:10,750 So I'm going to go into a little bit of detail 233 00:13:10,750 --> 00:13:13,000 about what some of their problems were. 234 00:13:13,000 --> 00:13:18,910 This is from an article in this journal, Statnews, 235 00:13:18,910 --> 00:13:24,010 which did an investigative piece on what happened with Watson. 236 00:13:24,010 --> 00:13:26,260 So you know, they say what I just said. 237 00:13:26,260 --> 00:13:29,050 Breathlessly promoting its signature brand, 238 00:13:29,050 --> 00:13:31,810 IBM sought to capture the world's imagination 239 00:13:31,810 --> 00:13:35,030 and quickly zeroed in on a high-profile target, 240 00:13:35,030 --> 00:13:36,470 which was cancer. 241 00:13:36,470 --> 00:13:40,210 So this was going to solve the problem of some patient 242 00:13:40,210 --> 00:13:42,730 shows up, is diagnosed with cancer, 243 00:13:42,730 --> 00:13:44,890 and you want to know how to treat this person. 244 00:13:44,890 --> 00:13:47,530 So this would use all of the literature 245 00:13:47,530 --> 00:13:49,510 and all of everything that it had 246 00:13:49,510 --> 00:13:51,850 gathered from previous treatments 247 00:13:51,850 --> 00:13:53,470 of previous patients. 248 00:13:53,470 --> 00:13:56,920 And it would give you the optimal solution. 249 00:13:56,920 --> 00:13:59,240 Now it has not been a success. 250 00:13:59,240 --> 00:14:02,870 There are a few dozen hospitals that have adopted the system. 251 00:14:02,870 --> 00:14:07,240 Very few of them in the United States, more of them abroad. 252 00:14:07,240 --> 00:14:11,590 And the foreigners complain that its advice 253 00:14:11,590 --> 00:14:14,170 is biased toward American patients 254 00:14:14,170 --> 00:14:16,390 and American approaches. 255 00:14:16,390 --> 00:14:19,120 To me, the biggest problem is that they haven't actually 256 00:14:19,120 --> 00:14:22,480 published anything that validates, 257 00:14:22,480 --> 00:14:25,690 in a scientific sense, that this is a good idea. 258 00:14:25,690 --> 00:14:28,330 That it's getting the right answers. 259 00:14:28,330 --> 00:14:30,310 My guess is the reason for this is 260 00:14:30,310 --> 00:14:32,230 because it's not getting the right answers, 261 00:14:32,230 --> 00:14:33,880 a lot of the time. 262 00:14:33,880 --> 00:14:39,790 But that doesn't prevent marketing from selling it. 263 00:14:39,790 --> 00:14:43,810 The other problem is that they made a deal with Memorial Sloan 264 00:14:43,810 --> 00:14:46,600 Kettering-- which is one of the leading cancer hospitals 265 00:14:46,600 --> 00:14:47,800 in the country-- 266 00:14:47,800 --> 00:14:51,490 to say, we're going to work with you guys and your oncologists 267 00:14:51,490 --> 00:14:55,870 in order to figure out what really is the right answer. 268 00:14:55,870 --> 00:14:59,140 So I think they tried to do what their marketing says 269 00:14:59,140 --> 00:15:02,090 that they're doing, which is to really derive 270 00:15:02,090 --> 00:15:05,320 the right answer from reading all of the literature 271 00:15:05,320 --> 00:15:07,420 and looking at past cases. 272 00:15:07,420 --> 00:15:09,680 But I don't think that worked well enough. 273 00:15:09,680 --> 00:15:12,190 And so what they wound up doing is turning 274 00:15:12,190 --> 00:15:14,350 to real oncologists, saying, what would you 275 00:15:14,350 --> 00:15:16,690 do under these circumstances? 276 00:15:16,690 --> 00:15:18,460 And so what they wound up building 277 00:15:18,460 --> 00:15:21,880 is something like a rule-based system that says, 278 00:15:21,880 --> 00:15:23,880 if you see the following symptoms 279 00:15:23,880 --> 00:15:26,470 and you have the following genetic defects, 280 00:15:26,470 --> 00:15:30,130 then this is the right treatment. 281 00:15:30,130 --> 00:15:32,800 So the promise that this was going 282 00:15:32,800 --> 00:15:37,330 to be a machine learning system that revolutionized cancer 283 00:15:37,330 --> 00:15:40,690 care by finding the optimal treatment really 284 00:15:40,690 --> 00:15:43,480 is not what they provided. 285 00:15:43,480 --> 00:15:45,670 And as the article says, the system 286 00:15:45,670 --> 00:15:48,280 doesn't really create new knowledge. 287 00:15:48,280 --> 00:15:53,170 So it's AI only in the sense of providing a search 288 00:15:53,170 --> 00:15:57,370 engine that, when it makes a recommendation, 289 00:15:57,370 --> 00:16:01,600 can point you to articles that are a reasonable reflection 290 00:16:01,600 --> 00:16:04,900 of what it's recommending. 291 00:16:04,900 --> 00:16:07,630 Well, I'm going to stop going through this litany. 292 00:16:07,630 --> 00:16:13,160 But you'll see it in the slides, which we'll post. 293 00:16:13,160 --> 00:16:15,490 They had a big contract with M.D. Anderson, 294 00:16:15,490 --> 00:16:19,180 which is another leading cancer center in the United States. 295 00:16:19,180 --> 00:16:24,040 M.D. Anderson spent about $60 million on this contract, 296 00:16:24,040 --> 00:16:25,210 implementing it. 297 00:16:25,210 --> 00:16:27,940 And they pulled the plug on it, because they decided that it 298 00:16:27,940 --> 00:16:32,110 just wasn't doing the job. 299 00:16:32,110 --> 00:16:39,700 Now by contrast, there was a much more successful attempt 300 00:16:39,700 --> 00:16:45,880 years ago, which was less driven by marketing and more driven 301 00:16:45,880 --> 00:16:47,800 by medical need. 302 00:16:47,800 --> 00:16:51,430 And the idea here was CPOE, stands for Computerized 303 00:16:51,430 --> 00:16:53,410 Physician Order Entry. 304 00:16:53,410 --> 00:16:57,640 The idea behind CPOE was that if you 305 00:16:57,640 --> 00:17:00,610 want to affect the behavior of clinicians 306 00:17:00,610 --> 00:17:06,510 in ordering tests or drugs or procedures, what you want to do 307 00:17:06,510 --> 00:17:10,050 is to make sure that they are interacting with the computer. 308 00:17:10,050 --> 00:17:12,240 So that when they order, for example, 309 00:17:12,240 --> 00:17:17,400 some insanely expensive drug, the system can come back 310 00:17:17,400 --> 00:17:19,410 and say, hey, do you realize that there's 311 00:17:19,410 --> 00:17:23,430 a drug that costs 1/100 as much, which according 312 00:17:23,430 --> 00:17:26,400 to the clinical trials that we have on record 313 00:17:26,400 --> 00:17:29,760 is just as effective as the one that you've ordered? 314 00:17:29,760 --> 00:17:34,560 And so for example, here at the Beth Israel many years ago, 315 00:17:34,560 --> 00:17:36,640 they implemented a system like that. 316 00:17:36,640 --> 00:17:38,850 And in the first year, they showed 317 00:17:38,850 --> 00:17:43,320 that they saved something like $16 million in the pharmacy, 318 00:17:43,320 --> 00:17:46,700 just by ordering cheaper variants of drugs that 319 00:17:46,700 --> 00:17:48,630 could have been very expensive. 320 00:17:48,630 --> 00:17:50,400 And they also found that the doctors who 321 00:17:50,400 --> 00:17:54,750 were doing the ordering were perfectly satisfied with that, 322 00:17:54,750 --> 00:17:58,860 because they just didn't know how expensive these drugs were. 323 00:17:58,860 --> 00:18:02,610 That's not one of the things that they pay attention to. 324 00:18:02,610 --> 00:18:05,460 So there are many applications like that 325 00:18:05,460 --> 00:18:08,100 that are driven by this. 326 00:18:08,100 --> 00:18:10,320 And again, here are some statistics. 327 00:18:10,320 --> 00:18:12,630 You can reduce error rates by half. 328 00:18:12,630 --> 00:18:20,880 You can reduce severe medication errors by 88%. 329 00:18:20,880 --> 00:18:26,310 You can have a 70% reduction in antibiotic-related adverse drug 330 00:18:26,310 --> 00:18:27,600 events. 331 00:18:27,600 --> 00:18:31,200 You can reduce length of stay, which is another big goal 332 00:18:31,200 --> 00:18:33,720 that people go after. 333 00:18:33,720 --> 00:18:38,370 And at least if you're an optimist, 334 00:18:38,370 --> 00:18:41,100 you can believe these extrapolations 335 00:18:41,100 --> 00:18:44,100 that say, well, we could prevent 3 million adverse drug 336 00:18:44,100 --> 00:18:49,350 events at big city hospitals in the United States 337 00:18:49,350 --> 00:18:53,520 if everybody used systems like this. 338 00:18:53,520 --> 00:18:57,330 So the benefits are that it prompts 339 00:18:57,330 --> 00:19:00,930 with warnings against possible drug interactions, allergies, 340 00:19:00,930 --> 00:19:03,570 or overdoses. 341 00:19:03,570 --> 00:19:07,710 It can be kept up to date by some sort of mechanism where 342 00:19:07,710 --> 00:19:09,960 people read the literature and keep 343 00:19:09,960 --> 00:19:13,740 updating the databases this is driven from. 344 00:19:13,740 --> 00:19:19,890 And it can do mechanical things like eliminate confusion 345 00:19:19,890 --> 00:19:22,740 about drug names that sound similar. 346 00:19:22,740 --> 00:19:24,510 Stuff like that. 347 00:19:24,510 --> 00:19:27,890 So the Leapfrog Group, which does a lot of meta analyses 348 00:19:27,890 --> 00:19:30,330 and studies of what's effective, really 349 00:19:30,330 --> 00:19:35,100 is behind this and pushing it very strongly. 350 00:19:35,100 --> 00:19:37,890 Potential future benefits, of course, 351 00:19:37,890 --> 00:19:40,680 are that if the kinds of machine learning techniques 352 00:19:40,680 --> 00:19:43,920 that we talk about become widely used, 353 00:19:43,920 --> 00:19:46,950 then these systems can be updated automatically rather 354 00:19:46,950 --> 00:19:48,930 than by manual review. 355 00:19:48,930 --> 00:19:52,980 And you can gain the advantages of immediate feedback 356 00:19:52,980 --> 00:19:57,020 as new information becomes available. 357 00:19:57,020 --> 00:20:04,610 Now the adoption of CPOE was recommended by the National 358 00:20:04,610 --> 00:20:06,380 Academy of Medicine. 359 00:20:06,380 --> 00:20:11,360 They wanted every hospital to use this by 1999. 360 00:20:11,360 --> 00:20:13,010 And of course, it hasn't happened. 361 00:20:13,010 --> 00:20:18,140 So I couldn't find current data, but 2014 data 362 00:20:18,140 --> 00:20:21,980 shows that CPOE, for example, for medication orders, 363 00:20:21,980 --> 00:20:25,880 is only being used in about 25% of the hospitals. 364 00:20:25,880 --> 00:20:30,540 And at that time, people were extrapolating and saying, well, 365 00:20:30,540 --> 00:20:34,040 it's not going to reach 80% penetration until the year 366 00:20:34,040 --> 00:20:35,720 2029. 367 00:20:35,720 --> 00:20:39,260 So it's a very slow adoption cycle. 368 00:20:39,260 --> 00:20:42,150 Maybe it's gotten better. 369 00:20:42,150 --> 00:20:45,920 The other problem-- and one of the reasons for resistance-- 370 00:20:45,920 --> 00:20:50,300 is that it puts additional stresses on people. 371 00:20:50,300 --> 00:20:54,680 So for example, this is a study of how 372 00:20:54,680 --> 00:20:57,110 pharmacists spend their time. 373 00:20:57,110 --> 00:21:01,770 So clinical time is useful. 374 00:21:01,770 --> 00:21:04,310 That's when they're consulting with doctors, 375 00:21:04,310 --> 00:21:06,500 helping them figure out appropriate dosage 376 00:21:06,500 --> 00:21:07,790 for patients. 377 00:21:07,790 --> 00:21:10,790 Or they're talking to patients, explaining to them 378 00:21:10,790 --> 00:21:14,330 how to take their medications, what side effects to watch out 379 00:21:14,330 --> 00:21:15,770 for, et cetera. 380 00:21:15,770 --> 00:21:19,820 These distributive tasks-- it's a funny term-- 381 00:21:19,820 --> 00:21:24,590 mean the non-clinical part of what they're doing. 382 00:21:24,590 --> 00:21:27,380 And what you see is that hospitals that 383 00:21:27,380 --> 00:21:33,020 have adopted CPOE, they wind up spending a little bit more time 384 00:21:33,020 --> 00:21:36,590 on the distributive tasks and a little bit less time 385 00:21:36,590 --> 00:21:37,850 on the clinical tasks. 386 00:21:37,850 --> 00:21:40,820 Which is probably not in the right direction, 387 00:21:40,820 --> 00:21:43,520 in terms of what pharmacists were hoping 388 00:21:43,520 --> 00:21:47,190 for out of systems like this. 389 00:21:47,190 --> 00:21:49,740 Now people have studied the diffusion 390 00:21:49,740 --> 00:21:52,440 of new medical technologies. 391 00:21:52,440 --> 00:21:55,920 And I think I'll just show you the graph. 392 00:21:55,920 --> 00:22:01,950 So this is in England, but this is the adoption for statins. 393 00:22:01,950 --> 00:22:04,140 So from the time they were introduced-- 394 00:22:04,140 --> 00:22:08,670 statins is the drug that keeps your cholesterol low. 395 00:22:08,670 --> 00:22:10,350 From the time they were introduced 396 00:22:10,350 --> 00:22:12,930 until they were being used, essentially, 397 00:22:12,930 --> 00:22:18,390 at 100% of places was about five and a half, six years. 398 00:22:18,390 --> 00:22:20,820 So reasonably fast. 399 00:22:20,820 --> 00:22:25,110 If you look at the adoption of magnetic resonance imaging 400 00:22:25,110 --> 00:22:29,070 technology, it took five years for it 401 00:22:29,070 --> 00:22:31,470 to have any adoption whatsoever. 402 00:22:31,470 --> 00:22:34,470 And that's because it was insanely expensive. 403 00:22:34,470 --> 00:22:37,170 So there were all kinds of limitations. 404 00:22:37,170 --> 00:22:39,540 You know, even in Massachusetts, you 405 00:22:39,540 --> 00:22:43,110 have to get permission from some state committee 406 00:22:43,110 --> 00:22:45,420 to buy a new MRI machine. 407 00:22:45,420 --> 00:22:49,260 And if another hospital in your town already had one, 408 00:22:49,260 --> 00:22:51,480 then they would say, well, you shouldn't buy one 409 00:22:51,480 --> 00:22:54,900 because you should be able to use this other hospital's MRI 410 00:22:54,900 --> 00:22:56,310 machine. 411 00:22:56,310 --> 00:22:58,530 Same thing happened with CT. 412 00:22:58,530 --> 00:23:02,190 But as soon as those limitations were lifted, boom. 413 00:23:05,040 --> 00:23:08,580 It went up and then continues to go up. 414 00:23:08,580 --> 00:23:11,880 Whereas stents, I actually don't know why 415 00:23:11,880 --> 00:23:14,590 they were delayed by that long. 416 00:23:14,590 --> 00:23:18,180 But this is for people with blockages in coronary arteries 417 00:23:18,180 --> 00:23:19,560 or other arteries. 418 00:23:19,560 --> 00:23:22,710 You can put in a little mesh tube that 419 00:23:22,710 --> 00:23:24,870 just keeps that artery open. 420 00:23:24,870 --> 00:23:28,770 And that adoption was incredibly quick. 421 00:23:28,770 --> 00:23:31,470 So different things get adopted at different rates. 422 00:23:39,410 --> 00:23:42,390 Now the last topic I want to talk about before-- 423 00:23:42,390 --> 00:23:42,890 yeah. 424 00:23:42,890 --> 00:23:45,240 AUDIENCE: So what happens in those years 425 00:23:45,240 --> 00:23:46,650 where you just have spikes? 426 00:23:46,650 --> 00:23:49,010 What's doing it? 427 00:23:49,010 --> 00:23:51,800 PETER SZOLOVITS: So according to those authors, 428 00:23:51,800 --> 00:23:55,190 in the case of stents, there were some champions 429 00:23:55,190 --> 00:23:58,070 of the idea of stenting who went around 430 00:23:58,070 --> 00:24:00,680 and convinced their colleagues that this was 431 00:24:00,680 --> 00:24:03,590 the right technology to use. 432 00:24:03,590 --> 00:24:08,150 So there was just an explosive growth in it. 433 00:24:08,150 --> 00:24:12,230 In the other technologies, in the MRI case, 434 00:24:12,230 --> 00:24:15,200 money mattered a lot because they're so expensive. 435 00:24:15,200 --> 00:24:17,690 Stents are relatively cheap. 436 00:24:17,690 --> 00:24:22,070 And in the case of statins, those 437 00:24:22,070 --> 00:24:24,530 are also relatively cheap. 438 00:24:24,530 --> 00:24:27,350 Or they've become cheap since they went off patent. 439 00:24:27,350 --> 00:24:30,252 Originally, they were much more expensive. 440 00:24:32,910 --> 00:24:35,590 But there are still adoption problems. 441 00:24:35,590 --> 00:24:39,220 So for example, there was a recommendation-- 442 00:24:39,220 --> 00:24:42,960 I think about 15, maybe even 20 years ago-- 443 00:24:42,960 --> 00:24:45,330 that said that anybody who has had a heart 444 00:24:45,330 --> 00:24:49,140 attack or coronary artery disease 445 00:24:49,140 --> 00:24:52,170 should be taking beta blockers. 446 00:24:52,170 --> 00:24:55,470 And I don't remember what the adoption rate is today, 447 00:24:55,470 --> 00:24:59,040 but it's only on the order of a half. 448 00:24:59,040 --> 00:25:02,730 And so why? 449 00:25:02,730 --> 00:25:06,000 This is a dirt cheap drug. 450 00:25:06,000 --> 00:25:08,790 For reasons not quite understood, 451 00:25:08,790 --> 00:25:11,820 it reduces the probability of having a second heart 452 00:25:11,820 --> 00:25:15,340 attack by about 35%. 453 00:25:15,340 --> 00:25:17,340 So it's a really cheap protective way 454 00:25:17,340 --> 00:25:19,560 of keeping people healthier. 455 00:25:19,560 --> 00:25:23,990 And yet it just hasn't suffused practice as much 456 00:25:23,990 --> 00:25:25,620 as people think it should have. 457 00:25:29,428 --> 00:25:31,280 All right. 458 00:25:31,280 --> 00:25:34,820 So how do we assure the quality of these technologies 459 00:25:34,820 --> 00:25:38,390 before we foist them on the world? 460 00:25:38,390 --> 00:25:40,410 This is tricky. 461 00:25:40,410 --> 00:25:44,960 So John Ioannidis, a Stanford professor, 462 00:25:44,960 --> 00:25:48,230 has made an extremely successful career 463 00:25:48,230 --> 00:25:51,800 out of pointing out that most biomedical research is crap. 464 00:25:54,330 --> 00:25:56,830 It can't be reproduced. 465 00:25:56,830 --> 00:25:59,620 And there are some famous publications 466 00:25:59,620 --> 00:26:04,720 that show that people have taken some area of biomedicine, 467 00:26:04,720 --> 00:26:08,260 and they've looked at a bunch of well-respected published 468 00:26:08,260 --> 00:26:09,245 studies. 469 00:26:09,245 --> 00:26:10,870 And they've gone to the lab and they've 470 00:26:10,870 --> 00:26:13,630 tried to replicate those studies. 471 00:26:13,630 --> 00:26:16,090 Half the time or three-quarters of the time, 472 00:26:16,090 --> 00:26:18,180 they fail to do so. 473 00:26:18,180 --> 00:26:21,960 You go, oh my god, this is horrible. 474 00:26:21,960 --> 00:26:22,760 It is horrible. 475 00:26:22,760 --> 00:26:23,455 Yeah. 476 00:26:23,455 --> 00:26:25,956 AUDIENCE: You mean like they failed to do so, 477 00:26:25,956 --> 00:26:29,374 so they won't reproduce the exact same results? 478 00:26:29,374 --> 00:26:31,710 Or what exactly-- 479 00:26:31,710 --> 00:26:33,250 PETER SZOLOVITS: Worse than that. 480 00:26:33,250 --> 00:26:35,730 So it's not that there are slight differences. 481 00:26:35,730 --> 00:26:37,770 It's that, for example, a result that 482 00:26:37,770 --> 00:26:42,210 was shown to be statistically significant in one study, when 483 00:26:42,210 --> 00:26:44,280 they repeat the study, is no longer 484 00:26:44,280 --> 00:26:46,920 statistically significant. 485 00:26:46,920 --> 00:26:52,710 That's bad, if you base policy on that kind of decision. 486 00:26:52,710 --> 00:26:56,460 So Ioannidis has a suggestion, which 487 00:26:56,460 --> 00:26:58,560 would probably help a lot. 488 00:26:58,560 --> 00:27:01,110 And that is, basically, make known 489 00:27:01,110 --> 00:27:04,960 to everybody all the studies that have failed. 490 00:27:04,960 --> 00:27:08,760 So the problem is that if you give me a big data set 491 00:27:08,760 --> 00:27:11,580 and I start mining this data set, 492 00:27:11,580 --> 00:27:14,880 I'm going to find tons and tons of interesting correlations 493 00:27:14,880 --> 00:27:17,160 in this data. 494 00:27:17,160 --> 00:27:21,180 And as soon as I get one that has a good p value, 495 00:27:21,180 --> 00:27:24,780 my students and I go, fantastic. 496 00:27:24,780 --> 00:27:25,500 Time to publish. 497 00:27:28,620 --> 00:27:30,930 Now consider the fact that I'm not 498 00:27:30,930 --> 00:27:33,030 the only person in this role. 499 00:27:33,030 --> 00:27:35,770 So you know, David's group is doing the same thing. 500 00:27:35,770 --> 00:27:39,600 And John Guttag's and Regina Barzilay's and all 501 00:27:39,600 --> 00:27:42,670 of our colleagues at every other major university 502 00:27:42,670 --> 00:27:45,600 and hospital in the United States. 503 00:27:45,600 --> 00:27:48,480 So there may be hundreds of people 504 00:27:48,480 --> 00:27:50,700 who are mining this data. 505 00:27:50,700 --> 00:27:53,710 And each of us has slightly different ways of doing it. 506 00:27:53,710 --> 00:27:55,770 We select our cases differently. 507 00:27:55,770 --> 00:27:58,380 We preprocess the data differently. 508 00:27:58,380 --> 00:28:02,310 We apply different learning algorithms to them. 509 00:28:02,310 --> 00:28:05,610 But just by random chance, some of us 510 00:28:05,610 --> 00:28:09,240 are going to find interesting results, interesting patterns. 511 00:28:09,240 --> 00:28:12,960 And of course, those are the ones that get published. 512 00:28:12,960 --> 00:28:16,150 Because if you don't find an interesting result, 513 00:28:16,150 --> 00:28:19,080 you're not going to submit it to a journal and say, 514 00:28:19,080 --> 00:28:22,170 you know I looked for the following fact phenomenon 515 00:28:22,170 --> 00:28:24,450 and I was unable to find it. 516 00:28:24,450 --> 00:28:26,850 Because the journal says, well, that's 517 00:28:26,850 --> 00:28:30,910 not interesting to anybody. 518 00:28:30,910 --> 00:28:35,820 So Ioannidis is recommending that, basically, 519 00:28:35,820 --> 00:28:38,370 every study that anybody undertakes should 520 00:28:38,370 --> 00:28:40,170 be registered. 521 00:28:40,170 --> 00:28:42,660 And if you don't get a significant result, 522 00:28:42,660 --> 00:28:44,250 that should be known. 523 00:28:44,250 --> 00:28:46,230 And this would allow us to make at least 524 00:28:46,230 --> 00:28:50,130 some reasonable estimate of whether the significant results 525 00:28:50,130 --> 00:28:53,940 that were gotten are just the statistical outliers that 526 00:28:53,940 --> 00:28:58,830 happened to reach p equal 0.05 or whatever your threshold is, 527 00:28:58,830 --> 00:29:03,253 or whether it's a real effect because not that many people 528 00:29:03,253 --> 00:29:04,170 have been trying this. 529 00:29:04,170 --> 00:29:04,670 Yeah. 530 00:29:04,670 --> 00:29:07,120 AUDIENCE: [INAUDIBLE] why do you think this is? 531 00:29:07,120 --> 00:29:09,495 Is it because of the size of some core patients? 532 00:29:09,495 --> 00:29:10,460 Or bias in the assay? 533 00:29:10,460 --> 00:29:14,000 Or just purely randomness in the study? 534 00:29:14,000 --> 00:29:15,750 PETER SZOLOVITS: It could be any of those. 535 00:29:15,750 --> 00:29:20,310 It could be that your hospital has some biased data 536 00:29:20,310 --> 00:29:21,520 collection. 537 00:29:21,520 --> 00:29:22,950 And so you find an effect. 538 00:29:22,950 --> 00:29:26,610 My hospital doesn't, and so I don't find it. 539 00:29:26,610 --> 00:29:30,020 It could be that we just randomly sub-sampled 540 00:29:30,020 --> 00:29:31,920 a different sample of the population. 541 00:29:38,110 --> 00:29:39,190 So it's very interesting. 542 00:29:39,190 --> 00:29:40,980 Last year I was invited to a meeting 543 00:29:40,980 --> 00:29:45,330 by Jeff Drazen, who's the executive editor of the New 544 00:29:45,330 --> 00:29:46,470 England Journal. 545 00:29:46,470 --> 00:29:48,270 And he's thinking about-- 546 00:29:48,270 --> 00:29:50,160 has not decided-- but he's thinking 547 00:29:50,160 --> 00:29:53,040 about a policy for the New England Journal, which 548 00:29:53,040 --> 00:29:57,060 is like the top medical journal, that says that he will not 549 00:29:57,060 --> 00:30:00,570 publish any result unless it's been replicated 550 00:30:00,570 --> 00:30:04,530 on two independent data sets. 551 00:30:04,530 --> 00:30:05,580 So that's interesting. 552 00:30:05,580 --> 00:30:09,030 And that's an attempt to fight back against this problem. 553 00:30:09,030 --> 00:30:15,060 It's a different solution than what Ioannidis is recommending. 554 00:30:15,060 --> 00:30:20,070 So this was a study by Enrico Carrara. 555 00:30:20,070 --> 00:30:23,910 And he's talking about what it means to replicate. 556 00:30:23,910 --> 00:30:26,190 And again, I'm not going to go through all this. 557 00:30:26,190 --> 00:30:29,190 But there's the notion that replication might 558 00:30:29,190 --> 00:30:31,820 mean exact replication, i.e. 559 00:30:31,820 --> 00:30:35,940 You do exactly the same thing on exactly the same kind of data, 560 00:30:35,940 --> 00:30:38,450 but in a different data set. 561 00:30:38,450 --> 00:30:42,510 And then partial replication, conceptual replication, 562 00:30:42,510 --> 00:30:45,480 which says, you follow the same procedures 563 00:30:45,480 --> 00:30:47,850 but in a different environment. 564 00:30:47,850 --> 00:30:49,860 And then quasi replication-- 565 00:30:49,860 --> 00:30:52,020 either partial or conceptual. 566 00:30:52,020 --> 00:30:54,090 And these have various characteristics 567 00:30:54,090 --> 00:30:55,540 that you can look at. 568 00:30:55,540 --> 00:30:56,850 It's an interesting framework. 569 00:31:01,590 --> 00:31:04,980 So this is not a new idea. 570 00:31:04,980 --> 00:31:09,000 The first edition of this book, Evaluation Methods 571 00:31:09,000 --> 00:31:12,750 in Biomedical Informatics, was called Evaluation Methods 572 00:31:12,750 --> 00:31:16,110 in Medical Informatics by the same authors 573 00:31:16,110 --> 00:31:20,502 and was published a long time ago. 574 00:31:20,502 --> 00:31:21,210 I can't remember. 575 00:31:21,210 --> 00:31:24,270 This one is relatively recent. 576 00:31:24,270 --> 00:31:30,540 And so they do a multi-hundred page, very detailed evaluation 577 00:31:30,540 --> 00:31:33,540 of exactly how one should evaluate 578 00:31:33,540 --> 00:31:35,730 clinical systems like this. 579 00:31:35,730 --> 00:31:38,280 And it's very careful and very cautious, 580 00:31:38,280 --> 00:31:41,070 but it's also very conservative. 581 00:31:41,070 --> 00:31:44,250 So for example, one of the things that they recommend 582 00:31:44,250 --> 00:31:46,530 is that the people doing the evaluation 583 00:31:46,530 --> 00:31:50,300 should not be the people who developed the technique, 584 00:31:50,300 --> 00:31:53,540 because there's innately bias. 585 00:31:53,540 --> 00:31:56,850 You know, I want my technique to succeed. 586 00:31:56,850 --> 00:31:58,970 And so they say, hand it off to somebody 587 00:31:58,970 --> 00:32:01,920 else who doesn't have that same vested interest. 588 00:32:01,920 --> 00:32:06,290 And then you're going to get a more careful evaluation. 589 00:32:06,290 --> 00:32:09,740 So Steve Pauker and I wrote a response 590 00:32:09,740 --> 00:32:12,590 to one of their early papers recommending 591 00:32:12,590 --> 00:32:16,910 this that said, well, that's so conservative that it 592 00:32:16,910 --> 00:32:19,880 sort of throws the baby out with the bathwater. 593 00:32:19,880 --> 00:32:23,810 Because if you make it so difficult to do an evaluation, 594 00:32:23,810 --> 00:32:26,630 you'll never get anything past it. 595 00:32:26,630 --> 00:32:31,070 So we proposed instead a kind of staged evaluation 596 00:32:31,070 --> 00:32:35,030 that says, first of all, you should do regression testing 597 00:32:35,030 --> 00:32:39,320 so that every time you use these agile development methods, 598 00:32:39,320 --> 00:32:42,710 you should have the set of cases that your program has 599 00:32:42,710 --> 00:32:44,300 worked on before. 600 00:32:44,300 --> 00:32:46,340 You should automatically rerun them 601 00:32:46,340 --> 00:32:48,770 and see which ones you've made better 602 00:32:48,770 --> 00:32:50,630 and which ones you've made worse. 603 00:32:50,630 --> 00:32:52,100 And that will give you some insight 604 00:32:52,100 --> 00:32:55,340 into whether what you're doing is reasonable. 605 00:32:55,340 --> 00:32:57,950 Then you might also build tools that 606 00:32:57,950 --> 00:33:02,180 look at automating ways of looking for inconsistencies 607 00:33:02,180 --> 00:33:05,180 in the models that you're building. 608 00:33:05,180 --> 00:33:08,660 Then you have retrospective review, judged by clinicians. 609 00:33:08,660 --> 00:33:11,570 So you run a program that you like 610 00:33:11,570 --> 00:33:14,630 over a whole bunch of existing data, 611 00:33:14,630 --> 00:33:18,830 like what you're doing with Mimic or with Market Scan. 612 00:33:18,830 --> 00:33:20,840 And then you do it prospectively, 613 00:33:20,840 --> 00:33:24,470 but without actually affecting patients. 614 00:33:24,470 --> 00:33:28,670 So you do it in real time as the data is coming in, 615 00:33:28,670 --> 00:33:32,930 but you don't tell anybody what the program results in. 616 00:33:32,930 --> 00:33:35,990 You just ask them to evaluate in retrospect 617 00:33:35,990 --> 00:33:38,510 to see whether it was right. 618 00:33:38,510 --> 00:33:40,550 And you might say, well, what's the difference 619 00:33:40,550 --> 00:33:42,770 between collecting the data in real time 620 00:33:42,770 --> 00:33:46,190 and collecting the data retrospectively? 621 00:33:46,190 --> 00:33:49,170 Historically, the answer is there is a difference. 622 00:33:49,170 --> 00:33:51,950 So circumstances differ. 623 00:33:51,950 --> 00:33:55,740 The mechanisms that you have for collecting the data differ. 624 00:33:55,740 --> 00:34:00,140 So this turns out to be an important issue. 625 00:34:00,140 --> 00:34:03,860 And then you can run a prospective controlled trial 626 00:34:03,860 --> 00:34:07,430 where you're interested in evaluating both the answer 627 00:34:07,430 --> 00:34:10,489 that you get from the program, and ultimately 628 00:34:10,489 --> 00:34:13,679 the effect on health outcomes. 629 00:34:13,679 --> 00:34:16,460 So if I have a decision support system, 630 00:34:16,460 --> 00:34:18,560 the ultimate proof of the pudding 631 00:34:18,560 --> 00:34:21,949 is if I run that decision support system. 632 00:34:21,949 --> 00:34:25,370 I give advice to clinicians, the clinicians 633 00:34:25,370 --> 00:34:29,239 change their behavior sometimes, and the patients 634 00:34:29,239 --> 00:34:30,739 get a better outcome. 635 00:34:30,739 --> 00:34:33,960 Then I'm convinced that this is really useful. 636 00:34:33,960 --> 00:34:35,719 But you have to get there slowly, 637 00:34:35,719 --> 00:34:38,659 because you don't want to give them worse outcomes. 638 00:34:38,659 --> 00:34:41,914 That's unethical and probably illegal. 639 00:34:47,260 --> 00:34:49,690 And you want to compare this to the performance 640 00:34:49,690 --> 00:34:52,130 of unaided doctors. 641 00:34:52,130 --> 00:34:54,460 So the Food and Drug Administration 642 00:34:54,460 --> 00:34:57,370 has been dealing with this issue for many, many years. 643 00:34:57,370 --> 00:35:01,540 I remember talking to them in about 1976, when 644 00:35:01,540 --> 00:35:05,140 they were reading about the very first expert system 645 00:35:05,140 --> 00:35:09,340 programs for diagnosis and therapy selection. 646 00:35:09,340 --> 00:35:12,100 And they said, well, how should we regulate these? 647 00:35:12,100 --> 00:35:15,580 And my response at the time was, God help us. 648 00:35:15,580 --> 00:35:17,830 Keep your hands off. 649 00:35:17,830 --> 00:35:21,760 Because if you regulate it, then you're 650 00:35:21,760 --> 00:35:23,830 going to slow down progress. 651 00:35:23,830 --> 00:35:27,040 And in any case, none of these programs are being used. 652 00:35:27,040 --> 00:35:29,320 These programs are being developed 653 00:35:29,320 --> 00:35:33,190 as experimental programs in experimental settings. 654 00:35:33,190 --> 00:35:35,320 They're not coming anywhere close to being 655 00:35:35,320 --> 00:35:37,700 used on real patients. 656 00:35:37,700 --> 00:35:40,730 And so there is not a regulatory issue. 657 00:35:40,730 --> 00:35:46,000 And about every five years, FDA has revisited that question. 658 00:35:46,000 --> 00:35:50,590 And they have continued to make essentially the same decision, 659 00:35:50,590 --> 00:35:53,230 based on the rationale that, for example, they 660 00:35:53,230 --> 00:35:55,270 don't regulate books. 661 00:35:55,270 --> 00:35:57,760 If I write a textbook that explains something 662 00:35:57,760 --> 00:36:01,930 about medicine, the FDA is not going 663 00:36:01,930 --> 00:36:04,310 to see whether it's correct or not. 664 00:36:04,310 --> 00:36:06,460 And the reason is because the expectation 665 00:36:06,460 --> 00:36:10,000 is that the textbook is making recommendations, 666 00:36:10,000 --> 00:36:14,500 so to speak, to clinical practitioners who are 667 00:36:14,500 --> 00:36:17,290 responsible experts themselves. 668 00:36:17,290 --> 00:36:20,140 So the ultimate responsibility for how they behave 669 00:36:20,140 --> 00:36:23,390 rests with them and not with the textbook. 670 00:36:23,390 --> 00:36:26,050 And they said, we're going to treat these computer programs 671 00:36:26,050 --> 00:36:28,960 as if they were dynamic textbooks, rather 672 00:36:28,960 --> 00:36:33,460 than colleagues who are acting independently and giving 673 00:36:33,460 --> 00:36:34,540 advice. 674 00:36:34,540 --> 00:36:37,630 Now as soon as you try to give that advice, not 675 00:36:37,630 --> 00:36:41,200 to a professional, but to a patient, then 676 00:36:41,200 --> 00:36:45,520 you are immediately under the regulatory auspices of FDA. 677 00:36:45,520 --> 00:36:48,970 Because now there is no professional intermediate 678 00:36:48,970 --> 00:36:52,820 that can evaluate the quality of that advice. 679 00:36:52,820 --> 00:36:58,840 So what FDA has done, just in the past year, 680 00:36:58,840 --> 00:37:03,130 is they've said that we're going to treat 681 00:37:03,130 --> 00:37:08,830 these AI-based quote-unquote devices as medical devices. 682 00:37:08,830 --> 00:37:12,790 And we're going to apply the same regulatory requirements 683 00:37:12,790 --> 00:37:16,660 that we have for these devices, except we don't really 684 00:37:16,660 --> 00:37:18,310 know how to do this. 685 00:37:18,310 --> 00:37:21,790 So there's a kind of experiment going on right now 686 00:37:21,790 --> 00:37:26,320 where they're saying, OK, submit applications for review 687 00:37:26,320 --> 00:37:28,570 of these devices to us. 688 00:37:28,570 --> 00:37:30,610 We will review them. 689 00:37:30,610 --> 00:37:36,640 And we will use these criteria-- 690 00:37:36,640 --> 00:37:40,510 product quality, patient safety, clinical responsibility, 691 00:37:40,510 --> 00:37:44,590 cybersecurity responsibility, and a so-called proactive 692 00:37:44,590 --> 00:37:48,040 culture in the organization that's developing them-- 693 00:37:48,040 --> 00:37:50,860 in order to make a judgment of whether or not to let you 694 00:37:50,860 --> 00:37:55,640 proceed with marketing one of these things. 695 00:37:55,640 --> 00:38:00,610 So if you look, there are in fact about 10 devices, 696 00:38:00,610 --> 00:38:03,070 quote-unquote-- these are all software-- 697 00:38:03,070 --> 00:38:06,160 that have been approved so far by FDA. 698 00:38:06,160 --> 00:38:10,150 And almost all of them are imaging devices. 699 00:38:10,150 --> 00:38:14,170 They're things that do convolutional networks 700 00:38:14,170 --> 00:38:16,520 over one thing or another. 701 00:38:16,520 --> 00:38:19,540 And so here are just a few examples. 702 00:38:19,540 --> 00:38:25,000 Imagen has OsteoDetect, which analyzes two-dimensional X-ray 703 00:38:25,000 --> 00:38:28,150 images for signs of distal radius fracture. 704 00:38:28,150 --> 00:38:32,530 So if you break your wrist, then this system 705 00:38:32,530 --> 00:38:36,100 will look at the X-ray and decide whether or not 706 00:38:36,100 --> 00:38:38,200 you've done that. 707 00:38:38,200 --> 00:38:44,450 Here's one from IDx, which looks at the photographs 708 00:38:44,450 --> 00:38:47,410 of your retina and decides whether you 709 00:38:47,410 --> 00:38:49,540 have diabetic retinopathy. 710 00:38:49,540 --> 00:38:51,130 And actually, they've published a lot 711 00:38:51,130 --> 00:38:55,030 of papers that show that they can also identify heart disease 712 00:38:55,030 --> 00:38:57,760 and stroke risk and various other things 713 00:38:57,760 --> 00:38:59,740 from those same photographs. 714 00:38:59,740 --> 00:39:04,970 So FDA has granted them approval to market this thing. 715 00:39:04,970 --> 00:39:10,660 Another one is Viz, which automatically analyzes 716 00:39:10,660 --> 00:39:15,910 CT scans for ER patients and is looking for blockages 717 00:39:15,910 --> 00:39:18,550 and major brain blood vessels. 718 00:39:18,550 --> 00:39:21,190 So this can obviously lead to a stroke. 719 00:39:21,190 --> 00:39:25,090 And this is an automated technique that does that. 720 00:39:25,090 --> 00:39:27,390 Here's another one. 721 00:39:27,390 --> 00:39:32,860 Arterys measures and tracks tumors or potential cancers 722 00:39:32,860 --> 00:39:34,105 in radiology images. 723 00:39:36,700 --> 00:39:41,700 So these are the ones that have been approved. 724 00:39:41,700 --> 00:39:43,560 And then I just wanted to remind you 725 00:39:43,560 --> 00:39:46,860 that there's actually plenty of literature 726 00:39:46,860 --> 00:39:48,810 about this kind of stuff. 727 00:39:48,810 --> 00:39:52,200 So the book on the left actually comes out next week. 728 00:39:54,810 --> 00:39:58,860 I got to read a pre-print of it, by Eric Topol, 729 00:39:58,860 --> 00:40:00,720 who's one of these doctors who writes 730 00:40:00,720 --> 00:40:03,750 a lot about the future of medicine. 731 00:40:03,750 --> 00:40:06,630 And he actually goes through tons and tons 732 00:40:06,630 --> 00:40:09,450 of examples of not only the systems 733 00:40:09,450 --> 00:40:12,030 that have been approved by FDA, but also 734 00:40:12,030 --> 00:40:13,980 things that are in the works that he's 735 00:40:13,980 --> 00:40:18,000 very optimistic that these will again revolutionize 736 00:40:18,000 --> 00:40:20,220 the practice of medicine. 737 00:40:20,220 --> 00:40:23,700 Bob Wachter, who wrote the book on the left a couple of years 738 00:40:23,700 --> 00:40:26,700 ago, is a little bit more cautious 739 00:40:26,700 --> 00:40:30,900 because he's chief of medicine at UC San Francisco. 740 00:40:30,900 --> 00:40:33,090 And he wrote this book in response 741 00:40:33,090 --> 00:40:36,870 to them almost killing a kid by giving him 742 00:40:36,870 --> 00:40:42,280 a 39x overdose of a medication. 743 00:40:42,280 --> 00:40:45,260 They didn't quite succeed in killing the kid. 744 00:40:45,260 --> 00:40:47,290 So it turned out OK. 745 00:40:47,290 --> 00:40:49,960 But he was really concerned about how 746 00:40:49,960 --> 00:40:54,040 this wonderful technology led to such a disastrous outcome. 747 00:40:54,040 --> 00:40:56,500 And so he spent a year studying how 748 00:40:56,500 --> 00:40:58,870 these systems were being used, and writes a more 749 00:40:58,870 --> 00:41:01,610 cautionary tale. 750 00:41:01,610 --> 00:41:06,230 So let me turn to Adam, who as I said, 751 00:41:06,230 --> 00:41:11,160 is a professor at the Brigham and Harvard Medical School. 752 00:41:11,160 --> 00:41:16,625 Please come and join me, and we can have a conversation. 753 00:41:16,625 --> 00:41:18,250 ADAM WRIGHT: So my name is Adam Wright. 754 00:41:18,250 --> 00:41:20,590 I'm an associate professor of medicine at Harvard Medical 755 00:41:20,590 --> 00:41:21,090 School. 756 00:41:21,090 --> 00:41:23,630 In that role, I lead a research program 757 00:41:23,630 --> 00:41:26,603 and I teach the introduction to biomedical informatics courses 758 00:41:26,603 --> 00:41:27,520 at the medical school. 759 00:41:27,520 --> 00:41:29,103 So if you're interested in the topics 760 00:41:29,103 --> 00:41:30,520 that Pete was talking about today, 761 00:41:30,520 --> 00:41:32,560 you should definitely consider cross-registering 762 00:41:32,560 --> 00:41:34,810 in VMI 701 or 702. 763 00:41:34,810 --> 00:41:37,120 The medical school certainly always 764 00:41:37,120 --> 00:41:39,970 could use a few more enthusiastic and 765 00:41:39,970 --> 00:41:44,270 technically-minded machine learning experts in our course. 766 00:41:44,270 --> 00:41:48,070 And then I have a operational job at Partners. 767 00:41:48,070 --> 00:41:49,870 Partners is the health system that 768 00:41:49,870 --> 00:41:52,540 includes Mass General Hospital and the Brigham 769 00:41:52,540 --> 00:41:54,160 and some community hospitals. 770 00:41:54,160 --> 00:41:57,370 And I work on Partners eCare, which 771 00:41:57,370 --> 00:42:00,460 is our kind of cool brand name for Epic. 772 00:42:00,460 --> 00:42:03,100 So Epic is the EHR that we use at Partners. 773 00:42:03,100 --> 00:42:06,382 And I help oversee the clinical decision support there. 774 00:42:06,382 --> 00:42:07,840 So we have a decision support team. 775 00:42:07,840 --> 00:42:10,388 I'm the clinical lead for monitoring and evaluation. 776 00:42:10,388 --> 00:42:12,430 And so I help make sure that our decision support 777 00:42:12,430 --> 00:42:15,920 systems of the type that Pete's talking about work correctly. 778 00:42:15,920 --> 00:42:18,338 So that's my job at the Brigham and at Partners. 779 00:42:18,338 --> 00:42:19,255 PETER SZOLOVITS: Cool. 780 00:42:19,255 --> 00:42:21,167 And I appreciate it very much. 781 00:42:21,167 --> 00:42:22,000 ADAM WRIGHT: Thanks. 782 00:42:22,000 --> 00:42:23,167 I appreciate the invitation. 783 00:42:23,167 --> 00:42:24,350 It's fun to be here. 784 00:42:24,350 --> 00:42:27,310 PETER SZOLOVITS: So Adam, the first obvious question 785 00:42:27,310 --> 00:42:30,340 is what kind of decision support systems 786 00:42:30,340 --> 00:42:32,420 have you guys actually put in place? 787 00:42:32,420 --> 00:42:33,420 ADAM WRIGHT: Absolutely. 788 00:42:33,420 --> 00:42:36,640 So we've had a long history at the Brigham and Partners 789 00:42:36,640 --> 00:42:38,680 of using decision support. 790 00:42:38,680 --> 00:42:41,330 Historically, we developed our own electronic health record, 791 00:42:41,330 --> 00:42:43,480 which was a little bit unusual. 792 00:42:43,480 --> 00:42:45,040 About three years ago, we switched 793 00:42:45,040 --> 00:42:47,507 from our self-developed system to Epic, 794 00:42:47,507 --> 00:42:49,840 which is a very widely-used commercial electronic health 795 00:42:49,840 --> 00:42:50,340 record. 796 00:42:50,340 --> 00:42:52,618 And to the point that you gave, we really 797 00:42:52,618 --> 00:42:54,660 started with a lot of medication-related decision 798 00:42:54,660 --> 00:42:55,160 support. 799 00:42:55,160 --> 00:42:57,655 So that's things like drug interaction, alerting. 800 00:42:57,655 --> 00:43:00,280 So you prescribe two drugs that might interact with each other. 801 00:43:00,280 --> 00:43:02,120 And we use a table-- 802 00:43:02,120 --> 00:43:04,660 no machine learning or anything too complicated-- that says, 803 00:43:04,660 --> 00:43:06,190 we think this drug might interact with this. 804 00:43:06,190 --> 00:43:08,315 We raise an alert to the doctor, to the pharmacist. 805 00:43:08,315 --> 00:43:10,512 And they make a decision, using their expertise 806 00:43:10,512 --> 00:43:12,220 as the learned intermediary, that they're 807 00:43:12,220 --> 00:43:13,928 going to continue with that prescription. 808 00:43:13,928 --> 00:43:16,240 Let's have some dosing support, allergy checking, 809 00:43:16,240 --> 00:43:17,510 and things like that. 810 00:43:17,510 --> 00:43:19,510 So our first set of decision support 811 00:43:19,510 --> 00:43:21,160 really was around medications. 812 00:43:21,160 --> 00:43:23,680 And then we turned to a broader set of things 813 00:43:23,680 --> 00:43:25,150 like preventative care reminders, 814 00:43:25,150 --> 00:43:28,240 so identifying patients that are overdue for a mammogram 815 00:43:28,240 --> 00:43:30,760 or a pap smear or that might benefit from a statin 816 00:43:30,760 --> 00:43:31,870 or something like that. 817 00:43:31,870 --> 00:43:35,440 Or a beta blocker, in the case of acute myocardial infarction. 818 00:43:35,440 --> 00:43:37,630 And we make suggestions to the doctor 819 00:43:37,630 --> 00:43:40,060 or to other members of the care team to do those things. 820 00:43:40,060 --> 00:43:42,820 Again, those historically have largely been rule-based. 821 00:43:42,820 --> 00:43:46,540 So some experts sat down and wrote Boolean if-then rules, 822 00:43:46,540 --> 00:43:48,910 using variables that are in a patient's chart. 823 00:43:48,910 --> 00:43:50,882 We have increasingly, though, started 824 00:43:50,882 --> 00:43:52,840 trying to use some predictive models for things 825 00:43:52,840 --> 00:43:55,990 like readmission or whether a patient is at risk of falling 826 00:43:55,990 --> 00:43:57,798 down in the hospital. 827 00:43:57,798 --> 00:43:59,590 A big problem that patients often encounter 828 00:43:59,590 --> 00:44:02,630 is they're in the hospital, they're kind of delirious. 829 00:44:02,630 --> 00:44:03,880 The hospital is a weird place. 830 00:44:03,880 --> 00:44:04,360 It's dark. 831 00:44:04,360 --> 00:44:05,780 They get up to go to the bathroom. 832 00:44:05,780 --> 00:44:08,830 They trip on their IV tubing, and then they 833 00:44:08,830 --> 00:44:09,820 fall and are injured. 834 00:44:09,820 --> 00:44:11,820 So we would like to prevent that from happening. 835 00:44:11,820 --> 00:44:13,760 Because that's obviously kind of a bad thing 836 00:44:13,760 --> 00:44:15,160 to happen to you once you're in the hospital. 837 00:44:15,160 --> 00:44:17,230 So we have some machine learning-based tools 838 00:44:17,230 --> 00:44:19,520 for predicting patients that are at risk for falls. 839 00:44:19,520 --> 00:44:21,250 And then there is a set of interventions 840 00:44:21,250 --> 00:44:24,190 like putting the bed rails up or putting an alarm that buzzes 841 00:44:24,190 --> 00:44:25,870 when if they get out of bed. 842 00:44:25,870 --> 00:44:28,625 Or in more extreme cases, having a sitter, like a person 843 00:44:28,625 --> 00:44:30,250 who actually sits in the room with them 844 00:44:30,250 --> 00:44:32,127 and tries to keep them from getting up 845 00:44:32,127 --> 00:44:33,460 or assists them to the bathroom. 846 00:44:33,460 --> 00:44:35,668 Or calls someone who can assist them to the bathroom. 847 00:44:35,668 --> 00:44:37,420 So we have increasingly started using 848 00:44:37,420 --> 00:44:39,610 those machine learning tools. 849 00:44:39,610 --> 00:44:41,470 Some of which we get from third parties, 850 00:44:41,470 --> 00:44:42,910 like from our electronic health record vendor, 851 00:44:42,910 --> 00:44:45,220 and some of which we sort of train ourselves 852 00:44:45,220 --> 00:44:46,750 on our own data. 853 00:44:46,750 --> 00:44:49,728 That's a newer pursuit for us, is this machine learning. 854 00:44:49,728 --> 00:44:51,520 PETER SZOLOVITS: So when you have something 855 00:44:51,520 --> 00:44:53,410 like a risk model, how do you decide 856 00:44:53,410 --> 00:44:55,390 where to set the threshold? 857 00:44:55,390 --> 00:45:00,650 You know, if I'm at 53% risk of falling, 858 00:45:00,650 --> 00:45:02,868 should you get a sitter to sit by my bedside? 859 00:45:02,868 --> 00:45:04,410 ADAM WRIGHT: It's complicated, right? 860 00:45:04,410 --> 00:45:06,202 I mean, I would like to say that what we do 861 00:45:06,202 --> 00:45:08,110 is a full kind of utility analysis, 862 00:45:08,110 --> 00:45:10,600 where we say, we pay a sitter this much per hour. 863 00:45:10,600 --> 00:45:12,190 And the risk of falling is this much. 864 00:45:12,190 --> 00:45:13,780 And the cost of a fall-- 865 00:45:13,780 --> 00:45:15,340 most patients who fall aren't hurt. 866 00:45:15,340 --> 00:45:16,270 But some are. 867 00:45:16,270 --> 00:45:20,350 And so you would calculate the cost-benefit 868 00:45:20,350 --> 00:45:22,510 of each of those things and figure out 869 00:45:22,510 --> 00:45:25,060 where on the ROC curve you want to place yourself. 870 00:45:25,060 --> 00:45:29,230 In practice, I think we often just play it by ear, 871 00:45:29,230 --> 00:45:31,300 in part because a lot of our things 872 00:45:31,300 --> 00:45:33,270 are intended to be suggestions. 873 00:45:33,270 --> 00:45:35,582 So our threshold for saying to the doctor, 874 00:45:35,582 --> 00:45:37,540 hey, this patient is at elevated risk for fall, 875 00:45:37,540 --> 00:45:40,270 consider doing something, is pretty low. 876 00:45:40,270 --> 00:45:42,840 If the system were, say, automatically ordering 877 00:45:42,840 --> 00:45:45,030 a sitter, we might set it higher. 878 00:45:45,030 --> 00:45:48,300 I would say that's an area of research. 879 00:45:48,300 --> 00:45:50,730 I would also say that one challenge we have is we 880 00:45:50,730 --> 00:45:53,750 often set and forget these kinds of systems. 881 00:45:53,750 --> 00:45:56,220 And so there is kind of feature drift and patients 882 00:45:56,220 --> 00:45:57,048 change over time. 883 00:45:57,048 --> 00:45:59,340 We probably should do a better job of then looking back 884 00:45:59,340 --> 00:46:01,007 to see how well they're actually working 885 00:46:01,007 --> 00:46:02,635 and making tweaks to the thresholds. 886 00:46:02,635 --> 00:46:03,510 Really good question. 887 00:46:03,510 --> 00:46:05,260 PETER SZOLOVITS: But these are, of course, 888 00:46:05,260 --> 00:46:07,260 very complicated decisions. 889 00:46:07,260 --> 00:46:13,230 I remember 50 years ago talking to some people in the Air Force 890 00:46:13,230 --> 00:46:18,120 about how much should they invest in safety measures. 891 00:46:18,120 --> 00:46:22,060 And they had a utility theoretic model 892 00:46:22,060 --> 00:46:26,340 that said, OK, how much does it cost to replace 893 00:46:26,340 --> 00:46:28,600 a pilot if you kill them? 894 00:46:28,600 --> 00:46:29,460 ADAM WRIGHT: Yikes. 895 00:46:29,460 --> 00:46:30,700 Yeah. 896 00:46:30,700 --> 00:46:34,118 PETER SZOLOVITS: And this was not publicized a lot. 897 00:46:34,118 --> 00:46:35,910 ADAM WRIGHT: I mean, we do calculate things 898 00:46:35,910 --> 00:46:37,410 like quality-adjusted life-years and 899 00:46:37,410 --> 00:46:38,760 disability-adjusted life-years. 900 00:46:38,760 --> 00:46:43,040 So there is-- in all of medicine as people deploy resources, 901 00:46:43,040 --> 00:46:43,800 this calculus. 902 00:46:43,800 --> 00:46:46,030 And I think we tend to assign a really high weight 903 00:46:46,030 --> 00:46:48,660 to patient harm, because patient harm is-- 904 00:46:48,660 --> 00:46:51,510 if you think about the oath the doctors swear, 905 00:46:51,510 --> 00:46:52,450 first do no harm. 906 00:46:52,450 --> 00:46:55,050 The worst thing we can do is harm you in the hospital. 907 00:46:55,050 --> 00:46:58,410 So I think we have a pretty strong aversion to do that. 908 00:46:58,410 --> 00:47:00,240 But it's very hard to weigh these things. 909 00:47:00,240 --> 00:47:02,430 I think one of the challenges we often run into 910 00:47:02,430 --> 00:47:05,292 is that different doctors would make different decisions. 911 00:47:05,292 --> 00:47:07,500 So if you put the same patient in front of 10 doctors 912 00:47:07,500 --> 00:47:09,592 and said, does this patient need a sitter? 913 00:47:09,592 --> 00:47:11,550 Maybe half would say yes and half would say no. 914 00:47:11,550 --> 00:47:13,050 So it's especially hard to know what 915 00:47:13,050 --> 00:47:15,030 to do with a decision support system 916 00:47:15,030 --> 00:47:17,070 if the humans can't agree on what you 917 00:47:17,070 --> 00:47:18,870 should do in that situation. 918 00:47:18,870 --> 00:47:20,995 PETER SZOLOVITS: So the other thing we talked about 919 00:47:20,995 --> 00:47:24,010 on the phone yesterday is I was concerned-- a few years ago, 920 00:47:24,010 --> 00:47:30,210 I was visiting one of these august Boston-area hospitals 921 00:47:30,210 --> 00:47:35,940 and asked to see an example of somebody interacting with this 922 00:47:35,940 --> 00:47:38,630 Computerized Physician Order Entry system. 923 00:47:38,630 --> 00:47:44,700 And the senior resident who was taking me around went up 924 00:47:44,700 --> 00:47:46,170 to the computer and said, well, I 925 00:47:46,170 --> 00:47:49,470 think I remember how to use this. 926 00:47:49,470 --> 00:47:52,650 And I said, wait a minute. 927 00:47:52,650 --> 00:47:56,340 This is something you're expected to use daily. 928 00:47:56,340 --> 00:47:59,730 But in reality, what happens is that it's not 929 00:47:59,730 --> 00:48:03,960 the senior doctors or even the medium senior doctors. 930 00:48:03,960 --> 00:48:06,690 It's the interns and the junior residents 931 00:48:06,690 --> 00:48:08,007 who actually use the systems. 932 00:48:08,007 --> 00:48:09,090 ADAM WRIGHT: This is true. 933 00:48:09,090 --> 00:48:11,640 PETER SZOLOVITS: And the concern I had 934 00:48:11,640 --> 00:48:17,820 was that it takes a junior resident with a lot of guts 935 00:48:17,820 --> 00:48:21,810 to go up to the chief of your service 936 00:48:21,810 --> 00:48:26,610 and say, doctor x, even though you asked me to order 937 00:48:26,610 --> 00:48:29,500 this drug for this patient, the computer 938 00:48:29,500 --> 00:48:33,000 is arguing back that you should use this other one instead. 939 00:48:33,000 --> 00:48:34,450 ADAM WRIGHT: Yeah, it does. 940 00:48:34,450 --> 00:48:36,658 And in fact, I actually thought of this a little more 941 00:48:36,658 --> 00:48:37,860 after we chatted about it. 942 00:48:37,860 --> 00:48:39,443 We've heard from residents that people 943 00:48:39,443 --> 00:48:41,400 have said to them, if you dare page me 944 00:48:41,400 --> 00:48:43,770 with an Epic suggestion in the middle of the night, 945 00:48:43,770 --> 00:48:45,150 I'll never talk to you again. 946 00:48:45,150 --> 00:48:48,150 So just override all of those alerts. 947 00:48:48,150 --> 00:48:51,990 I think that one of the challenges is-- 948 00:48:51,990 --> 00:48:55,080 and some culpability on our part-- 949 00:48:55,080 --> 00:48:57,840 is that a lot of these alerts we give 950 00:48:57,840 --> 00:49:00,930 have a PPV of like, 10 or 20%. 951 00:49:00,930 --> 00:49:02,727 They are usually wrong. 952 00:49:02,727 --> 00:49:04,560 We think it's really important, so we really 953 00:49:04,560 --> 00:49:06,060 raise these alerts a lot. 954 00:49:06,060 --> 00:49:09,240 But people experience this kind of alert fatigue, or what 955 00:49:09,240 --> 00:49:10,440 people call alarm fatigue. 956 00:49:10,440 --> 00:49:11,923 You see this in cockpits, too. 957 00:49:11,923 --> 00:49:13,590 But people get too many alerts, and they 958 00:49:13,590 --> 00:49:15,120 start ignoring the alerts. 959 00:49:15,120 --> 00:49:16,435 They assume that they're wrong. 960 00:49:16,435 --> 00:49:18,060 They tell the resident not to page them 961 00:49:18,060 --> 00:49:20,602 in the middle of the night, no matter what the computer says. 962 00:49:20,602 --> 00:49:23,340 So I do think that we have some responsibility to improve 963 00:49:23,340 --> 00:49:25,110 the accuracy of these alerts. 964 00:49:25,110 --> 00:49:26,940 I do think machine learning could help us. 965 00:49:26,940 --> 00:49:28,440 We're actually just having a meeting 966 00:49:28,440 --> 00:49:30,740 about a pneumococcal vaccination alert. 967 00:49:30,740 --> 00:49:32,850 This is something that helps people remember 968 00:49:32,850 --> 00:49:36,030 to prescribe this vaccination to help you not get pneumonia. 969 00:49:36,030 --> 00:49:39,537 And it takes four or five variables into account. 970 00:49:39,537 --> 00:49:41,370 We started looking at the cases where people 971 00:49:41,370 --> 00:49:42,630 would override the alert. 972 00:49:42,630 --> 00:49:44,280 And they were mostly appropriate. 973 00:49:44,280 --> 00:49:47,568 So the patient is in a really extreme state right now. 974 00:49:47,568 --> 00:49:49,860 Or conversely, the patient is close to the end of life. 975 00:49:49,860 --> 00:49:52,050 And they're not going to benefit from this vaccination. 976 00:49:52,050 --> 00:49:53,675 If the patient has a phobia of needles, 977 00:49:53,675 --> 00:49:55,390 if the patient has an insurance problem. 978 00:49:55,390 --> 00:49:58,110 And we think there's probably more like 30 or 40 variables 979 00:49:58,110 --> 00:50:00,540 that you would need to take into account to make 980 00:50:00,540 --> 00:50:01,800 that really accurate. 981 00:50:01,800 --> 00:50:04,170 So the question is, when you have that many variables, 982 00:50:04,170 --> 00:50:08,400 can a human develop and maintain that logic? 983 00:50:08,400 --> 00:50:11,550 Or would we be better off trying to use a machine learning 984 00:50:11,550 --> 00:50:12,660 system to do that? 985 00:50:12,660 --> 00:50:15,353 And would that really work or not? 986 00:50:15,353 --> 00:50:16,770 PETER SZOLOVITS: So how far are we 987 00:50:16,770 --> 00:50:20,320 from being able to use a machine learning system to do that? 988 00:50:20,320 --> 00:50:23,460 ADAM WRIGHT: I think that the biggest challenge, honestly, 989 00:50:23,460 --> 00:50:25,230 relates to the availability and accuracy 990 00:50:25,230 --> 00:50:27,280 of the data in our systems. 991 00:50:27,280 --> 00:50:29,940 So Epic, which is the EHR that we're using-- 992 00:50:29,940 --> 00:50:32,610 and Cerner and Allscripts and most of the major systems-- 993 00:50:32,610 --> 00:50:36,480 have various ways to run even sophisticated machine 994 00:50:36,480 --> 00:50:38,970 learning models, either inside of the system 995 00:50:38,970 --> 00:50:42,460 or bolted onto the system and then feeding model inferences 996 00:50:42,460 --> 00:50:44,337 back into the system. 997 00:50:44,337 --> 00:50:46,420 When I was giving that example of the pneumococcal 998 00:50:46,420 --> 00:50:48,003 vaccination, one of the major problems 999 00:50:48,003 --> 00:50:51,100 is that there's not always a really good structured way 1000 00:50:51,100 --> 00:50:53,350 in the system that we indicate that a patient is 1001 00:50:53,350 --> 00:50:55,990 at the end of life and receiving comfort measures only. 1002 00:50:55,990 --> 00:50:58,300 Or that the patient is in a really extreme state, 1003 00:50:58,300 --> 00:51:00,760 that we're in the middle of a code blue 1004 00:51:00,760 --> 00:51:02,470 and that we need to pause for a second 1005 00:51:02,470 --> 00:51:04,720 and stop giving these kind of friendly preventive care 1006 00:51:04,720 --> 00:51:05,350 suggestions. 1007 00:51:05,350 --> 00:51:08,140 So I would actually say that the biggest barrier 1008 00:51:08,140 --> 00:51:10,540 to really good machine-learning-based decision 1009 00:51:10,540 --> 00:51:14,110 support is just the lack of good, reliably documented, 1010 00:51:14,110 --> 00:51:16,720 coded usable features. 1011 00:51:16,720 --> 00:51:19,120 I think that the second challenge, obviously, 1012 00:51:19,120 --> 00:51:20,080 is workflow. 1013 00:51:20,080 --> 00:51:23,530 You said-- it's sometimes hard to know in the hospital who 1014 00:51:23,530 --> 00:51:24,720 a patient's doctor is. 1015 00:51:24,720 --> 00:51:25,720 The patient is admitted. 1016 00:51:25,720 --> 00:51:28,660 And on the care team is an intern, a junior resident, 1017 00:51:28,660 --> 00:51:31,540 and a fellow, an attending, several specialists, 1018 00:51:31,540 --> 00:51:32,347 a couple of nurses. 1019 00:51:32,347 --> 00:51:34,680 Who should get that message or who should get that page? 1020 00:51:34,680 --> 00:51:37,703 I think workflow is second. 1021 00:51:37,703 --> 00:51:39,370 This is where I think you may have said, 1022 00:51:39,370 --> 00:51:40,245 I have some optimism. 1023 00:51:40,245 --> 00:51:42,910 I actually think that the technical ability of our EHR 1024 00:51:42,910 --> 00:51:45,430 software to run these models is better 1025 00:51:45,430 --> 00:51:47,213 than it was three or five years ago. 1026 00:51:47,213 --> 00:51:49,630 And it's, actually, usually not the barrier in the studies 1027 00:51:49,630 --> 00:51:51,130 that we've done. 1028 00:51:51,130 --> 00:51:54,160 PETER SZOLOVITS: So there were attempts-- 1029 00:51:54,160 --> 00:51:56,350 again, 20 years ago-- 1030 00:51:56,350 --> 00:51:59,380 to create formal rules about who gets 1031 00:51:59,380 --> 00:52:01,930 notified under what circumstances. 1032 00:52:01,930 --> 00:52:05,500 I remember one of the doctors I worked with at Tufts Medical 1033 00:52:05,500 --> 00:52:11,020 Center was going crazy, because when they implemented a new lab 1034 00:52:11,020 --> 00:52:16,932 information system, it would alert on every abnormal lab. 1035 00:52:16,932 --> 00:52:19,300 And this was crazy. 1036 00:52:19,300 --> 00:52:22,840 But there were other hospitals that said, well, 1037 00:52:22,840 --> 00:52:25,180 let's be a little more sophisticated about when 1038 00:52:25,180 --> 00:52:27,010 it's necessary to alert. 1039 00:52:27,010 --> 00:52:30,550 And then if somebody doesn't respond to an alert 1040 00:52:30,550 --> 00:52:33,320 within a very short period of time, 1041 00:52:33,320 --> 00:52:36,520 then we escalate it to somebody higher up or somebody else 1042 00:52:36,520 --> 00:52:37,840 on the care team. 1043 00:52:37,840 --> 00:52:40,060 And that seemed like a reasonable idea to me. 1044 00:52:40,060 --> 00:52:42,252 But are there things like that in place now? 1045 00:52:42,252 --> 00:52:43,210 ADAM WRIGHT: There are. 1046 00:52:43,210 --> 00:52:45,700 It works very differently in the inpatient and the outpatient 1047 00:52:45,700 --> 00:52:46,200 setting. 1048 00:52:46,200 --> 00:52:48,850 At the inpatient setting, we're writing very acute care 1049 00:52:48,850 --> 00:52:49,660 to a patient. 1050 00:52:49,660 --> 00:52:52,750 And so we have processes where people sign in and out 1051 00:52:52,750 --> 00:52:55,090 of the care team. 1052 00:52:55,090 --> 00:52:57,640 In fact, these prevalence of these automated messages 1053 00:52:57,640 --> 00:53:00,320 is an incentive to do that well. 1054 00:53:00,320 --> 00:53:03,033 If I go home, I better sign myself out of that patient, 1055 00:53:03,033 --> 00:53:05,200 otherwise I'm going to get all these pages all night 1056 00:53:05,200 --> 00:53:05,810 about them. 1057 00:53:05,810 --> 00:53:07,870 And the system will always make sure 1058 00:53:07,870 --> 00:53:10,840 that somebody is the responding provider. 1059 00:53:10,840 --> 00:53:13,210 It becomes a little thornier in the outpatient setting, 1060 00:53:13,210 --> 00:53:15,820 because a lot of the academic doctors at the Brigham 1061 00:53:15,820 --> 00:53:18,070 only have clinic half a day a week. 1062 00:53:18,070 --> 00:53:20,860 And so the question is, if an abnormal result comes back, 1063 00:53:20,860 --> 00:53:22,780 should I send it to that doctor? 1064 00:53:22,780 --> 00:53:25,870 Should I send it to the person that's on call in that clinic? 1065 00:53:25,870 --> 00:53:27,910 Should I send it to the head of the clinic? 1066 00:53:27,910 --> 00:53:32,050 There are also these edge cases that mess us up a lot. 1067 00:53:32,050 --> 00:53:34,810 So a classic one is a patient is in the hospital. 1068 00:53:34,810 --> 00:53:36,340 I've ordered some lab tests. 1069 00:53:36,340 --> 00:53:38,920 They're looking well, so I discharge the patient. 1070 00:53:38,920 --> 00:53:41,890 The test is still pending at the time the patient is discharged. 1071 00:53:41,890 --> 00:53:43,210 And now, who does that go to? 1072 00:53:43,210 --> 00:53:45,610 Should it go to the patient's primary care doctor? 1073 00:53:45,610 --> 00:53:47,140 Do they have a primary care doctor? 1074 00:53:47,140 --> 00:53:49,182 Should it go to the person that ordered the test? 1075 00:53:49,182 --> 00:53:51,310 That person may be on vacation now, 1076 00:53:51,310 --> 00:53:53,560 if it's a test that takes a few weeks to come back. 1077 00:53:53,560 --> 00:53:55,602 So we still struggle with-- we call those TPADs-- 1078 00:53:55,602 --> 00:53:56,727 tests pending at discharge. 1079 00:53:56,727 --> 00:53:58,730 We still struggle with some of those edge cases. 1080 00:53:58,730 --> 00:54:02,572 But I think in the core, we're pretty good at it. 1081 00:54:02,572 --> 00:54:04,780 PETER SZOLOVITS: So one of the things we talked about 1082 00:54:04,780 --> 00:54:10,360 is an experience I've had and you've probably had that-- 1083 00:54:10,360 --> 00:54:12,160 for example, a few years ago I was 1084 00:54:12,160 --> 00:54:16,120 working with the people who run the clinical labs at Mass 1085 00:54:16,120 --> 00:54:17,860 General. 1086 00:54:17,860 --> 00:54:22,480 And they run some ancient laboratory information systems 1087 00:54:22,480 --> 00:54:25,700 that, as you said, can add and subtract but not multiply 1088 00:54:25,700 --> 00:54:26,200 or divide. 1089 00:54:26,200 --> 00:54:27,680 ADAM WRIGHT: They can add and multiply, but not subtract 1090 00:54:27,680 --> 00:54:28,285 or divide. 1091 00:54:28,285 --> 00:54:28,785 Yes. 1092 00:54:28,785 --> 00:54:30,600 And it doesn't support negative numbers. 1093 00:54:30,600 --> 00:54:33,820 Only unsigned integers. 1094 00:54:33,820 --> 00:54:37,270 PETER SZOLOVITS: So there are these wonderful legacy systems 1095 00:54:37,270 --> 00:54:41,500 around that really create horrendous problems, 1096 00:54:41,500 --> 00:54:44,050 because if you try to build anything-- 1097 00:54:44,050 --> 00:54:47,710 I mean, even a risk prediction calculator-- 1098 00:54:47,710 --> 00:54:54,040 it really helps to be able to divide as well as multiply. 1099 00:54:54,040 --> 00:54:57,820 So we've struggled in that project. 1100 00:54:57,820 --> 00:55:01,930 And I'm sure you've had similar experiences with how do we 1101 00:55:01,930 --> 00:55:07,030 incorporate a decision support system into some 1102 00:55:07,030 --> 00:55:11,890 of this squeaky old technology that just doesn't support it? 1103 00:55:11,890 --> 00:55:13,555 So what's the right approach to that? 1104 00:55:13,555 --> 00:55:15,430 ADAM WRIGHT: There are a lot of architectures 1105 00:55:15,430 --> 00:55:17,240 and they all have pros and cons. 1106 00:55:17,240 --> 00:55:19,490 I'm not sure if any one of them is the right approach. 1107 00:55:19,490 --> 00:55:24,100 I think we often do favor using these creaky old technology 1108 00:55:24,100 --> 00:55:25,590 or the new technology. 1109 00:55:25,590 --> 00:55:28,660 So Epic has a built in rule engine. 1110 00:55:28,660 --> 00:55:32,110 That laboratory you talked about has a basic calculation engine 1111 00:55:32,110 --> 00:55:35,260 with some significant limitations to it. 1112 00:55:35,260 --> 00:55:37,030 So where we can, we often will try 1113 00:55:37,030 --> 00:55:39,460 to build rules internally using these systems. 1114 00:55:39,460 --> 00:55:42,620 Those tend to have real-time availability of data, the best 1115 00:55:42,620 --> 00:55:45,220 ability to sort of push alerts to the person 1116 00:55:45,220 --> 00:55:48,335 right in their workflow and make though those alerts actionable. 1117 00:55:48,335 --> 00:55:50,460 In cases where we can't do that-- like for example, 1118 00:55:50,460 --> 00:55:53,430 a model that's too complex to execute in the system-- 1119 00:55:53,430 --> 00:55:57,170 one thing that we've often done is run that model 1120 00:55:57,170 --> 00:55:58,470 against our data warehouse. 1121 00:55:58,470 --> 00:56:00,178 So we have a data warehouse that extracts 1122 00:56:00,178 --> 00:56:01,928 the data from the electronic health record 1123 00:56:01,928 --> 00:56:02,930 every night at midnight. 1124 00:56:02,930 --> 00:56:07,020 So if we don't need real-time data, it's possible to run-- 1125 00:56:07,020 --> 00:56:09,270 extract the data, run a model, and then actually write 1126 00:56:09,270 --> 00:56:12,890 a risk score or a flag back into the patient's record 1127 00:56:12,890 --> 00:56:14,930 that can then be shown to the clinician, 1128 00:56:14,930 --> 00:56:17,060 or used to drive an alert or something like that. 1129 00:56:17,060 --> 00:56:21,110 That works really well, except that a lot of things that 1130 00:56:21,110 --> 00:56:23,040 happen-- particularly in an inpatient setting, 1131 00:56:23,040 --> 00:56:24,230 like predicting sepsis-- 1132 00:56:24,230 --> 00:56:25,842 depend on real-time data. 1133 00:56:25,842 --> 00:56:27,050 Data that we need right away. 1134 00:56:27,050 --> 00:56:30,500 And so we run into the challenge where that particular approach 1135 00:56:30,500 --> 00:56:36,170 only works on a 24-hour kind of retrospective basis. 1136 00:56:36,170 --> 00:56:40,580 We have also developed systems that depend on messages. 1137 00:56:40,580 --> 00:56:42,830 So there's this-- HL7 is a standard format 1138 00:56:42,830 --> 00:56:45,260 for exchanging data with an electronic health record. 1139 00:56:45,260 --> 00:56:49,160 There's various versions and profiles of HL7. 1140 00:56:49,160 --> 00:56:50,660 But you can set up an infrastructure 1141 00:56:50,660 --> 00:56:53,780 that sits outside of the EHR and gets messages in real time 1142 00:56:53,780 --> 00:56:54,560 from the EHR. 1143 00:56:54,560 --> 00:56:59,000 It makes inferences and sends messages back into the EHR. 1144 00:56:59,000 --> 00:57:01,970 Increasingly, EHRs also do support kind 1145 00:57:01,970 --> 00:57:03,500 of web service approaches. 1146 00:57:03,500 --> 00:57:05,570 So that you can register a hook and say, 1147 00:57:05,570 --> 00:57:07,280 call my hook whenever this thing happens. 1148 00:57:07,280 --> 00:57:09,980 Or you can pull the EHR to get data out and use another web 1149 00:57:09,980 --> 00:57:11,720 service to write data back in. 1150 00:57:11,720 --> 00:57:14,850 That's worked really well for us. 1151 00:57:14,850 --> 00:57:20,030 You can also ask the EHR to embed an app that you develop. 1152 00:57:20,030 --> 00:57:21,800 So people here may have heard-- or should 1153 00:57:21,800 --> 00:57:23,540 hear at some point-- about SMART on FHIR, 1154 00:57:23,540 --> 00:57:27,560 which is a open kind of API that allows you to develop 1155 00:57:27,560 --> 00:57:29,840 an application and embed that application 1156 00:57:29,840 --> 00:57:31,580 into an electronic health record. 1157 00:57:31,580 --> 00:57:35,220 We've increasingly been building some of those applications. 1158 00:57:35,220 --> 00:57:37,110 The downside right now of the smart apps 1159 00:57:37,110 --> 00:57:39,110 is that they're really good for reading data out 1160 00:57:39,110 --> 00:57:41,900 of the record and sort of visualizing or displaying it. 1161 00:57:41,900 --> 00:57:44,150 But they don't always have a lot of capability 1162 00:57:44,150 --> 00:57:47,600 to write data back into the record or take actions. 1163 00:57:47,600 --> 00:57:51,590 Most of the EHR vendors also have a proprietary approach, 1164 00:57:51,590 --> 00:57:52,520 like an app store. 1165 00:57:52,520 --> 00:57:54,270 So Epic calls theirs the App Orchard. 1166 00:57:54,270 --> 00:57:56,490 And most of the EHRs have something similar, 1167 00:57:56,490 --> 00:57:58,670 where you can join a developer program 1168 00:57:58,670 --> 00:58:01,460 and build an application. 1169 00:58:01,460 --> 00:58:04,910 And those are often more full-featured. 1170 00:58:04,910 --> 00:58:06,210 They tend to be proprietary. 1171 00:58:06,210 --> 00:58:08,020 So if you build one Epic app, you 1172 00:58:08,020 --> 00:58:10,228 have to then build a Cerner app and an Allscripts app 1173 00:58:10,228 --> 00:58:12,410 and an eClinicalWorks app separately. 1174 00:58:12,410 --> 00:58:16,760 There are often heavy fees for joining those programs, 1175 00:58:16,760 --> 00:58:18,470 although the EHR vendors-- 1176 00:58:18,470 --> 00:58:21,613 Epic in particular-- have lowered their prices a lot. 1177 00:58:21,613 --> 00:58:23,030 The federal government, the Office 1178 00:58:23,030 --> 00:58:24,767 of the National Coordinator of Health IT, 1179 00:58:24,767 --> 00:58:27,350 just about a week and a half ago released some new regulations 1180 00:58:27,350 --> 00:58:32,600 which really limit the rate at which vendors can charge 1181 00:58:32,600 --> 00:58:36,050 application developers for API access 1182 00:58:36,050 --> 00:58:38,450 basically to almost nothing, except for 1183 00:58:38,450 --> 00:58:42,020 incremental computation costs or special support. 1184 00:58:42,020 --> 00:58:43,893 So I think that may change everything now 1185 00:58:43,893 --> 00:58:45,560 that that regulation's been promulgated. 1186 00:58:45,560 --> 00:58:47,000 So we'll see. 1187 00:58:47,000 --> 00:58:51,020 PETER SZOLOVITS: So contrary to my pessimistic beginning, 1188 00:58:51,020 --> 00:58:54,830 this actually is the thing that makes me most optimistic. 1189 00:58:54,830 --> 00:58:57,320 That even five years ago, if you looked 1190 00:58:57,320 --> 00:59:02,360 at many of these systems, they essentially locked you out. 1191 00:59:02,360 --> 00:59:06,350 I remember in the early 2000s, I was 1192 00:59:06,350 --> 00:59:09,200 at the University of Pittsburgh, where 1193 00:59:09,200 --> 00:59:13,490 they had one of the first centers that was 1194 00:59:13,490 --> 00:59:17,210 doing heart-lung transplants. 1195 00:59:17,210 --> 00:59:22,370 So their people had built a special application 1196 00:59:22,370 --> 00:59:26,420 for supporting heart-lung transplant patients, 1197 00:59:26,420 --> 00:59:30,440 in their own homemade electronic medical records system. 1198 00:59:30,440 --> 00:59:35,580 And then UPMC went to Cerner at the time. 1199 00:59:35,580 --> 00:59:38,110 And I remember I was at some meeting 1200 00:59:38,110 --> 00:59:41,660 where the doctors who ran this heart-lung transplant unit 1201 00:59:41,660 --> 00:59:45,260 were talking to the Cerner people and saying, 1202 00:59:45,260 --> 00:59:49,310 how could we get something to support our special needs 1203 00:59:49,310 --> 00:59:50,960 for our patients? 1204 00:59:50,960 --> 00:59:53,840 And Cerner's answer was, well, commercially it doesn't 1205 00:59:53,840 --> 00:59:55,340 make sense for us to do this. 1206 00:59:55,340 --> 00:59:58,610 Because at the time there were like four hospitals 1207 00:59:58,610 --> 01:00:00,820 in the country that did this. 1208 01:00:00,820 --> 01:00:04,010 And so it's not a big money maker. 1209 01:00:04,010 --> 01:00:07,280 So their offer was, well, you pay us an extra $3 million 1210 01:00:07,280 --> 01:00:12,290 and within three years we will develop 1211 01:00:12,290 --> 01:00:15,040 the appropriate software for you. 1212 01:00:15,040 --> 01:00:17,000 So that's just crazy, right? 1213 01:00:17,000 --> 01:00:20,000 I mean, that's a totally untenable way 1214 01:00:20,000 --> 01:00:21,590 of going about things. 1215 01:00:21,590 --> 01:00:25,040 And now that there are systematic ways for you 1216 01:00:25,040 --> 01:00:29,430 either to embed your own code into one of these systems, 1217 01:00:29,430 --> 01:00:33,200 or at least to have a well-documented, reasonable way 1218 01:00:33,200 --> 01:00:36,440 of feeding data out and then feeding results back 1219 01:00:36,440 --> 01:00:41,130 into the system, that makes it possible to do 1220 01:00:41,130 --> 01:00:44,790 special-purpose applications like this. 1221 01:00:44,790 --> 01:00:48,270 Or experimental applications or all kinds of novel things. 1222 01:00:48,270 --> 01:00:49,420 So that's great. 1223 01:00:49,420 --> 01:00:51,420 ADAM WRIGHT: That's what we're optimistic about. 1224 01:00:51,420 --> 01:00:54,867 And I think it's worth adding that there's two barriers you 1225 01:00:54,867 --> 01:00:55,950 have to get through right. 1226 01:00:55,950 --> 01:00:57,540 One is Epic has to sort of let you 1227 01:00:57,540 --> 01:01:00,163 into their App Orchard, which is the barrier that 1228 01:01:00,163 --> 01:01:01,080 is increasingly lower. 1229 01:01:01,080 --> 01:01:03,288 And then you need to find a hospital or a health care 1230 01:01:03,288 --> 01:01:05,400 provider that wants to use your app, right. 1231 01:01:05,400 --> 01:01:08,190 So you have to clear both of those, 1232 01:01:08,190 --> 01:01:10,320 but I think it's increasingly possible. 1233 01:01:10,320 --> 01:01:11,820 You've got smart people here at MIT, 1234 01:01:11,820 --> 01:01:17,047 or at the hospitals that we have in Boston always wanting 1235 01:01:17,047 --> 01:01:17,880 to build these apps. 1236 01:01:17,880 --> 01:01:21,300 And I would say five years ago we would've told people, sorry, 1237 01:01:21,300 --> 01:01:22,150 it's not possible. 1238 01:01:22,150 --> 01:01:24,025 And today we're able, usually, to tell people 1239 01:01:24,025 --> 01:01:26,190 that if there's clinical interest, 1240 01:01:26,190 --> 01:01:28,180 the technical part will fall into place. 1241 01:01:28,180 --> 01:01:30,165 So that's exciting for us. 1242 01:01:30,165 --> 01:01:31,170 PETER SZOLOVITS: Yeah 1243 01:01:31,170 --> 01:01:31,878 ADAM WRIGHT: Yeah 1244 01:01:31,878 --> 01:01:33,240 AUDIENCE: Question about that. 1245 01:01:33,240 --> 01:01:33,570 ADAM WRIGHT: Absolutely 1246 01:01:33,570 --> 01:01:35,362 AUDIENCE: Some of the applications that you 1247 01:01:35,362 --> 01:01:37,650 guys develop in house, do you also 1248 01:01:37,650 --> 01:01:40,150 put those on the Epic Orchard, or do you 1249 01:01:40,150 --> 01:01:42,827 just sort of implement it one time within your own system? 1250 01:01:42,827 --> 01:01:44,910 ADAM WRIGHT: Yeah, there's a lot of different ways 1251 01:01:44,910 --> 01:01:46,850 that we share these applications, right. 1252 01:01:46,850 --> 01:01:48,270 So a lot of us are researchers. 1253 01:01:48,270 --> 01:01:50,370 So we will release an open source 1254 01:01:50,370 --> 01:01:54,240 version of the application or write a paper and say, 1255 01:01:54,240 --> 01:01:54,990 this is available. 1256 01:01:54,990 --> 01:01:56,340 And we'll share it with you. 1257 01:01:56,340 --> 01:01:59,550 The App Orchard is particularly focused on applications 1258 01:01:59,550 --> 01:02:00,945 that you want to sell. 1259 01:02:00,945 --> 01:02:02,820 So our hospital hasn't decided that we wanted 1260 01:02:02,820 --> 01:02:03,900 to sell any applications. 1261 01:02:03,900 --> 01:02:05,642 We've given a lot of applications away. 1262 01:02:05,642 --> 01:02:08,100 Epic also has something called the Community Library, which 1263 01:02:08,100 --> 01:02:11,465 is like the AppOrchard, but it's free instead of costing money. 1264 01:02:11,465 --> 01:02:12,840 And so we released a ton of stuff 1265 01:02:12,840 --> 01:02:15,440 through the Community Library. 1266 01:02:15,440 --> 01:02:18,420 To the point that I was poking out before, 1267 01:02:18,420 --> 01:02:21,780 one of the challenges is that if we build a Smart on FHIR app, 1268 01:02:21,780 --> 01:02:23,750 we're able to sort of share that publicly. 1269 01:02:23,750 --> 01:02:25,917 And we can post that on the web or put it on GitHub. 1270 01:02:25,917 --> 01:02:27,480 And anybody can use it. 1271 01:02:27,480 --> 01:02:34,710 Epic has a position that their APIs are proprietary. 1272 01:02:34,710 --> 01:02:37,230 And they represent Epic's valuable intellectual property 1273 01:02:37,230 --> 01:02:38,280 or trade secrets. 1274 01:02:38,280 --> 01:02:41,640 And so we're only allowed to share those apps 1275 01:02:41,640 --> 01:02:44,200 through the Epic ecosystem. 1276 01:02:44,200 --> 01:02:46,300 And so, we often now, when we get a grant-- 1277 01:02:46,300 --> 01:02:47,850 most of my work is through grants-- 1278 01:02:47,850 --> 01:02:49,230 we'll have an Epic site. 1279 01:02:49,230 --> 01:02:50,790 And we'll share that through the Community Library. 1280 01:02:50,790 --> 01:02:51,998 And we'll have a Cerner site. 1281 01:02:51,998 --> 01:02:54,030 And we'll share it through Cerner's equivalent. 1282 01:02:54,030 --> 01:02:57,830 But I think until the capability of the open APIs, 1283 01:02:57,830 --> 01:03:00,030 like Smart on FHIR, reaches the same level 1284 01:03:00,030 --> 01:03:02,365 as the proprietary APIs, we're still somewhat locked 1285 01:03:02,365 --> 01:03:03,990 into having to build different versions 1286 01:03:03,990 --> 01:03:06,660 and distribute three-- each EHR under separate channels. 1287 01:03:06,660 --> 01:03:08,645 Really, really good question. 1288 01:03:08,645 --> 01:03:10,800 PETER SZOLOVITS: And so what's lacking 1289 01:03:10,800 --> 01:03:13,080 in things like Smart on FHIR-- 1290 01:03:13,080 --> 01:03:14,055 ADAM WRIGHT: Yeah. 1291 01:03:14,055 --> 01:03:16,513 PETER SZOLOVITS: --that you get from the native interfaces? 1292 01:03:16,513 --> 01:03:19,290 ADAM WRIGHT: So it's very situational, right. 1293 01:03:19,290 --> 01:03:22,680 So, for example, in some EHR implementations, 1294 01:03:22,680 --> 01:03:25,063 the Smart on FHIR will give you a list of the patient's 1295 01:03:25,063 --> 01:03:26,730 current medications but may not give you 1296 01:03:26,730 --> 01:03:28,020 historical medications. 1297 01:03:28,020 --> 01:03:30,240 Or it will tell you that the medicine is ordered, 1298 01:03:30,240 --> 01:03:32,620 but it won't tell you whether it's been administered. 1299 01:03:32,620 --> 01:03:36,880 So one half of the battle is less complete data. 1300 01:03:36,880 --> 01:03:40,920 The other one is that most EHRs are not 1301 01:03:40,920 --> 01:03:42,990 implementing, at this point, the sort 1302 01:03:42,990 --> 01:03:47,220 of write back capabilities, or the actionable capabilities, 1303 01:03:47,220 --> 01:03:49,050 that Smart on FHIR is sort of working on. 1304 01:03:49,050 --> 01:03:50,633 And it's really some standards for us. 1305 01:03:50,633 --> 01:03:52,530 So if we want to build an application that 1306 01:03:52,530 --> 01:03:55,325 shows how a patient fits on a growth curve, that's fine. 1307 01:03:55,325 --> 01:03:56,950 If we went to build an application that 1308 01:03:56,950 --> 01:04:00,040 suggests ordering medicines, that can be really challenging. 1309 01:04:00,040 --> 01:04:03,180 Whereas the internal APIs that the vendors provide typically 1310 01:04:03,180 --> 01:04:04,940 have both read and write capabilities. 1311 01:04:04,940 --> 01:04:05,960 So that's the other challenge. 1312 01:04:05,960 --> 01:04:07,418 PETER SZOLOVITS: And do the vendors 1313 01:04:07,418 --> 01:04:11,117 worry about, I guess two related things, 1314 01:04:11,117 --> 01:04:13,595 one is sort of cognitive overload. 1315 01:04:13,595 --> 01:04:16,770 Because if you build 1,000 Smart on FHIR apps, 1316 01:04:16,770 --> 01:04:20,040 and they all start firing for these inpatients, 1317 01:04:20,040 --> 01:04:22,260 you're going to be back in the same situation 1318 01:04:22,260 --> 01:04:24,270 of over-alerting. 1319 01:04:24,270 --> 01:04:27,520 And the other question is, are they worried about liability? 1320 01:04:27,520 --> 01:04:30,270 Since if you were using their system 1321 01:04:30,270 --> 01:04:35,040 to display recommendations, and those recommendations turn out 1322 01:04:35,040 --> 01:04:37,350 to be wrong and harm some patient, 1323 01:04:37,350 --> 01:04:39,620 then somebody will reach out to them 1324 01:04:39,620 --> 01:04:41,440 legally because they have a lot of money. 1325 01:04:41,440 --> 01:04:42,440 ADAM WRIGHT: Absolutely. 1326 01:04:42,440 --> 01:04:44,065 They're worried about both of those. 1327 01:04:44,065 --> 01:04:45,690 Related particularly to the second one, 1328 01:04:45,690 --> 01:04:48,450 they're also worried about just sort of corruption or integrity 1329 01:04:48,450 --> 01:04:49,500 of the data, right. 1330 01:04:49,500 --> 01:04:52,140 So somehow if I can write a medication order directly 1331 01:04:52,140 --> 01:04:55,920 to the database, and it may bypass certain checks 1332 01:04:55,920 --> 01:04:57,390 that would be done normally. 1333 01:04:57,390 --> 01:05:02,500 And I could potentially enter a wrong or dangerous order. 1334 01:05:02,500 --> 01:05:04,560 The other thing that we're increasingly hearing 1335 01:05:04,560 --> 01:05:08,280 is concerns about protection of data, sort of Cambridge 1336 01:05:08,280 --> 01:05:10,470 Analytica style worries, right. 1337 01:05:10,470 --> 01:05:15,390 So if I, as an Epic patient, authorize the Words 1338 01:05:15,390 --> 01:05:17,910 With Friends app to see my medical record, 1339 01:05:17,910 --> 01:05:20,190 and then they post that on the web, 1340 01:05:20,190 --> 01:05:23,370 or monetize it in some sort of a tricky way, 1341 01:05:23,370 --> 01:05:25,320 what liability, if any, does my health care 1342 01:05:25,320 --> 01:05:27,826 provider organization, or my-- 1343 01:05:27,826 --> 01:05:30,130 the EHR vendor, have for that? 1344 01:05:30,130 --> 01:05:32,340 And the new regulations are extremely strict, right. 1345 01:05:32,340 --> 01:05:36,120 They say that if a patient asks you to, and authorizes an app 1346 01:05:36,120 --> 01:05:40,560 to access their record, you may not block that access, 1347 01:05:40,560 --> 01:05:42,770 even if you consider that app to be a bad actor. 1348 01:05:42,770 --> 01:05:46,760 So that's I think an area of liability that is just 1349 01:05:46,760 --> 01:05:47,930 beginning to be sorted out. 1350 01:05:47,930 --> 01:05:50,857 And it is, I think, some cause for concern. 1351 01:05:50,857 --> 01:05:52,940 But at the same time, you could imagine a universe 1352 01:05:52,940 --> 01:05:55,370 where, I think, there are conservative health 1353 01:05:55,370 --> 01:05:57,980 organizations that would choose to never authorize 1354 01:05:57,980 --> 01:06:01,230 any application to avoid risk. 1355 01:06:01,230 --> 01:06:03,980 So how you balance that is not yet solved. 1356 01:06:03,980 --> 01:06:05,855 PETER SZOLOVITS: Well-- and to avoid leakage. 1357 01:06:05,855 --> 01:06:06,855 ADAM WRIGHT: Absolutely. 1358 01:06:06,855 --> 01:06:09,050 PETER SZOLOVITS: So I remember years ago there 1359 01:06:09,050 --> 01:06:13,160 was a lot of reluctance, even among Boston area hospitals, 1360 01:06:13,160 --> 01:06:15,140 to share data, because they were worried 1361 01:06:15,140 --> 01:06:17,480 that another hospital could cherry 1362 01:06:17,480 --> 01:06:21,950 pick their most lucrative patients by figuring out 1363 01:06:21,950 --> 01:06:23,000 something about them. 1364 01:06:23,000 --> 01:06:26,300 So I'm sure that that hasn't gone away as a concern. 1365 01:06:26,300 --> 01:06:27,550 ADAM WRIGHT: Absolutely, yeah. 1366 01:06:27,550 --> 01:06:30,710 PETER SZOLOVITS: OK, we're going to try to remember to repeat 1367 01:06:30,710 --> 01:06:31,960 the questions you're asking-- 1368 01:06:31,960 --> 01:06:33,290 ADAM WRIGHT: Oh great, OK. 1369 01:06:33,290 --> 01:06:34,680 PETER SZOLOVITS: --because of the recording. 1370 01:06:34,680 --> 01:06:35,597 ADAM WRIGHT: Happy to. 1371 01:06:35,597 --> 01:06:36,575 PETER SZOLOVITS: Yeah. 1372 01:06:36,575 --> 01:06:39,230 AUDIENCE: So how does a third party vendor 1373 01:06:39,230 --> 01:06:42,530 deploy a machine learning model on your system? 1374 01:06:42,530 --> 01:06:44,530 So is that done through Epic? 1375 01:06:44,530 --> 01:06:46,700 Obviously, there's the App Orchard kind of thing, 1376 01:06:46,700 --> 01:06:49,160 but is there ways to go around that and go directly 1377 01:06:49,160 --> 01:06:50,650 into partners and whatnot? 1378 01:06:50,650 --> 01:06:51,150 And how does that work? 1379 01:06:51,150 --> 01:06:51,530 ADAM WRIGHT: Yeah. 1380 01:06:51,530 --> 01:06:53,630 So the question is how does a third party vendor 1381 01:06:53,630 --> 01:06:55,640 deploy an application or a machine 1382 01:06:55,640 --> 01:06:57,290 learning model or something like that? 1383 01:06:57,290 --> 01:07:01,550 And so with Epic, there's always a relationship 1384 01:07:01,550 --> 01:07:05,420 between the vendor of the application and the health care 1385 01:07:05,420 --> 01:07:06,630 provider organization. 1386 01:07:06,630 --> 01:07:09,510 And so we could work together directly. 1387 01:07:09,510 --> 01:07:12,140 So if you had an app that the Brigham wanted to use, 1388 01:07:12,140 --> 01:07:16,160 you could share that app with us in a number of ways. 1389 01:07:16,160 --> 01:07:19,550 So Epic supports this thing called Predictive Modeling 1390 01:07:19,550 --> 01:07:21,230 Markup Language, or PMML. 1391 01:07:21,230 --> 01:07:24,160 So if you train a model, you can export a PMML model. 1392 01:07:24,160 --> 01:07:26,990 And I can import it into Epic and run it natively. 1393 01:07:26,990 --> 01:07:31,320 Or you can produce a web service that I call out to and gives me 1394 01:07:31,320 --> 01:07:31,820 an answer. 1395 01:07:31,820 --> 01:07:34,320 We could work together directly. 1396 01:07:34,320 --> 01:07:36,620 However, there are some limitations 1397 01:07:36,620 --> 01:07:39,770 in what I'm allowed to tell you or share with you about Epic's 1398 01:07:39,770 --> 01:07:41,840 data model and what Epic perceives to be 1399 01:07:41,840 --> 01:07:43,530 their intellectual property. 1400 01:07:43,530 --> 01:07:48,300 And it is facilitated by you joining this program. 1401 01:07:48,300 --> 01:07:49,700 Because if you join this program, 1402 01:07:49,700 --> 01:07:52,200 you get access to documentation that you would otherwise not 1403 01:07:52,200 --> 01:07:53,223 have access to. 1404 01:07:53,223 --> 01:07:55,640 You may get access to a test harness or a test system that 1405 01:07:55,640 --> 01:07:57,930 lets you sort of validate your work. 1406 01:07:57,930 --> 01:07:59,720 However, people who join the program often 1407 01:07:59,720 --> 01:08:01,430 think that means that I can then just 1408 01:08:01,430 --> 01:08:03,050 run my app at every customer, right. 1409 01:08:03,050 --> 01:08:04,820 But with Epic, in particular, you 1410 01:08:04,820 --> 01:08:08,240 have to then make a deal with me to use it at the Brigham 1411 01:08:08,240 --> 01:08:11,750 and make a deal with my colleague to use at Stanford. 1412 01:08:11,750 --> 01:08:14,120 Other EHR vendors have developed a more sort 1413 01:08:14,120 --> 01:08:16,069 of centralized model where you can actually 1414 01:08:16,069 --> 01:08:17,450 release it and sell it, and I can 1415 01:08:17,450 --> 01:08:21,290 pay for it directly through the app store and integrate it. 1416 01:08:21,290 --> 01:08:23,180 I think that last mile piece hasn't really 1417 01:08:23,180 --> 01:08:24,729 been standardized yet. 1418 01:08:24,729 --> 01:08:26,271 AUDIENCE: I guess one of my questions 1419 01:08:26,271 --> 01:08:28,040 there is, what happens in the case 1420 01:08:28,040 --> 01:08:30,020 that I don't want to talk to Epic at all? 1421 01:08:30,020 --> 01:08:32,600 And just I looked at your data and just 1422 01:08:32,600 --> 01:08:34,217 like Brigham and Women's stuff. 1423 01:08:34,217 --> 01:08:35,550 And I build a really good model. 1424 01:08:35,550 --> 01:08:38,060 You saw how it works, and we just want to deploy it. 1425 01:08:38,060 --> 01:08:40,410 ADAM WRIGHT: Epic would not stop us from doing that. 1426 01:08:40,410 --> 01:08:42,260 The only real restriction is that Epic 1427 01:08:42,260 --> 01:08:45,710 would limit my ability to tell you stuff about Epic's guts. 1428 01:08:45,710 --> 01:08:48,490 And so you would need a relatively sophisticated health 1429 01:08:48,490 --> 01:08:51,240 care provider organization who could 1430 01:08:51,240 --> 01:08:54,229 map between some kind of platonic data, clinical data, 1431 01:08:54,229 --> 01:08:56,609 model and Epic's internal data model. 1432 01:08:56,609 --> 01:08:58,560 But if you had that, you could. 1433 01:08:58,560 --> 01:09:01,670 And at the Brigham, we have this iHub Innovation Program. 1434 01:09:01,670 --> 01:09:04,880 And we're probably working with 50 to 100 startups 1435 01:09:04,880 --> 01:09:06,710 doing work like that, some of whom 1436 01:09:06,710 --> 01:09:08,600 are members of the Epic App Orchard 1437 01:09:08,600 --> 01:09:10,905 and some who choose not to be members of the Epic App 1438 01:09:10,905 --> 01:09:11,210 Orchard. 1439 01:09:11,210 --> 01:09:12,793 It's worth saying that joining the App 1440 01:09:12,793 --> 01:09:14,870 Orchard or these programs entails revenue sharing 1441 01:09:14,870 --> 01:09:16,609 with Epic and some complexity. 1442 01:09:16,609 --> 01:09:18,745 That may go way down with these new regulations. 1443 01:09:18,745 --> 01:09:20,120 But right now, some organizations 1444 01:09:20,120 --> 01:09:22,037 have chosen not to partner with the vendors 1445 01:09:22,037 --> 01:09:23,620 and work directly with the health care 1446 01:09:23,620 --> 01:09:24,578 provider organizations. 1447 01:09:24,578 --> 01:09:27,470 PETER SZOLOVITS: So on the quality side of that question, 1448 01:09:27,470 --> 01:09:31,970 if you do develop an application and field it at the Brigham, 1449 01:09:31,970 --> 01:09:35,330 will Stanford be interested in taking it? 1450 01:09:35,330 --> 01:09:37,580 Or are they going to be concerned about the fact 1451 01:09:37,580 --> 01:09:40,910 that somehow you've fit it to the patient population 1452 01:09:40,910 --> 01:09:43,575 in Boston, and it won't be appropriate to their data? 1453 01:09:43,575 --> 01:09:45,950 ADAM WRIGHT: Yeah, I think that's a fundamental question, 1454 01:09:45,950 --> 01:09:48,859 right, is to what extent do these models generalize, right? 1455 01:09:48,859 --> 01:09:50,990 Can you train a model at one place 1456 01:09:50,990 --> 01:09:52,700 and transfer it to another place? 1457 01:09:52,700 --> 01:09:54,950 We've generally seen that many of them 1458 01:09:54,950 --> 01:09:56,240 transfer pretty well, right. 1459 01:09:56,240 --> 01:09:58,370 So if they really have more to do 1460 01:09:58,370 --> 01:10:00,080 with kind of core human physiology, 1461 01:10:00,080 --> 01:10:02,420 that can be pretty similar between organizations. 1462 01:10:02,420 --> 01:10:04,700 If they're really bound up in a particular workflow, 1463 01:10:04,700 --> 01:10:07,550 right, they assume that you're doing this task, this task, 1464 01:10:07,550 --> 01:10:10,360 this task in this order, they tend to transfer really, really 1465 01:10:10,360 --> 01:10:10,950 poorly. 1466 01:10:10,950 --> 01:10:13,250 So I would say that our general approach has 1467 01:10:13,250 --> 01:10:15,230 been to take a model that somebody has, 1468 01:10:15,230 --> 01:10:17,158 run it retrospectively on our data warehouse, 1469 01:10:17,158 --> 01:10:18,200 and see if it's accurate. 1470 01:10:18,200 --> 01:10:19,950 And if it is, we might go forward with it. 1471 01:10:19,950 --> 01:10:22,340 If it's not, we would try to retrain it on our data, 1472 01:10:22,340 --> 01:10:23,840 and then see how much improvement we 1473 01:10:23,840 --> 01:10:24,963 get by retraining it. 1474 01:10:24,963 --> 01:10:26,630 PETER SZOLOVITS: And so have you in fact 1475 01:10:26,630 --> 01:10:28,745 imported such models from other places? 1476 01:10:28,745 --> 01:10:29,870 ADAM WRIGHT: We have, yeah. 1477 01:10:29,870 --> 01:10:32,140 Epic provides five or six models. 1478 01:10:32,140 --> 01:10:35,300 And we've just started using some of them at the Brigham 1479 01:10:35,300 --> 01:10:37,800 or just kind of signed the license to begin using them. 1480 01:10:37,800 --> 01:10:40,920 And I think Epic's guidance and our experience 1481 01:10:40,920 --> 01:10:44,097 is that they work pretty well out of the box. 1482 01:10:44,097 --> 01:10:45,590 PETER SZOLOVITS: Great. 1483 01:10:45,590 --> 01:10:47,423 AUDIENCE: So could you say a little bit more 1484 01:10:47,423 --> 01:10:49,660 about these rescores that are being deployed, 1485 01:10:49,660 --> 01:10:50,390 maybe they work. 1486 01:10:50,390 --> 01:10:51,345 Maybe they don't. 1487 01:10:51,345 --> 01:10:54,400 How can you really tell whether they're working, even 1488 01:10:54,400 --> 01:10:57,170 just beyond patient shift over time, 1489 01:10:57,170 --> 01:10:58,902 just like how people react to the scores. 1490 01:10:58,902 --> 01:11:00,860 Like I know a lot of the bias in fairness works 1491 01:11:00,860 --> 01:11:03,625 is like people, if a score agrees with their intuition, 1492 01:11:03,625 --> 01:11:04,590 they'll trust it. 1493 01:11:04,590 --> 01:11:06,560 And if it doesn't, they ignore the score. 1494 01:11:06,560 --> 01:11:08,540 So like how-- what does the process look 1495 01:11:08,540 --> 01:11:11,022 like before you deploy the score thing 1496 01:11:11,022 --> 01:11:12,730 and then see whether it's working or not? 1497 01:11:12,730 --> 01:11:13,260 ADAM WRIGHT: Yeah, absolutely. 1498 01:11:13,260 --> 01:11:14,927 So the question is, we get a risk score, 1499 01:11:14,927 --> 01:11:16,660 or we deploy a new risk score that says, 1500 01:11:16,660 --> 01:11:18,368 patient has a risk of falling, or patient 1501 01:11:18,368 --> 01:11:20,830 has a risk of having sepsis or something like that. 1502 01:11:20,830 --> 01:11:22,960 We tend to do several levels of evaluation, right. 1503 01:11:22,960 --> 01:11:25,085 So the first level is, when we show the score, what 1504 01:11:25,085 --> 01:11:25,990 do people do, right? 1505 01:11:25,990 --> 01:11:27,940 If we-- typically we don't just show a score, 1506 01:11:27,940 --> 01:11:28,982 we make a recommendation. 1507 01:11:28,982 --> 01:11:30,540 We say, based on the score we think 1508 01:11:30,540 --> 01:11:32,665 you should order a lactate to see if the patient is 1509 01:11:32,665 --> 01:11:33,772 at risk of having sepsis. 1510 01:11:33,772 --> 01:11:35,980 First we look to see if people do what we say, right. 1511 01:11:35,980 --> 01:11:38,890 So we think it's a good sign if people follow the suggestions. 1512 01:11:38,890 --> 01:11:40,678 But ultimately, we view ourselves 1513 01:11:40,678 --> 01:11:42,220 as sort of clinical trialists, right. 1514 01:11:42,220 --> 01:11:45,340 So we deploy this model with an intent to move something, 1515 01:11:45,340 --> 01:11:48,440 to reduce the rate of sepsis, or to reduce the rate of mortality 1516 01:11:48,440 --> 01:11:48,940 in sepsis. 1517 01:11:48,940 --> 01:11:51,652 And so we would try to sort of measure, if nothing else, 1518 01:11:51,652 --> 01:11:53,110 do a before and after study, right, 1519 01:11:53,110 --> 01:11:55,510 measure the rates before, implement this intervention, 1520 01:11:55,510 --> 01:11:57,580 and measure the rates after. 1521 01:11:57,580 --> 01:11:59,823 In cases where we're less sure, or where we really 1522 01:11:59,823 --> 01:12:01,240 care about the results, we'll even 1523 01:12:01,240 --> 01:12:02,448 do a randomized trial, right. 1524 01:12:02,448 --> 01:12:04,912 So we'll give half of the units will get the alert, 1525 01:12:04,912 --> 01:12:06,370 half the units won't get the alert. 1526 01:12:06,370 --> 01:12:08,890 And we'll compare the effect on a clinical outcome 1527 01:12:08,890 --> 01:12:10,310 and see what the difference is. 1528 01:12:10,310 --> 01:12:12,910 In our opinion, unless we can show an effect 1529 01:12:12,910 --> 01:12:16,180 on these clinical measures, we shouldn't 1530 01:12:16,180 --> 01:12:17,360 be bothering people, right. 1531 01:12:17,360 --> 01:12:19,600 Pete made this point that what's the purpose 1532 01:12:19,600 --> 01:12:21,152 of having-- if we have 1,000 alerts, 1533 01:12:21,152 --> 01:12:22,360 everyone will be overwhelmed. 1534 01:12:22,360 --> 01:12:23,700 So we should only keep alerts on if we 1535 01:12:23,700 --> 01:12:25,090 can show that they're making a real clinical difference. 1536 01:12:25,090 --> 01:12:27,520 AUDIENCE: And are those sort of like just internal checks, 1537 01:12:27,520 --> 01:12:30,160 are there papers of some of these deployments? 1538 01:12:30,160 --> 01:12:31,180 ADAM WRIGHT: It's our-- 1539 01:12:31,180 --> 01:12:32,860 it's our intent to publish everything, right. 1540 01:12:32,860 --> 01:12:34,200 I mean, I think we're behind. 1541 01:12:34,200 --> 01:12:35,915 But I'd say, we publish everything. 1542 01:12:35,915 --> 01:12:37,540 We have some things that we've finished 1543 01:12:37,540 --> 01:12:38,790 that we haven't published yet. 1544 01:12:38,790 --> 01:12:40,950 They're sort of the next thing to sort of come out. 1545 01:12:40,950 --> 01:12:43,290 Yeah. 1546 01:12:43,290 --> 01:12:45,470 AUDIENCE: I guess so earlier we were talking 1547 01:12:45,470 --> 01:12:50,040 about how the models are just used to give recommendations 1548 01:12:50,040 --> 01:12:51,620 to doctors. 1549 01:12:51,620 --> 01:12:55,340 Do you have any metric, in terms of how often the model 1550 01:12:55,340 --> 01:12:58,770 recommendation matches with the doctor's decision? 1551 01:12:58,770 --> 01:13:00,020 ADAM WRIGHT: Yeah, absolutely. 1552 01:13:00,020 --> 01:13:00,530 AUDIENCE: Can you repeat the question? 1553 01:13:00,530 --> 01:13:01,150 ADAM WRIGHT: Oh yeah. 1554 01:13:01,150 --> 01:13:01,730 Thanks, David. 1555 01:13:01,730 --> 01:13:03,050 So the question is, do we ever check 1556 01:13:03,050 --> 01:13:05,092 to see how often the model recommendation matches 1557 01:13:05,092 --> 01:13:06,448 what the doctor does? 1558 01:13:06,448 --> 01:13:08,240 And so there's sort of two ways we do that. 1559 01:13:08,240 --> 01:13:11,240 We'll often retrospectively test the model back. 1560 01:13:11,240 --> 01:13:12,920 I think Pete shared a paper from Cerner 1561 01:13:12,920 --> 01:13:14,840 where they looked at these sort of suggestions 1562 01:13:14,840 --> 01:13:16,310 that they made to order lactates or to do 1563 01:13:16,310 --> 01:13:17,393 other sort of sepsis work. 1564 01:13:17,393 --> 01:13:20,120 And they looked to see whether the recommendations that they 1565 01:13:20,120 --> 01:13:22,790 made matched what the doctors had actually done. 1566 01:13:22,790 --> 01:13:24,717 And they showed that they, in many cases, did. 1567 01:13:24,717 --> 01:13:26,550 So that'll be the first thing that we do is, 1568 01:13:26,550 --> 01:13:29,120 before we even turn the model on, we'll run it in silent mode 1569 01:13:29,120 --> 01:13:31,115 and see if the doctor does what we suggest. 1570 01:13:31,115 --> 01:13:33,240 Now the doctor is not a perfect supervision, right, 1571 01:13:33,240 --> 01:13:35,368 because the doctor may neglect to do something 1572 01:13:35,368 --> 01:13:36,410 that would be good to do. 1573 01:13:36,410 --> 01:13:38,120 So then when we turn it on, we actually 1574 01:13:38,120 --> 01:13:39,620 look to see whether the doctor takes 1575 01:13:39,620 --> 01:13:41,883 the action that we suggested. 1576 01:13:41,883 --> 01:13:43,800 And if we're doing it in this randomized mode, 1577 01:13:43,800 --> 01:13:45,950 we would then look to see whether the doctor takes 1578 01:13:45,950 --> 01:13:49,280 the action we suggested more often in the case where we show 1579 01:13:49,280 --> 01:13:52,280 the alert, than where we generate the alert but just 1580 01:13:52,280 --> 01:13:54,634 logged it and don't-- don't show it. 1581 01:13:54,634 --> 01:13:55,134 Yeah. 1582 01:13:58,960 --> 01:13:59,750 Yes, sir? 1583 01:13:59,750 --> 01:14:05,020 AUDIENCE: So you'd mentioned how there's 1584 01:14:05,020 --> 01:14:06,505 kind of related to fatigue-- 1585 01:14:06,505 --> 01:14:06,800 ADAM WRIGHT: Yeah. 1586 01:14:06,800 --> 01:14:08,130 AUDIENCE: --if it's a code blue, these alarms will-- 1587 01:14:08,130 --> 01:14:09,080 ADAM WRIGHT: Right. 1588 01:14:09,080 --> 01:14:10,930 AUDIENCE: And you said that cockpits have-- 1589 01:14:10,930 --> 01:14:11,180 pilots now-- 1590 01:14:11,180 --> 01:14:11,610 ADAM WRIGHT: Yeah. 1591 01:14:11,610 --> 01:14:13,235 AUDIENCE: --that have similar problems. 1592 01:14:13,235 --> 01:14:15,690 My very limited understanding of aviation 1593 01:14:15,690 --> 01:14:18,120 is that if you're flying, say, below 10,000 feet, 1594 01:14:18,120 --> 01:14:19,120 then almost all of the-- 1595 01:14:19,120 --> 01:14:19,430 ADAM WRIGHT: Yeah. 1596 01:14:19,430 --> 01:14:20,890 AUDIENCE: --alarms get turned off, and-- 1597 01:14:20,890 --> 01:14:21,190 ADAM WRIGHT: Yeah. 1598 01:14:21,190 --> 01:14:22,630 AUDIENCE: --I don't know if there seems to be an airlock 1599 01:14:22,630 --> 01:14:23,430 for that, for-- 1600 01:14:23,430 --> 01:14:23,740 ADAM WRIGHT: Yeah. 1601 01:14:23,740 --> 01:14:24,040 AUDIENCE: --hospitals yet. 1602 01:14:24,040 --> 01:14:26,340 And is that just because the technology workflow 1603 01:14:26,340 --> 01:14:28,395 is not mature enough yet, only 10 years old? 1604 01:14:28,395 --> 01:14:29,145 ADAM WRIGHT: Yeah. 1605 01:14:29,145 --> 01:14:31,145 AUDIENCE: Or is that kind of the team's question 1606 01:14:31,145 --> 01:14:34,960 about the incentives between if you build the tool 1607 01:14:34,960 --> 01:14:36,610 and it doesn't flag this thing-- 1608 01:14:36,610 --> 01:14:37,000 ADAM WRIGHT: Yeah. 1609 01:14:37,000 --> 01:14:38,440 AUDIENCE: --the patient dies, then they could sued. 1610 01:14:38,440 --> 01:14:39,523 And so they're just very-- 1611 01:14:39,523 --> 01:14:41,290 ADAM WRIGHT: Yeah, no, we try, right? 1612 01:14:41,290 --> 01:14:43,660 So since we often don't know about the situations 1613 01:14:43,660 --> 01:14:45,460 in a structured way at the EHR. 1614 01:14:45,460 --> 01:14:48,440 And so most of our alerts are suppressed in the operating 1615 01:14:48,440 --> 01:14:48,940 room, right? 1616 01:14:48,940 --> 01:14:52,270 So during an-- when a patient is on anesthesia, 1617 01:14:52,270 --> 01:14:54,280 their physiology is being sort of manually 1618 01:14:54,280 --> 01:14:56,040 controlled by a doctor. 1619 01:14:56,040 --> 01:14:59,327 And so we often suppress the alerts in those situations. 1620 01:14:59,327 --> 01:15:01,660 I guess I didn't say the question, but the question was, 1621 01:15:01,660 --> 01:15:03,610 do we try to take situations into account 1622 01:15:03,610 --> 01:15:05,010 or how much can we? 1623 01:15:05,010 --> 01:15:07,218 We didn't used to know that a code blue was going on, 1624 01:15:07,218 --> 01:15:09,593 because we used to do most of our code blue documentation 1625 01:15:09,593 --> 01:15:10,210 on paper. 1626 01:15:10,210 --> 01:15:11,810 We now use this code narrator, right? 1627 01:15:11,810 --> 01:15:13,150 So we can tell when a code blue starts 1628 01:15:13,150 --> 01:15:14,260 and when a code blue ends. 1629 01:15:14,260 --> 01:15:17,280 A code blue is a cardiac arrest and resuscitation of a patient. 1630 01:15:17,280 --> 01:15:18,970 And so we actually do increasingly 1631 01:15:18,970 --> 01:15:22,030 turn a lot of alerting off during a code blue. 1632 01:15:22,030 --> 01:15:24,310 I get an email or a page whenever 1633 01:15:24,310 --> 01:15:27,520 a doctor overrides an alert and writes a cranky message. 1634 01:15:27,520 --> 01:15:30,610 And they'll often say something like, this patient is dying 1635 01:15:30,610 --> 01:15:32,668 of a myocardial infarction right now, 1636 01:15:32,668 --> 01:15:34,960 and your bothering me about this influenza vaccination. 1637 01:15:34,960 --> 01:15:36,585 And then what I'll do is I'll go back-- 1638 01:15:36,585 --> 01:15:38,260 no, seriously, I had that yesterday. 1639 01:15:38,260 --> 01:15:40,677 And so what I'll do is I'll go back and look in the record 1640 01:15:40,677 --> 01:15:42,940 and say, what signs did I have this patient sort 1641 01:15:42,940 --> 01:15:43,780 of in extremis? 1642 01:15:43,780 --> 01:15:45,970 And in that particular case, it was 1643 01:15:45,970 --> 01:15:48,820 a patient who came into the ED and very little documentation 1644 01:15:48,820 --> 01:15:50,445 had been started, and so there actually 1645 01:15:50,445 --> 01:15:53,020 were very few signs that the patient was in the acute state. 1646 01:15:53,020 --> 01:15:55,380 I think this, someday, could be sorted 1647 01:15:55,380 --> 01:15:58,630 by integrating monitor data and device data to figure that out. 1648 01:15:58,630 --> 01:16:01,053 But at that point, we didn't have a good, structured data 1649 01:16:01,053 --> 01:16:02,470 at that moment, in the chart, that 1650 01:16:02,470 --> 01:16:05,050 said this patient is so ill that it's 1651 01:16:05,050 --> 01:16:07,300 offensive to suggest an influenza vaccination right 1652 01:16:07,300 --> 01:16:08,382 now. 1653 01:16:08,382 --> 01:16:10,090 PETER SZOLOVITS: Now, there are hospitals 1654 01:16:10,090 --> 01:16:13,930 that have started experimenting with things like acquiring data 1655 01:16:13,930 --> 01:16:16,930 from the ambulance as the patient is coming in 1656 01:16:16,930 --> 01:16:20,230 so that the ED is already primed with preliminary data. 1657 01:16:20,230 --> 01:16:20,980 ADAM WRIGHT: Yeah. 1658 01:16:20,980 --> 01:16:23,530 PETER SZOLOVITS: And in that circumstance, you could tell. 1659 01:16:23,530 --> 01:16:25,330 ADAM WRIGHT: So this is the interoperability challenge, 1660 01:16:25,330 --> 01:16:25,830 right? 1661 01:16:25,830 --> 01:16:27,985 So we actually get the run sheet, all 1662 01:16:27,985 --> 01:16:29,860 of the ambulance data, to us. 1663 01:16:29,860 --> 01:16:34,630 It comes in as a PDF that's transmitted from the ambulance 1664 01:16:34,630 --> 01:16:37,510 emergency management system to our EHR. 1665 01:16:37,510 --> 01:16:40,780 And so it's not coming in in a way that we can read it well. 1666 01:16:40,780 --> 01:16:43,750 But to your point, exactly, if we were better 1667 01:16:43,750 --> 01:16:44,840 at interoperability-- 1668 01:16:44,840 --> 01:16:47,350 I've also talked to hospitals who use things 1669 01:16:47,350 --> 01:16:50,440 like video cameras and people's badges, 1670 01:16:50,440 --> 01:16:52,630 and if there's 50 people hovering around a patient, 1671 01:16:52,630 --> 01:16:54,547 that's a sign that something bad is happening. 1672 01:16:54,547 --> 01:16:58,240 And so we might be able to use something like that. 1673 01:16:58,240 --> 01:17:02,000 But yeah, we'd like to be better at that. 1674 01:17:02,000 --> 01:17:05,340 PETER SZOLOVITS: So why did HL7 version 3 not solve 1675 01:17:05,340 --> 01:17:06,795 all of these problems? 1676 01:17:06,795 --> 01:17:08,920 ADAM WRIGHT: This is a good philosophical question. 1677 01:17:08,920 --> 01:17:13,240 Come to BMI 701 and 702 and we'll talk about the standards. 1678 01:17:13,240 --> 01:17:15,740 HL7 version-- to his question-- version 2 1679 01:17:15,740 --> 01:17:16,990 was a very practical standard. 1680 01:17:16,990 --> 01:17:19,490 Version 3 was a very deeply philosophical standard-- 1681 01:17:19,490 --> 01:17:20,740 PETER SZOLOVITS: Aspirational. 1682 01:17:20,740 --> 01:17:24,310 ADAM WRIGHT: --aspirational, that never quite caught on. 1683 01:17:24,310 --> 01:17:25,340 And it did in pieces. 1684 01:17:25,340 --> 01:17:29,030 I mean, FHIR is a simplification of that. 1685 01:17:29,030 --> 01:17:29,947 PETER SZOLOVITS: Yeah. 1686 01:17:29,947 --> 01:17:30,863 ADAM WRIGHT: Yes, sir? 1687 01:17:30,863 --> 01:17:33,430 AUDIENCE: So I think usually, the machine learning models 1688 01:17:33,430 --> 01:17:35,050 evaluates the difficult [INAUDIBLE].. 1689 01:17:35,050 --> 01:17:36,170 ADAM WRIGHT: Yes, sir. 1690 01:17:36,170 --> 01:17:38,170 AUDIENCE: When it comes to a particular patient, 1691 01:17:38,170 --> 01:17:40,930 is there a way to know how reliable the model is? 1692 01:17:40,930 --> 01:17:43,220 ADAM WRIGHT: Yeah, I mean, there's calibration, right? 1693 01:17:43,220 --> 01:17:44,690 So we can say this model works particularly 1694 01:17:44,690 --> 01:17:47,150 well in these patients, or not as well in these patients. 1695 01:17:47,150 --> 01:17:48,712 There are some very simple equations 1696 01:17:48,712 --> 01:17:50,420 or models that we use, for example, where 1697 01:17:50,420 --> 01:17:53,480 we use a different model in African-American patients 1698 01:17:53,480 --> 01:17:55,617 versus non-African-American patients, 1699 01:17:55,617 --> 01:17:57,950 because there's some data that says this model is better 1700 01:17:57,950 --> 01:18:01,230 calibrated in this subgroup of patients versus another. 1701 01:18:01,230 --> 01:18:02,900 I do think, though, to your point, 1702 01:18:02,900 --> 01:18:06,020 that there's a suggestion, an inference 1703 01:18:06,020 --> 01:18:08,570 from a model-- this patient is at risk of a fall. 1704 01:18:08,570 --> 01:18:14,550 And then there's this whole set of value judgments and beliefs 1705 01:18:14,550 --> 01:18:18,310 and knowledge and understanding of a patient's circumstances 1706 01:18:18,310 --> 01:18:19,460 that are very human. 1707 01:18:19,460 --> 01:18:21,290 And I think that that's largely why 1708 01:18:21,290 --> 01:18:24,380 we deliver these suggestions to a doctor or to a nurse. 1709 01:18:24,380 --> 01:18:28,010 And then that human uses that information 1710 01:18:28,010 --> 01:18:30,740 plus their expertise and their relationship 1711 01:18:30,740 --> 01:18:35,300 and their experience to make a suggestion, rather than just 1712 01:18:35,300 --> 01:18:38,978 having the computer adjust the knob on the ventilator itself. 1713 01:18:38,978 --> 01:18:40,520 A question that people always ask me, 1714 01:18:40,520 --> 01:18:42,603 and that you should ask me, is, will we eventually 1715 01:18:42,603 --> 01:18:43,780 not need that human? 1716 01:18:43,780 --> 01:18:47,000 And I think I'm more optimistic than some people 1717 01:18:47,000 --> 01:18:49,550 that there are cases where the computer is good enough, 1718 01:18:49,550 --> 01:18:51,380 or the human is poor enough, that it 1719 01:18:51,380 --> 01:18:55,260 would be safe to have a close to closed loop. 1720 01:18:55,260 --> 01:18:57,712 However, I think those cases are not the norm. 1721 01:18:57,712 --> 01:19:00,170 I think that there'll be more cases where human doctors are 1722 01:19:00,170 --> 01:19:02,290 still very much needed. 1723 01:19:02,290 --> 01:19:04,040 PETER SZOLOVITS: So just to add that there 1724 01:19:04,040 --> 01:19:09,740 are tasks where patients are fungible, in the words 1725 01:19:09,740 --> 01:19:12,980 that I used a few lectures ago. 1726 01:19:12,980 --> 01:19:14,990 So for example, a lot of hospitals 1727 01:19:14,990 --> 01:19:19,850 are developing models that predict whether a patient will 1728 01:19:19,850 --> 01:19:24,770 show up for their optional surgery, because then they 1729 01:19:24,770 --> 01:19:27,680 can do a better job of over-scheduling the operating 1730 01:19:27,680 --> 01:19:32,600 room in the same way that the airlines over over-sell seats. 1731 01:19:32,600 --> 01:19:36,260 Because, statistically, you could win doing that. 1732 01:19:36,260 --> 01:19:39,380 Those are very safe predictions, because the worst thing that 1733 01:19:39,380 --> 01:19:41,480 happens is you get delayed. 1734 01:19:41,480 --> 01:19:43,340 But it's not going to have a harmful outcome 1735 01:19:43,340 --> 01:19:45,406 on an individual patient. 1736 01:19:45,406 --> 01:19:47,240 ADAM WRIGHT: Yeah, and conversely, there 1737 01:19:47,240 --> 01:19:49,532 are people that are working on machine learning systems 1738 01:19:49,532 --> 01:19:52,940 for dosing insulin or adjusting people's ventilator settings, 1739 01:19:52,940 --> 01:19:54,762 and those are high-- 1740 01:19:54,762 --> 01:19:56,470 PETER SZOLOVITS: Those are the high risk. 1741 01:19:56,470 --> 01:19:57,330 ADAM WRIGHT: --risk jobs. 1742 01:19:57,330 --> 01:19:58,350 PETER SZOLOVITS: Yep. 1743 01:19:58,350 --> 01:20:01,290 All right, last question because we have to wrap up. 1744 01:20:01,290 --> 01:20:04,340 AUDIENCE: You had alluded to some of the [INAUDIBLE] 1745 01:20:04,340 --> 01:20:04,872 problems-- 1746 01:20:04,872 --> 01:20:05,580 ADAM WRIGHT: Yes. 1747 01:20:05,580 --> 01:20:07,080 AUDIENCE: --of some of these models. 1748 01:20:07,080 --> 01:20:12,931 I'm, one, curious how long [INAUDIBLE].. 1749 01:20:12,931 --> 01:20:13,681 ADAM WRIGHT: Yeah. 1750 01:20:13,681 --> 01:20:16,545 AUDIENCE: And I guess, two, once it's 1751 01:20:16,545 --> 01:20:19,819 been determined that actually a significant issue has occurred, 1752 01:20:19,819 --> 01:20:22,110 what are some of the decisions that you made regarding 1753 01:20:22,110 --> 01:20:24,290 tradeoffs of using the out-of-date model that 1754 01:20:24,290 --> 01:20:27,160 looks at [INAUDIBLE] signal versus the cost of retraining? 1755 01:20:27,160 --> 01:20:28,160 ADAM WRIGHT: Retraining? 1756 01:20:28,160 --> 01:20:29,172 Yeah. 1757 01:20:29,172 --> 01:20:29,880 Yeah, absolutely. 1758 01:20:29,880 --> 01:20:32,208 So the question is the set-and-forget, right? 1759 01:20:32,208 --> 01:20:33,000 We build the model. 1760 01:20:33,000 --> 01:20:34,620 The model may become stale. 1761 01:20:34,620 --> 01:20:35,880 Should we update the model? 1762 01:20:35,880 --> 01:20:37,440 And how do we decide to do that? 1763 01:20:37,440 --> 01:20:38,432 I mean, we're using-- 1764 01:20:38,432 --> 01:20:40,140 it depends on what you define as a model. 1765 01:20:40,140 --> 01:20:42,440 We're using tables and rules that we've 1766 01:20:42,440 --> 01:20:45,090 developed since the 1970s. 1767 01:20:45,090 --> 01:20:48,720 I think we have a pretty high desire 1768 01:20:48,720 --> 01:20:50,225 to empirically revisit those. 1769 01:20:50,225 --> 01:20:51,600 There's a problem in the practice 1770 01:20:51,600 --> 01:20:54,100 called knowledge management or knowledge engineering, right? 1771 01:20:54,100 --> 01:20:56,070 How do we remember which of our knowledge bases 1772 01:20:56,070 --> 01:20:58,170 need to be checked again or updated? 1773 01:20:58,170 --> 01:21:02,300 And we'll often, just as a standard, retrain a model 1774 01:21:02,300 --> 01:21:05,460 or re-evaluate a knowledge base every six months or every year 1775 01:21:05,460 --> 01:21:08,810 because it's both harmful to patients 1776 01:21:08,810 --> 01:21:10,713 if this stuff is out-of-date, and it also 1777 01:21:10,713 --> 01:21:11,880 makes us look stupid, right? 1778 01:21:11,880 --> 01:21:13,963 So if there's a new paper that comes out and says, 1779 01:21:13,963 --> 01:21:15,670 beta blockers are terrible poison, 1780 01:21:15,670 --> 01:21:17,970 and we keep suggesting them, then people no longer 1781 01:21:17,970 --> 01:21:21,990 believe the suggestions that we make, that said, 1782 01:21:21,990 --> 01:21:23,257 we still make mistakes, right? 1783 01:21:23,257 --> 01:21:24,840 I mean, things happen all of the time. 1784 01:21:24,840 --> 01:21:27,670 A lot of my work has focused on malfunctions in these systems. 1785 01:21:27,670 --> 01:21:31,170 And so, as an example, empirically, the pharmacy 1786 01:21:31,170 --> 01:21:33,870 might change the code or ID number for a medicine, 1787 01:21:33,870 --> 01:21:36,120 or a new medicine might come on the market, 1788 01:21:36,120 --> 01:21:38,700 and we have to make sure to continually update 1789 01:21:38,700 --> 01:21:40,650 the knowledge base so that we're not suggesting an old medicine 1790 01:21:40,650 --> 01:21:42,817 or overlooking the fact that the patient has already 1791 01:21:42,817 --> 01:21:44,490 been prescribed a new medicine. 1792 01:21:44,490 --> 01:21:47,100 And so we tried to do that prospectively or proactively. 1793 01:21:47,100 --> 01:21:49,770 But then we also tried to listen to feedback from users 1794 01:21:49,770 --> 01:21:51,255 and fix things as we go. 1795 01:21:51,255 --> 01:21:52,410 Cool. 1796 01:21:52,410 --> 01:21:55,560 PETER SZOLOVITS: And just one more comment on that. 1797 01:21:55,560 --> 01:21:58,300 So some things are done in real time. 1798 01:21:58,300 --> 01:21:59,850 There was a system, many years ago, 1799 01:21:59,850 --> 01:22:05,130 at the Intermountain Health in Salt Lake City, where 1800 01:22:05,130 --> 01:22:08,610 they were looking at what bugs were growing out 1801 01:22:08,610 --> 01:22:12,150 of microbiology samples in the laboratory. 1802 01:22:12,150 --> 01:22:14,940 And of course, that can change on an hour-by-hour or 1803 01:22:14,940 --> 01:22:16,520 day-to-day basis. 1804 01:22:16,520 --> 01:22:19,440 And so they were updating those systems that warned you 1805 01:22:19,440 --> 01:22:23,880 about the possibility of that kind of infection in real time 1806 01:22:23,880 --> 01:22:26,358 by taking feeds directly from the laboratory. 1807 01:22:26,358 --> 01:22:27,400 ADAM WRIGHT: That's true. 1808 01:22:27,400 --> 01:22:28,680 PETER SZOLOVITS: All right, thank you very much. 1809 01:22:28,680 --> 01:22:30,055 ADAM WRIGHT: No, thank you, guys. 1810 01:22:30,055 --> 01:22:32,990 [APPLAUSE]