1 00:00:00,090 --> 00:00:02,430 The following content is provided under a Creative 2 00:00:02,430 --> 00:00:03,820 Commons license. 3 00:00:03,820 --> 00:00:06,030 Your support will help MIT OpenCourseWare 4 00:00:06,030 --> 00:00:10,120 continue to offer high-quality educational resources for free. 5 00:00:10,120 --> 00:00:12,660 To make a donation or to view additional materials 6 00:00:12,660 --> 00:00:16,620 from hundreds of MIT courses, visit MIT OpenCourseWare 7 00:00:16,620 --> 00:00:17,992 at ocw.mit.edu. 8 00:00:21,080 --> 00:00:23,930 WILLIAM BONVILLIAN: So I try to do 9 00:00:23,930 --> 00:00:27,380 this at least every other class so we remember the context. 10 00:00:27,380 --> 00:00:30,140 But class 1, as you remember, was 11 00:00:30,140 --> 00:00:32,810 all about the economic growth theory 12 00:00:32,810 --> 00:00:38,240 and the economic growth context for innovation policy. 13 00:00:38,240 --> 00:00:43,310 And we did growth economics and how it broke away 14 00:00:43,310 --> 00:00:45,140 from classical economics around developing 15 00:00:45,140 --> 00:00:47,010 a new theory of growth. 16 00:00:47,010 --> 00:00:51,860 So Solow argued that there's technological related 17 00:00:51,860 --> 00:00:52,580 innovation. 18 00:00:52,580 --> 00:00:59,700 And we concluded that was a key direct innovation factor. 19 00:00:59,700 --> 00:01:02,570 So we could loosely, although in accurately, 20 00:01:02,570 --> 00:01:08,030 translate that concept as saying, you've got to do R&D. 21 00:01:08,030 --> 00:01:09,380 And then we read Romer. 22 00:01:09,380 --> 00:01:12,170 And Romer argued that behind that R&D system 23 00:01:12,170 --> 00:01:14,510 is human capital engaged in research. 24 00:01:14,510 --> 00:01:17,880 That's another foundational direct innovation factor. 25 00:01:17,880 --> 00:01:19,730 You can't build an innovation system 26 00:01:19,730 --> 00:01:22,790 without these two elements arguably. 27 00:01:22,790 --> 00:01:24,710 So it's the R&D and the talent behind it. 28 00:01:24,710 --> 00:01:26,330 Those give us two direct factors. 29 00:01:26,330 --> 00:01:30,620 Then in class 2, we talked about the indirect elements 30 00:01:30,620 --> 00:01:33,410 in this larger ecosystem. 31 00:01:33,410 --> 00:01:35,360 And government controls some. 32 00:01:35,360 --> 00:01:38,060 Private sector controls more. 33 00:01:38,060 --> 00:01:40,580 But that gives us an idea of looking at innovation 34 00:01:40,580 --> 00:01:44,420 as a system, which Richard Nelson contributed to. 35 00:01:44,420 --> 00:01:47,210 And then within that system, you look at the strength 36 00:01:47,210 --> 00:01:49,040 of the innovation actors. 37 00:01:49,040 --> 00:01:51,980 And you can begin to think of your innovation organization 38 00:01:51,980 --> 00:01:55,430 as a third direct innovation factor. 39 00:01:55,430 --> 00:01:57,320 And that leads us into the whole problem 40 00:01:57,320 --> 00:01:59,750 of how do you cross the valley of death 41 00:01:59,750 --> 00:02:02,860 if you've got a disconnected model. 42 00:02:02,860 --> 00:02:05,800 Classes 3 and 4 were case studies on manufacturing. 43 00:02:05,800 --> 00:02:09,310 So we took a deep dive into a very current set 44 00:02:09,310 --> 00:02:13,090 of practical, ongoing problems to try and think about some 45 00:02:13,090 --> 00:02:15,700 of those innovation organization lessons 46 00:02:15,700 --> 00:02:17,500 in a manufacturing context. 47 00:02:17,500 --> 00:02:24,130 Class 5 was explicitly about innovation organization. 48 00:02:24,130 --> 00:02:28,700 And we looked at David [? Hart's ?] analysis 49 00:02:28,700 --> 00:02:35,240 of the ideology behind looking at innovation 50 00:02:35,240 --> 00:02:38,600 in the political system and the conservative, 51 00:02:38,600 --> 00:02:41,080 the associationalist, which is the public private, 52 00:02:41,080 --> 00:02:44,000 the national security models. 53 00:02:44,000 --> 00:02:48,870 All those issues are still very much with us. 54 00:02:48,870 --> 00:02:50,720 And in that class, we also talked 55 00:02:50,720 --> 00:02:57,770 about Donald Stokes' work where he critiqued the split 56 00:02:57,770 --> 00:02:59,900 in the US innovation organizational system 57 00:02:59,900 --> 00:03:03,380 between front-end innovation, the R 58 00:03:03,380 --> 00:03:06,300 side and the later stages. 59 00:03:06,300 --> 00:03:13,190 So when we organize to do, essentially, a research 60 00:03:13,190 --> 00:03:16,400 model at the end of World War II under that pipeline model 61 00:03:16,400 --> 00:03:21,660 that Vannevar Bush gave us, we missed 62 00:03:21,660 --> 00:03:23,700 issues that subsequently rose in the economy 63 00:03:23,700 --> 00:03:26,670 and became really deep sources of problems 64 00:03:26,670 --> 00:03:30,570 about the connections to the fall of one implementation 65 00:03:30,570 --> 00:03:33,240 back into the innovation system stages. 66 00:03:33,240 --> 00:03:36,690 Class 6 was about crossing this valley of death. 67 00:03:36,690 --> 00:03:39,030 How do you build the bridging models 68 00:03:39,030 --> 00:03:42,210 that get you from a pipeline innovation model 69 00:03:42,210 --> 00:03:43,660 across to the later stages. 70 00:03:43,660 --> 00:03:46,660 That's not easy in our system. 71 00:03:46,660 --> 00:03:49,170 And then we also talked about the fact 72 00:03:49,170 --> 00:03:52,170 that we run two parallel systems in the US. 73 00:03:52,170 --> 00:03:55,770 So Vernon Ruttan's book, Is War Necessary for Economic Growth, 74 00:03:55,770 --> 00:03:59,470 introduced us into the whole Defense Innovation system, 75 00:03:59,470 --> 00:04:01,500 which is not disconnected. 76 00:04:01,500 --> 00:04:03,840 That's a pretty connected model. 77 00:04:03,840 --> 00:04:05,610 And the Defense Department will typically 78 00:04:05,610 --> 00:04:08,160 perform the research, the development. 79 00:04:08,160 --> 00:04:10,160 It will do the demonstration. 80 00:04:10,160 --> 00:04:12,480 It will do the test bed. 81 00:04:12,480 --> 00:04:14,280 It will move to the advanced prototypes 82 00:04:14,280 --> 00:04:17,459 and will often provide an initial market 83 00:04:17,459 --> 00:04:18,720 for emerging technologies. 84 00:04:18,720 --> 00:04:20,850 That's a very different organizational system 85 00:04:20,850 --> 00:04:25,650 as we discussed from what the civilian agencies are up to. 86 00:04:25,650 --> 00:04:28,490 Class 7 was innovation at the face-to-face level. 87 00:04:28,490 --> 00:04:30,190 That's our whole great group's theory. 88 00:04:30,190 --> 00:04:35,813 So if innovation has to do with institutions, 89 00:04:35,813 --> 00:04:37,980 which it certainly does-- how are those institutions 90 00:04:37,980 --> 00:04:39,360 organized and how do they connect 91 00:04:39,360 --> 00:04:42,210 and how are the handoffs between the actors? 92 00:04:42,210 --> 00:04:47,070 But in the end, people own innovation, not institutions. 93 00:04:47,070 --> 00:04:48,888 It's very face to face. 94 00:04:48,888 --> 00:04:50,430 And then in the great groups classes, 95 00:04:50,430 --> 00:04:52,560 you remember well we talked about some 96 00:04:52,560 --> 00:04:57,570 of the rule sets that tend to guide 97 00:04:57,570 --> 00:05:00,960 the way in which great group-based innovation 98 00:05:00,960 --> 00:05:02,890 operates. 99 00:05:02,890 --> 00:05:05,870 And again, that drives us back to this third innovation 100 00:05:05,870 --> 00:05:07,040 factor-- 101 00:05:07,040 --> 00:05:09,410 what does innovation look like an organizational point 102 00:05:09,410 --> 00:05:12,120 of view, from a systems point of view, 103 00:05:12,120 --> 00:05:15,710 from an actor's point of view, but also, 104 00:05:15,710 --> 00:05:19,580 how does innovation look at the face-to-face level. 105 00:05:19,580 --> 00:05:21,503 And then in class 8, we talked about DARPA, 106 00:05:21,503 --> 00:05:23,420 which is an institution that actually attempts 107 00:05:23,420 --> 00:05:26,430 to do both on a good day. 108 00:05:26,430 --> 00:05:28,550 It will support that connectedness 109 00:05:28,550 --> 00:05:34,160 of innovation institutions being connected 110 00:05:34,160 --> 00:05:35,870 using that defense model. 111 00:05:35,870 --> 00:05:43,280 But it will also attempt to build great groups. 112 00:05:43,280 --> 00:05:45,890 And we talked about JCR Licklider and the evolution 113 00:05:45,890 --> 00:05:49,260 of the IT revolution, a lot of the origins 114 00:05:49,260 --> 00:05:52,590 of which came out of DARPA-supported research 115 00:05:52,590 --> 00:05:53,670 as an example of that. 116 00:05:53,670 --> 00:06:00,570 So now, we're going to do some more case studies in today's 117 00:06:00,570 --> 00:06:04,695 class really looking at the life science innovation system 118 00:06:04,695 --> 00:06:05,820 and how does that function. 119 00:06:05,820 --> 00:06:07,653 And we're going to take a lot of the lessons 120 00:06:07,653 --> 00:06:10,020 that we've been learning and try and fit them 121 00:06:10,020 --> 00:06:14,350 into analyzing this really important piece 122 00:06:14,350 --> 00:06:18,120 into this territory. 123 00:06:18,120 --> 00:06:21,690 Let me give you a quick NIH historical backdrop. 124 00:06:21,690 --> 00:06:25,290 Remember the pre World War II and the World War II context? 125 00:06:25,290 --> 00:06:29,970 NIH just stood up as a Vannevar Bush basic research 126 00:06:29,970 --> 00:06:31,560 organization. 127 00:06:31,560 --> 00:06:34,710 So Franklin Roosevelt in Vannevar Bush 128 00:06:34,710 --> 00:06:37,080 see what happens in World War II. 129 00:06:37,080 --> 00:06:42,120 They see the development of antibiotics led by penicillin. 130 00:06:42,120 --> 00:06:44,580 And the effects are breathtaking, absolutely 131 00:06:44,580 --> 00:06:45,960 breathtaking. 132 00:06:45,960 --> 00:06:47,370 So suddenly, we fight a war. 133 00:06:47,370 --> 00:06:49,800 And you probably won't die of disease. 134 00:06:49,800 --> 00:06:52,440 You'll actually die of battlefield injuries, which 135 00:06:52,440 --> 00:06:55,620 has never happened before. 136 00:06:55,620 --> 00:06:58,027 And that immediately has an effect. 137 00:06:58,027 --> 00:06:59,610 It removes pneumonia from the number 1 138 00:06:59,610 --> 00:07:01,568 cause of death in the United States conceivably 139 00:07:01,568 --> 00:07:03,750 further down the list. 140 00:07:03,750 --> 00:07:05,190 It's very dramatic. 141 00:07:05,190 --> 00:07:07,740 Bush and Roosevelt see this and understand 142 00:07:07,740 --> 00:07:09,870 how powerful that model is. 143 00:07:09,870 --> 00:07:13,170 And so one of the theories in The Endless Frontier 144 00:07:13,170 --> 00:07:15,497 that Vannevar Bush puts out, as you remember, 145 00:07:15,497 --> 00:07:17,580 is that there's going to be a war against disease. 146 00:07:17,580 --> 00:07:20,420 And they've just seen what this model is. 147 00:07:20,420 --> 00:07:23,020 So they want to do it. 148 00:07:23,020 --> 00:07:28,020 Now, Truman, as you remember, vetoes the one agency, 149 00:07:28,020 --> 00:07:31,410 One Tent National Science Foundation role. 150 00:07:31,410 --> 00:07:35,980 So he vetoes that in like, is it '47 or '49? 151 00:07:35,980 --> 00:07:39,510 And that leaves a void. 152 00:07:39,510 --> 00:07:41,380 So voids tend to get filled. 153 00:07:41,380 --> 00:07:44,310 So this little research outfit that's 154 00:07:44,310 --> 00:07:47,680 part of the Public Health Service, the National 155 00:07:47,680 --> 00:07:50,030 Institutes of Health becomes the National Institutes 156 00:07:50,030 --> 00:07:52,750 and really starts scaling up. 157 00:07:52,750 --> 00:07:58,390 So follows a Vannevar Bush basic research model. 158 00:07:58,390 --> 00:08:04,990 But it evolves in this gap of organizational leadership 159 00:08:04,990 --> 00:08:08,920 in a way and fills that void. 160 00:08:08,920 --> 00:08:13,670 And then over time, disease groups 161 00:08:13,670 --> 00:08:16,760 tend to be very active politically. 162 00:08:16,760 --> 00:08:19,310 If you have a disease or if a loved one has a disease, 163 00:08:19,310 --> 00:08:22,340 you tend to be very active in the disease group. 164 00:08:22,340 --> 00:08:28,070 And these disease groups push their disease in Congress. 165 00:08:28,070 --> 00:08:31,730 And Congress keeps adding institutes and centers 166 00:08:31,730 --> 00:08:33,380 onto the model. 167 00:08:33,380 --> 00:08:38,270 When it reaches an unmanageable number of 27, 168 00:08:38,270 --> 00:08:41,015 finally, Congress actually tries to restrain itself. 169 00:08:43,650 --> 00:08:48,820 But it creates an organizational model that's problematic. 170 00:08:48,820 --> 00:08:50,620 Now, another feature here is-- 171 00:08:50,620 --> 00:08:51,870 here's a line from Tony Fauci. 172 00:08:51,870 --> 00:08:52,840 It's one of the-- 173 00:08:52,840 --> 00:08:57,220 He's the director of NIAID, quite famous. 174 00:08:57,220 --> 00:09:00,910 NIAID was one of the leaders in getting 175 00:09:00,910 --> 00:09:04,330 on all kinds of infectious diseases, including AIDS. 176 00:09:04,330 --> 00:09:09,160 But Tony Fauci writes in 2003, "the path 177 00:09:09,160 --> 00:09:12,610 to product development has not been a part of NIAID's research 178 00:09:12,610 --> 00:09:15,665 strategy." 179 00:09:15,665 --> 00:09:19,460 In other words, that's not on the table. 180 00:09:19,460 --> 00:09:22,700 They're worried about their basic research results. 181 00:09:22,700 --> 00:09:25,610 So what did NIH-- 182 00:09:25,610 --> 00:09:29,720 and arguably, the biotech model that is spawned from it-- 183 00:09:29,720 --> 00:09:30,770 what did they get right? 184 00:09:30,770 --> 00:09:32,228 And they get a lot of things right. 185 00:09:32,228 --> 00:09:33,230 It's an amazing system. 186 00:09:33,230 --> 00:09:35,450 It's a remarkable story of success. 187 00:09:35,450 --> 00:09:38,330 So even though we're going to be critical today, 188 00:09:38,330 --> 00:09:41,340 it is an amazing success story. 189 00:09:41,340 --> 00:09:43,440 NIH trained everybody. 190 00:09:43,440 --> 00:09:50,000 It created this huge education system that put a lot of talent 191 00:09:50,000 --> 00:09:53,900 on the task in Romer's terms. 192 00:09:53,900 --> 00:09:58,670 This knowledge base helped spawn a remarkable entrepreneurial 193 00:09:58,670 --> 00:10:00,800 biotech startup model. 194 00:10:00,800 --> 00:10:04,490 And we talked about Genentech during our great groups class. 195 00:10:04,490 --> 00:10:05,480 That's the first. 196 00:10:05,480 --> 00:10:10,280 But that's a very good example of what then flourished. 197 00:10:10,280 --> 00:10:15,180 And Boyer and Swanson were the founders. 198 00:10:15,180 --> 00:10:20,550 Boyer comes out of that NIH-funded system. 199 00:10:20,550 --> 00:10:24,160 Biotechs have been able to get venture capital support 200 00:10:24,160 --> 00:10:29,718 even though they have a 10 to 15 year stand-up model, which 201 00:10:29,718 --> 00:10:31,010 is a remarkable accomplishment. 202 00:10:31,010 --> 00:10:33,800 No other sector is able to get venture capital money 203 00:10:33,800 --> 00:10:35,300 unless they're no more than a couple 204 00:10:35,300 --> 00:10:37,310 of years out of production. 205 00:10:37,310 --> 00:10:43,250 Biotechs have been able to break that and get long-term support 206 00:10:43,250 --> 00:10:46,130 for R&D. 207 00:10:46,130 --> 00:10:49,960 And the key to this has been the value of IP. 208 00:10:49,960 --> 00:10:54,860 So there is a working monopoly model in the biotech sector. 209 00:10:54,860 --> 00:11:01,460 So if you're first to patent a new drug, 210 00:11:01,460 --> 00:11:04,820 you are given a monopoly rent for a 20-year time 211 00:11:04,820 --> 00:11:07,790 period minus the time it gets you 212 00:11:07,790 --> 00:11:09,860 to get through clinical trials with NIH, 213 00:11:09,860 --> 00:11:11,000 which can be seven years. 214 00:11:11,000 --> 00:11:13,910 So it's a 13-year run where you're 215 00:11:13,910 --> 00:11:15,890 assured of monopoly rents. 216 00:11:15,890 --> 00:11:19,130 And unlike the hard technology sector, 217 00:11:19,130 --> 00:11:23,300 it's harder to stand up, in effect, 218 00:11:23,300 --> 00:11:28,100 copy-cat fixes, copy-cat drugs. 219 00:11:28,100 --> 00:11:32,240 So you really do get a significant run typically 220 00:11:32,240 --> 00:11:33,830 if you're first to patent and you 221 00:11:33,830 --> 00:11:37,530 get ahead first through the clinical trial process. 222 00:11:37,530 --> 00:11:42,620 So that's been the enabler that allows 223 00:11:42,620 --> 00:11:44,490 this financing system to work. 224 00:11:44,490 --> 00:11:46,550 The other key feature is FDA's approval. 225 00:11:46,550 --> 00:11:48,880 So FDA gives you-- 226 00:11:48,880 --> 00:11:53,152 in stage 1, stage 2, stage 3 clinical trials, 227 00:11:53,152 --> 00:11:54,610 if you're developing a cancer drug, 228 00:11:54,610 --> 00:11:56,260 you know exactly what your chances 229 00:11:56,260 --> 00:11:59,590 are on getting through the from stage 1 230 00:11:59,590 --> 00:12:04,000 to stage 2 and stage 2 to stage 3 and from stage 3 231 00:12:04,000 --> 00:12:05,440 to final drug approval. 232 00:12:05,440 --> 00:12:07,330 You know just what your chances are. 233 00:12:07,330 --> 00:12:10,000 So that enables venture capitalists 234 00:12:10,000 --> 00:12:13,000 to tee off their stages of financing 235 00:12:13,000 --> 00:12:15,940 to your success in the clinical trials process. 236 00:12:15,940 --> 00:12:19,635 There is no benchmarking system like this in any other sector 237 00:12:19,635 --> 00:12:20,260 in the economy. 238 00:12:20,260 --> 00:12:22,330 It works amazingly well. 239 00:12:22,330 --> 00:12:25,000 And then another big difference between life science 240 00:12:25,000 --> 00:12:28,900 and everything else is that, if FDA approves your drug, 241 00:12:28,900 --> 00:12:31,240 you are guaranteed a market. 242 00:12:31,240 --> 00:12:32,800 In fact, our wonderful legal system 243 00:12:32,800 --> 00:12:36,910 is such that, if a doctor fails to prescribe the medicine which 244 00:12:36,910 --> 00:12:42,130 is fitted to your problem, the next day, the doctor gets sued. 245 00:12:42,130 --> 00:12:46,240 So there's this huge forcing mechanism in the system. 246 00:12:46,240 --> 00:12:48,542 There's nothing like a certification that guarantees 247 00:12:48,542 --> 00:12:49,750 you a market on the next day. 248 00:12:49,750 --> 00:12:52,000 There's nothing like this anywhere else in our system. 249 00:12:52,000 --> 00:12:54,480 It's a remarkable thing. 250 00:12:54,480 --> 00:12:56,230 And we can start to think about how do you 251 00:12:56,230 --> 00:13:00,700 get analogous certification processes and benchmarking 252 00:13:00,700 --> 00:13:02,245 processes in place in more physical 253 00:13:02,245 --> 00:13:03,370 science-based technologies. 254 00:13:03,370 --> 00:13:05,560 It would be very interesting. 255 00:13:05,560 --> 00:13:06,770 We haven't done it. 256 00:13:06,770 --> 00:13:08,770 But there's a lot to be learned from this sector 257 00:13:08,770 --> 00:13:10,145 because of the system it's setup. 258 00:13:13,180 --> 00:13:14,978 Upstairs-downstairs has historically 259 00:13:14,978 --> 00:13:17,020 been a problem, particularly in European science, 260 00:13:17,020 --> 00:13:20,620 but also in US science, whereas the academic researchers have 261 00:13:20,620 --> 00:13:24,250 disdain for the company researchers. 262 00:13:24,250 --> 00:13:27,670 That has really broken up and broken apart 263 00:13:27,670 --> 00:13:29,710 in the life science side. 264 00:13:29,710 --> 00:13:34,030 So outstanding academics, think Boyer, 265 00:13:34,030 --> 00:13:37,510 who spent time in a biotech. 266 00:13:37,510 --> 00:13:40,450 That's not created as a sound career path. 267 00:13:40,450 --> 00:13:41,710 Boyer had to break the ground. 268 00:13:41,710 --> 00:13:44,300 He got a lot of flak for it at the time. 269 00:13:44,300 --> 00:13:46,450 But over time, it's become accepted 270 00:13:46,450 --> 00:13:49,900 that you can have an academic career, move to a biotech, 271 00:13:49,900 --> 00:13:51,910 move back to an academic career. 272 00:13:51,910 --> 00:13:53,407 That's OK. 273 00:13:53,407 --> 00:13:55,990 Obviously, this conflict is used throughout that whole process 274 00:13:55,990 --> 00:13:57,550 that had to be dealt with. 275 00:13:57,550 --> 00:14:00,200 But that's an acceptable pathway. 276 00:14:00,200 --> 00:14:02,530 So the whole upstairs-downstairs relationship 277 00:14:02,530 --> 00:14:04,780 between the academy and commercial development 278 00:14:04,780 --> 00:14:06,850 has really been broken up. 279 00:14:06,850 --> 00:14:12,220 So that's another thing that this sector got right. 280 00:14:12,220 --> 00:14:14,080 NIH also has a huge amount of money. 281 00:14:14,080 --> 00:14:16,450 It's by far the largest R&D agency. 282 00:14:16,450 --> 00:14:18,610 It's got over $30 billion a year. 283 00:14:18,610 --> 00:14:19,360 Nobody is close. 284 00:14:22,870 --> 00:14:25,630 NSF, by comparison, has $7 billion a year. 285 00:14:25,630 --> 00:14:28,120 The Department of Energy Office of Science has $5 billion. 286 00:14:28,120 --> 00:14:30,010 DARPA has $3 billion. 287 00:14:30,010 --> 00:14:35,110 So this is an order of magnitude almost different from other R&D 288 00:14:35,110 --> 00:14:36,020 agencies. 289 00:14:36,020 --> 00:14:39,940 So the political constituencies, including the disease groups, 290 00:14:39,940 --> 00:14:43,240 have built a very powerful political base 291 00:14:43,240 --> 00:14:47,350 for sustained funding in this sector. 292 00:14:47,350 --> 00:14:49,380 Now, that's the good news. 293 00:14:49,380 --> 00:14:50,593 Here's the problem. 294 00:14:50,593 --> 00:14:52,510 We're going to talk about the innovation train 295 00:14:52,510 --> 00:14:54,710 wreck that lies ahead here. 296 00:14:54,710 --> 00:14:57,130 So the economic model for biotechs and pharmas 297 00:14:57,130 --> 00:14:59,980 really requires blockbuster markets. 298 00:14:59,980 --> 00:15:03,210 It's a huge problem. 299 00:15:03,210 --> 00:15:06,100 So in other words, third-world diseases, infectious diseases, 300 00:15:06,100 --> 00:15:10,610 small-population diseases, it doesn't make sense 301 00:15:10,610 --> 00:15:12,820 to pursue those. 302 00:15:12,820 --> 00:15:17,090 To get a drug through the FDA clinical process to develop 303 00:15:17,090 --> 00:15:20,110 in time before, during, and during the clinical trial 304 00:15:20,110 --> 00:15:25,860 approval process itself, that's around a $1.4-billion-or-more 305 00:15:25,860 --> 00:15:27,060 proposition. 306 00:15:27,060 --> 00:15:30,162 So unless you're selling it to a major market, 307 00:15:30,162 --> 00:15:34,740 it doesn't make any sense to develop the remedy. 308 00:15:34,740 --> 00:15:38,600 So there is a statistic that some cite, 309 00:15:38,600 --> 00:15:47,260 which is that 80% of our R&D money in life science 310 00:15:47,260 --> 00:15:51,540 is spent on 10% of the diseases. 311 00:15:51,540 --> 00:15:52,950 That's a problem. 312 00:15:52,950 --> 00:15:54,520 I don't think the number is that bad. 313 00:15:54,520 --> 00:15:56,520 I think that's exaggerated. 314 00:15:56,520 --> 00:16:00,260 But we've got a problem here because of 315 00:16:00,260 --> 00:16:02,540 this blockbuster drug model. 316 00:16:02,540 --> 00:16:06,560 Only if you're developing a drug that sells into a major market 317 00:16:06,560 --> 00:16:11,170 opportunity is it going to get into the market. 318 00:16:11,170 --> 00:16:13,680 So what does that leave behind? 319 00:16:13,680 --> 00:16:16,860 In addition to the things I've mentioned, 320 00:16:16,860 --> 00:16:20,490 the precision medicine or personalized drug model 321 00:16:20,490 --> 00:16:21,340 gets left behind. 322 00:16:21,340 --> 00:16:23,820 In other words, if each one of you 323 00:16:23,820 --> 00:16:27,930 is going to have their own remedy variant that's 324 00:16:27,930 --> 00:16:30,930 particularly adapted to your genetic structure or metabolism 325 00:16:30,930 --> 00:16:33,120 or whatever, what are we going to do? 326 00:16:33,120 --> 00:16:37,020 Run a $1.4-billion clinical trial process for you? 327 00:16:37,020 --> 00:16:39,000 It's not going to work. 328 00:16:39,000 --> 00:16:40,980 So how do we deal with this? 329 00:16:40,980 --> 00:16:46,260 There's a huge looming train wreck problem in the system 330 00:16:46,260 --> 00:16:48,030 because that's where it's going. 331 00:16:48,030 --> 00:16:50,550 These are the benefits of being able to use big data 332 00:16:50,550 --> 00:16:52,530 and analytics to develop personalized medicine. 333 00:16:52,530 --> 00:16:56,450 But if there's not an economic model to get it there, 334 00:16:56,450 --> 00:16:58,980 it won't happen. 335 00:16:58,980 --> 00:17:02,730 The litigation threat makes drug companies risk averse. 336 00:17:02,730 --> 00:17:04,560 So we leave a lot of promising stuff 337 00:17:04,560 --> 00:17:06,900 on the table because of that threat. 338 00:17:06,900 --> 00:17:12,329 Often, a drug will cure, let's say, 339 00:17:12,329 --> 00:17:16,920 20%, 30%, maybe 40% of a disease group's problem. 340 00:17:16,920 --> 00:17:19,050 But if it kills some people, we'll 341 00:17:19,050 --> 00:17:22,391 never go to market because we don't understand the precision 342 00:17:22,391 --> 00:17:23,849 medicine, the personalized medicine 343 00:17:23,849 --> 00:17:25,650 implications of these things. 344 00:17:25,650 --> 00:17:27,780 And until we do, we're just leaving a lot of stuff 345 00:17:27,780 --> 00:17:31,890 on the shelf that has potential value because Americans, 346 00:17:31,890 --> 00:17:37,965 for good reason, have very low tolerance for risk from drugs. 347 00:17:41,310 --> 00:17:47,530 The health care spending in the overall system by 2025 348 00:17:47,530 --> 00:17:51,300 may account for 9% of GDP or more. 349 00:17:51,300 --> 00:17:54,040 And we'll talk about this in a minute. 350 00:17:54,040 --> 00:17:58,210 But we can't afford a health care bill of those dimensions 351 00:17:58,210 --> 00:17:59,770 and still have other things going 352 00:17:59,770 --> 00:18:02,275 on in the society and the economy and the government. 353 00:18:06,330 --> 00:18:09,190 This is what we spend money on in the federal government. 354 00:18:12,050 --> 00:18:14,690 This is what we think of as government. 355 00:18:14,690 --> 00:18:18,530 So that's like the national parks and federal research, 356 00:18:18,530 --> 00:18:20,900 transportation, highways. 357 00:18:20,900 --> 00:18:22,628 That's this. 358 00:18:22,628 --> 00:18:24,270 That's defense. 359 00:18:24,270 --> 00:18:25,910 So that's actually about 16%. 360 00:18:25,910 --> 00:18:30,350 And that's now about maybe a little below 20%. 361 00:18:30,350 --> 00:18:31,520 That's Social Security. 362 00:18:31,520 --> 00:18:35,900 That's Medicaid, Medicare, and SCHIP, 363 00:18:35,900 --> 00:18:40,220 the big health-care-oriented entitlement programs. 364 00:18:40,220 --> 00:18:42,560 That's other safety net programs, 9%. 365 00:18:42,560 --> 00:18:45,110 And then we have interest on the debt, which is actually now 366 00:18:45,110 --> 00:18:47,850 over 9%. 367 00:18:47,850 --> 00:18:52,020 So as you can see, the federal government is mostly-- 368 00:18:52,020 --> 00:18:55,860 we're talking 60% plus interest in the debt-- 369 00:18:55,860 --> 00:18:59,820 a check-writing kind of organization. 370 00:18:59,820 --> 00:19:01,570 And that's what we think of as government. 371 00:19:01,570 --> 00:19:03,570 And that's what we think of as the domestic side 372 00:19:03,570 --> 00:19:04,220 of government. 373 00:19:04,220 --> 00:19:05,680 So these are pretty small pieces. 374 00:19:05,680 --> 00:19:06,500 Rasheed? 375 00:19:06,500 --> 00:19:08,470 AUDIENCE: What year is this for? 376 00:19:08,470 --> 00:19:11,540 WILLIAM BONVILLIAN: This is probably about four years old 377 00:19:11,540 --> 00:19:12,040 now. 378 00:19:12,040 --> 00:19:16,180 But the numbers are slightly even more problematic 379 00:19:16,180 --> 00:19:17,170 than these. 380 00:19:17,170 --> 00:19:19,910 So again, this category is now 16%. 381 00:19:19,910 --> 00:19:22,910 So the numbers are probably off by a couple of percent. 382 00:19:22,910 --> 00:19:27,670 But it's a problem. 383 00:19:27,670 --> 00:19:33,440 And as we'll see in a second, these 384 00:19:33,440 --> 00:19:35,610 are components of federal spending. 385 00:19:35,610 --> 00:19:36,890 So that's health. 386 00:19:36,890 --> 00:19:39,150 And that blue line is everything else. 387 00:19:39,150 --> 00:19:42,320 So you can see where taxpayer dollars are going. 388 00:19:45,660 --> 00:19:47,910 This kind of demonstrates that Social Security is not 389 00:19:47,910 --> 00:19:48,960 the real problem. 390 00:19:48,960 --> 00:19:50,760 It's really the health care cost related 391 00:19:50,760 --> 00:19:53,130 to Medicare and Medicaid that are the problem. 392 00:19:56,160 --> 00:20:01,257 This shows historic levels of taxation. 393 00:20:01,257 --> 00:20:03,590 Remember, the United States was founded on a tax revolt. 394 00:20:03,590 --> 00:20:06,695 So we're not going to have European-style levels 395 00:20:06,695 --> 00:20:08,070 of taxation in the United States. 396 00:20:08,070 --> 00:20:09,990 It's just not going to happen. 397 00:20:09,990 --> 00:20:12,720 And we start to run into real political trouble 398 00:20:12,720 --> 00:20:17,520 once you get a little bit above, say, the 20% level of taxation 399 00:20:17,520 --> 00:20:20,220 that's a percentage of GDP. 400 00:20:20,220 --> 00:20:29,720 So we're headed way up against that threshold largely because 401 00:20:29,720 --> 00:20:32,210 of the demographics and the health care spending, 402 00:20:32,210 --> 00:20:41,530 which means that, broken down this way, the percentage of GDP 403 00:20:41,530 --> 00:20:45,770 we spend on, what we consider most of government, 404 00:20:45,770 --> 00:20:49,450 is just not going to be affordable 405 00:20:49,450 --> 00:20:54,510 given the acceptable levels, around 19%, 18%, 406 00:20:54,510 --> 00:20:57,680 the acceptable tax range in our political system. 407 00:20:57,680 --> 00:20:59,258 So Max? 408 00:20:59,258 --> 00:21:00,550 AUDIENCE: What are [INAUDIBLE]? 409 00:21:00,550 --> 00:21:01,930 WILLIAM BONVILLIAN: Expenditures by the federal government. 410 00:21:01,930 --> 00:21:03,580 What they're spending in a given year. 411 00:21:03,580 --> 00:21:07,120 What they were actually outlaying to meet their costs, 412 00:21:07,120 --> 00:21:09,320 to meet their obligations. 413 00:21:09,320 --> 00:21:12,235 So these are some of the problems that lie ahead. 414 00:21:12,235 --> 00:21:15,310 We got a real fiscal train wreck because 415 00:21:15,310 --> 00:21:17,440 of the cost of this health care system driven 416 00:21:17,440 --> 00:21:20,080 by the demographics. 417 00:21:20,080 --> 00:21:23,050 My generation attempted to make you 418 00:21:23,050 --> 00:21:25,540 spend all of your money on me. 419 00:21:25,540 --> 00:21:27,040 That's essentially what's going on. 420 00:21:27,040 --> 00:21:31,670 It's a massive intergenerational transfer of wealth. 421 00:21:31,670 --> 00:21:35,690 And you guys need to wake up to that because it forgoes 422 00:21:35,690 --> 00:21:38,600 opportunities that you have. 423 00:21:38,600 --> 00:21:41,330 And at the heart of this is a really serious problem 424 00:21:41,330 --> 00:21:44,380 with health care expenditures. 425 00:21:44,380 --> 00:21:49,650 So let's now get back into our main themes. 426 00:21:49,650 --> 00:21:54,148 This was a classic 2003 report, what is now 427 00:21:54,148 --> 00:21:55,440 called the Academy of Medicine. 428 00:21:55,440 --> 00:21:57,773 It used to be called the National Institute of Medicine. 429 00:21:57,773 --> 00:21:59,700 It's part of the national academies-- 430 00:21:59,700 --> 00:22:05,230 they did on evaluating NIH. 431 00:22:05,230 --> 00:22:09,740 And it was a pretty startling criticism. 432 00:22:09,740 --> 00:22:14,480 And they looked hard at this NIH system. 433 00:22:14,480 --> 00:22:17,930 It really began to identify some of the problems. 434 00:22:17,930 --> 00:22:23,330 Interestingly, this report drove some interesting congressional 435 00:22:23,330 --> 00:22:26,390 reform attempts. 436 00:22:26,390 --> 00:22:31,145 Let me see if I can summarize a few key points here. 437 00:22:31,145 --> 00:22:32,520 And Chloe, did you have this one? 438 00:22:35,350 --> 00:22:36,430 You've got it, Martin? 439 00:22:36,430 --> 00:22:38,910 OK. 440 00:22:38,910 --> 00:22:40,590 All right. 441 00:22:40,590 --> 00:22:46,860 So as I said, NIH is more like a feudal barony. 442 00:22:46,860 --> 00:22:49,380 It always has a famous director. 443 00:22:49,380 --> 00:22:51,870 But they don't have that much control over these institutes 444 00:22:51,870 --> 00:22:52,410 and centers. 445 00:22:52,410 --> 00:22:55,440 There is no centralized budgeting. 446 00:22:55,440 --> 00:22:57,840 And the institutes and centers battled 447 00:22:57,840 --> 00:23:03,270 to protect their percentage share of NIH's total budget. 448 00:23:03,270 --> 00:23:06,180 So Harold Varmus, a famous former director 449 00:23:06,180 --> 00:23:09,870 of NIH, who later came back in the Obama administration 450 00:23:09,870 --> 00:23:14,750 as head of the National Cancer Institute, he said, in 2001, 451 00:23:14,750 --> 00:23:17,960 "NIH would be more efficient and more manageable 452 00:23:17,960 --> 00:23:22,370 if a far smaller number of larger institutes 453 00:23:22,370 --> 00:23:27,080 existed organized around broad science areas." 454 00:23:27,080 --> 00:23:30,730 Less institutes, organized around science areas 455 00:23:30,730 --> 00:23:33,328 is his twin points here. 456 00:23:33,328 --> 00:23:35,120 And there's a lot to be said for that case. 457 00:23:35,120 --> 00:23:36,787 Instead, as we discussed, the institutes 458 00:23:36,787 --> 00:23:39,710 got set up essentially at the behest of disease groups 459 00:23:39,710 --> 00:23:43,320 to solve their disease problem. 460 00:23:43,320 --> 00:23:46,380 It turns out there isn't a separate pathway 461 00:23:46,380 --> 00:23:47,430 for each disease. 462 00:23:47,430 --> 00:23:49,620 It turns out there's a lot of common pathways 463 00:23:49,620 --> 00:23:52,090 across a lot of diseases. 464 00:23:52,090 --> 00:23:55,580 And we don't have a very good mechanism in NIH 465 00:23:55,580 --> 00:23:58,560 for getting groups of institutes to collaborate 466 00:23:58,560 --> 00:24:00,880 on a cross-cutting set of problems. 467 00:24:00,880 --> 00:24:05,160 So fundamental organizational issue. 468 00:24:05,160 --> 00:24:08,070 Now, there is another side of this argument, which 469 00:24:08,070 --> 00:24:10,980 is, you've got a lot of institutes 470 00:24:10,980 --> 00:24:14,670 with a lot of freedom to do a lot of stuff. 471 00:24:14,670 --> 00:24:17,910 Maybe a desegregated model is not all bad. 472 00:24:17,910 --> 00:24:21,120 There's certainly a case to be made for that. 473 00:24:21,120 --> 00:24:23,070 But overall, there's a challenge here. 474 00:24:23,070 --> 00:24:30,120 And NIH's budget doubled from 1998 to 2003. 475 00:24:30,120 --> 00:24:36,300 So there was a federal surplus in its budget 476 00:24:36,300 --> 00:24:38,310 during the IT revolution, which was generating 477 00:24:38,310 --> 00:24:40,710 huge amounts of tax revenue. 478 00:24:40,710 --> 00:24:46,890 And two US senators saw this emerging surplus. 479 00:24:46,890 --> 00:24:50,670 They were on the appropriations subcommittee that handled NIH. 480 00:24:50,670 --> 00:24:52,230 And they went to the Senate floor 481 00:24:52,230 --> 00:24:55,980 and took a noticeable part of that surplus 482 00:24:55,980 --> 00:24:57,385 and gave it to NIH. 483 00:24:57,385 --> 00:25:02,190 It was a fascinating political development, very shrewd. 484 00:25:02,190 --> 00:25:05,910 Arlen Specter of Pennsylvania and Tom Harkin of Iowa 485 00:25:05,910 --> 00:25:09,720 were chairmen and ranking on the appropriations subcommittee. 486 00:25:09,720 --> 00:25:12,000 And they created the doubling. 487 00:25:12,000 --> 00:25:14,538 It's a remarkable story. 488 00:25:14,538 --> 00:25:16,080 They were able to do it because there 489 00:25:16,080 --> 00:25:17,910 was so much excitement at that time 490 00:25:17,910 --> 00:25:19,350 over the genomics revolution. 491 00:25:19,350 --> 00:25:23,670 So we talked about Venter and the NIH dueling battle 492 00:25:23,670 --> 00:25:26,880 [? Collins ?] over who's going to get to the genome first. 493 00:25:26,880 --> 00:25:28,920 There was creating huge excitement. 494 00:25:28,920 --> 00:25:31,440 It was a sense that this medical research 495 00:25:31,440 --> 00:25:34,560 could lead to all kinds of new fundamental understandings. 496 00:25:34,560 --> 00:25:36,630 Everybody was excited about the idea. 497 00:25:36,630 --> 00:25:40,110 And that enabled those two senators to double NIH. 498 00:25:40,110 --> 00:25:43,020 No other agency has ever been able to do anything like this. 499 00:25:43,020 --> 00:25:46,590 The rest were stagnating. 500 00:25:46,590 --> 00:25:48,120 But the demographics are changing. 501 00:25:48,120 --> 00:25:50,830 The patterns of illness are changing. 502 00:25:50,830 --> 00:25:53,820 We have a threat of biothreats. 503 00:25:53,820 --> 00:25:57,120 Is NIH too fragmented to cope with all these challenges? 504 00:25:57,120 --> 00:25:59,010 Can it respond quickly enough? 505 00:25:59,010 --> 00:26:02,820 These are all issues that the report raises. 506 00:26:02,820 --> 00:26:05,055 It wants to avoid proliferation of new agencies. 507 00:26:05,055 --> 00:26:07,590 And in fact, Congress did that as part 508 00:26:07,590 --> 00:26:09,330 of its reform legislation. 509 00:26:09,330 --> 00:26:11,130 You can't have a new institute in NIH 510 00:26:11,130 --> 00:26:12,610 unless you close down another one. 511 00:26:18,410 --> 00:26:21,050 The Institute of Medicine, and now the Academy of Medicine, 512 00:26:21,050 --> 00:26:26,570 urged in this report that NIH focus on its capabilities. 513 00:26:26,570 --> 00:26:30,200 And it needed to do so because it concluded that NIH is not 514 00:26:30,200 --> 00:26:33,500 only imperfect, but nobody would have ever designed NIH this way 515 00:26:33,500 --> 00:26:35,370 at the outset. 516 00:26:35,370 --> 00:26:37,710 And in fact, Elias Zerhouni once told me 517 00:26:37,710 --> 00:26:43,280 in a meeting and others, if only we 518 00:26:43,280 --> 00:26:45,290 had thought through the organizational model 519 00:26:45,290 --> 00:26:48,320 before we doubled. 520 00:26:48,320 --> 00:26:50,275 So we doubled first. 521 00:26:50,275 --> 00:26:51,650 And then there was a mad scramble 522 00:26:51,650 --> 00:26:54,258 to grab the money from the existing institutes. 523 00:26:54,258 --> 00:26:56,550 If only they had thought about the organizational model 524 00:26:56,550 --> 00:26:57,830 up front. 525 00:26:57,830 --> 00:27:02,390 Zerhouni led a great effort to try and get 526 00:27:02,390 --> 00:27:05,930 cross-cutting research efforts going across the institute. 527 00:27:05,930 --> 00:27:08,230 So he created what he called the roadmap. 528 00:27:08,230 --> 00:27:10,070 But essentially, then Congress later 529 00:27:10,070 --> 00:27:12,402 institutionalized that as the common fund. 530 00:27:12,402 --> 00:27:14,360 Take a little bit of money from each institute. 531 00:27:14,360 --> 00:27:17,660 Create a common fund in the control of the director who 532 00:27:17,660 --> 00:27:19,340 can then allocate it to the highest 533 00:27:19,340 --> 00:27:21,955 cross-cutting priorities. 534 00:27:21,955 --> 00:27:23,330 So that was a significant reform. 535 00:27:23,330 --> 00:27:25,205 But it's still a very modest amount of money. 536 00:27:29,510 --> 00:27:33,410 NIH needs to pursue an ability to go 537 00:27:33,410 --> 00:27:36,360 after high-risk, high-return projects. 538 00:27:36,360 --> 00:27:38,048 That reaches a certain stage. 539 00:27:38,048 --> 00:27:39,590 And we'll talk about this more later. 540 00:27:39,590 --> 00:27:45,210 But peer review is not good at taking risk. 541 00:27:45,210 --> 00:27:46,980 If there's a certain level of competition, 542 00:27:46,980 --> 00:27:51,720 if the award rate gets higher than about one out of three, 543 00:27:51,720 --> 00:27:58,080 your ability to identify breakthrough opportunities that 544 00:27:58,080 --> 00:28:01,370 are feasible gets harder. 545 00:28:01,370 --> 00:28:04,760 So it tends to default to-- 546 00:28:04,760 --> 00:28:07,310 when there's a lot of competition for an award, 547 00:28:07,310 --> 00:28:10,100 the one that the peer community is most confident 548 00:28:10,100 --> 00:28:14,330 will yield results, albeit they be incremental results. 549 00:28:14,330 --> 00:28:16,535 Why take risks on high flyers when you've 550 00:28:16,535 --> 00:28:17,660 got to get the basics done. 551 00:28:17,660 --> 00:28:19,940 And so many people are fighting for the money. 552 00:28:19,940 --> 00:28:21,920 So this is a huge problem at NIH. 553 00:28:21,920 --> 00:28:25,010 How does it take the necessary risks that 554 00:28:25,010 --> 00:28:28,010 need to be taken in innovation? 555 00:28:28,010 --> 00:28:29,960 So since then, NIH has been working 556 00:28:29,960 --> 00:28:34,400 on creating categories of higher-risk projects 557 00:28:34,400 --> 00:28:36,110 that have a separate kind of review 558 00:28:36,110 --> 00:28:38,790 process than normal projects. 559 00:28:38,790 --> 00:28:41,510 So there are positive NIH capabilities. 560 00:28:41,510 --> 00:28:43,670 We talked about how decentralized 561 00:28:43,670 --> 00:28:44,660 can be an advantage. 562 00:28:44,660 --> 00:28:49,690 We talked about many setting R&D priorities. 563 00:28:49,690 --> 00:28:52,850 You get a lot of safety nets here. 564 00:28:52,850 --> 00:28:57,200 There are benefits to investigator initiative grants. 565 00:28:57,200 --> 00:28:59,690 You don't want to just have large-scale projects. 566 00:28:59,690 --> 00:29:02,418 You want bottom-up opportunities. 567 00:29:02,418 --> 00:29:04,460 The issue for NIH, though, is, historically, it's 568 00:29:04,460 --> 00:29:06,170 been totally dominated by bottom up 569 00:29:06,170 --> 00:29:09,140 and has a modest number of cross-cutting projects. 570 00:29:12,560 --> 00:29:15,650 Peer review in some ways is like Winston Churchill's definition 571 00:29:15,650 --> 00:29:20,120 of democracy, the worst possible system except 572 00:29:20,120 --> 00:29:21,590 for all the others. 573 00:29:21,590 --> 00:29:24,800 So peer review remains competitive. 574 00:29:24,800 --> 00:29:27,380 And that's a very important feature to build in. 575 00:29:27,380 --> 00:29:30,650 So [? IOM ?] recommended much more centralized management 576 00:29:30,650 --> 00:29:34,730 giving more authority to the director of NIH, 577 00:29:34,730 --> 00:29:37,370 getting the director to engage in strategic planning 578 00:29:37,370 --> 00:29:43,190 across the institutes so they got on to cross-cutting plans, 579 00:29:43,190 --> 00:29:45,020 doing cross-cutting budgets. 580 00:29:45,020 --> 00:29:48,170 So this idea of take 10% of the budget 581 00:29:48,170 --> 00:29:50,270 and fund the strategic plans, actually, Congress 582 00:29:50,270 --> 00:29:53,090 worked on implementing that through the common fund. 583 00:29:53,090 --> 00:29:57,620 Strengthening the control of the director 584 00:29:57,620 --> 00:30:00,980 over these cross-agency initiatives was key. 585 00:30:00,980 --> 00:30:04,310 They also recommended creating a DARPA within NIH 586 00:30:04,310 --> 00:30:10,550 to go after the high-risk, high-reward projects. 587 00:30:10,550 --> 00:30:13,910 They wanted improvements in NIH as intramural programs. 588 00:30:13,910 --> 00:30:16,600 So NIH is also a major research entity itself. 589 00:30:16,600 --> 00:30:19,963 It's not just funding universities that historically 590 00:30:19,963 --> 00:30:22,130 has tended to be somewhat weaker than the university 591 00:30:22,130 --> 00:30:24,350 research, which is much more competition-based 592 00:30:24,350 --> 00:30:26,870 rather than kind of entitlement-funding-based. 593 00:30:26,870 --> 00:30:29,160 They wanted to strengthen that. 594 00:30:29,160 --> 00:30:31,190 There is a desperate need for standardization 595 00:30:31,190 --> 00:30:33,320 of data and information systems across 596 00:30:33,320 --> 00:30:35,030 these different institutes. 597 00:30:35,030 --> 00:30:38,960 Because a lot of these diseases are related, 598 00:30:38,960 --> 00:30:41,270 they can learn from each other and you can't do 599 00:30:41,270 --> 00:30:42,860 cross-cutting data analytics. 600 00:30:45,580 --> 00:30:50,090 So there is a series of reform steps here. 601 00:30:50,090 --> 00:30:53,570 As going back to some earlier points we've made in the class, 602 00:30:53,570 --> 00:30:57,710 NIH is not a connected organization. 603 00:30:57,710 --> 00:31:00,890 It's an early-stage basic research organization. 604 00:31:00,890 --> 00:31:02,900 It's not connected to the following stages. 605 00:31:02,900 --> 00:31:08,480 Now, it has managed to create a model through biotechs 606 00:31:08,480 --> 00:31:11,450 that are venture funded that is able to get 607 00:31:11,450 --> 00:31:17,120 across the valley of death if you've got a blockbuster drug. 608 00:31:17,120 --> 00:31:19,430 That's pretty creative. 609 00:31:19,430 --> 00:31:23,180 Other agencies haven't come up with that. 610 00:31:23,180 --> 00:31:26,120 It's not as if NIH was sitting around figuring it out. 611 00:31:26,120 --> 00:31:29,930 But in effect, it occurred and the training of the talent 612 00:31:29,930 --> 00:31:32,760 helped it occur through people like Boyer and Swanson. 613 00:31:32,760 --> 00:31:33,260 Martine? 614 00:31:33,260 --> 00:31:34,840 AUDIENCE: Will we define blockbuster drugs 615 00:31:34,840 --> 00:31:36,632 as the one that makes a lot of money or one 616 00:31:36,632 --> 00:31:39,050 that impacts the disease that a lot of people have? 617 00:31:39,050 --> 00:31:41,500 WILLIAM BONVILLIAN: One that makes a lot of money. 618 00:31:41,500 --> 00:31:43,190 Right. 619 00:31:43,190 --> 00:31:46,400 I mean, in theory, it won't get through FDA unless it actually 620 00:31:46,400 --> 00:31:47,210 solves a problem. 621 00:31:52,930 --> 00:31:55,300 NIH is primarily small-grant research 622 00:31:55,300 --> 00:31:58,270 and lacks the capability to set up 623 00:31:58,270 --> 00:31:59,852 initiatives across the stovepipes 624 00:31:59,852 --> 00:32:01,060 as we've talked about before. 625 00:32:01,060 --> 00:32:04,400 It tends to be fairly slow moving. 626 00:32:04,400 --> 00:32:06,190 There's a risk in peer review of avoiding 627 00:32:06,190 --> 00:32:09,070 high-risk, high-payoff approaches. 628 00:32:09,070 --> 00:32:11,380 There's limited connections to industry. 629 00:32:11,380 --> 00:32:15,190 In other words, that translational research effect 630 00:32:15,190 --> 00:32:19,510 is hard to do it at NIH, although Collins, 631 00:32:19,510 --> 00:32:22,540 who's the current director of NIH, 632 00:32:22,540 --> 00:32:26,170 has worked on creating a special translational medicine 633 00:32:26,170 --> 00:32:28,240 institute designed at that problem. 634 00:32:28,240 --> 00:32:30,100 So NIH has been taking steps on this. 635 00:32:30,100 --> 00:32:31,970 I think that's a pretty good-- 636 00:32:31,970 --> 00:32:34,690 I'll give you one more example-- 637 00:32:34,690 --> 00:32:35,600 nanotechnology. 638 00:32:38,830 --> 00:32:41,117 The earliest beneficiary of nanotechnology 639 00:32:41,117 --> 00:32:42,700 was probably the semiconductor sector. 640 00:32:42,700 --> 00:32:45,340 But the life science sector, the health research sector 641 00:32:45,340 --> 00:32:47,440 was very close. 642 00:32:47,440 --> 00:32:51,220 Huge potential, opportunity spaces 643 00:32:51,220 --> 00:32:53,170 in understanding things at the nanoscale here. 644 00:32:58,150 --> 00:33:04,290 Because of the disaggregated structure of NIH, 645 00:33:04,290 --> 00:33:07,380 for many, many years after the nanotechnology initiative was 646 00:33:07,380 --> 00:33:09,630 created, NIH-- 647 00:33:09,630 --> 00:33:12,720 by far, the largest research organization-- 648 00:33:12,720 --> 00:33:15,900 was only spending a fraction of what the National Science 649 00:33:15,900 --> 00:33:19,170 Foundation was spending on nanotechnology, even 650 00:33:19,170 --> 00:33:23,220 though the gains for the health research system 651 00:33:23,220 --> 00:33:27,450 were, and clearly are, phenomenal. 652 00:33:27,450 --> 00:33:31,800 So that's an example of a problem 653 00:33:31,800 --> 00:33:34,650 in being able to attack a whole new approached 654 00:33:34,650 --> 00:33:39,060 disease across this very disaggregated model. 655 00:33:42,320 --> 00:33:46,040 So that's the IOM report of 2003. 656 00:33:46,040 --> 00:33:48,110 We've learned some lessons from this. 657 00:33:48,110 --> 00:33:49,740 We tried to make some changes. 658 00:33:49,740 --> 00:33:51,740 And Congress has actually provided a fair amount 659 00:33:51,740 --> 00:33:54,200 of leadership on that in creating the common fund 660 00:33:54,200 --> 00:33:57,682 and capping the number of institutes. 661 00:33:57,682 --> 00:34:00,140 Martin, you want to take us through some questions in this? 662 00:34:00,140 --> 00:34:02,928 AUDIENCE: Do you actually want to do the Cooke-Deegan one? 663 00:34:02,928 --> 00:34:04,720 Because that way I can just integrate them. 664 00:34:04,720 --> 00:34:05,750 And it might be easier. 665 00:34:05,750 --> 00:34:06,230 WILLIAM BONVILLIAN: All right. 666 00:34:06,230 --> 00:34:07,760 We can do Deegan and do it together. 667 00:34:07,760 --> 00:34:09,650 And we don't need to spend a lot of time 668 00:34:09,650 --> 00:34:11,790 on this because last week's class was on DARPA. 669 00:34:11,790 --> 00:34:15,830 So I'm not going to recap this. 670 00:34:15,830 --> 00:34:20,690 Robert Cooke-Deegan was deputy when Watson was heading 671 00:34:20,690 --> 00:34:22,770 the Genome Project at NIH. 672 00:34:22,770 --> 00:34:25,409 So he knows NIH from the inside. 673 00:34:25,409 --> 00:34:29,860 He later went on to head Duke's Genome Institute. 674 00:34:29,860 --> 00:34:33,670 So he's an outstanding researcher. 675 00:34:33,670 --> 00:34:39,100 And now, he is teaching science and technology policy 676 00:34:39,100 --> 00:34:43,449 at the Washington branch of Arizona State. 677 00:34:43,449 --> 00:34:47,230 So he's gone over to the dark side of science policy 678 00:34:47,230 --> 00:34:50,620 where people like me inhabit the world. 679 00:34:50,620 --> 00:34:54,130 But he's got real talents, unlike me actually, 680 00:34:54,130 --> 00:34:56,949 on the technology side. 681 00:34:56,949 --> 00:35:00,580 And he wrote this piece back in 1996 fairly fresh 682 00:35:00,580 --> 00:35:05,890 off his own NIH experience, Does NIH Need a DARPA? 683 00:35:05,890 --> 00:35:08,680 And I can't tell you how controversial this was. 684 00:35:08,680 --> 00:35:11,650 NIH has always viewed itself as better than everybody else. 685 00:35:11,650 --> 00:35:13,560 And the fact that one of their own, 686 00:35:13,560 --> 00:35:16,240 the deputy director of their Genome Project, 687 00:35:16,240 --> 00:35:18,728 was telling them there was a better model out there that 688 00:35:18,728 --> 00:35:20,770 they ought to at least consider adopting for part 689 00:35:20,770 --> 00:35:23,200 of their operation-- certainly, not for all-- 690 00:35:23,200 --> 00:35:26,470 was a powerful message. 691 00:35:26,470 --> 00:35:28,250 And we know the DARPA story. 692 00:35:28,250 --> 00:35:30,790 So I'm not going to recap that. 693 00:35:30,790 --> 00:35:34,090 Robert Cooke-Deegan underscores that the DARPA model 694 00:35:34,090 --> 00:35:38,320 suggests that peer review is not the only way of organizing 695 00:35:38,320 --> 00:35:40,150 research. 696 00:35:40,150 --> 00:35:43,720 DARPA has far lower transaction costs than NIH does, 697 00:35:43,720 --> 00:35:46,657 a lot lower review costs per science direction. 698 00:35:49,520 --> 00:35:54,680 He introduces this issue of how conservative peer review gets 699 00:35:54,680 --> 00:35:58,370 when it gets to an award rate that's 700 00:35:58,370 --> 00:36:03,510 much worse than about one out of three applications per grants 701 00:36:03,510 --> 00:36:04,010 awarded. 702 00:36:06,860 --> 00:36:08,750 NIH has got a very limited ability 703 00:36:08,750 --> 00:36:14,790 to do the grand challenge model, to pursue that challenge model. 704 00:36:14,790 --> 00:36:16,700 It's got limited ability to take high risks 705 00:36:16,700 --> 00:36:19,220 and get high rewards. 706 00:36:19,220 --> 00:36:22,820 It's got a lower risk acceptance capability. 707 00:36:22,820 --> 00:36:26,240 Those are DARPA things that NIH should have-- not all of NIH, 708 00:36:26,240 --> 00:36:31,190 but certainly some part of it-- might adopt, he argues. 709 00:36:31,190 --> 00:36:32,450 He cites a lot of examples. 710 00:36:36,540 --> 00:36:38,150 The Human Genome Initiative actually 711 00:36:38,150 --> 00:36:40,790 originated in the Department of Energy 712 00:36:40,790 --> 00:36:45,080 because the Department of Energy understood supercomputers. 713 00:36:45,080 --> 00:36:47,210 So for the first five years of the Genome Project, 714 00:36:47,210 --> 00:36:51,740 it was carried by the supercomputer gurus 715 00:36:51,740 --> 00:36:53,240 at DOE, because they understood what 716 00:36:53,240 --> 00:36:55,950 the potential supercomputing was going to be in figuring out 717 00:36:55,950 --> 00:36:57,920 the genetic structure. 718 00:36:57,920 --> 00:37:02,570 NIH only picked it up when it saw the promise of the model. 719 00:37:02,570 --> 00:37:05,720 NIH had terrible trouble coping with the advances made 720 00:37:05,720 --> 00:37:06,650 by Leroy Hood. 721 00:37:06,650 --> 00:37:09,380 And we'll talk later about Venter 722 00:37:09,380 --> 00:37:12,590 and how do you adapt and incorporate 723 00:37:12,590 --> 00:37:15,860 a computational model in medical research. 724 00:37:15,860 --> 00:37:18,560 They're locked into a biology-only drug development 725 00:37:18,560 --> 00:37:19,820 model pretty much. 726 00:37:19,820 --> 00:37:22,450 It's hard for them to do other territories 727 00:37:22,450 --> 00:37:25,280 as we'll talk about when we get to the convergence report. 728 00:37:25,280 --> 00:37:30,650 So the lessons from DARPA for handling 729 00:37:30,650 --> 00:37:34,640 multidisciplinary approaches in bringing in a group of talent 730 00:37:34,640 --> 00:37:37,670 around a problem that comes from different fields, there's 731 00:37:37,670 --> 00:37:42,660 lessons here for NIH. 732 00:37:42,660 --> 00:37:47,240 I mean, that's the heart of Robert Cooke-Deegan's 733 00:37:47,240 --> 00:37:48,200 recommendations. 734 00:37:48,200 --> 00:37:52,297 And Martine, it's yours. 735 00:37:52,297 --> 00:37:54,755 AUDIENCE: So I think you did a good analysis of the papers. 736 00:37:54,755 --> 00:37:57,800 So I won't really summarize too much more. 737 00:37:57,800 --> 00:38:00,530 I think we can just go straight into discussion. 738 00:38:00,530 --> 00:38:03,860 And so I think a good first question would be, how do we-- 739 00:38:03,860 --> 00:38:06,320 know that there's a lot of cost in the health care industry 740 00:38:06,320 --> 00:38:09,220 and this system, the structure they have isn't perfect. 741 00:38:09,220 --> 00:38:11,720 And so how do we create a system so that these problems that 742 00:38:11,720 --> 00:38:14,240 need to get solved but don't have economic incentive 743 00:38:14,240 --> 00:38:15,740 because they're not blockbusters, 744 00:38:15,740 --> 00:38:19,010 how do you combine NIH with maybe some DARPA-like abilities 745 00:38:19,010 --> 00:38:20,750 and industry? 746 00:38:20,750 --> 00:38:24,687 And if anybody's specialty is higher research, 747 00:38:24,687 --> 00:38:26,770 it would be interesting to get your insight first. 748 00:38:30,418 --> 00:38:32,210 AUDIENCE: What you mean by higher research? 749 00:38:32,210 --> 00:38:33,627 AUDIENCE: Well, if you're focusing 750 00:38:33,627 --> 00:38:37,980 on an academic career in a national lab, 751 00:38:37,980 --> 00:38:39,590 you know more about that than I would. 752 00:38:39,590 --> 00:38:40,700 AUDIENCE: Where is Lilly when you need her? 753 00:38:40,700 --> 00:38:42,400 AUDIENCE: Yes, you got it. 754 00:38:46,390 --> 00:38:49,750 AUDIENCE: Yeah, I think you probably, 755 00:38:49,750 --> 00:38:52,150 as Cooke-Deegan laid it out, you just really struggle 756 00:38:52,150 --> 00:38:55,510 in this peer review process if you have an idea for maybe 757 00:38:55,510 --> 00:38:59,680 not a blockbuster drug but a new approach 758 00:38:59,680 --> 00:39:02,320 to a system that hasn't really been tried out yet. 759 00:39:02,320 --> 00:39:03,460 So that's what you see. 760 00:39:03,460 --> 00:39:06,490 [INAUDIBLE] a mentor have this huge problem. 761 00:39:06,490 --> 00:39:08,260 And I think it really is just kind 762 00:39:08,260 --> 00:39:13,120 of allocating a part of that NIH budget 763 00:39:13,120 --> 00:39:18,040 that they get to this DARPA-like high-impact, maybe 764 00:39:18,040 --> 00:39:22,410 high-risk research and then moving from there. 765 00:39:22,410 --> 00:39:25,410 But I'm not sure because it takes 766 00:39:25,410 --> 00:39:28,953 so long to develop these things, so a three to five year. 767 00:39:28,953 --> 00:39:31,370 And then you have to figure out clinical trials and things 768 00:39:31,370 --> 00:39:32,430 like that like. 769 00:39:32,430 --> 00:39:34,430 Would the time scale need to be adjusted 770 00:39:34,430 --> 00:39:35,930 to account for doing the research 771 00:39:35,930 --> 00:39:37,430 and then moving into clinical trials 772 00:39:37,430 --> 00:39:41,840 to be able to actually support these high-risk initiatives. 773 00:39:41,840 --> 00:39:43,438 And a lot of these I feel like will 774 00:39:43,438 --> 00:39:45,480 be building on research that doesn't really exist 775 00:39:45,480 --> 00:39:46,610 or isn't really there yet. 776 00:39:46,610 --> 00:39:50,803 So we'll need more time to stand up [INAUDIBLE].. 777 00:39:50,803 --> 00:39:52,470 AUDIENCE: Yeah, I think there's probably 778 00:39:52,470 --> 00:39:55,380 two main ways that maybe some of the later readings 779 00:39:55,380 --> 00:39:56,110 touch on more. 780 00:39:56,110 --> 00:40:00,630 But definitely, incentives on the FDA 781 00:40:00,630 --> 00:40:04,110 is hard to extend and make that process a little bit more 782 00:40:04,110 --> 00:40:08,490 profitable than, say, if it's on an orphan drug where you have 783 00:40:08,490 --> 00:40:12,660 a very small market that could be addressed when 784 00:40:12,660 --> 00:40:16,890 the drug comes out by increasing the patent life 785 00:40:16,890 --> 00:40:19,020 so they have extended patent provisions 786 00:40:19,020 --> 00:40:20,760 for those kind of drugs. 787 00:40:20,760 --> 00:40:24,060 That definitely helps make it more profitable. 788 00:40:24,060 --> 00:40:27,870 And also, a lot of the later papers talk about this. 789 00:40:27,870 --> 00:40:30,390 But they have a lot of flexibility 790 00:40:30,390 --> 00:40:32,910 in the way they can price their drugs. 791 00:40:32,910 --> 00:40:37,710 So I think that kind of helps a lot because if they can charge 792 00:40:37,710 --> 00:40:43,050 a lot of money for a niche drug that kind of compensates 793 00:40:43,050 --> 00:40:46,565 for the fact that you're not giving it to as many people. 794 00:40:46,565 --> 00:40:48,690 AUDIENCE: And I guess my concern with that last one 795 00:40:48,690 --> 00:40:50,882 is what about a Martin Shkreli situation? 796 00:40:50,882 --> 00:40:51,840 AUDIENCE: I mean, yeah. 797 00:40:51,840 --> 00:40:53,940 Obviously, that's going to come up 798 00:40:53,940 --> 00:40:55,770 because all these companies will want 799 00:40:55,770 --> 00:40:58,230 to recoup their costs because they're spending 800 00:40:58,230 --> 00:41:02,180 millions and billions of dollars on these drugs to develop them. 801 00:41:02,180 --> 00:41:04,830 And there is kind of an argument for both ways 802 00:41:04,830 --> 00:41:08,010 that it's kind of justified but, I mean, obviously, bad 803 00:41:08,010 --> 00:41:10,730 for society. 804 00:41:10,730 --> 00:41:14,040 AUDIENCE: And also, you could make a very similar argument 805 00:41:14,040 --> 00:41:17,280 for any other drug in one of these blockbuster drugs 806 00:41:17,280 --> 00:41:18,630 as we say. 807 00:41:18,630 --> 00:41:21,090 Because well, you can still charge 808 00:41:21,090 --> 00:41:22,650 an exorbitant price for them. 809 00:41:22,650 --> 00:41:24,630 You can get tons of money for them. 810 00:41:24,630 --> 00:41:27,155 And then you're back to square 1 where, OK, let's just 811 00:41:27,155 --> 00:41:28,530 keep funding the drugs that we've 812 00:41:28,530 --> 00:41:33,380 been doing because don't fix what isn't broken, right? 813 00:41:33,380 --> 00:41:39,140 So I'm not sure how it would incentivize people to focus 814 00:41:39,140 --> 00:41:40,280 on these smaller drugs. 815 00:41:40,280 --> 00:41:44,570 Now, one way you could do it, maybe tax incentives 816 00:41:44,570 --> 00:41:45,830 or something. 817 00:41:45,830 --> 00:41:48,860 Maybe force insurance companies to foot the bill a bit more. 818 00:41:48,860 --> 00:41:53,177 But I don't really know the detail on how that would work. 819 00:41:53,177 --> 00:41:55,760 WILLIAM BONVILLIAN: This is one of the great societal dilemmas 820 00:41:55,760 --> 00:41:57,593 that we've get in health care at the moment. 821 00:41:57,593 --> 00:42:03,050 You want to create enormous incentives for researchers 822 00:42:03,050 --> 00:42:06,080 and the biotech companies they may create 823 00:42:06,080 --> 00:42:07,890 to go after these big problems. 824 00:42:07,890 --> 00:42:11,280 You really want to incentivize that process. 825 00:42:11,280 --> 00:42:13,460 So if you start to disincentivize it, 826 00:42:13,460 --> 00:42:17,030 you're going to have less remedies on the table. 827 00:42:17,030 --> 00:42:20,330 On the other hand, you've got serious cost problems 828 00:42:20,330 --> 00:42:22,610 for some portion of the population that's 829 00:42:22,610 --> 00:42:25,113 not going to be able to cope with the cost structure. 830 00:42:25,113 --> 00:42:26,780 And you're going to have real reluctance 831 00:42:26,780 --> 00:42:28,220 on the part of government agencies 832 00:42:28,220 --> 00:42:32,630 to subscribe for a huge cost burden that 833 00:42:32,630 --> 00:42:35,330 only helps a relatively small number of people. 834 00:42:35,330 --> 00:42:39,410 So this is a dilemma that is now upon us big time 835 00:42:39,410 --> 00:42:42,680 that we have not sorted through in an intelligent way. 836 00:42:42,680 --> 00:42:49,500 And there's constant reviling of drug and biotech companies. 837 00:42:49,500 --> 00:42:51,583 But on the other hand, they're the ones 838 00:42:51,583 --> 00:42:53,250 that created these opportunities for us. 839 00:42:53,250 --> 00:42:55,400 So it's a very contradictory-- 840 00:42:55,400 --> 00:42:59,140 the politics doesn't seem to recognize the importance 841 00:42:59,140 --> 00:43:00,380 of the innovation system. 842 00:43:00,380 --> 00:43:03,556 And yet we don't have the right balance between the two. 843 00:43:03,556 --> 00:43:04,306 AUDIENCE: Yeah, [? 844 00:43:04,306 --> 00:43:06,150 I was just going to answer ?] [INAUDIBLE] the policy side, 845 00:43:06,150 --> 00:43:06,780 or whatever [? you want to say. ?] 846 00:43:06,780 --> 00:43:07,660 AUDIENCE: Oh, I actually was going 847 00:43:07,660 --> 00:43:10,308 to say that also I think the dimension that's also missing 848 00:43:10,308 --> 00:43:12,100 is something we touched on last week, which 849 00:43:12,100 --> 00:43:13,540 is the ethical components. 850 00:43:13,540 --> 00:43:16,300 I think there's really a rising bioethics community surrounding 851 00:43:16,300 --> 00:43:20,230 issues of utilitarianism versus issues of human suffering. 852 00:43:20,230 --> 00:43:22,290 Do we seek to address the person who 853 00:43:22,290 --> 00:43:25,000 is suffering the most or the number of people 854 00:43:25,000 --> 00:43:26,350 who are suffering? 855 00:43:26,350 --> 00:43:29,830 And that is a really difficult, I think, 856 00:43:29,830 --> 00:43:31,720 ethical question that has yet to be 857 00:43:31,720 --> 00:43:34,490 addressed on a national level. 858 00:43:34,490 --> 00:43:38,575 And I think it's easier-- 859 00:43:41,308 --> 00:43:43,350 I guess, I think about it in terms of heuristics, 860 00:43:43,350 --> 00:43:45,460 like short cuts for decision making. 861 00:43:45,460 --> 00:43:48,000 If we understand what our values are nationally, 862 00:43:48,000 --> 00:43:51,150 we can understand where to begin to allocate money. 863 00:43:51,150 --> 00:43:53,910 And if we sort out our national values 864 00:43:53,910 --> 00:43:56,610 in terms of the life sciences and what 865 00:43:56,610 --> 00:43:58,800 we think are our national priorities, 866 00:43:58,800 --> 00:44:01,873 perhaps maybe through a referendum-like system, 867 00:44:01,873 --> 00:44:03,540 I think it would actually be a much more 868 00:44:03,540 --> 00:44:06,450 efficient and accountable way for the government 869 00:44:06,450 --> 00:44:08,740 to make funding priority decisions 870 00:44:08,740 --> 00:44:12,540 rather than for life sciences organizations and the NIH 871 00:44:12,540 --> 00:44:15,420 to sort of make those decisions for themselves 872 00:44:15,420 --> 00:44:16,680 or for the citizens. 873 00:44:16,680 --> 00:44:22,080 Because ultimately, this is an issue at the human level. 874 00:44:22,080 --> 00:44:24,600 And that implies an understanding 875 00:44:24,600 --> 00:44:27,308 of bioethical considerations. 876 00:44:27,308 --> 00:44:29,850 WILLIAM BONVILLIAN: I think that tends to run into a problem, 877 00:44:29,850 --> 00:44:33,570 however, with root attitudes of the American population. 878 00:44:33,570 --> 00:44:36,270 And I would say that a lot of the American population 879 00:44:36,270 --> 00:44:38,820 believes that we should be allowed to live forever 880 00:44:38,820 --> 00:44:40,230 because we're Americans. 881 00:44:40,230 --> 00:44:43,350 So we should be eternal. 882 00:44:43,350 --> 00:44:44,640 I'm exaggerating a bit here. 883 00:44:44,640 --> 00:44:46,230 But there's a lot of thinking along 884 00:44:46,230 --> 00:44:48,780 those lines and a lot of assumptions 885 00:44:48,780 --> 00:44:53,520 that any cost, whatever it is, is justified in terms 886 00:44:53,520 --> 00:44:55,660 of the outcome for my health. 887 00:44:55,660 --> 00:44:59,760 So these are these are really tough decisions. 888 00:44:59,760 --> 00:45:02,520 And the life science system is now right up against this. 889 00:45:02,520 --> 00:45:04,940 And the whole innovation model is right up against these. 890 00:45:04,940 --> 00:45:09,555 So these are these are important considerations. 891 00:45:09,555 --> 00:45:10,180 AUDIENCE: Yeah. 892 00:45:10,180 --> 00:45:14,170 Just my concern with the proposal as far as putting it 893 00:45:14,170 --> 00:45:17,350 in the hands of American people is 894 00:45:17,350 --> 00:45:21,490 kind of what questions are appropriate for letting people 895 00:45:21,490 --> 00:45:23,925 to be answering versus the ones that are experts. 896 00:45:23,925 --> 00:45:25,300 [INAUDIBLE] A lot of people would 897 00:45:25,300 --> 00:45:28,317 debate that the Brexit question wasn't really 898 00:45:28,317 --> 00:45:30,400 something that the average person could understand 899 00:45:30,400 --> 00:45:32,515 well enough to make an informed decision. 900 00:45:32,515 --> 00:45:34,480 And so I guess if you're asking strictly 901 00:45:34,480 --> 00:45:36,670 what are your ethical beliefs, vote on that. 902 00:45:36,670 --> 00:45:38,712 I guess that's something that people can vote on. 903 00:45:38,712 --> 00:45:40,690 But if you're given options of should we 904 00:45:40,690 --> 00:45:45,250 fund something that will save x number of people at this cost, 905 00:45:45,250 --> 00:45:48,130 weighing that information is more difficult than I think 906 00:45:48,130 --> 00:45:51,640 the average person is willing to put the time in to understand. 907 00:45:51,640 --> 00:45:54,490 So I think that, to some extent, NIH 908 00:45:54,490 --> 00:45:55,990 and these kind of organizations need 909 00:45:55,990 --> 00:45:59,580 to make the decisions on what our national priorities are 910 00:45:59,580 --> 00:46:02,437 but, ideally, with some input from us. 911 00:46:02,437 --> 00:46:04,020 WILLIAM BONVILLIAN: I want to push you 912 00:46:04,020 --> 00:46:08,440 back to the issue that's before us in this class, which 913 00:46:08,440 --> 00:46:12,170 is the innovation organization model that we've got here. 914 00:46:12,170 --> 00:46:15,070 So we're looking at an innovation organization that 915 00:46:15,070 --> 00:46:17,410 evolved in a very historical set of ways 916 00:46:17,410 --> 00:46:19,990 where the problems, when it was in the foundation 917 00:46:19,990 --> 00:46:22,960 level, the scientific problems turned out 918 00:46:22,960 --> 00:46:26,860 to be different than the way we see them now. 919 00:46:26,860 --> 00:46:29,350 We tend to see that there are a lot 920 00:46:29,350 --> 00:46:31,780 of cross-cutting scientific approaches 921 00:46:31,780 --> 00:46:34,870 that cut across many disease pathways that may 922 00:46:34,870 --> 00:46:36,880 be critical for those diseases. 923 00:46:36,880 --> 00:46:39,490 Whereas back then, we saw, oh, there 924 00:46:39,490 --> 00:46:42,730 will be one remedy for each disease, 925 00:46:42,730 --> 00:46:46,210 a much earlier-stage scientific notion of disease. 926 00:46:46,210 --> 00:46:48,070 And we're caught up in that model. 927 00:46:48,070 --> 00:46:52,930 And how do we bring change here? 928 00:46:52,930 --> 00:46:54,895 We're up against a legacy-sector problem 929 00:46:54,895 --> 00:46:57,160 as we'll talk in a little bit. 930 00:46:57,160 --> 00:47:00,900 How do we bring change to that legacy sector 931 00:47:00,900 --> 00:47:04,050 from an organizational point of view? 932 00:47:04,050 --> 00:47:07,650 AUDIENCE: I have not an answer, but a non-answer. 933 00:47:07,650 --> 00:47:11,620 I am going back to the attack on peer review as it were. 934 00:47:11,620 --> 00:47:16,727 I think that's a very odd place to start in terms of something 935 00:47:16,727 --> 00:47:17,310 to get rid of. 936 00:47:17,310 --> 00:47:22,500 I think there were definitely a lot of reasonable arguments 937 00:47:22,500 --> 00:47:25,730 presented in favor of adopting something a DARPA model. 938 00:47:25,730 --> 00:47:27,510 And then going back to your ethics-- 939 00:47:27,510 --> 00:47:29,820 you raised the point of bioethics considerations. 940 00:47:29,820 --> 00:47:33,290 It seems to me that peer review in the system, 941 00:47:33,290 --> 00:47:35,040 appearing to be the values of peer review, 942 00:47:35,040 --> 00:47:39,960 are so deeply ingrained in the biopharmaceutical-- 943 00:47:39,960 --> 00:47:44,687 well, more on the biological and medical research community. 944 00:47:44,687 --> 00:47:46,770 It seems like it's very much part of their values. 945 00:47:46,770 --> 00:47:48,840 And it would be very odd to me to see 946 00:47:48,840 --> 00:47:51,090 them as separated out because I can recognize 947 00:47:51,090 --> 00:47:53,260 all of the detriments of the peer review system 948 00:47:53,260 --> 00:47:56,140 and how it can favor incremental, 949 00:47:56,140 --> 00:48:02,100 unimportant research over big innovations and risk takers. 950 00:48:02,100 --> 00:48:04,890 But it still seems like, if you remove that barrier, 951 00:48:04,890 --> 00:48:07,200 then you might have a flood of ethical problems 952 00:48:07,200 --> 00:48:08,940 that would just start cropping up. 953 00:48:08,940 --> 00:48:10,950 WILLIAM BONVILLIAN: Look, I mean, 954 00:48:10,950 --> 00:48:13,770 the DARPA model does not work in all circumstances. 955 00:48:13,770 --> 00:48:16,950 In other words, the DARPA model is very much a top-down model. 956 00:48:16,950 --> 00:48:20,580 A bunch of elite program managers 957 00:48:20,580 --> 00:48:22,890 are looking at things they want out 958 00:48:22,890 --> 00:48:25,140 of the end of the innovation pipeline 959 00:48:25,140 --> 00:48:28,940 and designing projects to get there. 960 00:48:28,940 --> 00:48:31,970 And we can certainly see why there's room for this model, 961 00:48:31,970 --> 00:48:35,750 for that kind of projects-based, challenge-based model. 962 00:48:35,750 --> 00:48:39,200 NIH offers the other side of the coin. 963 00:48:39,200 --> 00:48:41,360 Researchers propose funding, and they 964 00:48:41,360 --> 00:48:44,660 get funded based upon the interest and quality of what 965 00:48:44,660 --> 00:48:46,100 they're proposing. 966 00:48:46,100 --> 00:48:47,770 That's very much a bottom-up model. 967 00:48:47,770 --> 00:48:50,020 In other words, a lot of people can see a lot of stuff 968 00:48:50,020 --> 00:48:52,700 in the bottom level that a top-down model 969 00:48:52,700 --> 00:48:54,110 can't necessarily see. 970 00:48:54,110 --> 00:48:55,880 So Chloe, you're right. 971 00:48:55,880 --> 00:49:00,800 It's not that you want to replace NIH with DARPA. 972 00:49:00,800 --> 00:49:02,480 I think what Robert Cooke-Deegan would 973 00:49:02,480 --> 00:49:07,620 say is maybe there's room within a large NIH 974 00:49:07,620 --> 00:49:09,870 entity for a DARPA like piece. 975 00:49:09,870 --> 00:49:13,320 Just as there is in the Defense Innovation System, 976 00:49:13,320 --> 00:49:16,590 DARPA is only one element of a much larger defense research 977 00:49:16,590 --> 00:49:17,670 portfolio. 978 00:49:17,670 --> 00:49:20,290 But it does provide some interesting capability. 979 00:49:23,830 --> 00:49:29,880 And peer review does have advantages 980 00:49:29,880 --> 00:49:33,660 from that bottom-up perspective that a top-down strong program 981 00:49:33,660 --> 00:49:36,990 manager perspective will not necessarily be able to capture. 982 00:49:41,247 --> 00:49:42,955 AUDIENCE: So from a business perspective, 983 00:49:42,955 --> 00:49:44,080 the thing I'm wondering about is-- 984 00:49:44,080 --> 00:49:45,913 because there is this trend right now called 985 00:49:45,913 --> 00:49:47,825 social bonds, which what they do is 986 00:49:47,825 --> 00:49:50,200 they look at the parts of society that are really wasting 987 00:49:50,200 --> 00:49:52,000 a lot of money and then they create 988 00:49:52,000 --> 00:49:53,590 businesses to solve them. 989 00:49:53,590 --> 00:49:56,850 And then they give a return. 990 00:49:56,850 --> 00:49:58,780 That's why I asked about the blockbuster drug. 991 00:49:58,780 --> 00:50:00,530 Because you get to find a blockbuster drug 992 00:50:00,530 --> 00:50:02,830 is also a drug that reduces a ton of costs 993 00:50:02,830 --> 00:50:03,630 for the government. 994 00:50:03,630 --> 00:50:05,755 And the business model is the government is saying, 995 00:50:05,755 --> 00:50:08,410 oh, we're spending $100 million here to solve this problem. 996 00:50:08,410 --> 00:50:11,370 We're going to send some of that portion of the money to you. 997 00:50:11,370 --> 00:50:14,500 And that's a good way to cut the fat. 998 00:50:14,500 --> 00:50:16,120 Because if the major problem is we're 999 00:50:16,120 --> 00:50:18,730 going to use all our budget to solve these diseases-- 1000 00:50:18,730 --> 00:50:22,887 there's probably a disease that's 80% or a huge proportion 1001 00:50:22,887 --> 00:50:24,970 because it's not going to be a linear relationship 1002 00:50:24,970 --> 00:50:26,805 with the cost. 1003 00:50:26,805 --> 00:50:28,180 My other concern too was when you 1004 00:50:28,180 --> 00:50:30,408 said that they doubled the budget, 1005 00:50:30,408 --> 00:50:31,700 this happens a lot in software. 1006 00:50:31,700 --> 00:50:33,880 So it's hard and unintuitive to understand. 1007 00:50:33,880 --> 00:50:36,580 But it's like, say we're going to have the lunch. 1008 00:50:36,580 --> 00:50:39,040 And I say, our budget is $100,000. 1009 00:50:39,040 --> 00:50:40,040 Then I have to justify-- 1010 00:50:40,040 --> 00:50:42,130 WILLIAM BONVILLIAN: That would be quite a lunch. 1011 00:50:42,130 --> 00:50:44,530 AUDIENCE: Yeah, that's the thing because we're not just 1012 00:50:44,530 --> 00:50:45,130 eating, right? 1013 00:50:45,130 --> 00:50:49,690 It's like, OK, now, we have to justify the $100,000. 1014 00:50:49,690 --> 00:50:51,293 Or a cool example is right now there 1015 00:50:51,293 --> 00:50:53,950 is a startup in Silicon Valley that spent like $120 million 1016 00:50:53,950 --> 00:50:55,540 to develop a juicer. 1017 00:50:55,540 --> 00:50:59,620 And so that you could only juice with the machine. 1018 00:50:59,620 --> 00:51:02,387 And then a reporter was like, I call your bluff. 1019 00:51:02,387 --> 00:51:03,970 And they squeezed it with their hands. 1020 00:51:03,970 --> 00:51:05,730 And they did it. 1021 00:51:05,730 --> 00:51:08,170 So my concern is when you have too much money, 1022 00:51:08,170 --> 00:51:09,910 you tend to have to justify things. 1023 00:51:09,910 --> 00:51:11,723 And you come up with this weird logic tree. 1024 00:51:11,723 --> 00:51:13,390 And you don't really fundamentally solve 1025 00:51:13,390 --> 00:51:14,300 the problem. 1026 00:51:14,300 --> 00:51:16,840 In code, if there's a piece code that one coder could do 1027 00:51:16,840 --> 00:51:19,007 and you put five people, it actually complicates it. 1028 00:51:19,007 --> 00:51:20,210 It makes it way harder. 1029 00:51:20,210 --> 00:51:22,300 And now, you got people infighting. 1030 00:51:22,300 --> 00:51:25,770 So that was my big concern with that too. 1031 00:51:25,770 --> 00:51:28,800 AUDIENCE: I think-- and Martin may be trying to get at it. 1032 00:51:28,800 --> 00:51:31,080 This kind of group theory doesn't really 1033 00:51:31,080 --> 00:51:33,665 exists or kind of breaks down in NIH 1034 00:51:33,665 --> 00:51:35,790 because it's really hard to get these cross-cutting 1035 00:51:35,790 --> 00:51:38,140 technologies and encourage that collaboration. 1036 00:51:38,140 --> 00:51:42,420 And I think what I really want to see out of the DARPA 1037 00:51:42,420 --> 00:51:45,100 that they pull is not so much get rid of the peer review 1038 00:51:45,100 --> 00:51:47,280 but start-- 1039 00:51:47,280 --> 00:51:50,130 because DARPA has the ability to pull these great groups 1040 00:51:50,130 --> 00:51:52,697 together and just call on a lot of different folks 1041 00:51:52,697 --> 00:51:54,780 from a lot of different areas and then direct them 1042 00:51:54,780 --> 00:51:58,500 towards a specific research problem. 1043 00:51:58,500 --> 00:52:00,600 And I think where NIH struggles is, yeah, 1044 00:52:00,600 --> 00:52:02,700 they might have a call to say we want to get rid 1045 00:52:02,700 --> 00:52:05,442 of cancer or heart disease. 1046 00:52:05,442 --> 00:52:07,650 But there are a lot of different folks working on it. 1047 00:52:07,650 --> 00:52:11,880 And not a lot of them, I would say, probably work together 1048 00:52:11,880 --> 00:52:14,190 in tandem to really figure out and solve 1049 00:52:14,190 --> 00:52:18,540 maybe one aspect of that problem for maybe 10 years. 1050 00:52:18,540 --> 00:52:20,280 And then they focus on a different aspect 1051 00:52:20,280 --> 00:52:23,670 of how a disease would work in a different lab. 1052 00:52:23,670 --> 00:52:27,070 And I think NIH really struggles in pulling together 1053 00:52:27,070 --> 00:52:29,070 these cross-cutting technologies because there's 1054 00:52:29,070 --> 00:52:32,490 no one pulling them together. 1055 00:52:32,490 --> 00:52:36,350 And they just leave it up to the discretion of the researchers 1056 00:52:36,350 --> 00:52:38,220 and scientists and pharmaceutical companies 1057 00:52:38,220 --> 00:52:40,740 to come up with incremental advances 1058 00:52:40,740 --> 00:52:47,730 rather than not only calling for big calls or big research 1059 00:52:47,730 --> 00:52:49,830 projects but also funding and making 1060 00:52:49,830 --> 00:52:52,080 it possible for great groups to work on these research 1061 00:52:52,080 --> 00:52:53,400 projects. 1062 00:52:53,400 --> 00:52:54,310 AUDIENCE: Max 1063 00:52:54,310 --> 00:52:54,850 WILLIAM BONVILLIAN: Martin, you want 1064 00:52:54,850 --> 00:52:57,570 to give us some closing thoughts on these two pieces? 1065 00:52:57,570 --> 00:52:58,390 Oh, I'm sorry, Max. 1066 00:52:58,390 --> 00:52:58,890 Did you-- 1067 00:52:58,890 --> 00:53:00,940 AUDIENCE: I actually had a quick question. 1068 00:53:00,940 --> 00:53:07,020 So given that this was published 21 years ago, give or take, 1069 00:53:07,020 --> 00:53:07,600 I'm curious-- 1070 00:53:07,600 --> 00:53:10,017 WILLIAM BONVILLIAN: Robert Cooke-Deegan's piece, you mean. 1071 00:53:10,017 --> 00:53:10,690 AUDIENCE: Yeah. 1072 00:53:10,690 --> 00:53:14,050 Did NIH ever attempt anything like this? 1073 00:53:14,050 --> 00:53:16,592 Because we've had some significant time period. 1074 00:53:16,592 --> 00:53:17,800 Did they at least look at it? 1075 00:53:17,800 --> 00:53:18,865 Or did they decide, no? 1076 00:53:18,865 --> 00:53:20,740 AUDIENCE: And also, how much criticism did he 1077 00:53:20,740 --> 00:53:23,773 get for publishing that? 1078 00:53:23,773 --> 00:53:25,690 WILLIAM BONVILLIAN: It was a threatening piece 1079 00:53:25,690 --> 00:53:29,360 to the NIH and life science community, 1080 00:53:29,360 --> 00:53:33,590 frankly, arguing that there may be an additional model that you 1081 00:53:33,590 --> 00:53:35,480 all need to consider. 1082 00:53:35,480 --> 00:53:37,970 So it was not received with open arms. 1083 00:53:37,970 --> 00:53:46,010 Now, in fact, the current NIH director, Collins, 1084 00:53:46,010 --> 00:53:50,390 has, early on in his tenure as director, 1085 00:53:50,390 --> 00:53:52,400 latched onto the problem that NIH 1086 00:53:52,400 --> 00:53:55,410 has got in doing translational work-- in other words, 1087 00:53:55,410 --> 00:53:57,320 moving a technology from the basic stage 1088 00:53:57,320 --> 00:53:59,990 into follow-on sectors and doing the handoff 1089 00:53:59,990 --> 00:54:01,010 to the private sector. 1090 00:54:01,010 --> 00:54:04,280 So Collins began to focus on that problem. 1091 00:54:04,280 --> 00:54:08,450 And there was discussion, at that time, of doing something 1092 00:54:08,450 --> 00:54:09,320 like a DARPA. 1093 00:54:09,320 --> 00:54:14,030 So his effort to create NCATS to do translational medicine, 1094 00:54:14,030 --> 00:54:17,090 the director that he hired to head that new institute-- 1095 00:54:17,090 --> 00:54:18,860 and he had to close another one to do it-- 1096 00:54:22,630 --> 00:54:25,000 actually really began thinking seriously 1097 00:54:25,000 --> 00:54:29,080 about should there be DARPA-like elements in this new NCATS 1098 00:54:29,080 --> 00:54:30,213 entity. 1099 00:54:30,213 --> 00:54:31,630 Chris Austin who is that director, 1100 00:54:31,630 --> 00:54:38,290 very talented and very interesting person, however, 1101 00:54:38,290 --> 00:54:41,080 just didn't have the resource to set up a whole new entity 1102 00:54:41,080 --> 00:54:44,560 within his NCATS piece. 1103 00:54:44,560 --> 00:54:46,150 But so it's an idea that continues 1104 00:54:46,150 --> 00:54:48,370 to kick around here and there. 1105 00:54:48,370 --> 00:54:52,390 And frankly, I would view it as an interesting additional 1106 00:54:52,390 --> 00:54:56,380 feature for NIH to do some things that it can't really 1107 00:54:56,380 --> 00:54:59,920 do without this kind of organizational model. 1108 00:54:59,920 --> 00:55:03,220 So I think there's organizational lessons here 1109 00:55:03,220 --> 00:55:05,590 that we can take from the issues we've been reviewing 1110 00:55:05,590 --> 00:55:11,500 and apply it to a long established research entity 1111 00:55:11,500 --> 00:55:15,110 to create new things in the model. 1112 00:55:15,110 --> 00:55:17,147 How about some closing thoughts, Martine? 1113 00:55:17,147 --> 00:55:18,730 AUDIENCE: I mean, the closing thoughts 1114 00:55:18,730 --> 00:55:20,800 is that we discussed the NIH model 1115 00:55:20,800 --> 00:55:22,780 and how it does work for what they're doing 1116 00:55:22,780 --> 00:55:25,120 but also how there are a lot of blind spots and areas 1117 00:55:25,120 --> 00:55:29,483 that aren't being touched into I've been trying to solve. 1118 00:55:29,483 --> 00:55:31,150 And then we discussed DARPA, which might 1119 00:55:31,150 --> 00:55:32,317 be a good way of solving it. 1120 00:55:32,317 --> 00:55:36,250 But it seems like it won't fit in well with NIH. 1121 00:55:36,250 --> 00:55:39,130 But there is a structural dissonance, I would say, 1122 00:55:39,130 --> 00:55:41,920 in terms of that this form factor does not work 1123 00:55:41,920 --> 00:55:43,450 for these kinds of problems. 1124 00:55:43,450 --> 00:55:47,200 And it might also be the groupthink 1125 00:55:47,200 --> 00:55:49,360 in the area that affects their ability to solve 1126 00:55:49,360 --> 00:55:50,690 these kinds of problems. 1127 00:55:50,690 --> 00:55:53,440 So it might be better for there to be a DARPA 1128 00:55:53,440 --> 00:55:57,460 NIH but not at NIH or nearby because it might be too 1129 00:55:57,460 --> 00:56:00,100 difficult. And also, how do you create natural incentives 1130 00:56:00,100 --> 00:56:01,140 for the researchers. 1131 00:56:01,140 --> 00:56:05,110 Because it seems like they do these kind of leapfrogging 1132 00:56:05,110 --> 00:56:08,300 research initiatives because it's a lot simpler 1133 00:56:08,300 --> 00:56:10,270 and they have a lot to lose if they do fail. 1134 00:56:10,270 --> 00:56:12,790 And so how do you create this kind of comfort zone 1135 00:56:12,790 --> 00:56:15,550 for researchers so that, even if they do fail, 1136 00:56:15,550 --> 00:56:17,750 to get some kind of reward? 1137 00:56:17,750 --> 00:56:21,017 And how do they get recognition for that sacrifice of their 1138 00:56:21,017 --> 00:56:22,350 what you call "academic career." 1139 00:56:24,927 --> 00:56:26,010 WILLIAM BONVILLIAN: Right. 1140 00:56:26,010 --> 00:56:28,230 I mean, that's another issue too, 1141 00:56:28,230 --> 00:56:32,660 which is having multiple PIs on the problem. 1142 00:56:32,660 --> 00:56:35,460 It's complicated in an RO1 award process 1143 00:56:35,460 --> 00:56:38,070 that focuses on the single PI. 1144 00:56:38,070 --> 00:56:41,850 Let me go onto the next couple of readings. 1145 00:56:41,850 --> 00:56:44,490 And we'll do them as a pair. 1146 00:56:44,490 --> 00:56:47,280 So the Infectious Disease Society of America 1147 00:56:47,280 --> 00:56:51,630 has this report called Bad Bugs, No Drugs, which essentially 1148 00:56:51,630 --> 00:56:54,750 summarizes the whole problem. 1149 00:56:54,750 --> 00:56:56,640 And then the Food and Drug Administration 1150 00:56:56,640 --> 00:57:00,570 has a paper on innovation or stagnation, 1151 00:57:00,570 --> 00:57:04,080 which focuses on some of the problems they face. 1152 00:57:04,080 --> 00:57:08,580 So let's do those. 1153 00:57:08,580 --> 00:57:10,332 And Chloe, do you have those? 1154 00:57:10,332 --> 00:57:11,790 Or Martin, do to have one of those? 1155 00:57:11,790 --> 00:57:12,748 AUDIENCE: Do you have-- 1156 00:57:12,748 --> 00:57:13,630 I know I have-- 1157 00:57:13,630 --> 00:57:14,550 AUDIENCE: Which one? 1158 00:57:14,550 --> 00:57:15,720 WILLIAM BONVILLIAN: Bad Bugs, No Drugs, 1159 00:57:15,720 --> 00:57:17,012 the Infectious Disease Society? 1160 00:57:19,986 --> 00:57:21,414 AUDIENCE: I can do this one. 1161 00:57:24,567 --> 00:57:25,970 I'm prepared for it, yeah. 1162 00:57:25,970 --> 00:57:26,350 WILLIAM BONVILLIAN: All right. 1163 00:57:26,350 --> 00:57:27,808 Well, Chloe, which one do you have? 1164 00:57:27,808 --> 00:57:28,930 You have the FDA one? 1165 00:57:28,930 --> 00:57:30,303 Innovation, stagnation. 1166 00:57:30,303 --> 00:57:30,970 All right, fine. 1167 00:57:30,970 --> 00:57:33,340 OK. 1168 00:57:33,340 --> 00:57:36,080 I mean, the title gives this story away. 1169 00:57:36,080 --> 00:57:38,800 And this is the Infectious Disease Society. 1170 00:57:38,800 --> 00:57:48,280 And they note that resistance to bacteria, I think, 1171 00:57:48,280 --> 00:57:51,070 as everybody in this classroom knows is very much on the rise. 1172 00:57:54,355 --> 00:57:56,480 And again, this report was written a few years ago. 1173 00:57:56,480 --> 00:57:58,150 Two million people in US hospitals 1174 00:57:58,150 --> 00:58:00,400 are going to get bacterial infections in the hospital. 1175 00:58:00,400 --> 00:58:03,600 And 90,000 of them are going to die. 1176 00:58:03,600 --> 00:58:09,310 That is a staggering number, right? 1177 00:58:09,310 --> 00:58:14,320 We lose 30,000 people a year in automobile accidents. 1178 00:58:14,320 --> 00:58:18,030 We're losing three times that many in hospitals. 1179 00:58:18,030 --> 00:58:21,280 So talk about tolerance for risk. 1180 00:58:21,280 --> 00:58:24,160 Americans haven't fully woken up to this. 1181 00:58:24,160 --> 00:58:28,420 So hospitals are increasingly a dangerous place to be. 1182 00:58:28,420 --> 00:58:34,120 And only two classes of antibiotics 1183 00:58:34,120 --> 00:58:36,460 have been developed at the time this was written 1184 00:58:36,460 --> 00:58:37,660 in the previous 30 years. 1185 00:58:37,660 --> 00:58:39,550 And one of those is already facing resistance 1186 00:58:39,550 --> 00:58:43,150 in this kind of endless cycle of build up 1187 00:58:43,150 --> 00:58:48,340 of resistance that bacterial sources go through. 1188 00:58:48,340 --> 00:58:55,420 And by the late 1960s, 80% of staff bacteria 1189 00:58:55,420 --> 00:58:58,030 were penicillin resistant. 1190 00:58:58,030 --> 00:59:00,910 And in pneumonia, 40% of the infections 1191 00:59:00,910 --> 00:59:05,920 were resistant to one drug and 15% to the next three. 1192 00:59:05,920 --> 00:59:09,820 So this is a serious growing problem. 1193 00:59:09,820 --> 00:59:15,650 Yet because of the blockbuster drug model, 1194 00:59:15,650 --> 00:59:18,920 there's no incentive for drug companies or biotechs 1195 00:59:18,920 --> 00:59:25,790 to go after these antibiotics because they cure the problem. 1196 00:59:28,310 --> 00:59:30,110 So you take the antibiotic for two weeks, 1197 00:59:30,110 --> 00:59:31,880 and the problem is solved. 1198 00:59:31,880 --> 00:59:35,330 What you want, under the blockbuster drug model, 1199 00:59:35,330 --> 00:59:37,100 is something that I'm going to have 1200 00:59:37,100 --> 00:59:40,910 to take for the rest of my life for $100,000 a year, right? 1201 00:59:40,910 --> 00:59:42,320 You don't really want to cure it. 1202 00:59:42,320 --> 00:59:45,230 You want to create incremental advances that 1203 00:59:45,230 --> 00:59:46,820 manage the problem. 1204 00:59:46,820 --> 00:59:51,260 Whereas the antibiotic actually cures the problem. 1205 00:59:51,260 --> 00:59:53,720 So there's very little incentive since there's 1206 00:59:53,720 --> 00:59:57,122 no economic return model that works. 1207 00:59:57,122 --> 00:59:58,580 There's very little incentive to go 1208 00:59:58,580 --> 01:00:01,220 after these antibiotic problems. 1209 01:00:01,220 --> 01:00:03,260 They work too well too fast. 1210 01:00:03,260 --> 01:00:07,040 So it's a very weak return on investment. 1211 01:00:07,040 --> 01:00:08,720 And successful antibiotics are just 1212 01:00:08,720 --> 01:00:11,690 too successful to justify the investment cost. 1213 01:00:11,690 --> 01:00:15,350 So everybody is aware of this. 1214 01:00:15,350 --> 01:00:22,460 Elias Zerhouni, who preceded Francis Collins as head of NIH, 1215 01:00:22,460 --> 01:00:24,530 had this roadmap model. 1216 01:00:24,530 --> 01:00:27,530 And this was certainly on that list of cross-cutting issues 1217 01:00:27,530 --> 01:00:29,570 that need to be dealt with. 1218 01:00:29,570 --> 01:00:31,310 But how do you get around the kind 1219 01:00:31,310 --> 01:00:35,690 of model here, the economic, problematic blockbuster drug 1220 01:00:35,690 --> 01:00:36,380 model here? 1221 01:00:36,380 --> 01:00:41,780 So I mean there've been a variety of ideas advanced. 1222 01:00:41,780 --> 01:00:46,190 There was bioshield legislation to deal with biothreats. 1223 01:00:46,190 --> 01:00:49,610 But it could also bear on infectious diseases. 1224 01:00:49,610 --> 01:00:53,140 The idea there was, for a biothreat, 1225 01:00:53,140 --> 01:00:56,790 why would anybody develop a biothreat remedy? 1226 01:00:56,790 --> 01:01:00,940 Because the drug would only be sold 1227 01:01:00,940 --> 01:01:03,370 if there was a completely unpredictable 1228 01:01:03,370 --> 01:01:05,642 terrible national disaster. 1229 01:01:05,642 --> 01:01:07,850 So are you going to take all the risks and go through 1230 01:01:07,850 --> 01:01:12,110 the $1.4-billion clinical trial process to develop a remedy 1231 01:01:12,110 --> 01:01:14,720 for something that may well never be used? 1232 01:01:14,720 --> 01:01:19,370 A similar kind of problem for antibiotic drugs. 1233 01:01:19,370 --> 01:01:24,272 So the Infectious Disease Society said, wait a minute. 1234 01:01:24,272 --> 01:01:26,480 The same model that would deal with biothreats, which 1235 01:01:26,480 --> 01:01:28,130 is that the government would agree 1236 01:01:28,130 --> 01:01:32,280 to buy a certain number of dosages, 1237 01:01:32,280 --> 01:01:36,440 a certain volume of the remedy, at a set 1238 01:01:36,440 --> 01:01:39,752 price if you developed the remedy. 1239 01:01:39,752 --> 01:01:41,460 So the risk on the part of the government 1240 01:01:41,460 --> 01:01:42,418 is actually pretty low. 1241 01:01:42,418 --> 01:01:45,660 It only has to buy something if you solve the problem. 1242 01:01:45,660 --> 01:01:47,820 But then of course, that's what biotechs do anyway. 1243 01:01:47,820 --> 01:01:50,650 That's the economic model they work off of. 1244 01:01:50,650 --> 01:01:52,710 So could the government intervene in this sector 1245 01:01:52,710 --> 01:01:56,820 and, in effect, really change around the risk-reward model. 1246 01:01:56,820 --> 01:01:59,953 So ideas like this come to bear here. 1247 01:01:59,953 --> 01:02:01,620 We certainly haven't solved this problem 1248 01:02:01,620 --> 01:02:04,170 and remains very much with us but illustrates 1249 01:02:04,170 --> 01:02:06,030 what the issues are. 1250 01:02:06,030 --> 01:02:09,000 This other report from the FDA, Innovation/Stagnation-- 1251 01:02:09,000 --> 01:02:10,980 Challenge and Opportunity on the Critical Path 1252 01:02:10,980 --> 01:02:14,220 to New Medical Products. 1253 01:02:14,220 --> 01:02:16,980 And both these reports, by the way, have been updated. 1254 01:02:16,980 --> 01:02:19,320 I put the originals in, which are fairly hard-hitting. 1255 01:02:19,320 --> 01:02:22,840 But they've been updated since then by these organizations 1256 01:02:22,840 --> 01:02:24,090 so you can get later versions. 1257 01:02:27,740 --> 01:02:31,880 FDA does not have its own substantial research arm. 1258 01:02:31,880 --> 01:02:35,430 NIH does the medical research. 1259 01:02:35,430 --> 01:02:43,040 Yet we're not doing research on how 1260 01:02:43,040 --> 01:02:50,240 to get a better, more reliable, and certainly speedier 1261 01:02:50,240 --> 01:02:53,360 evaluation and approval process for FDA. 1262 01:02:53,360 --> 01:02:54,860 We don't have that. 1263 01:02:54,860 --> 01:02:56,900 There's nobody on that problem. 1264 01:02:56,900 --> 01:02:58,958 NIH does not view that is their problem. 1265 01:02:58,958 --> 01:03:00,500 They view their problem as developing 1266 01:03:00,500 --> 01:03:05,780 new drugs, new therapies, not figuring out a safety approval 1267 01:03:05,780 --> 01:03:07,230 set of problems. 1268 01:03:07,230 --> 01:03:09,500 So nobody's really on that problem 1269 01:03:09,500 --> 01:03:14,150 except for FDA's own fairly modest research budget. 1270 01:03:14,150 --> 01:03:17,060 So the picture that this report portrays 1271 01:03:17,060 --> 01:03:22,520 is ongoing breakthrough scientific discoveries 1272 01:03:22,520 --> 01:03:27,590 that get nailed because the drug approval process isn't 1273 01:03:27,590 --> 01:03:30,350 receiving new science and new technology 1274 01:03:30,350 --> 01:03:33,088 advances that would enable it to keep up 1275 01:03:33,088 --> 01:03:34,130 with these breakthroughs. 1276 01:03:34,130 --> 01:03:36,020 And we mentioned this earlier. 1277 01:03:36,020 --> 01:03:40,310 But the most serious one that's ahead is in precision medicine 1278 01:03:40,310 --> 01:03:44,510 and personalized medicine where it's developing a therapy 1279 01:03:44,510 --> 01:03:48,530 or a remedy that's uniquely appropriate for you 1280 01:03:48,530 --> 01:03:53,060 as opposed to me, a personalized medicine approach. 1281 01:03:53,060 --> 01:03:55,520 How are we going to do the approval process for that? 1282 01:03:55,520 --> 01:03:58,430 How is FDA is going to manage that? 1283 01:03:58,430 --> 01:04:06,850 So it is a dilemma here in the system. 1284 01:04:06,850 --> 01:04:11,330 And again, it's an innovation organization problem. 1285 01:04:11,330 --> 01:04:13,160 We've got an innovation organization 1286 01:04:13,160 --> 01:04:15,110 that's focused on one set of the problems. 1287 01:04:15,110 --> 01:04:18,050 And they've got the resources do the R&D on it. 1288 01:04:18,050 --> 01:04:20,060 And yet we've got a parallel big problem that's 1289 01:04:20,060 --> 01:04:22,060 jeopardizing the other system. 1290 01:04:22,060 --> 01:04:24,110 And we don't have the organizational capability 1291 01:04:24,110 --> 01:04:25,400 of acting on it. 1292 01:04:25,400 --> 01:04:27,560 So then this is yet another innovation 1293 01:04:27,560 --> 01:04:29,030 organizational problem that, when 1294 01:04:29,030 --> 01:04:31,310 you think about these things as systems, 1295 01:04:31,310 --> 01:04:33,230 you begin to identify what the gaps are. 1296 01:04:33,230 --> 01:04:35,243 And here is one. 1297 01:04:35,243 --> 01:04:36,785 That's probably enough on this topic. 1298 01:04:39,350 --> 01:04:43,830 So why don't we go right into some Q&A on this. 1299 01:04:43,830 --> 01:04:46,243 Martine, you want to lead us on Bad Bugs, No Drugs? 1300 01:04:46,243 --> 01:04:46,868 AUDIENCE: Yeah. 1301 01:04:51,850 --> 01:04:53,560 So I guess the main theme of this 1302 01:04:53,560 --> 01:04:56,470 is their way of using policy or subsidies to incentivize 1303 01:04:56,470 --> 01:04:59,445 the private sector to take on this problem. 1304 01:04:59,445 --> 01:05:01,570 Or even is the private sector even the right entity 1305 01:05:01,570 --> 01:05:02,640 to handle this problem? 1306 01:05:09,990 --> 01:05:12,620 AUDIENCE: I feel like a little suggestion 1307 01:05:12,620 --> 01:05:15,200 of having government guarantee to pay 1308 01:05:15,200 --> 01:05:17,630 for a certain amount of dosages seems 1309 01:05:17,630 --> 01:05:20,260 like a pretty good fix for it. 1310 01:05:20,260 --> 01:05:24,275 If you leave the free market alone, these companies, 1311 01:05:24,275 --> 01:05:25,900 they currently don't have an incentive. 1312 01:05:25,900 --> 01:05:30,310 So I don't think that's really an option. 1313 01:05:30,310 --> 01:05:32,930 AUDIENCE: A professor of mine, an economics professor 1314 01:05:32,930 --> 01:05:36,830 at Wellesley and Cornell is very supportive of patent 1315 01:05:36,830 --> 01:05:39,450 extensions, which is something that was cited in the report. 1316 01:05:39,450 --> 01:05:41,690 And she felt like, at least in my understanding, 1317 01:05:41,690 --> 01:05:43,610 that would very much motivate bio and pharma 1318 01:05:43,610 --> 01:05:45,780 companies to do research primarily 1319 01:05:45,780 --> 01:05:47,780 because they're concerned that their patents are 1320 01:05:47,780 --> 01:05:50,660 going to run out in the lifecycle of the innovation 1321 01:05:50,660 --> 01:05:51,720 process. 1322 01:05:51,720 --> 01:05:54,242 WILLIAM BONVILLIAN: And explain how that would work, Steph? 1323 01:05:54,242 --> 01:05:56,700 AUDIENCE: Oh, man, I don't know that I could do it justice. 1324 01:05:56,700 --> 01:05:58,970 WILLIAM BONVILLIAN: Just briefly. 1325 01:05:58,970 --> 01:06:00,150 Give us a snapshot. 1326 01:06:00,150 --> 01:06:00,650 Or Chris. 1327 01:06:00,650 --> 01:06:02,960 AUDIENCE: So I think, my understanding 1328 01:06:02,960 --> 01:06:05,820 of the patent-extension process is that, 1329 01:06:05,820 --> 01:06:10,160 especially for orphan drugs where it's like under like 1330 01:06:10,160 --> 01:06:12,560 10,000 people or something like that-- 1331 01:06:12,560 --> 01:06:16,880 so a smaller market size or it's targeting children 1332 01:06:16,880 --> 01:06:19,280 like adolescents, you get patent extensions. 1333 01:06:19,280 --> 01:06:21,440 And that could be three to five years extra, 1334 01:06:21,440 --> 01:06:26,810 which is pretty significant in terms of drug companies. 1335 01:06:26,810 --> 01:06:30,020 And yeah, I think those are the main ones. 1336 01:06:30,020 --> 01:06:33,560 And there's also breakthrough therapy designations. 1337 01:06:33,560 --> 01:06:36,740 And those not only help speed up the approval process, which 1338 01:06:36,740 --> 01:06:39,050 is really valuable for these companies that really want 1339 01:06:39,050 --> 01:06:41,600 to push through these drugs really quickly 1340 01:06:41,600 --> 01:06:43,400 and get it to market. 1341 01:06:43,400 --> 01:06:47,280 And then they also extend a little bit in certain cases. 1342 01:06:47,280 --> 01:06:49,940 So there's a lot of different small ways 1343 01:06:49,940 --> 01:06:52,230 FDA is trying to get more incentive, 1344 01:06:52,230 --> 01:06:54,050 which is nice to see. 1345 01:06:54,050 --> 01:06:55,500 I think it's a good policy. 1346 01:06:55,500 --> 01:06:58,510 AUDIENCE: And to add a lower-order analysis 1347 01:06:58,510 --> 01:07:02,106 from microeconomics, I think the way 1348 01:07:02,106 --> 01:07:04,790 that she explained it was essentially-- and all of you 1349 01:07:04,790 --> 01:07:05,510 may know this. 1350 01:07:05,510 --> 01:07:07,400 I did not know this until two years ago. 1351 01:07:07,400 --> 01:07:09,600 So maybe this is new to someone-- 1352 01:07:09,600 --> 01:07:13,023 is that researchers, I mean, from the time 1353 01:07:13,023 --> 01:07:14,690 that they apply for a patent, they have, 1354 01:07:14,690 --> 01:07:16,590 what is it, 15, 17 years or something 1355 01:07:16,590 --> 01:07:18,090 like that to commercialization where 1356 01:07:18,090 --> 01:07:19,230 their patent is protected. 1357 01:07:19,230 --> 01:07:19,970 AUDIENCE: I thought it was 20. 1358 01:07:19,970 --> 01:07:20,450 AUDIENCE: 20? 1359 01:07:20,450 --> 01:07:21,170 AUDIENCE: It changed. 1360 01:07:21,170 --> 01:07:22,545 WILLIAM BONVILLIAN: Yeah, it did. 1361 01:07:22,545 --> 01:07:25,520 [INAUDIBLE] is now 20 from 17. 1362 01:07:25,520 --> 01:07:26,900 AUDIENCE: So it used to be 17. 1363 01:07:26,900 --> 01:07:28,490 And now, it is 20. 1364 01:07:28,490 --> 01:07:30,410 And so the way that she explained it to us 1365 01:07:30,410 --> 01:07:35,390 is that, because a lot of the research-and-development 1366 01:07:35,390 --> 01:07:38,090 process could take 20 years by the time 1367 01:07:38,090 --> 01:07:39,972 they get a drug to market, they may 1368 01:07:39,972 --> 01:07:41,180 have already lost the patent. 1369 01:07:41,180 --> 01:07:45,320 So it no longer becomes economically advantageous 1370 01:07:45,320 --> 01:07:47,270 for the company to pursue commercialization 1371 01:07:47,270 --> 01:07:50,450 of a particular therapy, and thus what Chris was saying. 1372 01:07:50,450 --> 01:07:54,140 So that's just sort of the small connections. 1373 01:07:54,140 --> 01:07:58,030 AUDIENCE: Separate concerns sort of from like a scientific 1374 01:07:58,030 --> 01:07:59,910 or a biological perspective. 1375 01:07:59,910 --> 01:08:03,860 So I wonder if, is it that these drugs are super effective 1376 01:08:03,860 --> 01:08:05,690 or antibiotics are the best mechanism 1377 01:08:05,690 --> 01:08:07,315 to get rid of these bacteria in the way 1378 01:08:07,315 --> 01:08:09,740 that we traditionally think about them to say 1379 01:08:09,740 --> 01:08:14,990 that one is the way that we think about bacteria? 1380 01:08:14,990 --> 01:08:16,993 Is that the actual most effective means? 1381 01:08:16,993 --> 01:08:18,410 Or is that actually how they work? 1382 01:08:18,410 --> 01:08:20,090 Or do we not understand the pathways 1383 01:08:20,090 --> 01:08:21,640 of how they're actually becoming more 1384 01:08:21,640 --> 01:08:24,319 resistant to these technologies? 1385 01:08:24,319 --> 01:08:30,200 So my question, is there just like a group think in a way 1386 01:08:30,200 --> 01:08:34,550 that we think about antibiotics and bacterial theory 1387 01:08:34,550 --> 01:08:39,560 in biology or in medicine. 1388 01:08:39,560 --> 01:08:41,750 We're just kind of following along the same path. 1389 01:08:41,750 --> 01:08:44,600 So I would love to talk to a bacteriologist. 1390 01:08:44,600 --> 01:08:48,770 And then 2, is this problem one that we will ever solve? 1391 01:08:48,770 --> 01:08:50,510 Or will it just get exponentially worse? 1392 01:08:50,510 --> 01:08:52,218 Because I can imagine, even if we come up 1393 01:08:52,218 --> 01:08:54,500 with a faster and faster mechanism 1394 01:08:54,500 --> 01:08:56,990 to get out these antibiotics and reach more people, 1395 01:08:56,990 --> 01:08:59,840 will these bacteria just kind of keep growing 1396 01:08:59,840 --> 01:09:01,220 and get faster and more resistant 1397 01:09:01,220 --> 01:09:03,710 and we'll always have this problem? 1398 01:09:03,710 --> 01:09:05,479 And then thirdly, does that mean that we 1399 01:09:05,479 --> 01:09:12,680 should have an established kind of entity within the FDA 1400 01:09:12,680 --> 01:09:16,640 that just deals with antibiotics if they deem it 1401 01:09:16,640 --> 01:09:19,680 as a serious enough problem and it's going to be persistent? 1402 01:09:19,680 --> 01:09:24,420 AUDIENCE: [INAUDIBLE] 1403 01:09:24,420 --> 01:09:26,441 AUDIENCE: Yeah, I can try to speak to that. 1404 01:09:26,441 --> 01:09:28,149 AUDIENCE: The question I was going to ask 1405 01:09:28,149 --> 01:09:29,877 is you can about this theoretically. 1406 01:09:29,877 --> 01:09:32,210 But I was going to ask, so if you had to start a company 1407 01:09:32,210 --> 01:09:34,521 right and you're relatively young in your career 1408 01:09:34,521 --> 01:09:36,229 and this might be a career-ender in terms 1409 01:09:36,229 --> 01:09:39,140 of if there's no success, what incentives would incentivize 1410 01:09:39,140 --> 01:09:43,060 you to start a company or solve this problem or an organization 1411 01:09:43,060 --> 01:09:44,540 to work on this exactly? 1412 01:09:44,540 --> 01:09:47,359 AUDIENCE: Presuming you have the equipment and resources. 1413 01:09:47,359 --> 01:09:51,470 AUDIENCE: So antibiotics are pretty difficult, first of all, 1414 01:09:51,470 --> 01:09:55,848 because they're becoming too effective for their own good. 1415 01:09:55,848 --> 01:09:57,890 I think it was Pfizer or one of the big companies 1416 01:09:57,890 --> 01:10:01,680 produce a very famous kind of antibiotic drug. 1417 01:10:01,680 --> 01:10:06,080 And now, it's just so effective that they've kind of become 1418 01:10:06,080 --> 01:10:07,760 a victim of their own success. 1419 01:10:07,760 --> 01:10:11,120 And also, it's also really hard to pass antibiotic drugs 1420 01:10:11,120 --> 01:10:14,330 through clinical trials, disproportionately so 1421 01:10:14,330 --> 01:10:15,750 compared to other diseases. 1422 01:10:15,750 --> 01:10:19,310 So that's also kind of like a barrier to entry. 1423 01:10:19,310 --> 01:10:24,470 I think, honestly, if I were to want to get into this field, 1424 01:10:24,470 --> 01:10:27,080 you have to come up with a new approach, 1425 01:10:27,080 --> 01:10:29,650 something better than is being done right now. 1426 01:10:29,650 --> 01:10:35,440 Because I think antibiotics it's a very broad field. 1427 01:10:35,440 --> 01:10:37,270 There's so many different strains. 1428 01:10:37,270 --> 01:10:39,400 And if they mutate a bit your drug 1429 01:10:39,400 --> 01:10:41,680 can be already rendered ineffective. 1430 01:10:41,680 --> 01:10:45,500 So it's just one of those fields that is always evolving. 1431 01:10:45,500 --> 01:10:47,875 So not only do you have to have very strong basic science 1432 01:10:47,875 --> 01:10:51,280 to keep up with what's happening with these bacteria, 1433 01:10:51,280 --> 01:10:55,090 how are they like mutating, and how are they kind of evolving, 1434 01:10:55,090 --> 01:10:58,000 and then you have to come up with drugs to help target them. 1435 01:10:58,000 --> 01:11:00,340 So it's like a very twofold problem 1436 01:11:00,340 --> 01:11:03,280 that you need to simultaneously target. 1437 01:11:03,280 --> 01:11:05,290 And then obviously, they're evolving 1438 01:11:05,290 --> 01:11:09,250 so rapidly that you have to kind of pivot very quickly. 1439 01:11:09,250 --> 01:11:13,320 So I think it's one of those fast-changing fields that 1440 01:11:13,320 --> 01:11:14,830 is kind of hard to target. 1441 01:11:14,830 --> 01:11:16,540 But definitely important. 1442 01:11:16,540 --> 01:11:18,670 I'm not really sure what mechanisms 1443 01:11:18,670 --> 01:11:22,120 they have in place or organization-wise 1444 01:11:22,120 --> 01:11:24,430 to kind of target this specifically. 1445 01:11:24,430 --> 01:11:27,020 But I think it definitely should be a focus. 1446 01:11:27,020 --> 01:11:32,110 AUDIENCE: You talked about your pipeline getting approved. 1447 01:11:32,110 --> 01:11:34,310 What if they could speed that up very quickly 1448 01:11:34,310 --> 01:11:35,890 so it's a priority? 1449 01:11:35,890 --> 01:11:37,930 AUDIENCE: I mean, speeding it up would help. 1450 01:11:37,930 --> 01:11:43,020 But also, it's just hard to produce a good clinical trial 1451 01:11:43,020 --> 01:11:44,065 for these drugs. 1452 01:11:44,065 --> 01:11:45,940 So I think that's also a fundamental problem. 1453 01:11:45,940 --> 01:11:47,410 It's just like psychiatric drugs. 1454 01:11:47,410 --> 01:11:51,160 Those are very notoriously hard to prove because placebo 1455 01:11:51,160 --> 01:11:53,655 effect is really a big problem. 1456 01:11:53,655 --> 01:11:55,030 WILLIAM BONVILLIAN: So that leads 1457 01:11:55,030 --> 01:11:58,660 us right into our next reading, which 1458 01:11:58,660 --> 01:12:03,100 is this FDA problem and the fact that we haven't put resources 1459 01:12:03,100 --> 01:12:05,800 on the drug approval process. 1460 01:12:05,800 --> 01:12:08,670 So Chloe, it's yours. 1461 01:12:08,670 --> 01:12:11,880 AUDIENCE: So if we're starting off 1462 01:12:11,880 --> 01:12:15,730 with tackling that problem of how do we sort of reform 1463 01:12:15,730 --> 01:12:19,900 and revamp the way in which we evaluate these products, 1464 01:12:19,900 --> 01:12:23,470 one question for you guys would be-- 1465 01:12:23,470 --> 01:12:25,570 so the meeting mentioned that one 1466 01:12:25,570 --> 01:12:27,760 of the things you could attribute this mismatch 1467 01:12:27,760 --> 01:12:32,710 was a mismatch between the levels of how far both research 1468 01:12:32,710 --> 01:12:35,440 in basic science and applied science, where they put that. 1469 01:12:35,440 --> 01:12:37,240 They're very mismatched in terms of one 1470 01:12:37,240 --> 01:12:39,760 of them is just shot ahead and the other can't even keep up. 1471 01:12:39,760 --> 01:12:46,173 So what are your thoughts on how or if an agency such as the FDA 1472 01:12:46,173 --> 01:12:47,590 should be responsible for ensuring 1473 01:12:47,590 --> 01:12:50,440 the development of these sciences are evenly matched? 1474 01:12:50,440 --> 01:12:53,560 For example, should the agency bottleneck 1475 01:12:53,560 --> 01:12:56,418 the funding for basic sciences until applied sciences 1476 01:12:56,418 --> 01:12:56,960 and catch up? 1477 01:12:56,960 --> 01:12:59,620 Or should they aggressively stimulate opportunities 1478 01:12:59,620 --> 01:13:01,690 on other side? 1479 01:13:01,690 --> 01:13:03,730 AUDIENCE: I suppose in an ideal scenario 1480 01:13:03,730 --> 01:13:08,200 you just stimulate that's more funding to research drugs. 1481 01:13:08,200 --> 01:13:11,620 But I understand that there isn't always money for that. 1482 01:13:11,620 --> 01:13:14,038 AUDIENCE: I forget-- but to that point, 1483 01:13:14,038 --> 01:13:15,080 there is another reading. 1484 01:13:15,080 --> 01:13:16,538 I forget if it was this one or not. 1485 01:13:16,538 --> 01:13:20,110 But it said, just throwing money and giving infinite budgets 1486 01:13:20,110 --> 01:13:25,030 isn't always the solution because, in that case, 1487 01:13:25,030 --> 01:13:27,490 it takes away the element of good planning 1488 01:13:27,490 --> 01:13:29,138 and strategic planning. 1489 01:13:29,138 --> 01:13:31,430 WILLIAM BONVILLIAN: The end of innovation organization. 1490 01:13:31,430 --> 01:13:33,347 If you don't have the innovation organization, 1491 01:13:33,347 --> 01:13:36,742 that's going to enable you to avoid big gaps in the system. 1492 01:13:36,742 --> 01:13:38,200 You're just not going to get there. 1493 01:13:38,200 --> 01:13:41,440 So throwing money at a problem without tackling the innovation 1494 01:13:41,440 --> 01:13:46,280 organization problems is problematic. 1495 01:13:46,280 --> 01:13:47,572 AUDIENCE: Ideally, yes. 1496 01:13:47,572 --> 01:13:48,530 Money in, money would-- 1497 01:13:48,530 --> 01:13:50,720 AUDIENCE: Yeah, assuming that you can use the money properly. 1498 01:13:50,720 --> 01:13:51,220 Yeah. 1499 01:13:55,675 --> 01:13:57,550 AUDIENCE: One thing that I was curious about. 1500 01:13:57,550 --> 01:14:00,160 I was confused how antibiotics can simultaneously 1501 01:14:00,160 --> 01:14:02,620 be working too well and you can have 1502 01:14:02,620 --> 01:14:05,350 tons of antibiotic-resistant bacteria, which 1503 01:14:05,350 --> 01:14:08,230 implies that antibiotics are not working too well. 1504 01:14:08,230 --> 01:14:12,103 So if someone could clarify that, I would appreciate that. 1505 01:14:12,103 --> 01:14:13,770 WILLIAM BONVILLIAN: That's yours, Chris. 1506 01:14:13,770 --> 01:14:18,750 AUDIENCE: So I think the problem is that, say, Pfizer's drug is 1507 01:14:18,750 --> 01:14:19,660 doing really well. 1508 01:14:19,660 --> 01:14:24,330 So they give these antibiotics to their patients. 1509 01:14:24,330 --> 01:14:26,200 And these patients are getting cured. 1510 01:14:26,200 --> 01:14:28,893 And so that means, if they're getting cured 1511 01:14:28,893 --> 01:14:30,810 and they're not really getting these diseases, 1512 01:14:30,810 --> 01:14:32,790 that means that drug is not going 1513 01:14:32,790 --> 01:14:34,620 to be really used as much. 1514 01:14:34,620 --> 01:14:37,650 And then at the same time, because these patients 1515 01:14:37,650 --> 01:14:41,580 have used the drug, there is antibiotic resistance growing. 1516 01:14:41,580 --> 01:14:44,580 And just because it's been around for so long, 1517 01:14:44,580 --> 01:14:47,752 it's inevitable that resistance is 1518 01:14:47,752 --> 01:14:49,710 going to build up to the point that the drug is 1519 01:14:49,710 --> 01:14:50,970 no longer effective. 1520 01:14:50,970 --> 01:14:55,177 Because it's been out for 10, 15 years. 1521 01:14:55,177 --> 01:14:56,760 AUDIENCE: But that resistance can only 1522 01:14:56,760 --> 01:14:59,050 develop if lots of people are using the drug-- 1523 01:14:59,050 --> 01:15:01,467 AUDIENCE: I mean, obviously, a lot of people have used it. 1524 01:15:01,467 --> 01:15:04,080 And it is, obviously, a critical mass have used it. 1525 01:15:04,080 --> 01:15:05,050 It's been successful. 1526 01:15:07,620 --> 01:15:09,090 It's a really effective drug. 1527 01:15:09,090 --> 01:15:14,550 So maybe there's not repeat people in the way 1528 01:15:14,550 --> 01:15:16,930 that other drugs might have a lot of repeat users. 1529 01:15:16,930 --> 01:15:19,560 Maybe it's treating a large population, 1530 01:15:19,560 --> 01:15:22,350 but people don't really get the same kind 1531 01:15:22,350 --> 01:15:25,500 of strain of disease multiple times, something like that. 1532 01:15:25,500 --> 01:15:27,583 WILLIAM BONVILLIAN: And Chris, part of the problem 1533 01:15:27,583 --> 01:15:29,160 is that we overprescribe antibiotics 1534 01:15:29,160 --> 01:15:30,720 to an absurd extent. 1535 01:15:30,720 --> 01:15:33,990 And we're sticking antibiotics in everything like hand soap. 1536 01:15:33,990 --> 01:15:39,600 And we're virtually guaranteeing our own Darwinian dilemma. 1537 01:15:39,600 --> 01:15:41,080 AUDIENCE: It seems to be applying 1538 01:15:41,080 --> 01:15:44,620 the language of this class in previous lectures. 1539 01:15:44,620 --> 01:15:48,280 We're playing catch-up with an innovation problem instead 1540 01:15:48,280 --> 01:15:50,350 of purely innovating, which doesn't 1541 01:15:50,350 --> 01:15:54,520 seem like a very American way to tackle this problem. 1542 01:15:54,520 --> 01:15:56,230 It's kind of odd because it's almost 1543 01:15:56,230 --> 01:15:58,540 like we're being outplayed by nature, which, I mean, 1544 01:15:58,540 --> 01:16:01,767 most of our big successes over the last 200 or 300 years 1545 01:16:01,767 --> 01:16:03,850 have been us figuring out how to manipulate nature 1546 01:16:03,850 --> 01:16:04,730 to our advantage. 1547 01:16:04,730 --> 01:16:07,666 So it's kind of an interesting [INAUDIBLE].. 1548 01:16:07,666 --> 01:16:09,781 AUDIENCE: I left my American flag at home. 1549 01:16:12,402 --> 01:16:14,110 AUDIENCE: If that's all we have for that, 1550 01:16:14,110 --> 01:16:18,760 I do have one more question for the FDA reading, 1551 01:16:18,760 --> 01:16:21,590 more on the standard setting sorts of things things. 1552 01:16:21,590 --> 01:16:24,220 What do you guys think, again, are the roles 1553 01:16:24,220 --> 01:16:28,180 and responsibilities of such an agency that 1554 01:16:28,180 --> 01:16:30,820 sets these standards to encourage people 1555 01:16:30,820 --> 01:16:33,770 to take risks and bring the slightly 1556 01:16:33,770 --> 01:16:35,128 riskier drugs to market? 1557 01:16:35,128 --> 01:16:37,420 Because the reading mentioned that a lot of researchers 1558 01:16:37,420 --> 01:16:39,503 won't even go down that path because they know how 1559 01:16:39,503 --> 01:16:43,730 arduous and tough [INAUDIBLE]. 1560 01:16:43,730 --> 01:16:45,878 So is there something the FDA could do to-- 1561 01:16:45,878 --> 01:16:47,920 AUDIENCE: I like what Bill said in terms of like, 1562 01:16:47,920 --> 01:16:49,700 OK, it's a very hard process. 1563 01:16:49,700 --> 01:16:51,325 But once you do it, you're kind of set. 1564 01:16:51,325 --> 01:16:53,858 The only thing I would question is I would tier it. 1565 01:16:53,858 --> 01:16:55,900 Because there's probably different kinds of drugs 1566 01:16:55,900 --> 01:16:57,233 that have different specialties. 1567 01:16:57,233 --> 01:16:58,930 And they're probably something that-- 1568 01:16:58,930 --> 01:17:00,347 there's pretty drugs that I really 1569 01:17:00,347 --> 01:17:01,980 want to get tested really, really well. 1570 01:17:01,980 --> 01:17:03,910 Or there's some that, even if I do test them, 1571 01:17:03,910 --> 01:17:07,330 I still don't know and some drugs that can probably get 1572 01:17:07,330 --> 01:17:08,860 passed faster. 1573 01:17:08,860 --> 01:17:11,560 So I would figure out like how does this scheme work. 1574 01:17:11,560 --> 01:17:14,908 Look at the data in terms of how are these drugs being passed, 1575 01:17:14,908 --> 01:17:17,200 how long does it take, which ones were relatively quick 1576 01:17:17,200 --> 01:17:20,110 and see if I could restructure the organization so that I 1577 01:17:20,110 --> 01:17:22,920 can optimize for the stuff that really matters 1578 01:17:22,920 --> 01:17:25,570 and the stuff that doesn't really matter as much, 1579 01:17:25,570 --> 01:17:27,060 it's not a focus. 1580 01:17:27,060 --> 01:17:30,580 So we're having a linear kind of everything is the same. 1581 01:17:30,580 --> 01:17:33,880 AUDIENCE: So are you saying maybe make the funding 1582 01:17:33,880 --> 01:17:37,090 or whatever a function of, say, quality and the amount of time 1583 01:17:37,090 --> 01:17:40,240 it would take to get passed so then, if you need a drug that 1584 01:17:40,240 --> 01:17:43,840 maybe would only work for a few people, you can make it 1585 01:17:43,840 --> 01:17:45,340 so it would be really high quality. 1586 01:17:45,340 --> 01:17:46,250 Whereas if you have a drug that needs 1587 01:17:46,250 --> 01:17:49,180 to work for a lot of people that you need really quickly, 1588 01:17:49,180 --> 01:17:52,420 then you can make it so that it's 1589 01:17:52,420 --> 01:17:54,440 less robust or for fewer restraints, et cetera, 1590 01:17:54,440 --> 01:17:55,690 embracing something like that? 1591 01:17:55,690 --> 01:17:57,732 AUDIENCE: Yeah, something like that. 1592 01:17:57,732 --> 01:17:59,690 Because you also think about compound interest. 1593 01:17:59,690 --> 01:18:01,357 So say I know that, if I make this drug, 1594 01:18:01,357 --> 01:18:03,960 I'm going to save the government like $50 million a year 1595 01:18:03,960 --> 01:18:07,828 or $80 million a year over four or five years. 1596 01:18:07,828 --> 01:18:10,120 So if it's something that's going to make a big impact, 1597 01:18:10,120 --> 01:18:12,550 how about we go faster and we can iterate faster? 1598 01:18:12,550 --> 01:18:15,030 So we have priorities for drugs that really, really matter. 1599 01:18:15,030 --> 01:18:17,530 I don't like the argument of saying, oh, not a lot of people 1600 01:18:17,530 --> 01:18:20,020 are going to use it so we're not going to prioritize it. 1601 01:18:20,020 --> 01:18:21,980 And let's put it in the back because it's kind of false. 1602 01:18:21,980 --> 01:18:24,147 Or maybe you should create another organization that 1603 01:18:24,147 --> 01:18:26,050 only focuses on those kinds of drugs 1604 01:18:26,050 --> 01:18:28,360 and split up the FDA into one that focuses on 1605 01:18:28,360 --> 01:18:31,010 that so it gets equal review. 1606 01:18:31,010 --> 01:18:33,760 But that's kind of a same wavelength. 1607 01:18:33,760 --> 01:18:36,220 WILLIAM BONVILLIAN: So just to kind of summarize 1608 01:18:36,220 --> 01:18:40,870 here from your all good presentations and good 1609 01:18:40,870 --> 01:18:46,620 questions, the FDA is sitting on a really critical part 1610 01:18:46,620 --> 01:18:47,440 of the problem. 1611 01:18:47,440 --> 01:18:52,200 So if we could significantly speed the drug approval process 1612 01:18:52,200 --> 01:18:54,990 and if we could significantly lower its cost, 1613 01:18:54,990 --> 01:18:58,300 we solve a lot of problems here. 1614 01:18:58,300 --> 01:19:01,680 We can be much less reliant on a blockbuster drug model. 1615 01:19:01,680 --> 01:19:05,340 So it seems to me that, in terms of the panoply of fixes 1616 01:19:05,340 --> 01:19:08,850 to this gap in the innovation system, 1617 01:19:08,850 --> 01:19:12,020 really paying attention to how to accelerate-- 1618 01:19:12,020 --> 01:19:14,070 Martine, as you were pointing out-- 1619 01:19:14,070 --> 01:19:16,140 the review process at FDA. 1620 01:19:16,140 --> 01:19:23,090 And it's very hard to reduce safety requirements. 1621 01:19:23,090 --> 01:19:25,540 It's just not going to be acceptable to the public. 1622 01:19:25,540 --> 01:19:30,100 But if there are new ways of using big data and analytics 1623 01:19:30,100 --> 01:19:34,720 and simulation and modeling, those potentially 1624 01:19:34,720 --> 01:19:38,680 present very significant improvements to this process 1625 01:19:38,680 --> 01:19:40,930 here that could really help tackle this problem. 1626 01:19:40,930 --> 01:19:43,870 So putting some money on that one I think 1627 01:19:43,870 --> 01:19:45,770 could be really key. 1628 01:19:45,770 --> 01:19:48,490 Then we go back to NIH, NIH is not 1629 01:19:48,490 --> 01:19:51,870 organized around those kinds of technology problems. 1630 01:19:51,870 --> 01:19:53,233 So how are we going to do this? 1631 01:19:53,233 --> 01:19:54,650 So we have another dilemma as soon 1632 01:19:54,650 --> 01:19:57,025 as we arrive at the answer. 1633 01:19:57,025 --> 01:19:57,995 Anything else? 1634 01:20:02,640 --> 01:20:05,360 AUDIENCE: I mean, just as we transition into Bill's reading 1635 01:20:05,360 --> 01:20:10,580 about this being a legacy sector or exhibiting a lot of legacy 1636 01:20:10,580 --> 01:20:15,290 features, I'm curious about what you, Bill, or the class 1637 01:20:15,290 --> 01:20:19,290 thought about the political viability of, 1638 01:20:19,290 --> 01:20:21,440 I guess, in terms of Martine's proposal 1639 01:20:21,440 --> 01:20:22,760 toward a tiered approach. 1640 01:20:22,760 --> 01:20:26,090 If we thought it was more politically viable to do this 1641 01:20:26,090 --> 01:20:31,700 for either therapeutic drugs or for cure-all interventions, 1642 01:20:31,700 --> 01:20:36,440 there seems, in my very limited study of the life sciences, 1643 01:20:36,440 --> 01:20:41,000 to be a proclivity towards the therapeutic drugs 1644 01:20:41,000 --> 01:20:42,590 and interventions because they're more 1645 01:20:42,590 --> 01:20:44,870 sustainable and profitable. 1646 01:20:44,870 --> 01:20:48,950 So do we think that that could be a potential market 1647 01:20:48,950 --> 01:20:51,230 opportunity to test a DARPA-like system 1648 01:20:51,230 --> 01:20:57,980 or to test the sort of improved speed of acceptance 1649 01:20:57,980 --> 01:20:58,790 of the drug? 1650 01:20:58,790 --> 01:21:02,163 WILLIAM BONVILLIAN: Let me throw that back to the group here. 1651 01:21:02,163 --> 01:21:03,330 AUDIENCE: I like the point-- 1652 01:21:03,330 --> 01:21:06,710 I forgot when we mentioned it about how drugs were too low. 1653 01:21:06,710 --> 01:21:08,810 So they want to have recurring revenue. 1654 01:21:08,810 --> 01:21:11,390 But I think, as the government, you get a lot of, 1655 01:21:11,390 --> 01:21:12,440 you have to pay that. 1656 01:21:12,440 --> 01:21:13,730 So it would be interesting if the government 1657 01:21:13,730 --> 01:21:15,740 is like, OK, well, we know, over the lifetime, 1658 01:21:15,740 --> 01:21:17,323 if we don't solve this now, it's going 1659 01:21:17,323 --> 01:21:20,130 to cost us one million dollars for this person. 1660 01:21:20,130 --> 01:21:22,420 So you solve it today, we're going 1661 01:21:22,420 --> 01:21:25,190 to give you $20k, which is way cheaper for us 1662 01:21:25,190 --> 01:21:26,480 overall in the long term. 1663 01:21:26,480 --> 01:21:30,360 But you're not getting paid $10 per drug or $100 per drug. 1664 01:21:30,360 --> 01:21:32,720 And so it's kind of a win-win because it's really 1665 01:21:32,720 --> 01:21:34,250 good revenue for the company and it 1666 01:21:34,250 --> 01:21:37,760 justifies having a super drug that really, really works. 1667 01:21:37,760 --> 01:21:40,350 And it gives them enough money on their balance sheet. 1668 01:21:40,350 --> 01:21:42,200 I don't know if you guys know how Warren Buffett got rich. 1669 01:21:42,200 --> 01:21:43,610 But he got an insurance company. 1670 01:21:43,610 --> 01:21:45,443 And they give him a lot of cash so that they 1671 01:21:45,443 --> 01:21:46,808 can invest in other companies. 1672 01:21:46,808 --> 01:21:49,100 And it's really good for companies having a lot of cash 1673 01:21:49,100 --> 01:21:51,290 on hand. 1674 01:21:51,290 --> 01:21:52,680 AUDIENCE: Who'd have thought? 1675 01:21:52,680 --> 01:21:54,680 AUDIENCE: Well, no, it's just like a lot of them 1676 01:21:54,680 --> 01:21:57,840 don't have a lot of cash on hand at any given time. 1677 01:21:57,840 --> 01:22:00,800 So they have their assets distributed between physical. 1678 01:22:00,800 --> 01:22:03,380 And so it's really good to be able to move quickly and buy 1679 01:22:03,380 --> 01:22:04,160 all this stuff. 1680 01:22:04,160 --> 01:22:06,770 And it's just not a thing that happens that easily. 1681 01:22:06,770 --> 01:22:08,870 There's also a lot of tax benefits. 1682 01:22:08,870 --> 01:22:11,190 That's why people do it.