1 00:00:00,090 --> 00:00:02,430 The following content is provided under a Creative 2 00:00:02,430 --> 00:00:03,820 Commons license. 3 00:00:03,820 --> 00:00:06,030 Your support will help MIT OpenCourseWare 4 00:00:06,030 --> 00:00:10,120 continue to offer high-quality educational resources for free. 5 00:00:10,120 --> 00:00:12,660 To make a donation or to view additional materials 6 00:00:12,660 --> 00:00:16,620 from hundreds of MIT courses, visit MIT OpenCourseWare 7 00:00:16,620 --> 00:00:17,992 at ocw.mit.edu. 8 00:00:22,360 --> 00:00:26,910 WILLIAM BONVILLIAN: This video that we saw by David Mindell 9 00:00:26,910 --> 00:00:30,270 is about his new book, which is called 10 00:00:30,270 --> 00:00:33,390 Our Robots Ourselves, Robots and the Myth of Autonomy. 11 00:00:33,390 --> 00:00:35,250 And he means exactly that. 12 00:00:35,250 --> 00:00:39,540 Autonomy really is a myth and will be so. 13 00:00:39,540 --> 00:00:42,540 And David has a unique perspective on this question, 14 00:00:42,540 --> 00:00:45,728 because this is his life. 15 00:00:45,728 --> 00:00:47,520 In other words, he has been deeply involved 16 00:00:47,520 --> 00:00:50,730 with robotics at just cutting-edge levels 17 00:00:50,730 --> 00:00:53,800 for the last 30-plus years. 18 00:00:53,800 --> 00:00:57,360 So he was deeply involved in a lot of the early undersea 19 00:00:57,360 --> 00:01:01,590 exploration and working with Bob Ballard and Alvin 20 00:01:01,590 --> 00:01:03,900 and the pods that came off Alvin. 21 00:01:03,900 --> 00:01:08,070 And then he did a lot of work with NASA 22 00:01:08,070 --> 00:01:11,820 and was involved in the robotics on Mars 23 00:01:11,820 --> 00:01:16,980 and the planetary explorations using robotics. 24 00:01:16,980 --> 00:01:23,970 And then he spent a lot of time with the US military 25 00:01:23,970 --> 00:01:25,740 that had been doing drone technology 26 00:01:25,740 --> 00:01:27,630 and has a real hands-on feel. 27 00:01:27,630 --> 00:01:30,180 He's also a pilot and flies extensively, 28 00:01:30,180 --> 00:01:33,510 both helicopters and aircraft. 29 00:01:33,510 --> 00:01:36,040 He's been involved in a helicopter-drone company, 30 00:01:36,040 --> 00:01:39,810 so he knows exactly what those are and how they work. 31 00:01:39,810 --> 00:01:44,130 And in addition, he's just taken leave from MIT, 32 00:01:44,130 --> 00:01:46,830 and he set up his own robotics company, including with people 33 00:01:46,830 --> 00:01:50,370 from Lincoln Labs, which has the greatest radar expertise. 34 00:01:50,370 --> 00:01:51,960 And the issue there is, how do you 35 00:01:51,960 --> 00:01:54,930 create robots that can work in really intimate proximity 36 00:01:54,930 --> 00:01:56,760 with people, right? 37 00:01:56,760 --> 00:01:59,550 Not kind of arm's length distance, much less 38 00:01:59,550 --> 00:02:04,113 locked in on a floor, but really in close proximity. 39 00:02:04,113 --> 00:02:06,030 So they're developing interesting technologies 40 00:02:06,030 --> 00:02:06,780 to enable that. 41 00:02:06,780 --> 00:02:11,790 So he's got a whole personal sense of what this robotics 42 00:02:11,790 --> 00:02:13,350 movement is all about. 43 00:02:13,350 --> 00:02:19,350 And he kind of takes us back from the literature that 44 00:02:19,350 --> 00:02:22,620 has portrayed this kind of ongoing nightmare of job 45 00:02:22,620 --> 00:02:24,750 displacement from robotics, and I think, frankly, 46 00:02:24,750 --> 00:02:28,260 puts us in a much more realistic posture of what it's actually 47 00:02:28,260 --> 00:02:33,510 going to be like when robotics start to scale up to a greater 48 00:02:33,510 --> 00:02:35,970 extent than they have so far. 49 00:02:35,970 --> 00:02:41,640 And his work-- we read JCR Licklider when we took up 50 00:02:41,640 --> 00:02:45,980 the book The Dream Machine, his biography by Mitch Waldrop. 51 00:02:45,980 --> 00:02:48,330 And Licklider, as you remember, gave us 52 00:02:48,330 --> 00:02:50,970 this picture of how people and machines were 53 00:02:50,970 --> 00:02:52,065 going to work together. 54 00:02:52,065 --> 00:02:54,450 And Licklider, in a way, was responding 55 00:02:54,450 --> 00:02:57,505 to a period of technological concern 56 00:02:57,505 --> 00:02:59,880 that computers were going to replace people, because they 57 00:02:59,880 --> 00:03:01,980 could think better than people. 58 00:03:01,980 --> 00:03:05,880 And Licklider, although a great technologist, 59 00:03:05,880 --> 00:03:08,550 was trained as a psychologist. 60 00:03:08,550 --> 00:03:12,090 And his field was man-machine interface-- 61 00:03:12,090 --> 00:03:13,947 how do people work with machines? 62 00:03:13,947 --> 00:03:15,280 How does that actually function? 63 00:03:15,280 --> 00:03:19,920 And so he comes to computing with this vision of a symbiosis 64 00:03:19,920 --> 00:03:22,660 between, computers are going to do what they're good at, 65 00:03:22,660 --> 00:03:24,660 and people are going to do what they're good at, 66 00:03:24,660 --> 00:03:28,050 and we're going to have a symbiosis between the two 67 00:03:28,050 --> 00:03:30,300 that's going to optimize both territories. 68 00:03:30,300 --> 00:03:35,100 And indeed, that's exactly the direction that computers go in. 69 00:03:35,100 --> 00:03:37,830 In other words, I'm not spending sleepless nights 70 00:03:37,830 --> 00:03:41,760 wondering when this thing is going to replace me, right? 71 00:03:41,760 --> 00:03:43,950 We thought it was. 72 00:03:43,950 --> 00:03:46,380 And Norbert Wiener at MIT had invented 73 00:03:46,380 --> 00:03:49,470 the term cybernetics and wrote the first book on the topic. 74 00:03:49,470 --> 00:03:51,065 But Wiener had a very dark vision 75 00:03:51,065 --> 00:03:52,440 of what computing was going to be 76 00:03:52,440 --> 00:03:55,470 for the future of human thinking. 77 00:03:55,470 --> 00:03:57,480 It didn't turn out to happen. 78 00:03:57,480 --> 00:04:02,850 So Mindell is arguing that even the most threatening 79 00:04:02,850 --> 00:04:06,980 technology, which arguably is robotics, 80 00:04:06,980 --> 00:04:08,720 it's just not going to work out that way. 81 00:04:08,720 --> 00:04:12,630 A symbiosis is going to be what's achieved here. 82 00:04:16,628 --> 00:04:19,170 And he takes apart each one of these territories in the book. 83 00:04:19,170 --> 00:04:20,462 I mean, I had you do the video. 84 00:04:20,462 --> 00:04:22,740 But it's a very rich and very well-written book. 85 00:04:22,740 --> 00:04:24,690 It's a tremendously fun read, and he 86 00:04:24,690 --> 00:04:27,420 tells a lot of great stories. 87 00:04:27,420 --> 00:04:31,050 But his vision is that there can be a richer space when 88 00:04:31,050 --> 00:04:34,640 people and robots are joined. 89 00:04:34,640 --> 00:04:36,480 There's a ritual understanding and ability 90 00:04:36,480 --> 00:04:38,250 to make perceptions and judgments that 91 00:04:38,250 --> 00:04:40,230 occurs from that mix. 92 00:04:40,230 --> 00:04:45,180 And that turns out to be exactly his experience undersea 93 00:04:45,180 --> 00:04:49,470 and doing space robotics, that there's just 94 00:04:49,470 --> 00:04:53,430 a different kind of environment that evolves here. 95 00:04:53,430 --> 00:04:56,700 And it's better than a machine alone or a person 96 00:04:56,700 --> 00:04:59,310 alone, that there's a new symbiosis that 97 00:04:59,310 --> 00:05:03,660 enhances both sides in a way that we can draw on. 98 00:05:03,660 --> 00:05:07,500 And what he finds is that when people, in effect, 99 00:05:07,500 --> 00:05:10,590 start to operate the robotics-- 100 00:05:10,590 --> 00:05:12,090 and in the end, in these systems, 101 00:05:12,090 --> 00:05:14,340 autonomous systems don't work very well-- 102 00:05:14,340 --> 00:05:19,810 people really need to be part of the process with the robot-- 103 00:05:19,810 --> 00:05:23,300 that the robot becomes an extension of themselves. 104 00:05:23,300 --> 00:05:28,160 And that, in fact, all their perceptions are in the robot. 105 00:05:28,160 --> 00:05:32,750 So even if the robot is a mile deeper undersea 106 00:05:32,750 --> 00:05:36,680 or tens of thousands of miles away in space 107 00:05:36,680 --> 00:05:40,970 running around the moon, that the person operating the robot 108 00:05:40,970 --> 00:05:44,120 and cooperating with the robot is in the robot. 109 00:05:44,120 --> 00:05:46,940 And the sensory systems that are available to the robot 110 00:05:46,940 --> 00:05:48,100 really become theirs. 111 00:05:48,100 --> 00:05:53,060 In effect, we get extended with a whole new kind of reach. 112 00:05:53,060 --> 00:05:57,710 And that's really his vision of what occurs here. 113 00:05:57,710 --> 00:06:01,340 That's the UAV story, that's the undersea exploration 114 00:06:01,340 --> 00:06:03,380 story, that's the space story. 115 00:06:03,380 --> 00:06:06,860 And, he argues, this is going to be the driverless car 116 00:06:06,860 --> 00:06:09,160 story, too. 117 00:06:09,160 --> 00:06:11,770 So we have been proceeding-- 118 00:06:11,770 --> 00:06:14,260 and he goes through this at rich length in his book-- 119 00:06:14,260 --> 00:06:16,750 we've been proceeding under the assumption 120 00:06:16,750 --> 00:06:21,130 that we're going to get replaced by driverless cars. 121 00:06:21,130 --> 00:06:25,180 But he argues, wait a minute, 40 years of experience of robotics 122 00:06:25,180 --> 00:06:28,240 says that's not what happens. 123 00:06:28,240 --> 00:06:31,480 Instead, this symbiosis starts to occur, right? 124 00:06:31,480 --> 00:06:34,000 Because people are not going to want to give up 125 00:06:34,000 --> 00:06:35,770 certain kinds of level of controls 126 00:06:35,770 --> 00:06:37,783 and certain kinds of decision-making. 127 00:06:37,783 --> 00:06:39,700 They're going to want to be involved in those. 128 00:06:39,700 --> 00:06:42,940 So he argues, that just as with undersea exploration 129 00:06:42,940 --> 00:06:46,840 and with the other robotics fields that he's talking about, 130 00:06:46,840 --> 00:06:49,990 a richer driver experience that involves much more 131 00:06:49,990 --> 00:06:51,940 engagement with the surroundings, much 132 00:06:51,940 --> 00:06:54,760 greater information and knowledge access, much 133 00:06:54,760 --> 00:06:59,320 better safety and much better comfort, a lower stress 134 00:06:59,320 --> 00:07:04,570 environment, can occur in a driverless car setting. 135 00:07:04,570 --> 00:07:07,020 That people will still be in command of the car. 136 00:07:07,020 --> 00:07:10,630 They'll be able to cede a lot of lower-end territory 137 00:07:10,630 --> 00:07:13,760 to the robotic activities, but will 138 00:07:13,760 --> 00:07:15,550 retain certain kinds of charge and be 139 00:07:15,550 --> 00:07:18,400 able to operate in a different kind of level of perception. 140 00:07:18,400 --> 00:07:19,690 We'll see, you know? 141 00:07:19,690 --> 00:07:21,280 Time will tell here. 142 00:07:21,280 --> 00:07:23,080 The driverless car movement has been 143 00:07:23,080 --> 00:07:25,322 pushed by a whole community that assumes 144 00:07:25,322 --> 00:07:27,280 that he wants to get people out of the driver's 145 00:07:27,280 --> 00:07:30,610 seat into the trunk, or at least watching 146 00:07:30,610 --> 00:07:32,720 movies in the back seat. 147 00:07:32,720 --> 00:07:35,705 Mindell argues that's actually probably not going to occur. 148 00:07:35,705 --> 00:07:38,080 And then he lays out a series of technological challenges 149 00:07:38,080 --> 00:07:39,850 that have to happen. 150 00:07:39,850 --> 00:07:42,580 So LIDAR, which is the fundamental radar system we're 151 00:07:42,580 --> 00:07:44,050 using for driverless cars-- 152 00:07:44,050 --> 00:07:45,580 and David knows a lot about radar, 153 00:07:45,580 --> 00:07:47,163 because that's what his new company is 154 00:07:47,163 --> 00:07:48,710 organizing right now-- 155 00:07:48,710 --> 00:07:53,830 LIDAR happens to be very problematic in wet surfaces 156 00:07:53,830 --> 00:07:54,655 and snowy services. 157 00:07:57,370 --> 00:07:59,260 It fails, right? 158 00:07:59,260 --> 00:08:02,720 That's why Google has based its driverless car operation out 159 00:08:02,720 --> 00:08:04,690 in sunny California, right? 160 00:08:04,690 --> 00:08:08,350 Uber is having much more trouble in Pittsburgh. 161 00:08:08,350 --> 00:08:10,512 And I'm sure your friends, Steph, 162 00:08:10,512 --> 00:08:12,220 are going to have trouble when they worry 163 00:08:12,220 --> 00:08:15,010 about Boston in the wintertime with their driverless car 164 00:08:15,010 --> 00:08:17,800 initiatives. 165 00:08:17,800 --> 00:08:20,390 It's highly problematic. 166 00:08:20,390 --> 00:08:26,590 You can get around that by using ground-penetrating radar, which 167 00:08:26,590 --> 00:08:31,610 avoids the surface water problem and surface slickness problem. 168 00:08:31,610 --> 00:08:36,090 But at the moment, that's another $70,000 per car. 169 00:08:36,090 --> 00:08:40,070 So the prices will get driven down the cost curve, 170 00:08:40,070 --> 00:08:41,679 and maybe we can do that. 171 00:08:41,679 --> 00:08:45,250 But these are significant barriers here. 172 00:08:45,250 --> 00:08:48,640 I was at a presentation recently by Gill Pratt and John 173 00:08:48,640 --> 00:08:51,370 Leonard, who are leading the Toyota driverless car 174 00:08:51,370 --> 00:08:53,320 initiative. 175 00:08:53,320 --> 00:08:56,800 And it was an interesting discussion. 176 00:08:56,800 --> 00:09:01,810 The leader of the discussion said, well, you know, 177 00:09:01,810 --> 00:09:05,710 we're going to have driverless cars within five years. 178 00:09:05,710 --> 00:09:09,970 And Gill Pratt, who led the whole 179 00:09:09,970 --> 00:09:12,030 robotics effort in DARPA-- 180 00:09:12,030 --> 00:09:13,780 and a former MIT faculty member, one 181 00:09:13,780 --> 00:09:17,560 of the most respected experts in robotics in the world, 182 00:09:17,560 --> 00:09:21,250 and named by Toyota to lead their effort-- 183 00:09:21,250 --> 00:09:24,730 Gill had this line and he said, you know, 184 00:09:24,730 --> 00:09:29,110 people tend to completely overestimate 185 00:09:29,110 --> 00:09:31,465 what we're going to accomplish in the next five years 186 00:09:31,465 --> 00:09:33,340 and completely underestimate what we're going 187 00:09:33,340 --> 00:09:35,710 to accomplish in the next 50. 188 00:09:35,710 --> 00:09:38,740 We're not good at making that balance. 189 00:09:38,740 --> 00:09:40,540 And in effect what he was saying is, 190 00:09:40,540 --> 00:09:43,840 the driverless car problem is not a five-year problem. 191 00:09:43,840 --> 00:09:45,590 And then John Leonard, who is running 192 00:09:45,590 --> 00:09:48,830 MIT's part of the Toyota initiative, one 193 00:09:48,830 --> 00:09:54,700 of MIT's famous robotics experts, he put up a video. 194 00:09:54,700 --> 00:10:00,130 And the video was him taking his kids to school. 195 00:10:00,130 --> 00:10:02,140 And they drive down a residential street-- one 196 00:10:02,140 --> 00:10:06,050 of these kind of urbanized suburbs around here. 197 00:10:06,050 --> 00:10:07,300 And there's a T-intersection. 198 00:10:07,300 --> 00:10:09,568 So they leave the residential street, 199 00:10:09,568 --> 00:10:11,860 and they have to make a left turn to get to the school. 200 00:10:11,860 --> 00:10:15,410 The kids are sitting in the back. 201 00:10:15,410 --> 00:10:18,260 And in the lane immediately in front of him 202 00:10:18,260 --> 00:10:20,990 going in that direction, going to the right, 203 00:10:20,990 --> 00:10:23,750 are cars that are erratically coming across 204 00:10:23,750 --> 00:10:26,960 at about 40 to 50 miles an hour. 205 00:10:26,960 --> 00:10:29,120 And the lane that he wants to get into, 206 00:10:29,120 --> 00:10:31,520 it's bumper to bumper, completely stalled. 207 00:10:31,520 --> 00:10:32,820 And he's got to turn left. 208 00:10:35,400 --> 00:10:38,310 He finally is able to solve the problem, he indicates, 209 00:10:38,310 --> 00:10:40,470 by rolling down his window, making 210 00:10:40,470 --> 00:10:45,750 a judgment about what driver might be the most sympathetic. 211 00:10:45,750 --> 00:10:48,640 Waving, establishing an eye contact, 212 00:10:48,640 --> 00:10:50,460 and this driver is willing to let 213 00:10:50,460 --> 00:10:52,590 him pull in in front of her. 214 00:10:52,590 --> 00:10:53,910 But he's then got to wait-- 215 00:10:53,910 --> 00:10:55,868 she's got to be at the right point in the line, 216 00:10:55,868 --> 00:10:58,230 and the erratic 50-mile-an-hour cars have got to let up 217 00:10:58,230 --> 00:10:59,550 a little bit so he can get across. 218 00:10:59,550 --> 00:11:00,883 He's finally able to achieve it. 219 00:11:00,883 --> 00:11:02,510 He does this every morning. 220 00:11:02,510 --> 00:11:06,360 He said, look, I can't write the algorithm 221 00:11:06,360 --> 00:11:08,790 for the left-hand turn into heavy traffic. 222 00:11:08,790 --> 00:11:10,437 I can't do it. 223 00:11:10,437 --> 00:11:12,270 I've been thinking about it for a long time. 224 00:11:12,270 --> 00:11:15,300 That is at least a 10-year problem. 225 00:11:15,300 --> 00:11:18,510 How is my driverless car going to establish eye contact 226 00:11:18,510 --> 00:11:21,090 with the driver in that far lane. 227 00:11:21,090 --> 00:11:23,333 How are they going to signal out the window? 228 00:11:23,333 --> 00:11:25,500 How are they going to make that personal connection? 229 00:11:25,500 --> 00:11:28,290 All of us, when we drive all the time, 230 00:11:28,290 --> 00:11:31,680 are making constant judgments about other drivers. 231 00:11:31,680 --> 00:11:34,680 John Leonard says, I can't write the algorithms 232 00:11:34,680 --> 00:11:38,490 to evaluate the crazy 18-year-old hot rod 233 00:11:38,490 --> 00:11:44,820 driver versus me versus an elderly 65-year-old driver. 234 00:11:44,820 --> 00:11:48,510 I can't make those differentials in my algorithms 235 00:11:48,510 --> 00:11:50,170 to make this thing work. 236 00:11:50,170 --> 00:11:53,150 So the problem of driving in an urbanized setting 237 00:11:53,150 --> 00:11:54,150 is going to be profound. 238 00:11:54,150 --> 00:11:55,858 He said, look, we can probably figure out 239 00:11:55,858 --> 00:11:57,900 interstate driving-- the variables are much more 240 00:11:57,900 --> 00:11:59,550 manageable. 241 00:11:59,550 --> 00:12:03,640 But urbanized setting, it's really, really complicated. 242 00:12:03,640 --> 00:12:07,020 So I guess what Mindell is telling us-- 243 00:12:07,020 --> 00:12:10,280 and those are just examples that are not in his book, 244 00:12:10,280 --> 00:12:13,800 but I'm pulling up from recent experience. 245 00:12:13,800 --> 00:12:17,320 But Mindell is telling us that this autonomy is not 246 00:12:17,320 --> 00:12:18,390 necessarily at hand. 247 00:12:18,390 --> 00:12:22,020 And even when it is available, and he 248 00:12:22,020 --> 00:12:26,880 uses the example of the moon landing, 249 00:12:26,880 --> 00:12:31,030 the astronauts preempt it, because they see things, i.e. 250 00:12:31,030 --> 00:12:34,813 a small crater, that would have prevented the landing 251 00:12:34,813 --> 00:12:35,730 in the right location. 252 00:12:35,730 --> 00:12:38,147 So they have to take over the controls and move the thing. 253 00:12:38,147 --> 00:12:40,140 People are not going to want to give up 254 00:12:40,140 --> 00:12:43,782 that level of involvement, he argues, and shouldn't, 255 00:12:43,782 --> 00:12:45,240 because there's some things they're 256 00:12:45,240 --> 00:12:48,400 going to be better at judging than even the best machinery we 257 00:12:48,400 --> 00:12:50,310 can come up with. 258 00:12:50,310 --> 00:12:53,860 So there's an example that David uses, 259 00:12:53,860 --> 00:12:57,450 which is that a combination of a chess master and a computer 260 00:12:57,450 --> 00:13:01,250 always beats another chess master alone 261 00:13:01,250 --> 00:13:02,930 or a computer alone. 262 00:13:02,930 --> 00:13:05,570 And so far, that's our experience. 263 00:13:05,570 --> 00:13:07,610 Now, all of these things could change over time, 264 00:13:07,610 --> 00:13:09,800 and technology will definitely advance. 265 00:13:09,800 --> 00:13:13,310 But what David is arguing is that this robotics revolution 266 00:13:13,310 --> 00:13:16,670 is not going to be a revolution around autonomy. 267 00:13:16,670 --> 00:13:19,760 It's going to be a revolution like this revolution 268 00:13:19,760 --> 00:13:21,367 around symbiosis. 269 00:13:21,367 --> 00:13:23,450 And I think that's a really important thing for us 270 00:13:23,450 --> 00:13:27,140 to keep in mind as we think about the speed which 271 00:13:27,140 --> 00:13:29,150 these changes are going to start to happen. 272 00:13:29,150 --> 00:13:31,910 The complexity is really high. 273 00:13:31,910 --> 00:13:34,580 And the nature of the changes themselves. 274 00:13:34,580 --> 00:13:40,640 And in David Mindell's world of cobotics and assistive 275 00:13:40,640 --> 00:13:43,520 robotics, there is much more room 276 00:13:43,520 --> 00:13:47,930 for people than a truly totally autonomous system. 277 00:13:47,930 --> 00:13:50,630 And that presents-- again, back to our education discussion, 278 00:13:50,630 --> 00:13:52,880 that's back to our training discussion-- that presents 279 00:13:52,880 --> 00:13:57,440 a whole new set of ways of thinking about what this 280 00:13:57,440 --> 00:14:00,860 is going to look like that we probably ought to keep in mind. 281 00:14:00,860 --> 00:14:03,590 And it also means that the timetable of these changes 282 00:14:03,590 --> 00:14:08,030 is not as abrupt as the media has been telling us 283 00:14:08,030 --> 00:14:10,580 in the last couple of years. 284 00:14:10,580 --> 00:14:12,260 It's a more manageable-- 285 00:14:12,260 --> 00:14:13,830 David Mindell would argue-- 286 00:14:13,830 --> 00:14:17,500 and more gradual kind of timetable. 287 00:14:17,500 --> 00:14:18,990 All right. 288 00:14:18,990 --> 00:14:21,740 Sanam, it's all yours. 289 00:14:21,740 --> 00:14:22,240 SANAM: Yeah. 290 00:14:22,240 --> 00:14:26,450 So I think that the question about symbiosis between people 291 00:14:26,450 --> 00:14:28,590 and machines is really interesting, especially 292 00:14:28,590 --> 00:14:33,870 considering the debate about whether people and robots are 293 00:14:33,870 --> 00:14:37,200 a lot like substitute goods or complementary goods. 294 00:14:37,200 --> 00:14:39,840 And I think that the consensus here 295 00:14:39,840 --> 00:14:43,338 is that they are equally complementary. 296 00:14:43,338 --> 00:14:45,630 But I think that also has some interesting implications 297 00:14:45,630 --> 00:14:50,260 for what that could do to the human environment. 298 00:14:50,260 --> 00:14:53,880 So even with goods that are complementary, 299 00:14:53,880 --> 00:14:57,000 it's likely that this will push out 300 00:14:57,000 --> 00:14:59,280 the demand for the types of labor 301 00:14:59,280 --> 00:15:02,492 that is more well-versed in high skill [INAUDIBLE] 302 00:15:02,492 --> 00:15:04,200 in working with these types of technology 303 00:15:04,200 --> 00:15:08,280 and reduced demand for labor that's less. 304 00:15:08,280 --> 00:15:10,708 So I think that there are really important considerations 305 00:15:10,708 --> 00:15:12,375 about when you're integrating technology 306 00:15:12,375 --> 00:15:14,400 into a human environment, how it's going 307 00:15:14,400 --> 00:15:16,750 to affect that environment. 308 00:15:16,750 --> 00:15:21,660 So I think just a question that a lot of you had was, 309 00:15:21,660 --> 00:15:24,480 when you are looking at building these autonomous or 310 00:15:24,480 --> 00:15:27,540 semi-autonomous technologies into an existing 311 00:15:27,540 --> 00:15:30,570 human environment, it's going to require a lot of knowledge 312 00:15:30,570 --> 00:15:32,550 about that environment in order to make 313 00:15:32,550 --> 00:15:34,830 these important predictive models. 314 00:15:34,830 --> 00:15:38,022 So what are some of the biggest challenges in doing that? 315 00:15:38,022 --> 00:15:41,958 And where do you see the potential failings or pitfalls? 316 00:15:44,910 --> 00:15:47,450 CHLOE: So one of the things I was interested in-- 317 00:15:47,450 --> 00:15:50,510 I really agree with a lot of the points that he makes. 318 00:15:50,510 --> 00:15:54,830 I think he lays out really optimistic, but also 319 00:15:54,830 --> 00:15:58,280 appropriately realistic future for our symbiotic relationships 320 00:15:58,280 --> 00:16:00,480 with robots in a lot of different fields. 321 00:16:00,480 --> 00:16:04,460 But [INAUDIBLE] your question, he 322 00:16:04,460 --> 00:16:07,820 raises the issue of the pilot, the soldier, and the astronaut 323 00:16:07,820 --> 00:16:12,650 who are all, to some extent, a little reticent to fully adopt 324 00:16:12,650 --> 00:16:16,220 complete autonomy in their own respective fields. 325 00:16:16,220 --> 00:16:19,100 So I guess one of the places where 326 00:16:19,100 --> 00:16:21,545 you could consider a potential failure of that symbiosis 327 00:16:21,545 --> 00:16:28,790 is, is that hesitation to adopt the fully-autonomous system 328 00:16:28,790 --> 00:16:31,940 something we should train out of these humans? 329 00:16:31,940 --> 00:16:34,790 Or should we adapt our technological development 330 00:16:34,790 --> 00:16:35,390 to match it? 331 00:16:35,390 --> 00:16:37,370 Like is that a failure point or is that OK? 332 00:16:37,370 --> 00:16:41,930 MAX: It's possible that the want or the desire to go away from 333 00:16:41,930 --> 00:16:45,860 autonomy-- maybe they have some sort of instinctual realization 334 00:16:45,860 --> 00:16:49,740 that the computer cannot do everything that they can. 335 00:16:49,740 --> 00:16:51,950 It's not as good at pattern recognition, 336 00:16:51,950 --> 00:16:53,000 that kind of thing. 337 00:16:53,000 --> 00:16:54,770 So maybe it's not a failure point. 338 00:16:54,770 --> 00:16:56,310 Maybe instead it's a-- 339 00:16:56,310 --> 00:16:56,810 [SNEEZING] 340 00:16:56,810 --> 00:16:58,430 I'm not sure what you would call it. 341 00:16:58,430 --> 00:16:58,915 CHLOE: Interesting. 342 00:16:58,915 --> 00:16:59,540 MAX: Bless you. 343 00:16:59,540 --> 00:17:01,380 CHLOE: Like an actual benefit to that. 344 00:17:01,380 --> 00:17:05,460 MAX: Yeah, almost like a warning system. 345 00:17:05,460 --> 00:17:06,025 Yeah? 346 00:17:06,025 --> 00:17:06,650 AUDIENCE: Yeah. 347 00:17:06,650 --> 00:17:09,440 And on top of that, I feel like there's a whole-- 348 00:17:09,440 --> 00:17:11,420 especially thinking about cars, but I'm 349 00:17:11,420 --> 00:17:13,910 sure for pilots and other people as well-- 350 00:17:13,910 --> 00:17:17,600 there's a lot of culture around being a driver 351 00:17:17,600 --> 00:17:19,040 or being a pilot. 352 00:17:19,040 --> 00:17:21,500 You can define yourself as a pilot. 353 00:17:21,500 --> 00:17:24,230 And I think by taking away any control 354 00:17:24,230 --> 00:17:27,140 that you have over an aircraft or something, that 355 00:17:27,140 --> 00:17:29,030 can be something that people don't want 356 00:17:29,030 --> 00:17:31,543 to do just because people like to drive, 357 00:17:31,543 --> 00:17:32,960 they like the open road, they want 358 00:17:32,960 --> 00:17:34,460 to feel like they're in control. 359 00:17:34,460 --> 00:17:39,630 And taking that away could just be a cultural issue as well. 360 00:17:39,630 --> 00:17:43,190 RASHEED: I thought autopilot was pretty well-accepted and kind 361 00:17:43,190 --> 00:17:45,080 of integrated by now. 362 00:17:45,080 --> 00:17:49,070 I know, especially for long-distance flights, 363 00:17:49,070 --> 00:17:50,750 cross-Atlantic, I definitely don't 364 00:17:50,750 --> 00:17:54,592 want the pilot of this airplane reading and charting 365 00:17:54,592 --> 00:17:56,300 the navigation across the Atlantic Ocean. 366 00:17:56,300 --> 00:17:57,260 Like, I'm good. 367 00:17:57,260 --> 00:18:01,812 We can definitely have more integrated technology 368 00:18:01,812 --> 00:18:03,020 to allow machines to do that. 369 00:18:03,020 --> 00:18:06,052 But at least within the pilot community, 370 00:18:06,052 --> 00:18:07,510 I thought this was pretty accepted. 371 00:18:07,510 --> 00:18:09,093 And now with fighter pilots and things 372 00:18:09,093 --> 00:18:10,790 like that, you're going so fast, it 373 00:18:10,790 --> 00:18:13,190 doesn't make sense to have a human interpret 374 00:18:13,190 --> 00:18:15,264 all these signals that are coming in. 375 00:18:15,264 --> 00:18:15,943 CHLOE: Sorry. 376 00:18:15,943 --> 00:18:17,110 RASHEED: Yeah, no, go ahead. 377 00:18:17,110 --> 00:18:18,445 CHLOE: No, you are right. 378 00:18:18,445 --> 00:18:21,110 It is very widely accepted and integrated. 379 00:18:21,110 --> 00:18:23,990 But also, I think you make an interesting point in terms 380 00:18:23,990 --> 00:18:28,190 of how people define themselves and that job 381 00:18:28,190 --> 00:18:31,940 being actually part of their identity. 382 00:18:31,940 --> 00:18:34,603 Also, I think as we've developed more and more complex 383 00:18:34,603 --> 00:18:36,020 machines and those machines become 384 00:18:36,020 --> 00:18:39,920 integrated into our life, we interact with each other 385 00:18:39,920 --> 00:18:41,280 through those machines. 386 00:18:41,280 --> 00:18:46,130 So even just taking the example of air traffic. 387 00:18:46,130 --> 00:18:48,980 There's a very complex communication network 388 00:18:48,980 --> 00:18:55,070 between air traffic controllers and different pilots 389 00:18:55,070 --> 00:18:58,520 from commercial aviation to general aviation. 390 00:18:58,520 --> 00:19:03,350 And even though that system has a lot of autonomous components 391 00:19:03,350 --> 00:19:05,000 in it in today's day in age, there's 392 00:19:05,000 --> 00:19:14,700 still generations of protocol, both very officially 393 00:19:14,700 --> 00:19:18,470 delineated and also sort of adopted as practice. 394 00:19:18,470 --> 00:19:21,320 And I think that the humans who are willingly 395 00:19:21,320 --> 00:19:23,990 part of that system as controllers and pilots 396 00:19:23,990 --> 00:19:25,300 know that very well. 397 00:19:25,300 --> 00:19:29,080 And so they know how to interact with the other pilots, 398 00:19:29,080 --> 00:19:34,010 both in manners and how to make well-informed, safe decisions 399 00:19:34,010 --> 00:19:35,850 based on what other people are doing. 400 00:19:35,850 --> 00:19:37,490 So I think-- yeah. 401 00:19:37,490 --> 00:19:40,380 I think that even though they might be flying, technically, 402 00:19:40,380 --> 00:19:44,660 a completely autonomous plane, they still 403 00:19:44,660 --> 00:19:48,735 serve as the end-point of the communication with other humans 404 00:19:48,735 --> 00:19:51,110 who are in charge of other completely autonomous systems. 405 00:19:51,110 --> 00:19:52,693 WILLIAM BONVILLIAN: And Chloe, I think 406 00:19:52,693 --> 00:19:55,280 you're absolutely right on that from what 407 00:19:55,280 --> 00:19:58,340 I understand from David's book and discussions with him. 408 00:19:58,340 --> 00:20:02,600 In addition, he gives a remarkable example. 409 00:20:02,600 --> 00:20:05,810 It's a startling example that starts the book off. 410 00:20:05,810 --> 00:20:09,110 When you get a chance, the book is really worth reading. 411 00:20:09,110 --> 00:20:17,030 He tells the story of an Air France crash over the Pacific 412 00:20:17,030 --> 00:20:18,217 15 years ago. 413 00:20:18,217 --> 00:20:19,800 Chloe, I see you're smiling, so you're 414 00:20:19,800 --> 00:20:20,700 thinking about the story. 415 00:20:20,700 --> 00:20:21,560 CHLOE: Yeah, I was just thinking about this. 416 00:20:21,560 --> 00:20:23,720 WILLIAM BONVILLIAN: And you can extrapolate and see 417 00:20:23,720 --> 00:20:24,850 if I get the story right. 418 00:20:24,850 --> 00:20:27,740 But essentially, the story that he tells 419 00:20:27,740 --> 00:20:31,790 is that there's a moment in the flight they're on autopilot. 420 00:20:31,790 --> 00:20:33,680 As you point out, Rasheed, it can 421 00:20:33,680 --> 00:20:37,520 be a very useful tool for pilots and is 422 00:20:37,520 --> 00:20:42,210 used in many parts of a flight pattern. 423 00:20:42,210 --> 00:20:45,710 And they're on autopilot, and the plane 424 00:20:45,710 --> 00:20:49,970 enters an area of very low temperatures and ice 425 00:20:49,970 --> 00:20:52,100 is forming on the wings. 426 00:20:52,100 --> 00:20:56,660 So the autopilot system default mechanism, at that point, 427 00:20:56,660 --> 00:21:00,920 is to remove the autopilot control 428 00:21:00,920 --> 00:21:04,500 and switch it to the pilot control. 429 00:21:04,500 --> 00:21:07,920 And what has happened is that the pilot of the aircraft 430 00:21:07,920 --> 00:21:10,160 is back in the restroom, because they're on autopilot 431 00:21:10,160 --> 00:21:12,260 over the middle of the Pacific. 432 00:21:12,260 --> 00:21:15,630 And the engineer, the flight engineer and the co-pilot, 433 00:21:15,630 --> 00:21:16,850 are at the controls. 434 00:21:16,850 --> 00:21:19,430 They're relaxing. 435 00:21:19,430 --> 00:21:22,190 And suddenly, they're in charge. 436 00:21:22,190 --> 00:21:25,340 And each has a different reaction. 437 00:21:25,340 --> 00:21:29,840 And it ends up completely destabilizing the aircraft 438 00:21:29,840 --> 00:21:35,330 and throwing it into a spin right into the Pacific Ocean, 439 00:21:35,330 --> 00:21:37,850 where all the lives and aircraft are lost. 440 00:21:37,850 --> 00:21:40,040 And they had the opposite-- they're not ready. 441 00:21:40,040 --> 00:21:45,650 So this moment of people being ready to take control again 442 00:21:45,650 --> 00:21:48,560 after having been out of control turns out 443 00:21:48,560 --> 00:21:53,390 to be a nightmare in all kinds of other settings, right? 444 00:21:53,390 --> 00:21:55,130 It's very unmanageable. 445 00:21:55,130 --> 00:21:59,930 How do you keep the person who needs to be in overall control 446 00:21:59,930 --> 00:22:01,775 completely in the game? 447 00:22:01,775 --> 00:22:03,650 And how do you keep them completely involved? 448 00:22:03,650 --> 00:22:05,720 So that's one of the most complicated design 449 00:22:05,720 --> 00:22:08,360 problems for autonomous vehicles in general, 450 00:22:08,360 --> 00:22:10,877 and driverless cars in particular. 451 00:22:10,877 --> 00:22:13,460 Since we're probably not going to get to fully driverless cars 452 00:22:13,460 --> 00:22:17,337 soon, how is that relationship going to work? 453 00:22:17,337 --> 00:22:18,920 And how are we going to play that out? 454 00:22:18,920 --> 00:22:21,560 And how do you keep just enough of a level involved? 455 00:22:21,560 --> 00:22:24,530 And interesting, Rasheed, back to your point, 456 00:22:24,530 --> 00:22:28,000 the autopilot system in aircraft has actually now evolved 457 00:22:28,000 --> 00:22:31,550 to the point where they keep that balance 458 00:22:31,550 --> 00:22:34,190 right between the machine and the person 459 00:22:34,190 --> 00:22:36,140 and the overall person in control 460 00:22:36,140 --> 00:22:39,860 and the desire of the person to retain the control. 461 00:22:39,860 --> 00:22:41,390 They figured out what the signaling 462 00:22:41,390 --> 00:22:44,150 is so that that actually works well on both sides. 463 00:22:44,150 --> 00:22:46,317 But that's a really complicated balance to work out. 464 00:22:46,317 --> 00:22:48,150 Have I got that right, Chloe, would you say? 465 00:22:48,150 --> 00:22:49,160 CHLOE: Yeah, yeah. 466 00:22:49,160 --> 00:22:51,577 WILLIAM BONVILLIAN: Anything you have to say more on this? 467 00:22:51,577 --> 00:22:52,680 Are you studying this? 468 00:22:52,680 --> 00:22:55,610 CHLOE: Well, in AeroAstro we do a little bit 469 00:22:55,610 --> 00:22:56,780 of failure investigation. 470 00:22:56,780 --> 00:22:59,780 But I also fly personally, and I've 471 00:22:59,780 --> 00:23:03,990 had a lot of experiences that validate that exact thing. 472 00:23:03,990 --> 00:23:10,710 So I used to fly a pretty small four-seater aircraft. 473 00:23:10,710 --> 00:23:12,490 MAX: Cessna? 474 00:23:12,490 --> 00:23:14,390 CHLOE: Yeah, well, a Columbia, which 475 00:23:14,390 --> 00:23:17,270 is basically the same size as a Cessna, but a lot faster. 476 00:23:17,270 --> 00:23:20,510 And I used to fly with my dad a lot. 477 00:23:20,510 --> 00:23:24,230 And one of the stories that we would repeatedly live out 478 00:23:24,230 --> 00:23:27,260 when we were landing at a new airport was that the air 479 00:23:27,260 --> 00:23:30,740 traffic controller would, based on the size and type 480 00:23:30,740 --> 00:23:33,260 of our aircraft, would vastly underestimate 481 00:23:33,260 --> 00:23:35,960 the speed at which we were making our descent. 482 00:23:35,960 --> 00:23:37,970 And a full descent into most of these airports 483 00:23:37,970 --> 00:23:39,770 is a multi-leg type thing, but you 484 00:23:39,770 --> 00:23:42,810 start up to 20 minutes before you're actually on the ground. 485 00:23:42,810 --> 00:23:43,940 So it's pretty complicated. 486 00:23:43,940 --> 00:23:46,320 There are other airplanes in the air. 487 00:23:46,320 --> 00:23:48,500 You have to match their speeds and rates of descent, 488 00:23:48,500 --> 00:23:51,710 so you don't have two people colliding on the runway. 489 00:23:51,710 --> 00:23:56,440 And repeatedly, how fast we were going would be underestimated. 490 00:23:56,440 --> 00:24:00,598 And we would be coming down much more rapidly than the air 491 00:24:00,598 --> 00:24:01,890 traffic controller would think. 492 00:24:01,890 --> 00:24:07,010 And so we'd be going faster than a much, much larger 493 00:24:07,010 --> 00:24:08,090 aircraft was. 494 00:24:08,090 --> 00:24:11,600 So that was just another case of where establishing 495 00:24:11,600 --> 00:24:14,180 that human contact with other people 496 00:24:14,180 --> 00:24:17,570 to let them know where you were and with the people whose 497 00:24:17,570 --> 00:24:21,980 instrumentation was telling them something different, 498 00:24:21,980 --> 00:24:25,203 it's important to have people who know their own missions-- 499 00:24:25,203 --> 00:24:26,870 WILLIAM BONVILLIAN: Interesting example. 500 00:24:26,870 --> 00:24:29,050 CHLOE: --involved in the process. 501 00:24:29,050 --> 00:24:32,090 MARTIN: I think there's also an opportunity for using autonomy 502 00:24:32,090 --> 00:24:34,970 in an area where there's human bias. 503 00:24:34,970 --> 00:24:38,540 There was a case study of a Japanese flight, where 504 00:24:38,540 --> 00:24:42,608 the co-pilot knows that what the actual pilot is doing is wrong. 505 00:24:42,608 --> 00:24:44,150 But because he's an authority figure, 506 00:24:44,150 --> 00:24:45,890 he won't contradict them. 507 00:24:45,890 --> 00:24:49,130 So using some autonomous things to say like, no, this is wrong, 508 00:24:49,130 --> 00:24:51,255 we're definitely doing something wrong, or at least 509 00:24:51,255 --> 00:24:53,560 put some kind of flag is useful. 510 00:24:53,560 --> 00:24:56,000 In sci-fi, it was really cool, because it 511 00:24:56,000 --> 00:24:58,220 talked about human bias injustice, 512 00:24:58,220 --> 00:24:59,842 like when you make a decision. 513 00:24:59,842 --> 00:25:01,550 And so there was this one science fiction 514 00:25:01,550 --> 00:25:05,360 where in the future, an AI was the judge, 515 00:25:05,360 --> 00:25:07,308 because he'd be non-biased. 516 00:25:07,308 --> 00:25:07,850 I don't know. 517 00:25:07,850 --> 00:25:09,658 I think that stuff's pretty interesting. 518 00:25:09,658 --> 00:25:11,450 AUDIENCE: That's already kind of happening, 519 00:25:11,450 --> 00:25:13,130 that they've been using algorithms 520 00:25:13,130 --> 00:25:16,550 in some jurisdictions to determine 521 00:25:16,550 --> 00:25:19,040 whether someone should get bail or not based 522 00:25:19,040 --> 00:25:21,505 on their backgrounds. 523 00:25:21,505 --> 00:25:22,880 It's like a proprietary software, 524 00:25:22,880 --> 00:25:24,050 which is kind of scary to begin with, 525 00:25:24,050 --> 00:25:25,400 because people don't really understand-- 526 00:25:25,400 --> 00:25:25,640 MARTIN: How it works. 527 00:25:25,640 --> 00:25:27,390 AUDIENCE: --how it's making its decisions. 528 00:25:27,390 --> 00:25:30,070 But you'll put in different factors about the person, 529 00:25:30,070 --> 00:25:32,570 and that can help inform the judge as to whether they should 530 00:25:32,570 --> 00:25:34,320 be let out on bail or not. 531 00:25:34,320 --> 00:25:35,990 And so I think there's a big-- 532 00:25:35,990 --> 00:25:38,633 someone just filed a lawsuit about that, 533 00:25:38,633 --> 00:25:40,550 since you're not allowed to face your accuser, 534 00:25:40,550 --> 00:25:42,860 if your accuser is an algorithm. 535 00:25:42,860 --> 00:25:44,420 So we're already getting into this-- 536 00:25:44,420 --> 00:25:46,600 [INTERPOSING VOICES] 537 00:25:49,066 --> 00:25:51,470 RASHEED: Well, why would you even start down that road? 538 00:25:51,470 --> 00:25:53,382 AUDIENCE: Well, because-- the justification-- 539 00:25:53,382 --> 00:25:54,340 MARTIN: It's more fair. 540 00:25:54,340 --> 00:25:56,110 AUDIENCE: --whatever you think of it-- 541 00:25:56,110 --> 00:25:56,350 MAX: And it's faster. 542 00:25:56,350 --> 00:25:58,558 AUDIENCE: --is that there are a lot of judges who are 543 00:25:58,558 --> 00:26:01,210 seeing hundreds of cases a day. 544 00:26:01,210 --> 00:26:03,395 A lot of things are affecting their judgment from-- 545 00:26:03,395 --> 00:26:05,770 I think the paper I read said it could be like a therapy, 546 00:26:05,770 --> 00:26:07,660 or a team won the night before, or they're 547 00:26:07,660 --> 00:26:11,700 hungry-- like they're influenced by these outside factors. 548 00:26:11,700 --> 00:26:13,420 So if we automate it, then we take 549 00:26:13,420 --> 00:26:16,020 out some of those biases and other biases 550 00:26:16,020 --> 00:26:19,650 that judges just have inherently from whatever their upbringing. 551 00:26:19,650 --> 00:26:21,487 RASHEED: Or like compassion maybe? 552 00:26:21,487 --> 00:26:23,070 I don't know if there's a compassion-- 553 00:26:23,070 --> 00:26:24,460 [INTERPOSING VOICES] 554 00:26:25,440 --> 00:26:27,330 AUDIENCE: It's a very hard-- 555 00:26:27,330 --> 00:26:29,788 MARTIN: I mean, also the way I would 556 00:26:29,788 --> 00:26:31,330 think about it is, great, because you 557 00:26:31,330 --> 00:26:32,320 get a nice heuristic. 558 00:26:32,320 --> 00:26:33,880 And if there's ever an issue, that's 559 00:26:33,880 --> 00:26:37,013 the one that goes to court and will get a full thing. 560 00:26:37,013 --> 00:26:38,680 Because there's probably a lot of issues 561 00:26:38,680 --> 00:26:41,260 that are very quick cases. 562 00:26:41,260 --> 00:26:43,270 But there's a whole thing at MIT about this 563 00:26:43,270 --> 00:26:46,510 now that they're doing, like lawyers and tech. 564 00:26:46,510 --> 00:26:50,650 And it's around Sloan and around the media lab. 565 00:26:50,650 --> 00:26:54,828 But it's a field that MIT is looking into, 566 00:26:54,828 --> 00:26:56,995 but I don't know if we'll come up with a law school. 567 00:26:56,995 --> 00:26:58,662 STEPH: But there was a really cool study 568 00:26:58,662 --> 00:26:59,980 that I just read for my paper. 569 00:26:59,980 --> 00:27:00,760 I pulled it up. 570 00:27:00,760 --> 00:27:04,760 It got published literally this week by Carnegie Mellon, 571 00:27:04,760 --> 00:27:06,850 entitled "A Human-Centered Approach to Algorithmic 572 00:27:06,850 --> 00:27:08,980 Services, Considerations for Fair 573 00:27:08,980 --> 00:27:11,182 and Motivating Smart Community Service Management 574 00:27:11,182 --> 00:27:13,390 that Allocates Donations to Nonprofit Organizations." 575 00:27:13,390 --> 00:27:14,350 So it's a very long title. 576 00:27:14,350 --> 00:27:14,770 But it's essentially-- 577 00:27:14,770 --> 00:27:15,450 [LAUGHTER] 578 00:27:15,450 --> 00:27:17,383 WILLIAM BONVILLIAN: Yes, it is. 579 00:27:17,383 --> 00:27:20,050 The title of your paper is going to be shorter, I'm sure, Steph. 580 00:27:20,050 --> 00:27:21,490 STEPH: I hope so. 581 00:27:21,490 --> 00:27:22,490 So there's three people. 582 00:27:22,490 --> 00:27:24,220 One of them is from the Center for Machine Learning, 583 00:27:24,220 --> 00:27:25,887 another one's from the School of Design, 584 00:27:25,887 --> 00:27:28,690 and then there's a stakeholder from an organization called 585 00:27:28,690 --> 00:27:33,190 Food Rescue, which serves the greater Pittsburgh area. 586 00:27:33,190 --> 00:27:36,550 And what was really cool about this was essentially 587 00:27:36,550 --> 00:27:39,340 that they not only talked to computer scientists who 588 00:27:39,340 --> 00:27:43,040 are designing the algorithms for who gets the food allocations, 589 00:27:43,040 --> 00:27:46,420 but then they also evaluated, I guess, 590 00:27:46,420 --> 00:27:49,720 the ethos that was driving the way that the programmers were 591 00:27:49,720 --> 00:27:51,490 devising their algorithm and the decisions 592 00:27:51,490 --> 00:27:53,830 that they had made, like the cost-benefit analysis 593 00:27:53,830 --> 00:27:56,792 and who actually gets the food, why they should get them, 594 00:27:56,792 --> 00:27:58,750 what considerations their children should have, 595 00:27:58,750 --> 00:27:59,385 et cetera. 596 00:27:59,385 --> 00:28:00,760 And then from there, they started 597 00:28:00,760 --> 00:28:04,210 thinking about the ways in which the algorithm was designed, 598 00:28:04,210 --> 00:28:06,280 how that would impact, in implementation, 599 00:28:06,280 --> 00:28:08,050 the lives of the people receiving the food 600 00:28:08,050 --> 00:28:09,700 and whether or not that was fair. 601 00:28:09,700 --> 00:28:11,410 And interestingly enough, and I think 602 00:28:11,410 --> 00:28:14,620 that this is really, really crucial in not only the design 603 00:28:14,620 --> 00:28:17,265 process, but generally in policymaking, was the question 604 00:28:17,265 --> 00:28:18,640 that they bring up about empathy. 605 00:28:18,640 --> 00:28:21,460 Like, to what extent are the algorithms that 606 00:28:21,460 --> 00:28:25,000 are being created reflective of the individual programmers' 607 00:28:25,000 --> 00:28:27,700 ethics and personal values? 608 00:28:27,700 --> 00:28:30,430 And that was something that they really bring to light. 609 00:28:30,430 --> 00:28:33,100 And this, I think, is an incredibly innovative case 610 00:28:33,100 --> 00:28:36,430 study, and it's being applied to the nonprofit sector 611 00:28:36,430 --> 00:28:38,850 in a really small, piloted way. 612 00:28:38,850 --> 00:28:40,930 But I could see studies like this scaling 613 00:28:40,930 --> 00:28:42,535 up to other algorithms and really 614 00:28:42,535 --> 00:28:44,410 striving to understand the ways in which they 615 00:28:44,410 --> 00:28:47,830 prejudice decision-making, even if they act as heuristics. 616 00:28:47,830 --> 00:28:50,950 Heuristics are just utilizing values and shortcuts 617 00:28:50,950 --> 00:28:53,680 to decision-making based on your intuition 618 00:28:53,680 --> 00:28:55,420 in the decision-making process. 619 00:28:55,420 --> 00:28:56,963 So that could be really cool. 620 00:28:56,963 --> 00:28:58,630 But at the same time, for me, as someone 621 00:28:58,630 --> 00:29:02,710 who studies both design and politics, there's an enormous-- 622 00:29:02,710 --> 00:29:06,410 and I guess my religious studies also influence this a bit-- 623 00:29:06,410 --> 00:29:08,740 there's a huge movement in religious studies 624 00:29:08,740 --> 00:29:11,290 to move away from empathy, because it can be utilized 625 00:29:11,290 --> 00:29:12,700 as a tool for manipulation. 626 00:29:12,700 --> 00:29:15,280 Because if you have insight into what an individual matters 627 00:29:15,280 --> 00:29:18,160 and what motivates them to action, then that is not good. 628 00:29:18,160 --> 00:29:19,870 But what, indeed, you should have, 629 00:29:19,870 --> 00:29:22,490 as was stated by a divinity professor at Yale, 630 00:29:22,490 --> 00:29:24,723 and I think he also does some work in philosophy, 631 00:29:24,723 --> 00:29:26,140 is that what you should strive for 632 00:29:26,140 --> 00:29:27,973 and what I think the algorithm should strive 633 00:29:27,973 --> 00:29:31,360 for is compassion, not empathy, which is distinct from empathy. 634 00:29:31,360 --> 00:29:33,250 The difference is that in empathy, you 635 00:29:33,250 --> 00:29:35,910 seek to feel the way that the other person feels. 636 00:29:35,910 --> 00:29:39,130 Where in compassion, you seek to understand how they feel, 637 00:29:39,130 --> 00:29:41,920 but you don't truly try to access those emotions. 638 00:29:41,920 --> 00:29:45,160 And so I think that compassion, intellectually, 639 00:29:45,160 --> 00:29:48,100 is more sustainable and is something that you can actively 640 00:29:48,100 --> 00:29:49,660 implement in an algorithm. 641 00:29:49,660 --> 00:29:52,730 Whereas if you try to make your algorithm empathetic, 642 00:29:52,730 --> 00:29:56,650 it starts getting into the realm of making valued judgments, 643 00:29:56,650 --> 00:29:59,050 which is ultimately going to be damaging and more 644 00:29:59,050 --> 00:30:02,770 prejudicial than if an individual is making 645 00:30:02,770 --> 00:30:03,310 a decision. 646 00:30:03,310 --> 00:30:07,390 Because then you can leave the perjury to the machine 647 00:30:07,390 --> 00:30:10,720 and not place the impetus of whatever discrimination 648 00:30:10,720 --> 00:30:14,485 is happening on an individual. 649 00:30:14,485 --> 00:30:16,740 MARTIN: Also, hacks. 650 00:30:16,740 --> 00:30:17,540 LILY: Always hacks. 651 00:30:17,540 --> 00:30:18,207 RASHEED: Thanks. 652 00:30:18,207 --> 00:30:21,757 [INTERPOSING VOICES] 653 00:30:21,757 --> 00:30:24,090 STEPH: Yeah, the Microsoft-- just quickly on that point. 654 00:30:24,090 --> 00:30:24,390 LILY: Mm-hmm. 655 00:30:24,390 --> 00:30:26,400 STEPH: They just released an article in The New York Times, 656 00:30:26,400 --> 00:30:28,530 I think it was yesterday, on the hack that 657 00:30:28,530 --> 00:30:29,970 happened in England on the NHS. 658 00:30:29,970 --> 00:30:34,440 And they were wondering, what is the role 659 00:30:34,440 --> 00:30:36,450 of Microsoft in all of this? 660 00:30:36,450 --> 00:30:37,836 Should they be to blame? 661 00:30:37,836 --> 00:30:40,360 So, a question. 662 00:30:40,360 --> 00:30:43,060 LILY: Well, the whole time I was watching this, Mindell, 663 00:30:43,060 --> 00:30:44,420 he's brilliant, obviously. 664 00:30:44,420 --> 00:30:46,288 But I also think that he-- 665 00:30:46,288 --> 00:30:47,830 I couldn't help but think that he was 666 00:30:47,830 --> 00:30:49,090 being a little hypocritical. 667 00:30:49,090 --> 00:30:52,890 Because I think his whole message and premise was 668 00:30:52,890 --> 00:30:55,510 don't worry about job displacement 669 00:30:55,510 --> 00:30:59,530 by autonomy or robotics, autonomous robots. 670 00:30:59,530 --> 00:31:02,050 It'll never happen, because the symbiosis of humans 671 00:31:02,050 --> 00:31:04,830 and machines is more powerful than the machines alone 672 00:31:04,830 --> 00:31:06,190 or the humans alone. 673 00:31:06,190 --> 00:31:11,110 But I don't think that we can deny that certain jobs have 674 00:31:11,110 --> 00:31:14,890 been displaced or replaced by automated machines, 675 00:31:14,890 --> 00:31:16,150 robotics, et cetera. 676 00:31:16,150 --> 00:31:20,215 And although more jobs can be created in the future 677 00:31:20,215 --> 00:31:24,430 or as a result of improved technologies, 678 00:31:24,430 --> 00:31:27,040 we can't really get around the fact, I think, 679 00:31:27,040 --> 00:31:31,183 that the people who do lose those jobs often have families. 680 00:31:31,183 --> 00:31:33,100 And then, as we've been talking earlier today, 681 00:31:33,100 --> 00:31:34,308 then they're out of the game. 682 00:31:34,308 --> 00:31:36,220 They don't go back for a four-year-- 683 00:31:36,220 --> 00:31:37,810 an advanced degree-- because, oh, 684 00:31:37,810 --> 00:31:41,860 now I need a PhD or a master's degree. 685 00:31:41,860 --> 00:31:47,110 They're often permanently unemployed or displaced 686 00:31:47,110 --> 00:31:49,330 into other service sectors. 687 00:31:49,330 --> 00:31:52,618 So, yeah, David Mindell, you're right. 688 00:31:52,618 --> 00:31:54,910 We're probably never going to be completely autonomous. 689 00:31:54,910 --> 00:31:56,539 But then again, there are jobs-- like, 690 00:31:56,539 --> 00:31:57,956 you can't deny the fact that there 691 00:31:57,956 --> 00:31:59,081 are jobs that are replaced. 692 00:31:59,081 --> 00:32:01,900 CHLOE: I had the exact same feeling. 693 00:32:01,900 --> 00:32:06,100 I wouldn't necessarily say hypocritical, but maybe biased. 694 00:32:06,100 --> 00:32:07,850 LILY: He's never going to be out of a job. 695 00:32:07,850 --> 00:32:09,100 He's the person who invents these. 696 00:32:09,100 --> 00:32:10,930 Like, yeah, of course, he's the one who's 697 00:32:10,930 --> 00:32:12,910 staying ahead of the technological race, 698 00:32:12,910 --> 00:32:14,960 because he is the technological race, you know? 699 00:32:14,960 --> 00:32:15,460 [LAUGHS] 700 00:32:15,460 --> 00:32:18,490 CHLOE: But the symbiosis is very real for the scientists 701 00:32:18,490 --> 00:32:22,180 and engineers whose limits have been extended 702 00:32:22,180 --> 00:32:24,130 by these types of endeavors. 703 00:32:24,130 --> 00:32:25,505 But for the people who have never 704 00:32:25,505 --> 00:32:26,880 gotten to that level of education 705 00:32:26,880 --> 00:32:28,653 in the first place, yeah, [INAUDIBLE].. 706 00:32:28,653 --> 00:32:30,815 MAX: The thing is, he pointed out 707 00:32:30,815 --> 00:32:33,190 like, OK, yes, there are people in manufacturing jobs who 708 00:32:33,190 --> 00:32:33,970 get replaced. 709 00:32:33,970 --> 00:32:35,800 But it's similar to how-- 710 00:32:35,800 --> 00:32:38,175 I think it was one of the other readings, where they were 711 00:32:38,175 --> 00:32:41,050 saying that people who would have to light lamps 712 00:32:41,050 --> 00:32:43,510 in the street got replaced by electrical lights, 713 00:32:43,510 --> 00:32:47,080 or people who would shout news to factory workers 714 00:32:47,080 --> 00:32:48,380 were replaced by radios. 715 00:32:48,380 --> 00:32:49,865 You know, it's going to happen. 716 00:32:49,865 --> 00:32:51,240 LILY: Yeah, it's going to happen. 717 00:32:51,240 --> 00:32:52,365 But I think he's taking a-- 718 00:32:52,365 --> 00:32:53,740 MAX: It sucks, but it happens. 719 00:32:53,740 --> 00:32:54,823 That's how progress works. 720 00:32:54,823 --> 00:32:57,340 LILY: --a self-centric view. 721 00:32:57,340 --> 00:32:59,830 MAX: I don't know really what to-- what's the alternative, 722 00:32:59,830 --> 00:33:00,850 stop progressing? 723 00:33:00,850 --> 00:33:03,400 Because then other countries start progressing, 724 00:33:03,400 --> 00:33:04,780 and then you're left behind. 725 00:33:04,780 --> 00:33:07,750 And then everyone's unemployed, because now your country 726 00:33:07,750 --> 00:33:08,613 doesn't have money. 727 00:33:08,613 --> 00:33:09,280 That's not good. 728 00:33:09,280 --> 00:33:11,020 MARTIN: Yeah, [INAUDIBLE] or somebody else will. 729 00:33:11,020 --> 00:33:12,437 STEPH: [INAUDIBLE],, I just wanted 730 00:33:12,437 --> 00:33:15,028 to make a quick point about the concept of fetishization. 731 00:33:15,028 --> 00:33:15,820 MAX: Fetishization? 732 00:33:15,820 --> 00:33:16,040 STEPH: Fetishization, yeah. 733 00:33:16,040 --> 00:33:18,890 MARTIN: Technology fetishization or a product fetishization? 734 00:33:18,890 --> 00:33:22,600 STEPH: Yeah, because I think, at least in my understanding, 735 00:33:22,600 --> 00:33:24,820 fetishization in post-modern philosophy 736 00:33:24,820 --> 00:33:27,340 is the process by which you love the idea of something 737 00:33:27,340 --> 00:33:29,260 so much that you relentlessly pursue it. 738 00:33:29,260 --> 00:33:32,570 And then you continue operating in the pursuit of that, right? 739 00:33:32,570 --> 00:33:34,600 And the question that-- 740 00:33:34,600 --> 00:33:38,210 as much as I love technology and as I'm fascinated by it, 741 00:33:38,210 --> 00:33:42,460 and as much as I someday maybe want to be an engineer, 742 00:33:42,460 --> 00:33:45,250 I ask myself, how is it that we have 743 00:33:45,250 --> 00:33:49,270 come so far in autonomous vehicular technology, 744 00:33:49,270 --> 00:33:53,130 and we can't get more public bus routes? 745 00:33:53,130 --> 00:33:55,380 It doesn't seem like a trade-off to people, but to me, 746 00:33:55,380 --> 00:33:56,890 that's an enormous concern. 747 00:33:56,890 --> 00:33:59,560 And I'm constantly looking up questions like this on Quora. 748 00:33:59,560 --> 00:34:00,910 And people are like, well, it doesn't matter, 749 00:34:00,910 --> 00:34:02,870 because they don't have to be mutually exclusive. 750 00:34:02,870 --> 00:34:04,260 WILLIAM BONVILLIAN: Well, this is the whole problem 751 00:34:04,260 --> 00:34:06,140 with change in legacy sectors. 752 00:34:06,140 --> 00:34:08,639 How do you introduce innovation in the legacy sectors? 753 00:34:08,639 --> 00:34:10,929 That's at the heart of that kind of question. 754 00:34:10,929 --> 00:34:15,670 And changing a public transportation system, 755 00:34:15,670 --> 00:34:18,699 an established sector, may, in many ways, 756 00:34:18,699 --> 00:34:21,940 be far harder than introducing the completely disruptive 757 00:34:21,940 --> 00:34:24,909 technology that's completely outside the scope 758 00:34:24,909 --> 00:34:26,453 of an existing realm. 759 00:34:26,453 --> 00:34:28,120 MAX: Also, doesn't this fit in with what 760 00:34:28,120 --> 00:34:31,918 Bill was saying before, where city driving is unbelievably 761 00:34:31,918 --> 00:34:33,460 difficult, and we probably will never 762 00:34:33,460 --> 00:34:35,400 get autonomous city driving. 763 00:34:35,400 --> 00:34:37,559 WILLIAM BONVILLIAN: Well, I'm not saying never. 764 00:34:37,559 --> 00:34:38,770 I'm just saying it's-- 765 00:34:38,770 --> 00:34:41,030 MAX: Well, it'll take a really long time. 766 00:34:41,030 --> 00:34:43,530 WILLIAM BONVILLIAN: This is more than a decade of a problem. 767 00:34:43,530 --> 00:34:49,793 STEPH: But I guess I'm curious, why do we view these-- 768 00:34:49,793 --> 00:34:51,460 I'm not going to say nearly impossible-- 769 00:34:51,460 --> 00:34:56,199 but these incredibly technically challenging problems as more 770 00:34:56,199 --> 00:34:59,620 solvable than getting more public buses on the route? 771 00:34:59,620 --> 00:35:02,380 Why do we have more persistence and resilience when it 772 00:35:02,380 --> 00:35:03,010 comes to technical innovation? 773 00:35:03,010 --> 00:35:04,750 WILLIAM BONVILLIAN: Because it's a frontier territory, 774 00:35:04,750 --> 00:35:06,580 and the barriers aren't in the way. 775 00:35:06,580 --> 00:35:07,630 MARTIN: Yeah, there's a lot more barriers 776 00:35:07,630 --> 00:35:09,150 for making a train and how much it costs. 777 00:35:09,150 --> 00:35:09,510 WILLIAM BONVILLIAN: Right. 778 00:35:09,510 --> 00:35:11,177 STEPH: But I'm not talking about trains. 779 00:35:11,177 --> 00:35:12,842 Literally, public buses. 780 00:35:12,842 --> 00:35:14,800 RASHEED: No, I think you were trying to get at, 781 00:35:14,800 --> 00:35:18,055 it's like there's no barrier to figuring out 782 00:35:18,055 --> 00:35:20,680 what the new autonomous vehicle is, because there's no existing 783 00:35:20,680 --> 00:35:22,222 infrastructure on autonomous vehicles 784 00:35:22,222 --> 00:35:23,650 that I have to overcome. 785 00:35:23,650 --> 00:35:26,260 Whereas in order to get new public bus routes, 786 00:35:26,260 --> 00:35:29,170 I have to contend with the established 787 00:35:29,170 --> 00:35:31,760 system of public bus routes and kind of go through that. 788 00:35:31,760 --> 00:35:33,230 WILLIAM BONVILLIAN: Interesting. 789 00:35:33,230 --> 00:35:35,230 MARTIN: There was a start-up that was like Uber, 790 00:35:35,230 --> 00:35:37,853 but for buses, which is optimized for people's-- huh? 791 00:35:37,853 --> 00:35:38,520 AUDIENCE: BRIDJ? 792 00:35:38,520 --> 00:35:38,930 MARTIN: Something like that. 793 00:35:38,930 --> 00:35:40,555 AUDIENCE: I think they just went under. 794 00:35:40,555 --> 00:35:43,930 MARTIN: Yeah, it's just a lot of the financing-- 795 00:35:43,930 --> 00:35:48,270 the economic cycle doesn't churn continuously. 796 00:35:48,270 --> 00:35:49,390 It needs to get spun. 797 00:35:49,390 --> 00:35:50,370 But a quick point-- 798 00:35:50,370 --> 00:35:51,470 I think you've brought a really good point 799 00:35:51,470 --> 00:35:53,720 in terms of the people writing these papers. 800 00:35:53,720 --> 00:35:56,670 I was at a talk by a diplomat on Trump in Mexico. 801 00:35:56,670 --> 00:35:58,045 And he had this really funny term 802 00:35:58,045 --> 00:36:00,400 where he's like, yeah, it's kind of interesting to read 803 00:36:00,400 --> 00:36:02,317 a lot of these papers, because a lot of them-- 804 00:36:02,317 --> 00:36:04,090 or like when tech billionaires talk about, 805 00:36:04,090 --> 00:36:06,640 yeah, we're going to do this and help the [INAUDIBLE] go 806 00:36:06,640 --> 00:36:08,500 to their dinner, he's like, yeah, they're 807 00:36:08,500 --> 00:36:10,600 kind of like limousine liberals, which 808 00:36:10,600 --> 00:36:12,035 I thought was a really funny term, 809 00:36:12,035 --> 00:36:14,410 not to criticize liberals. 810 00:36:14,410 --> 00:36:17,020 But I think the people that read 10,000-- there's also 811 00:36:17,020 --> 00:36:21,760 a Chinese saying, which is, much better to walk 10,000 miles 812 00:36:21,760 --> 00:36:25,213 and talk to the people on the way than to read 10,000 books, 813 00:36:25,213 --> 00:36:26,380 that I think is really true. 814 00:36:26,380 --> 00:36:27,330 Where it's like-- 815 00:36:27,330 --> 00:36:28,760 WILLIAM BONVILLIAN: Are you familiar with that one, Luyao? 816 00:36:28,760 --> 00:36:29,260 Good. 817 00:36:29,260 --> 00:36:30,625 [LAUGHTER] 818 00:36:30,625 --> 00:36:31,500 I wanted to certify-- 819 00:36:31,500 --> 00:36:33,920 [INTERPOSING VOICES] 820 00:36:34,420 --> 00:36:35,850 LILY: Verified! 821 00:36:35,850 --> 00:36:36,790 WILLIAM BONVILLIAN: It's a great line, Martin. 822 00:36:36,790 --> 00:36:39,332 I wanted to use it myself, but I wanted to validate it first. 823 00:36:39,332 --> 00:36:39,890 [LAUGHTER] 824 00:36:39,890 --> 00:36:40,990 MARTIN: I liked the limousine liberals. 825 00:36:40,990 --> 00:36:42,190 And I was like, yeah, that was pretty funny. 826 00:36:42,190 --> 00:36:42,910 WILLIAM BONVILLIAN: That's an old term. 827 00:36:42,910 --> 00:36:44,160 That term's been around for [INAUDIBLE].. 828 00:36:44,160 --> 00:36:44,290 MARTIN: I didn't know. 829 00:36:44,290 --> 00:36:45,200 STEPH: I'd never heard of that before, 830 00:36:45,200 --> 00:36:46,510 but it makes total sense. 831 00:36:46,510 --> 00:36:50,790 MARTIN: But it does really classify the Silicon Valley 832 00:36:50,790 --> 00:36:52,180 kind of-- 833 00:36:52,180 --> 00:36:53,120 STEPH: Libertarianism? 834 00:36:53,120 --> 00:36:54,470 MARTIN: Yeah. 835 00:36:54,470 --> 00:36:56,610 But I think the big issue is we really 836 00:36:56,610 --> 00:36:59,140 need to start talking to the people on the street 837 00:36:59,140 --> 00:37:01,390 and what they're thinking-- 838 00:37:01,390 --> 00:37:04,385 what they think about the issue, rather than just like, 839 00:37:04,385 --> 00:37:05,380 this is what could be. 840 00:37:05,380 --> 00:37:07,000 Also, put it into practice, because in practice it could 841 00:37:07,000 --> 00:37:08,360 be a lot of different things. 842 00:37:08,360 --> 00:37:10,720 And most people just want to do good. 843 00:37:10,720 --> 00:37:12,150 And sometimes their words-- 844 00:37:12,150 --> 00:37:14,110 I'm clouded by the way I view those words. 845 00:37:14,110 --> 00:37:16,120 You know what I mean? 846 00:37:16,120 --> 00:37:18,970 STEPH: I'm just really perplexed by the fact 847 00:37:18,970 --> 00:37:22,220 that people will spend 50 years trying to develop a technology. 848 00:37:22,220 --> 00:37:24,550 But in two months or two years, you 849 00:37:24,550 --> 00:37:28,000 can't increase support for public transit. 850 00:37:28,000 --> 00:37:30,625 That, I think, is to me, what's insane about the public sector. 851 00:37:30,625 --> 00:37:32,083 AUDIENCE: Maybe [INAUDIBLE] because 852 00:37:32,083 --> 00:37:33,250 of the nature of the good. 853 00:37:33,250 --> 00:37:35,050 One is public good and the other one 854 00:37:35,050 --> 00:37:42,365 is marketable, tradable, private goods, and there's profit. 855 00:37:42,365 --> 00:37:44,740 CHLOE: Also, to tie it back to one of your earlier points 856 00:37:44,740 --> 00:37:48,280 about how we have these Snapchat-esque companies, 857 00:37:48,280 --> 00:37:51,520 Facebook-esque companies, before we have the big players. 858 00:37:51,520 --> 00:37:53,440 And also because it's easier maybe now 859 00:37:53,440 --> 00:37:56,740 for a group of 20 to 30 scientists 860 00:37:56,740 --> 00:37:58,690 and engineers to make a Mars rover 861 00:37:58,690 --> 00:38:01,620 and have that to be a successful mission-- 862 00:38:01,620 --> 00:38:04,500 to have a lot of little things like that, 863 00:38:04,500 --> 00:38:07,730 where they can define how they want their innovation 864 00:38:07,730 --> 00:38:12,180 to operate before we can then take that infrastructure that 865 00:38:12,180 --> 00:38:14,740 is established by those pioneers and apply it 866 00:38:14,740 --> 00:38:15,967 to public good work. 867 00:38:15,967 --> 00:38:17,800 WILLIAM BONVILLIAN: You know, interestingly, 868 00:38:17,800 --> 00:38:21,700 Sanjay Sarma makes the argument-- of MIT-- 869 00:38:21,700 --> 00:38:25,900 he makes the argument that we're not 870 00:38:25,900 --> 00:38:29,410 going to get to anything resembling autonomous vehicles 871 00:38:29,410 --> 00:38:33,340 until we integrate the systems we're developing 872 00:38:33,340 --> 00:38:34,690 into the infrastructure itself. 873 00:38:34,690 --> 00:38:39,490 In other words, suppose rather than trying to make the vehicle 874 00:38:39,490 --> 00:38:42,640 entirely independent of everything that's around it-- 875 00:38:42,640 --> 00:38:45,730 people, everything that's around-- 876 00:38:45,730 --> 00:38:48,580 suppose we integrated the vehicle into a new set 877 00:38:48,580 --> 00:38:52,340 of smart infrastructure. 878 00:38:52,340 --> 00:38:58,480 So in effect, there are rails for every car, right? 879 00:38:58,480 --> 00:39:01,120 They just happen to be cyber rails. 880 00:39:01,120 --> 00:39:05,750 And that would simplify all kinds of things in the autonomy 881 00:39:05,750 --> 00:39:07,265 project. 882 00:39:07,265 --> 00:39:09,140 LILY: Or like parking in a parking structure, 883 00:39:09,140 --> 00:39:12,897 where you just get to a dock and the parking structure 884 00:39:12,897 --> 00:39:13,480 deals with it. 885 00:39:13,480 --> 00:39:13,546 WILLIAM BONVILLIAN: Right. 886 00:39:13,546 --> 00:39:15,520 So in other words, making the infrastructure 887 00:39:15,520 --> 00:39:19,600 smart in parallel with making the vehicle smart 888 00:39:19,600 --> 00:39:23,890 radically resolves a lot of the problems here. 889 00:39:23,890 --> 00:39:25,810 And yet, we haven't even thought about that. 890 00:39:25,810 --> 00:39:29,850 We're so busy embarked on making the vehicle totally independent 891 00:39:29,850 --> 00:39:32,350 that we haven't even thought about a much more logical path, 892 00:39:32,350 --> 00:39:35,262 he would argue, which is to upgrade 893 00:39:35,262 --> 00:39:36,970 the infrastructure at the same time we're 894 00:39:36,970 --> 00:39:38,390 upgrading the vehicles. 895 00:39:38,390 --> 00:39:42,520 And he argues, what's going to happen to people? 896 00:39:42,520 --> 00:39:44,650 Well, you've got a massive infrastructure 897 00:39:44,650 --> 00:39:48,400 upgrade, which is going to be heavy employment focused. 898 00:39:48,400 --> 00:39:49,820 It's a significant kind of offset. 899 00:39:49,820 --> 00:39:51,700 And we have a thought about doing that. 900 00:39:51,700 --> 00:39:54,997 And that gets into, in a way, some of the issues 901 00:39:54,997 --> 00:39:56,080 that you're talking about. 902 00:39:56,080 --> 00:39:56,470 STEPH: Yeah. 903 00:39:56,470 --> 00:39:58,780 And what's interesting is that in the article that I conducted 904 00:39:58,780 --> 00:40:00,730 with the co-founder of New Urban Mechanics, 905 00:40:00,730 --> 00:40:04,810 he said that the firms that he has talked to, and NuTonomy 906 00:40:04,810 --> 00:40:07,270 specifically-- not to, so to speak, 907 00:40:07,270 --> 00:40:08,623 throw them under the bus-- 908 00:40:08,623 --> 00:40:10,540 they don't think that those are considerations 909 00:40:10,540 --> 00:40:11,635 that fall in their domain. 910 00:40:11,635 --> 00:40:13,510 They think that the market is, at some point, 911 00:40:13,510 --> 00:40:15,130 going to decide these things for them, 912 00:40:15,130 --> 00:40:17,470 and so they should just worry on fixing the technology. 913 00:40:17,470 --> 00:40:19,797 But if there's anything that this class is teaching 914 00:40:19,797 --> 00:40:21,880 us is that we need to include these considerations 915 00:40:21,880 --> 00:40:23,800 within the research and development process, 916 00:40:23,800 --> 00:40:26,400 even at the basic or early stages. 917 00:40:26,400 --> 00:40:27,670 WILLIAM BONVILLIAN: Right. 918 00:40:27,670 --> 00:40:30,610 So, Sanam, how about a closing thought on David Mindell. 919 00:40:30,610 --> 00:40:32,120 We've had a robust discussion. 920 00:40:32,120 --> 00:40:33,460 Thank you all. 921 00:40:33,460 --> 00:40:35,410 I think everybody got into it this time, too. 922 00:40:35,410 --> 00:40:36,910 MAX: I love the under the bus thing. 923 00:40:36,910 --> 00:40:37,982 [LAUGHTER] 924 00:40:38,710 --> 00:40:40,960 SANAM: Yeah, I think it's interesting that this debate 925 00:40:40,960 --> 00:40:44,260 about autonomous, semi-autonomous technology 926 00:40:44,260 --> 00:40:46,923 raises some really philosophical questions, first, 927 00:40:46,923 --> 00:40:48,340 about what we were talking about-- 928 00:40:48,340 --> 00:40:51,880 humans trust and agency when it comes to technology. 929 00:40:51,880 --> 00:40:54,580 And then also about what the goals 930 00:40:54,580 --> 00:40:57,490 should be of when we talk about integrating technology 931 00:40:57,490 --> 00:41:02,620 into deeper and deeper parts of life, what should the goals be? 932 00:41:02,620 --> 00:41:04,550 How much should there be intervention on? 933 00:41:04,550 --> 00:41:07,300 And how could we improve the overall infrastructure? 934 00:41:07,300 --> 00:41:12,715 So I think Mindell, at some point, 935 00:41:12,715 --> 00:41:14,215 talked about how this is engineering 936 00:41:14,215 --> 00:41:15,760 at its philosophical best. 937 00:41:15,760 --> 00:41:17,438 So I think that's an interesting point. 938 00:41:17,438 --> 00:41:18,730 WILLIAM BONVILLIAN: Good point. 939 00:41:18,730 --> 00:41:20,880 Good closing points, Sanam. 940 00:41:20,880 --> 00:41:26,728 All right, one more to go, and you're back to me. 941 00:41:26,728 --> 00:41:28,270 RASHEED: You saved the best for last? 942 00:41:28,270 --> 00:41:30,190 [LAUGHTER] 943 00:41:30,190 --> 00:41:32,403 WILLIAM BONVILLIAN: You could be the judge, Rasheed. 944 00:41:32,403 --> 00:41:33,820 MARTIN: That was such a good line. 945 00:41:33,820 --> 00:41:34,816 [LAUGHTER] 946 00:41:34,816 --> 00:41:37,900 WILLIAM BONVILLIAN: But that is the cover of my next book 947 00:41:37,900 --> 00:41:39,100 written with Peter Singer. 948 00:41:39,100 --> 00:41:40,400 These are both Peter and I out hiking. 949 00:41:40,400 --> 00:41:40,976 MARTIN: Nice pictures too, though. 950 00:41:40,976 --> 00:41:42,008 [LAUGHTER] 951 00:41:42,008 --> 00:41:44,550 RASHEED: So he took one of you, and then you took one of him? 952 00:41:44,550 --> 00:41:46,470 WILLIAM BONVILLIAN: No, I think his mother took that picture. 953 00:41:46,470 --> 00:41:46,870 MARTIN: Their new jackets. 954 00:41:46,870 --> 00:41:49,420 WILLIAM BONVILLIAN: My wife took that picture of me. 955 00:41:49,420 --> 00:41:52,660 But this is, as you saw, a chapter that-- 956 00:41:52,660 --> 00:41:54,900 it's actually going through a far amount of revisions 957 00:41:54,900 --> 00:41:56,200 from the version you've got. 958 00:41:56,200 --> 00:42:01,330 But it's an attempt to kind of wrestle with this movement 959 00:42:01,330 --> 00:42:02,950 here around the future of work. 960 00:42:05,620 --> 00:42:08,680 And this concept that people are going 961 00:42:08,680 --> 00:42:10,720 to get displaced by automation, this 962 00:42:10,720 --> 00:42:12,940 has been with us for a long time. 963 00:42:12,940 --> 00:42:15,520 So the first major episode is really 964 00:42:15,520 --> 00:42:22,250 in Britain in 1815 around the Luddite movement, 965 00:42:22,250 --> 00:42:27,190 with weavers smashing automated looms that are putting them out 966 00:42:27,190 --> 00:42:28,990 of work. 967 00:42:28,990 --> 00:42:33,370 And eventually, a whole division of the British Army, 968 00:42:33,370 --> 00:42:36,280 like 15,000 British soldiers, get 969 00:42:36,280 --> 00:42:41,110 pulled in to put down this pretty significant movement 970 00:42:41,110 --> 00:42:42,130 in Britain. 971 00:42:42,130 --> 00:42:45,610 So that's the first episode. 972 00:42:45,610 --> 00:42:49,400 We regularly go through this debate about every 30 years, 973 00:42:49,400 --> 00:42:49,900 roughly. 974 00:42:52,540 --> 00:42:55,400 In 1950, I mentioned this before, 975 00:42:55,400 --> 00:42:57,490 but Norbert Wiener painted this very dark vision 976 00:42:57,490 --> 00:43:00,160 of computers displacing people. 977 00:43:00,160 --> 00:43:03,850 It took us a long time to evolve into Licklider's vision. 978 00:43:03,850 --> 00:43:06,340 In the 1960s, there was huge anxiety 979 00:43:06,340 --> 00:43:09,160 about workforce automation, as a number of new technologies 980 00:43:09,160 --> 00:43:11,620 are being introduced into the industrial processes. 981 00:43:14,950 --> 00:43:17,820 In 2015, that concern came right back, 982 00:43:17,820 --> 00:43:20,650 that a mix of artificial intelligence, machine learning, 983 00:43:20,650 --> 00:43:24,610 robotics was going to be very threatening. 984 00:43:24,610 --> 00:43:25,640 This is an old debate. 985 00:43:25,640 --> 00:43:29,860 So the famous economist John Maynard Keynes once famously 986 00:43:29,860 --> 00:43:32,770 wrote, "Thus we have been expressly evolved 987 00:43:32,770 --> 00:43:36,280 by nature with all our impulses and deepest 988 00:43:36,280 --> 00:43:41,170 instincts for the purpose of solving the economic problem. 989 00:43:41,170 --> 00:43:43,480 If the economic problem is solved, 990 00:43:43,480 --> 00:43:47,660 making will be deprived of its traditional purpose." 991 00:43:47,660 --> 00:43:49,400 In other words, our lives are organized 992 00:43:49,400 --> 00:43:53,818 around our work in many, many important kinds of ways. 993 00:43:53,818 --> 00:43:55,360 And that's how our lives get meaning. 994 00:43:55,360 --> 00:44:01,150 And if we're blowing up the work model, what's going to happen? 995 00:44:01,150 --> 00:44:03,370 But the important point to realize here 996 00:44:03,370 --> 00:44:06,450 is, there's no sign of that yet. 997 00:44:06,450 --> 00:44:09,220 The American workweek is 47 hours at this point. 998 00:44:09,220 --> 00:44:12,820 So we're some distance away from this. 999 00:44:12,820 --> 00:44:15,370 We have time to reflect on this, if we ever indeed 1000 00:44:15,370 --> 00:44:19,750 come close to that kind of level of realization. 1001 00:44:19,750 --> 00:44:22,120 I just wanted to expose you to this famous Keynes point, 1002 00:44:22,120 --> 00:44:24,250 because Keynes is thinking about this 1003 00:44:24,250 --> 00:44:26,350 in the midst of the Depression. 1004 00:44:26,350 --> 00:44:30,150 He's thinking about the whole future of work. 1005 00:44:30,150 --> 00:44:33,440 So the backdrop that we're facing now 1006 00:44:33,440 --> 00:44:35,540 is significant work disruption. 1007 00:44:35,540 --> 00:44:38,330 And we've talked a lot about that today. 1008 00:44:38,330 --> 00:44:40,670 Half of the manufacturing jobs, as we talked back 1009 00:44:40,670 --> 00:44:44,690 in class number three, were lost between 2000 and 2010. 1010 00:44:44,690 --> 00:44:49,710 And we saw the data on median income decline, 1011 00:44:49,710 --> 00:44:53,330 and we talked about the barbell problem. 1012 00:44:53,330 --> 00:44:58,940 And Brynjolfsson and McAfee paint a picture 1013 00:44:58,940 --> 00:45:02,330 of technological job displacement 1014 00:45:02,330 --> 00:45:06,650 and describe the accelerating round of IT technologies 1015 00:45:06,650 --> 00:45:09,170 that are headed towards the workplace. 1016 00:45:09,170 --> 00:45:12,680 Now, when we read about advanced manufacturing, 1017 00:45:12,680 --> 00:45:14,510 the fix for the manufacturing sector 1018 00:45:14,510 --> 00:45:17,930 is full of new technologies like 3D printing 1019 00:45:17,930 --> 00:45:23,180 and digital production and advanced sensors and photonics, 1020 00:45:23,180 --> 00:45:29,390 and in advanced materials, a raft of new technologies 1021 00:45:29,390 --> 00:45:31,520 that are also going to be entering that sector 1022 00:45:31,520 --> 00:45:33,290 to make that significantly more efficient 1023 00:45:33,290 --> 00:45:34,850 and get the US back in the game. 1024 00:45:34,850 --> 00:45:37,460 That's the whole object of that exercise, right? 1025 00:45:37,460 --> 00:45:40,100 So there is a raft of new technologies 1026 00:45:40,100 --> 00:45:43,340 that is headed from the IT world and elsewhere 1027 00:45:43,340 --> 00:45:44,720 into the workplace that we're all 1028 00:45:44,720 --> 00:45:47,990 going to have to reckon with. 1029 00:45:47,990 --> 00:45:50,060 And there are technological dystopians. 1030 00:45:50,060 --> 00:45:55,700 So Martin Ford argues that there is a system of "winner take all 1031 00:45:55,700 --> 00:45:59,390 distribution" that's evolving, that the technology of software 1032 00:45:59,390 --> 00:46:03,530 towards monopoly and the ability of computers to do more than 1033 00:46:03,530 --> 00:46:06,583 they're programmed for-- deep learning, as it's called-- 1034 00:46:06,583 --> 00:46:08,000 is going to push out a whole lower 1035 00:46:08,000 --> 00:46:11,900 end, that this "winner take all distribution" 1036 00:46:11,900 --> 00:46:14,390 is going to be very disruptive in this society. 1037 00:46:14,390 --> 00:46:18,560 David Cowen argues that the country 1038 00:46:18,560 --> 00:46:21,290 will be divided by this technological advance 1039 00:46:21,290 --> 00:46:22,335 into two countries. 1040 00:46:22,335 --> 00:46:23,960 We'll have a developed world, and we'll 1041 00:46:23,960 --> 00:46:27,120 have an undeveloped world in the United States. 1042 00:46:27,120 --> 00:46:31,640 And indeed, Peter Temin of MIT, the economist, 1043 00:46:31,640 --> 00:46:36,310 has just written a book arguing that current inequality 1044 00:46:36,310 --> 00:46:38,810 in the United States has reached such a level 1045 00:46:38,810 --> 00:46:40,550 that the right comparison is to look 1046 00:46:40,550 --> 00:46:43,990 at developing world economics in a US context 1047 00:46:43,990 --> 00:46:45,350 to understand it for you. 1048 00:46:45,350 --> 00:46:49,100 Just for example, Germany, the lowest 1049 00:46:49,100 --> 00:46:56,090 20% compared to the highest 20%, it's 1 to 4. 1050 00:46:56,090 --> 00:46:57,860 In the United States, the lowest 20% 1051 00:46:57,860 --> 00:47:02,660 of the population income to the highest 20% is now 8 to 1. 1052 00:47:02,660 --> 00:47:05,570 That's pretty dramatic income inequality. 1053 00:47:05,570 --> 00:47:09,440 That's a pretty-- that's different than how the US was 1054 00:47:09,440 --> 00:47:13,850 for the previous century. 1055 00:47:13,850 --> 00:47:19,300 In their Second Machine Age book, 1056 00:47:19,300 --> 00:47:21,690 Erik and Andrew argue that the low-skilled 1057 00:47:21,690 --> 00:47:23,580 replaced the middle-skilled jobs that 1058 00:47:23,580 --> 00:47:26,220 were displaced by technology. 1059 00:47:26,220 --> 00:47:28,710 They argue for tax policy as a fix here. 1060 00:47:28,710 --> 00:47:32,850 Although, as we've discussed and Rasheed led us into it, 1061 00:47:32,850 --> 00:47:36,850 that politically is a very tough proposition. 1062 00:47:36,850 --> 00:47:39,960 There are studies that project very high levels 1063 00:47:39,960 --> 00:47:41,860 of technological job displacement. 1064 00:47:41,860 --> 00:47:45,450 So out of Oxford, Frey and Osborne 1065 00:47:45,450 --> 00:47:47,640 look at occupational descriptions 1066 00:47:47,640 --> 00:47:51,840 and conclude, oh, 47% of all US jobs 1067 00:47:51,840 --> 00:47:56,790 have high likelihood of being replaced by automation. 1068 00:47:56,790 --> 00:48:00,870 Rob Atkinson at ITIF has given a very strong critique 1069 00:48:00,870 --> 00:48:02,550 of that study. 1070 00:48:02,550 --> 00:48:08,280 He argues that it assumes a highly unlikely 3% 1071 00:48:08,280 --> 00:48:11,887 labor productivity rate-- the advent of all this automation. 1072 00:48:11,887 --> 00:48:13,720 We haven't seen that since the 19th century. 1073 00:48:13,720 --> 00:48:19,050 So don't expect it to happen anytime soon, Atkinson argues. 1074 00:48:19,050 --> 00:48:22,950 They also argue that Frey and Osborne engage 1075 00:48:22,950 --> 00:48:25,290 in a lump of labor fallacy. 1076 00:48:25,290 --> 00:48:30,630 In other words, they assume that there's a fixed amount of work 1077 00:48:30,630 --> 00:48:33,300 and that the amount of work doesn't grow spurred 1078 00:48:33,300 --> 00:48:34,440 by these new technologies. 1079 00:48:34,440 --> 00:48:36,905 That's what David Autor led us to understand better, 1080 00:48:36,905 --> 00:48:38,030 this whole complementarity. 1081 00:48:38,030 --> 00:48:41,430 In other words, there isn't a fixed amount of work. 1082 00:48:41,430 --> 00:48:44,670 The complementarity can create more, 1083 00:48:44,670 --> 00:48:47,520 so that there are larger net potential gains here 1084 00:48:47,520 --> 00:48:50,290 across the economy. 1085 00:48:50,290 --> 00:48:53,350 The most realistic study of technological displacement 1086 00:48:53,350 --> 00:48:57,480 so far has been by the OECD done last summer. 1087 00:48:57,480 --> 00:49:01,200 They looked at 22 OECD nations. 1088 00:49:01,200 --> 00:49:06,240 And rather than look at occupational descriptions, 1089 00:49:06,240 --> 00:49:10,440 they decided, let's go talk to the workers that are actually 1090 00:49:10,440 --> 00:49:11,580 doing those jobs. 1091 00:49:11,580 --> 00:49:16,230 And this went through 22 different OECD countries. 1092 00:49:16,230 --> 00:49:18,690 They actually went out and talked to the workforce. 1093 00:49:18,690 --> 00:49:20,600 And they would talk to people about, 1094 00:49:20,600 --> 00:49:22,210 you may have a particular job title, 1095 00:49:22,210 --> 00:49:24,030 but what are you actually doing? 1096 00:49:24,030 --> 00:49:25,890 What does your work actually consistent of? 1097 00:49:25,890 --> 00:49:29,850 And of course, it turns out, and we all surmise this in a way, 1098 00:49:29,850 --> 00:49:32,250 people are doing many things that aren't necessarily 1099 00:49:32,250 --> 00:49:34,170 captioned in their job title. 1100 00:49:34,170 --> 00:49:37,740 Heaven forbid if somebody held me to my job title. 1101 00:49:37,740 --> 00:49:39,730 Nothing would ever get done. 1102 00:49:39,730 --> 00:49:41,940 And we all know this, right? 1103 00:49:41,940 --> 00:49:44,140 And that's what the OECD found-- people 1104 00:49:44,140 --> 00:49:47,010 are doing much more stuff than their particularly narrow job 1105 00:49:47,010 --> 00:49:49,440 description may say that they're doing. 1106 00:49:49,440 --> 00:49:52,710 So they concluded over an extended period of time, 1107 00:49:52,710 --> 00:49:55,680 across those developed countries, 1108 00:49:55,680 --> 00:49:58,830 the effect of technological displacement was maybe 9%. 1109 00:49:58,830 --> 00:50:00,420 Now, that's a significant number, 1110 00:50:00,420 --> 00:50:03,120 even over an extended period of time. 1111 00:50:03,120 --> 00:50:04,800 That's a significant number. 1112 00:50:04,800 --> 00:50:08,520 But it's not 47%. 1113 00:50:08,520 --> 00:50:10,230 In the United States, the number was 10%. 1114 00:50:10,230 --> 00:50:12,245 The range was 6% to 12%, depending 1115 00:50:12,245 --> 00:50:14,370 on what kind of employment you had in your country. 1116 00:50:17,310 --> 00:50:20,010 That means we have a job ahead of us, 1117 00:50:20,010 --> 00:50:22,530 but it's not a nightmare. 1118 00:50:22,530 --> 00:50:26,070 It's not something we should be sleepless over every night 1119 00:50:26,070 --> 00:50:28,170 from now on. 1120 00:50:28,170 --> 00:50:31,980 There's a recent effort on-- this is not in the book, 1121 00:50:31,980 --> 00:50:32,700 in the chapter-- 1122 00:50:32,700 --> 00:50:36,390 but Daron Acemoglu, another very noted MIT economist 1123 00:50:36,390 --> 00:50:39,180 who does a lot of work with David Autor 1124 00:50:39,180 --> 00:50:41,370 and is a real student of innovation, 1125 00:50:41,370 --> 00:50:46,680 he's a growth economist, Daron looked at robotics. 1126 00:50:46,680 --> 00:50:48,900 And he looked at job displacement 1127 00:50:48,900 --> 00:50:49,920 in the robotics sector. 1128 00:50:49,920 --> 00:50:51,930 And he had a pretty useful-- 1129 00:50:51,930 --> 00:50:54,390 had a good database that he was working from. 1130 00:50:54,390 --> 00:51:02,500 He concluded that over a 17-year period ending in 1977, 1131 00:51:02,500 --> 00:51:04,510 a total for the entire United States 1132 00:51:04,510 --> 00:51:08,680 economy of technological job displacement caused by robotics 1133 00:51:08,680 --> 00:51:12,100 was somewhere between 300,000 and 600,000 jobs 1134 00:51:12,100 --> 00:51:14,020 over a 17-year period. 1135 00:51:14,020 --> 00:51:16,600 That is not a big number. 1136 00:51:16,600 --> 00:51:18,580 Job churn in the United States per week 1137 00:51:18,580 --> 00:51:23,290 is something like 75,000 jobs per week. 1138 00:51:23,290 --> 00:51:25,190 Look at it in that kind of context. 1139 00:51:25,190 --> 00:51:29,170 So these changes are going to be significant. 1140 00:51:29,170 --> 00:51:32,320 And I would argue they will occur. 1141 00:51:32,320 --> 00:51:36,220 But we may have a time period that we're able to look at them 1142 00:51:36,220 --> 00:51:40,240 and think and plan and act rather than being completely 1143 00:51:40,240 --> 00:51:42,430 disruptive overnight. 1144 00:51:42,430 --> 00:51:48,280 So there's a quote by the vice president 1145 00:51:48,280 --> 00:51:52,960 of the Danish Conference of Trade Unions named 1146 00:51:52,960 --> 00:51:55,780 Nauna Hejlund. 1147 00:51:55,780 --> 00:51:59,080 Her comment at an OECD conference 1148 00:51:59,080 --> 00:52:03,550 I went to last fall was, "New technology 1149 00:52:03,550 --> 00:52:07,270 is not the enemy of workers, old technology is." 1150 00:52:07,270 --> 00:52:11,500 And what she's saying here is that the companies most 1151 00:52:11,500 --> 00:52:14,950 likely to fail, where all jobs will be lost, 1152 00:52:14,950 --> 00:52:18,520 are those who are not keeping up with the technology. 1153 00:52:18,520 --> 00:52:22,150 That workers have a stake in keeping their firms current 1154 00:52:22,150 --> 00:52:23,380 with current realities. 1155 00:52:29,970 --> 00:52:32,440 And here's a chart that just shows 1156 00:52:32,440 --> 00:52:34,000 the rate at which robots are being 1157 00:52:34,000 --> 00:52:35,500 installed into production. 1158 00:52:35,500 --> 00:52:38,680 It's actually slowed fairly significantly. 1159 00:52:38,680 --> 00:52:40,390 Now, look, we're in a process of moving 1160 00:52:40,390 --> 00:52:43,690 from old-style, industrial robotics, which 1161 00:52:43,690 --> 00:52:48,210 used to weigh tons and they were fixed in place 1162 00:52:48,210 --> 00:52:49,960 and they had to be fenced off, because you 1163 00:52:49,960 --> 00:52:52,480 didn't want to be anywhere near them, because they were unsafe. 1164 00:52:52,480 --> 00:52:53,938 And they would do the perfect weld, 1165 00:52:53,938 --> 00:52:56,967 and that's all they could do, on an oil line. 1166 00:52:56,967 --> 00:52:59,050 We're moving to a whole new generation of robotics 1167 00:52:59,050 --> 00:53:00,700 that are much more flexible, that 1168 00:53:00,700 --> 00:53:04,770 are subject to voice command, that 1169 00:53:04,770 --> 00:53:06,625 are a radically different kind of robotics 1170 00:53:06,625 --> 00:53:08,000 than the old industrial robotics. 1171 00:53:08,000 --> 00:53:12,070 So some of this data may reflect the fact 1172 00:53:12,070 --> 00:53:14,080 that that new generation of robotics 1173 00:53:14,080 --> 00:53:17,350 is only now just starting to be thought about and put together. 1174 00:53:17,350 --> 00:53:17,850 Max? 1175 00:53:17,850 --> 00:53:19,392 MAX: Do you think that this has to do 1176 00:53:19,392 --> 00:53:21,190 with some of the anxieties that workers 1177 00:53:21,190 --> 00:53:23,620 have expressed around being replaced by robotics? 1178 00:53:23,620 --> 00:53:26,185 Like there might be some sort of correlation, relation there? 1179 00:53:26,185 --> 00:53:27,810 WILLIAM BONVILLIAN: This slowdown rate? 1180 00:53:27,810 --> 00:53:28,485 MAX: Yeah. 1181 00:53:28,485 --> 00:53:29,860 WILLIAM BONVILLIAN: I don't know. 1182 00:53:29,860 --> 00:53:32,410 I don't think so, because we don't have 1183 00:53:32,410 --> 00:53:34,360 an organized workforce anymore. 1184 00:53:34,360 --> 00:53:35,860 We're down to about 11% unionization 1185 00:53:35,860 --> 00:53:37,630 in the United States. 1186 00:53:37,630 --> 00:53:38,170 MAX: Really? 1187 00:53:38,170 --> 00:53:39,170 WILLIAM BONVILLIAN: Yes. 1188 00:53:39,170 --> 00:53:43,390 So workers don't have a lot to say at this point in the US 1189 00:53:43,390 --> 00:53:44,638 about a lot of these changes. 1190 00:53:44,638 --> 00:53:46,930 They don't have a lot of control over their employment, 1191 00:53:46,930 --> 00:53:52,550 frankly, which is another set of issues. 1192 00:53:52,550 --> 00:53:58,840 But the chapter argues that the most immediate 1193 00:53:58,840 --> 00:54:04,900 near-term problem is not technological displacement. 1194 00:54:04,900 --> 00:54:07,000 That's going to be a longer-term problem. 1195 00:54:07,000 --> 00:54:10,600 The most immediate near-term problem 1196 00:54:10,600 --> 00:54:14,440 is what economists call "secular stagnation." 1197 00:54:14,440 --> 00:54:16,240 And that's the problem we're in now. 1198 00:54:16,240 --> 00:54:21,580 And this term was developed by Larry Summers in 2013 1199 00:54:21,580 --> 00:54:24,100 to essentially describe this situation 1200 00:54:24,100 --> 00:54:26,680 that we've got at the moment-- 1201 00:54:26,680 --> 00:54:29,260 interest rates around zero and US output 1202 00:54:29,260 --> 00:54:31,600 is insufficient to support full employment. 1203 00:54:31,600 --> 00:54:32,600 He's running in 2013. 1204 00:54:32,600 --> 00:54:33,850 It's gotten better since then. 1205 00:54:33,850 --> 00:54:36,370 But we do have this structural unemployment, 1206 00:54:36,370 --> 00:54:40,330 where a significant number of people who had jobs no longer 1207 00:54:40,330 --> 00:54:41,590 are in the workforce. 1208 00:54:41,590 --> 00:54:44,440 That's the problem that's still with us. 1209 00:54:44,440 --> 00:54:46,510 He argues that there are a series of factors 1210 00:54:46,510 --> 00:54:49,030 that affect this, that are hampering the investment 1211 00:54:49,030 --> 00:54:50,020 demand. 1212 00:54:50,020 --> 00:54:52,630 So decreasing population growth-- 1213 00:54:52,630 --> 00:54:55,660 we can all see that-- a relative decrease in the cost of capital 1214 00:54:55,660 --> 00:54:58,450 goods, excess money being retained and not used 1215 00:54:58,450 --> 00:55:00,610 by large corporations-- so retained earnings 1216 00:55:00,610 --> 00:55:03,370 by large companies is very high at this point. 1217 00:55:03,370 --> 00:55:06,520 On the savings side, there's, he would argue, 1218 00:55:06,520 --> 00:55:10,990 excessive reserved holdings in developing countries 1219 00:55:10,990 --> 00:55:15,550 due to, in part, post-crisis financial regulation, 1220 00:55:15,550 --> 00:55:20,200 inequality is a cause here, and increasing intermediation costs 1221 00:55:20,200 --> 00:55:24,520 are all causative factors in this period of low growth. 1222 00:55:24,520 --> 00:55:28,780 We have very low GDP, and we have very low productivity. 1223 00:55:28,780 --> 00:55:32,770 So economy-wide, our productivity is about 1.12%. 1224 00:55:32,770 --> 00:55:34,265 Not a good level, right? 1225 00:55:34,265 --> 00:55:36,340 MAX: But we still have low unemployment. 1226 00:55:36,340 --> 00:55:36,590 WILLIAM BONVILLIAN: Right? 1227 00:55:36,590 --> 00:55:38,200 MAX: But we still have low unemployment. 1228 00:55:38,200 --> 00:55:40,370 WILLIAM BONVILLIAN: Yeah, we have 4.9% unemployment. 1229 00:55:40,370 --> 00:55:43,510 But go back to Rasheed's barbell example-- 1230 00:55:43,510 --> 00:55:44,980 a lot of that is the thinning out 1231 00:55:44,980 --> 00:55:48,440 of the center being pushed to lower-end services jobs. 1232 00:55:48,440 --> 00:55:51,310 RASHEED: Also, the labor force participation rate is too high. 1233 00:55:51,310 --> 00:55:52,390 WILLIAM BONVILLIAN: Yeah. 1234 00:55:52,390 --> 00:55:56,710 So we have low productivity, low capital investment, 1235 00:55:56,710 --> 00:56:00,160 and a low growth rate, growing inequality, and a declining 1236 00:56:00,160 --> 00:56:01,570 middle class. 1237 00:56:01,570 --> 00:56:04,040 These are big problems. 1238 00:56:04,040 --> 00:56:07,930 Now, low productivity rate, low capital investment, 1239 00:56:07,930 --> 00:56:13,360 that tells us that automation is not happening at the moment, 1240 00:56:13,360 --> 00:56:16,840 because automation is designed to increase productivity 1241 00:56:16,840 --> 00:56:19,060 and is based on increased capital investment. 1242 00:56:19,060 --> 00:56:20,750 We're not seeing either one of those, 1243 00:56:20,750 --> 00:56:23,680 so that tells us that we're not entering 1244 00:56:23,680 --> 00:56:27,030 a period of radical growth in automation. 1245 00:56:27,030 --> 00:56:27,530 Martin? 1246 00:56:27,530 --> 00:56:29,800 MARTIN: So is this just the US or is it global? 1247 00:56:29,800 --> 00:56:32,590 And also, is it taking into account the industrialization 1248 00:56:32,590 --> 00:56:34,096 jobs leaving? 1249 00:56:34,096 --> 00:56:35,830 WILLIAM BONVILLIAN: Yeah. 1250 00:56:35,830 --> 00:56:37,540 It does take into account the decline 1251 00:56:37,540 --> 00:56:39,650 in manufacturing employment. 1252 00:56:39,650 --> 00:56:41,830 This is a US picture that Summers is painting, 1253 00:56:41,830 --> 00:56:44,500 and he's going to propose a US solution in a second, which is 1254 00:56:44,500 --> 00:56:46,720 a major infrastructure package. 1255 00:56:46,720 --> 00:56:51,910 But the secular stagnation is a phenomena now 1256 00:56:51,910 --> 00:56:55,300 in European economies as well, and in the Japanese economy. 1257 00:56:55,300 --> 00:56:58,420 So this is a worldwide phenomena of the developed 1258 00:56:58,420 --> 00:57:00,040 world at the moment. 1259 00:57:00,040 --> 00:57:02,680 And it's a problem that you all have. 1260 00:57:02,680 --> 00:57:05,440 You're going to have this future work problem for sure, 1261 00:57:05,440 --> 00:57:06,890 but now you've got this one. 1262 00:57:06,890 --> 00:57:09,400 AUDIENCE: Do we also have a aging workforce? 1263 00:57:09,400 --> 00:57:11,560 WILLIAM BONVILLIAN: Yes, aging workforce, right. 1264 00:57:11,560 --> 00:57:14,030 And that's a powerful part of this story, too. 1265 00:57:14,030 --> 00:57:14,530 OK? 1266 00:57:14,530 --> 00:57:23,160 So if you have a population increase, a 1% year population 1267 00:57:23,160 --> 00:57:27,120 increase, that guarantees you one additional percentage 1268 00:57:27,120 --> 00:57:29,130 point in your growth rate. 1269 00:57:29,130 --> 00:57:30,810 If you have zero population increase, 1270 00:57:30,810 --> 00:57:34,860 you just pull one percentage of growth out of your growth rate, 1271 00:57:34,860 --> 00:57:36,450 out of your GDP rate. 1272 00:57:36,450 --> 00:57:39,200 So that's a phenomena that a lot of these developing countries 1273 00:57:39,200 --> 00:57:40,590 are starting to wrestle with. 1274 00:57:40,590 --> 00:57:42,673 And China's going to have to wrestle with this one 1275 00:57:42,673 --> 00:57:44,730 fairly soon. 1276 00:57:44,730 --> 00:57:47,580 So here's what Summers is talking 1277 00:57:47,580 --> 00:57:51,180 about in terms of excessive savings over investment. 1278 00:57:51,180 --> 00:57:53,910 Look at the split between savings and investment levels. 1279 00:57:53,910 --> 00:57:58,140 That means we're not investing our savings in the society, 1280 00:57:58,140 --> 00:57:59,640 in the economy. 1281 00:57:59,640 --> 00:58:01,380 And Summers identifies this gap. 1282 00:58:01,380 --> 00:58:06,680 RASHEED: And in 2008 it's like where? 1283 00:58:06,680 --> 00:58:08,350 WILLIAM BONVILLIAN: Right there, right? 1284 00:58:08,350 --> 00:58:10,960 Well, that's when savings falls apart. 1285 00:58:10,960 --> 00:58:14,140 But there's still a big gap in the system. 1286 00:58:14,140 --> 00:58:17,170 And investment hits then, too. 1287 00:58:17,170 --> 00:58:19,630 I want you to know of this economist named Robert Gordon, 1288 00:58:19,630 --> 00:58:22,210 because you all are hanging out at MIT, 1289 00:58:22,210 --> 00:58:25,930 and we believe that technological advance is 1290 00:58:25,930 --> 00:58:28,360 the most powerful thing ever. 1291 00:58:28,360 --> 00:58:31,420 But Robert Gordon paints a different picture. 1292 00:58:31,420 --> 00:58:33,460 And you just ought to be aware of this debate-- 1293 00:58:33,460 --> 00:58:35,500 not that he's necessarily right. 1294 00:58:35,500 --> 00:58:39,310 This was by far the best-selling book on economics 1295 00:58:39,310 --> 00:58:41,170 for the last two years. 1296 00:58:41,170 --> 00:58:43,840 And Gordon's book is called The Rise and Fall 1297 00:58:43,840 --> 00:58:44,770 of American Growth. 1298 00:58:44,770 --> 00:58:47,875 He argues that the current low-growth secular stagnation, 1299 00:58:47,875 --> 00:58:51,280 and he uses the term, is the result 1300 00:58:51,280 --> 00:58:56,680 not of insufficient demand, but of insufficient supply, i.e. 1301 00:58:56,680 --> 00:58:59,380 insufficient technological supply. 1302 00:58:59,380 --> 00:59:04,330 So he argues that the IT revolution is fading 1303 00:59:04,330 --> 00:59:07,120 and that we never got as much out of the IT revolution 1304 00:59:07,120 --> 00:59:11,140 as we did out of prior innovation waves. 1305 00:59:11,140 --> 00:59:13,870 This one just wasn't that big compared to electricity 1306 00:59:13,870 --> 00:59:16,480 and compared to railroads and some of the other big waves 1307 00:59:16,480 --> 00:59:18,250 that we've been through. 1308 00:59:18,250 --> 00:59:22,930 And that we're locked on to an innovation wave that 1309 00:59:22,930 --> 00:59:25,030 is just not, at this stage, producing 1310 00:59:25,030 --> 00:59:28,060 that much productivity growth. 1311 00:59:28,060 --> 00:59:31,990 And he goes into enormous depth about each one 1312 00:59:31,990 --> 00:59:32,930 of these technologies. 1313 00:59:32,930 --> 00:59:36,010 So a complete rarity for an economist-- he 1314 00:59:36,010 --> 00:59:39,850 actually has read into the technology literature 1315 00:59:39,850 --> 00:59:42,190 and is looking at what these intentions actually 1316 00:59:42,190 --> 00:59:44,660 were in the second half of the 19th century 1317 00:59:44,660 --> 00:59:46,370 and what they meant. 1318 00:59:46,370 --> 00:59:50,748 So it's an intriguing alternative story, 1319 00:59:50,748 --> 00:59:52,540 one that's contrary to the story that we've 1320 00:59:52,540 --> 00:59:55,280 been telling ourselves for a long period of time. 1321 00:59:55,280 --> 00:59:56,410 How do we fix this? 1322 00:59:59,440 --> 01:00:03,760 Gordon says that it involves-- fixing societal headwinds is 1323 01:00:03,760 --> 01:00:05,920 the best way out of this box. 1324 01:00:05,920 --> 01:00:10,830 And improving growth will depend on educational attainment. 1325 01:00:10,830 --> 01:00:12,580 And remember when we read Goldin and Katz, 1326 01:00:12,580 --> 01:00:17,080 we talked about how a key to the historic American growth rate 1327 01:00:17,080 --> 01:00:20,920 was that there's always a rise in increased 1328 01:00:20,920 --> 01:00:23,507 technical requirements in the economy? 1329 01:00:23,507 --> 01:00:25,090 And the genius of the American system, 1330 01:00:25,090 --> 01:00:29,140 at least till the mid-'70s, was that we kept an education curve 1331 01:00:29,140 --> 01:00:31,630 between high school, and particularly Mass college 1332 01:00:31,630 --> 01:00:34,660 education, out ahead of that, right? 1333 01:00:34,660 --> 01:00:37,720 That, in turn-- those curves are related to each other. 1334 01:00:37,720 --> 01:00:40,480 That talent base, as we know from Romer, 1335 01:00:40,480 --> 01:00:42,310 helps drive the technological base. 1336 01:00:42,310 --> 01:00:44,130 If you get a large part of your economy 1337 01:00:44,130 --> 01:00:48,040 out in prospecting or involved in supporting a technology 1338 01:00:48,040 --> 01:00:50,830 advance, the whole system will do better. 1339 01:00:50,830 --> 01:00:55,690 So when we leveled out the graduation rate at the college 1340 01:00:55,690 --> 01:01:00,790 level in the mid-'70s, we paid a big price in terms of ongoing 1341 01:01:00,790 --> 01:01:03,845 technology productivity and technology advance, as well. 1342 01:01:03,845 --> 01:01:05,470 So that's kind of what he's suggesting. 1343 01:01:05,470 --> 01:01:08,290 It's an interesting-- he doesn't spell it out quite that way. 1344 01:01:08,290 --> 01:01:10,850 But it's an interesting idea. 1345 01:01:10,850 --> 01:01:13,900 So tackling some of these underlying social problems, 1346 01:01:13,900 --> 01:01:16,390 including inequality, i.e. bring more people 1347 01:01:16,390 --> 01:01:22,090 into the economy, all of these are fixes, in his mind, 1348 01:01:22,090 --> 01:01:27,100 for getting around the fact that the IT revolution isn't 1349 01:01:27,100 --> 01:01:28,930 quite what we hoped for. 1350 01:01:28,930 --> 01:01:32,540 Now, what do I think of that? 1351 01:01:32,540 --> 01:01:34,240 You know, time will tell. 1352 01:01:34,240 --> 01:01:35,300 We're going to find out. 1353 01:01:35,300 --> 01:01:38,110 I think it may be we had the rapid rise, we're 1354 01:01:38,110 --> 01:01:40,630 on a more moderate growth pattern in the IT revolution, 1355 01:01:40,630 --> 01:01:43,240 as we talked about we talked about innovation waves. 1356 01:01:43,240 --> 01:01:45,160 And we can only expect more moderate growth 1357 01:01:45,160 --> 01:01:46,840 as a result of that. 1358 01:01:46,840 --> 01:01:49,120 But we are waiting for the next big thing, 1359 01:01:49,120 --> 01:01:50,778 for the next big innovation. 1360 01:01:53,710 --> 01:01:58,390 So Summers' solution is to throw money in infrastructure-- 1361 01:01:58,390 --> 01:02:00,010 and, well, let's do roads-- 1362 01:02:00,010 --> 01:02:01,850 and that will help that community 1363 01:02:01,850 --> 01:02:04,030 that got displaced in manufacturing jobs that 1364 01:02:04,030 --> 01:02:06,490 aren't upskilling. 1365 01:02:06,490 --> 01:02:08,120 That could help them. 1366 01:02:08,120 --> 01:02:11,880 But the problem with Summers' solution, I would argue 1367 01:02:11,880 --> 01:02:14,740 and the chapter argues, is that it doesn't do anything 1368 01:02:14,740 --> 01:02:18,670 to address productivity, which is the heart of the problem. 1369 01:02:18,670 --> 01:02:25,690 Getting that productivity number up, as we know, increases GDP. 1370 01:02:25,690 --> 01:02:28,030 It's a key causative factor behind the growth rate. 1371 01:02:28,030 --> 01:02:30,100 And let's get that growth rate up 1372 01:02:30,100 --> 01:02:32,590 and overall economic well-being will improve, 1373 01:02:32,590 --> 01:02:34,590 and you'll have more money to distribute, right? 1374 01:02:34,590 --> 01:02:36,730 Let's at least get that right. 1375 01:02:36,730 --> 01:02:38,980 But an infrastructure program that's 1376 01:02:38,980 --> 01:02:43,390 aimed at roads and sewers, we already have our road system. 1377 01:02:43,390 --> 01:02:45,730 We have 90,000 miles of interstate highway. 1378 01:02:45,730 --> 01:02:48,700 We're not going to get, by repairing it, 1379 01:02:48,700 --> 01:02:52,000 we're not going to get the innovation wave boost 1380 01:02:52,000 --> 01:02:54,490 that the combination of building that interstate system 1381 01:02:54,490 --> 01:02:57,160 and internal combustion engines got us back 1382 01:02:57,160 --> 01:02:58,780 in the '40s and '50s. 1383 01:02:58,780 --> 01:03:00,700 We're just not going to get that out of it. 1384 01:03:00,700 --> 01:03:04,260 So Summers' prescription of just doing infrastructure 1385 01:03:04,260 --> 01:03:08,980 will help people who are underemployed, 1386 01:03:08,980 --> 01:03:12,160 but it's not going to wrestle with this underlying 1387 01:03:12,160 --> 01:03:16,930 productivity rate, which is really the problem at hand. 1388 01:03:16,930 --> 01:03:19,540 And the alternative here, the book argues, 1389 01:03:19,540 --> 01:03:21,505 is investing in innovation infrastructure. 1390 01:03:21,505 --> 01:03:23,740 In other words, having a larger view 1391 01:03:23,740 --> 01:03:25,710 of what that infrastructure ought to be 1392 01:03:25,710 --> 01:03:28,780 and investing in that. 1393 01:03:28,780 --> 01:03:29,680 All right. 1394 01:03:29,680 --> 01:03:32,010 That's this text. 1395 01:03:32,010 --> 01:03:33,940 And let's take it apart. 1396 01:03:33,940 --> 01:03:35,360 Luyao, it's all yours. 1397 01:03:38,440 --> 01:03:40,890 I'll go back to the hiking picture at the beginning. 1398 01:03:40,890 --> 01:03:42,140 RASHEED: Everybody's favorite. 1399 01:03:42,140 --> 01:03:44,370 MARTIN: So it's the last 16 minutes, pretty much. 1400 01:03:44,370 --> 01:03:47,852 So do you want to wrap up, and then go back to the discussion? 1401 01:03:47,852 --> 01:03:49,310 WILLIAM BONVILLIAN: Well, let's let 1402 01:03:49,310 --> 01:03:51,440 Luyao tell us some key observations 1403 01:03:51,440 --> 01:03:52,480 about this chapter. 1404 01:03:52,480 --> 01:03:55,450 I agree, we're heading towards the deadline, 1405 01:03:55,450 --> 01:03:57,880 and there's some things I wanted to say, too. 1406 01:03:57,880 --> 01:04:00,020 But go ahead. 1407 01:04:00,020 --> 01:04:02,150 AUDIENCE: Yeah, we realize there's basically 1408 01:04:02,150 --> 01:04:04,130 the competition of the technology 1409 01:04:04,130 --> 01:04:06,590 replacement and the possible solutions 1410 01:04:06,590 --> 01:04:10,460 that we have, and also all the policy factors that we 1411 01:04:10,460 --> 01:04:12,380 want to take into account. 1412 01:04:12,380 --> 01:04:15,950 And I think the technology replacement can actually 1413 01:04:15,950 --> 01:04:21,320 be an opportunity for economies having an aging population. 1414 01:04:21,320 --> 01:04:23,210 But at the same time, the United States 1415 01:04:23,210 --> 01:04:29,400 facing low investment rate and low savings 1416 01:04:29,400 --> 01:04:33,240 and low economic growth is probably 1417 01:04:33,240 --> 01:04:38,670 problematic for this substitution to work well. 1418 01:04:42,660 --> 01:04:53,137 So what are the suggestions that we have to just improve-- 1419 01:04:53,137 --> 01:04:54,220 maybe this is too general. 1420 01:05:00,260 --> 01:05:03,243 Yeah, so what are the-- 1421 01:05:03,243 --> 01:05:04,410 well, let's talk about this. 1422 01:05:04,410 --> 01:05:07,680 So in one of the suggestions it's 1423 01:05:07,680 --> 01:05:12,130 saying by reducing the number of working hours, 1424 01:05:12,130 --> 01:05:16,710 it can actually help to encourage consumption and help 1425 01:05:16,710 --> 01:05:17,940 to solve unemployment. 1426 01:05:17,940 --> 01:05:20,000 And how do you guys like this idea? 1427 01:05:20,000 --> 01:05:21,870 MAX: Of reducing hours? 1428 01:05:21,870 --> 01:05:24,840 AUDIENCE: Yeah, reducing the number of working hours. 1429 01:05:24,840 --> 01:05:25,530 WILLIAM BONVILLIAN: France has been 1430 01:05:25,530 --> 01:05:26,610 working on this for a while. 1431 01:05:26,610 --> 01:05:27,690 LILY: I would love that. 1432 01:05:27,690 --> 01:05:30,510 But then I think you always have the people who-- 1433 01:05:30,510 --> 01:05:32,940 I mean, how do you enforce it and how do you regulate it? 1434 01:05:32,940 --> 01:05:35,580 Because you'll always have the people who will 1435 01:05:35,580 --> 01:05:37,857 want to work more hours to get ahead. 1436 01:05:37,857 --> 01:05:39,190 That's one of the reasons that-- 1437 01:05:39,190 --> 01:05:39,330 AUDIENCE: Yeah. 1438 01:05:39,330 --> 01:05:40,663 That's what Chinese do, I guess. 1439 01:05:40,663 --> 01:05:41,440 [LAUGHTER] 1440 01:05:41,440 --> 01:05:43,320 MAX: Yeah, you define when people 1441 01:05:43,320 --> 01:05:44,508 start getting paid overtime. 1442 01:05:44,508 --> 01:05:45,300 It's not that hard. 1443 01:05:45,300 --> 01:05:47,310 And then, make it so that overtime, 1444 01:05:47,310 --> 01:05:51,480 as it currently is, you have to get permission 1445 01:05:51,480 --> 01:05:53,950 from your supervisor in order to actually get those hours. 1446 01:05:53,950 --> 01:05:55,680 AUDIENCE: At most companies, that's already in place. 1447 01:05:55,680 --> 01:05:58,010 If you're going overtime, you need an excuse for it. 1448 01:05:58,010 --> 01:05:58,270 MAX: Exactly. 1449 01:05:58,270 --> 01:05:59,630 AUDIENCE: And people just do it under the table instead. 1450 01:05:59,630 --> 01:06:00,900 Because by billing overtime, you're 1451 01:06:00,900 --> 01:06:03,400 admitting you can't finish your work in the normal hours, so 1452 01:06:03,400 --> 01:06:04,190 [INAUDIBLE]. 1453 01:06:04,190 --> 01:06:05,730 LILY: Yeah. 1454 01:06:05,730 --> 01:06:09,360 So in my field, we're not paid hourly. 1455 01:06:09,360 --> 01:06:11,790 We're paid on-- the way that we are 1456 01:06:11,790 --> 01:06:16,650 measured is in the output of publications that we have. 1457 01:06:16,650 --> 01:06:18,780 So, yeah, I didn't want to work until 1 o'clock 1458 01:06:18,780 --> 01:06:21,330 in the morning three nights last week. 1459 01:06:21,330 --> 01:06:23,730 But I did, because I had a paper I had to get out. 1460 01:06:23,730 --> 01:06:27,600 And so I can't imagine, in my field, 1461 01:06:27,600 --> 01:06:30,533 an hour limit being in place. 1462 01:06:30,533 --> 01:06:32,700 MAX: Well, yeah, but that's different from a factory 1463 01:06:32,700 --> 01:06:35,893 job or some sort of-- 1464 01:06:35,893 --> 01:06:38,060 MARTIN: Yeah, I think cultural context is important. 1465 01:06:38,060 --> 01:06:41,350 Just like I don't think it's very-- 1466 01:06:41,350 --> 01:06:43,480 I don't think I went into the USC to do less. 1467 01:06:43,480 --> 01:06:43,760 WILLIAM BONVILLIAN: Right. 1468 01:06:43,760 --> 01:06:46,170 This is not a very American solution, I would say. 1469 01:06:46,170 --> 01:06:46,990 [LAUGHTER] 1470 01:06:46,990 --> 01:06:48,260 MARTIN: I would try to think about-- well, 1471 01:06:48,260 --> 01:06:48,950 maybe there's certain sectors-- 1472 01:06:48,950 --> 01:06:50,210 WILLIAM BONVILLIAN: Because we are driven workaholics. 1473 01:06:50,210 --> 01:06:51,520 MARTIN: Yeah. 1474 01:06:51,520 --> 01:06:53,772 It is very much the culture. 1475 01:06:53,772 --> 01:06:55,520 STEPH: Well, in the future of work, 1476 01:06:55,520 --> 01:06:58,520 one of the trends I've seen is people talking about who 1477 01:06:58,520 --> 01:07:00,110 is going to lose jobs. 1478 01:07:00,110 --> 01:07:02,270 And they always say that the alternative 1479 01:07:02,270 --> 01:07:04,820 is that the people who are going to gain jobs are creatives 1480 01:07:04,820 --> 01:07:06,770 and that it's the creative economy that's 1481 01:07:06,770 --> 01:07:10,538 going to gain the most from automation in manufacturing. 1482 01:07:10,538 --> 01:07:12,830 So there's going to be really interesting opportunities 1483 01:07:12,830 --> 01:07:14,690 in, say, virtual reality, for people 1484 01:07:14,690 --> 01:07:16,820 who are interested in producing, say, entertainment 1485 01:07:16,820 --> 01:07:21,140 specifically for that technological environment. 1486 01:07:21,140 --> 01:07:24,420 AUDIENCE: I think Romer proposed that there will be more job 1487 01:07:24,420 --> 01:07:25,670 placement in the middle class. 1488 01:07:25,670 --> 01:07:30,590 And we'll see a higher demand for high-pay, high-skilled jobs 1489 01:07:30,590 --> 01:07:33,540 and low-pay, low-skilled jobs in this, 1490 01:07:33,540 --> 01:07:37,560 and we'll see a reduced size of middle class. 1491 01:07:37,560 --> 01:07:42,970 So in terms of the rising demand for the low-skill, low-pay 1492 01:07:42,970 --> 01:07:47,870 jobs, what would be the social problems associated 1493 01:07:47,870 --> 01:07:51,220 with that kind of future? 1494 01:07:51,220 --> 01:07:54,860 RASHEED: Yeah, I think you just highlighted a pretty important 1495 01:07:54,860 --> 01:07:57,830 point that we tried to get at, which is just this hollowing 1496 01:07:57,830 --> 01:08:00,830 out of the middle class. 1497 01:08:00,830 --> 01:08:04,790 But more importantly, the economic models 1498 01:08:04,790 --> 01:08:07,105 that we're going to have to use are not going to be-- 1499 01:08:07,105 --> 01:08:08,480 now, everyone's income levels are 1500 01:08:08,480 --> 01:08:11,720 distributed relatively evenly in accordance 1501 01:08:11,720 --> 01:08:12,830 to population numbers. 1502 01:08:12,830 --> 01:08:16,590 But you have to use models from different countries and stuff 1503 01:08:16,590 --> 01:08:17,090 like that. 1504 01:08:17,090 --> 01:08:20,990 It's going to be like an entirely different way to think 1505 01:08:20,990 --> 01:08:24,710 about the economy, because we've never really had to deal with 1506 01:08:24,710 --> 01:08:28,473 an economy where we're so-- now, we're like-- 1507 01:08:28,473 --> 01:08:30,890 we've never been bimodal before, having to deal with like, 1508 01:08:30,890 --> 01:08:32,689 we only have low-skill, low-pay jobs 1509 01:08:32,689 --> 01:08:34,819 or high-skill, high-pay jobs. 1510 01:08:34,819 --> 01:08:37,668 And on the [INAUDIBLE] the US sits on, 1511 01:08:37,668 --> 01:08:39,710 I don't think we've had to deal with that before. 1512 01:08:39,710 --> 01:08:40,403 And so-- 1513 01:08:40,403 --> 01:08:43,070 STEPH: I don't think a democracy has had to deal with it before. 1514 01:08:43,070 --> 01:08:45,320 AUDIENCE: I mean, would you not say that early America 1515 01:08:45,320 --> 01:08:46,290 was pretty bimodal? 1516 01:08:46,290 --> 01:08:48,140 You had plantation owners, and you 1517 01:08:48,140 --> 01:08:50,270 had the bankers in New York. 1518 01:08:50,270 --> 01:08:51,748 Everyone else was pretty poor. 1519 01:08:51,748 --> 01:08:52,790 WILLIAM BONVILLIAN: Yeah. 1520 01:08:52,790 --> 01:08:54,710 I think there were times in the later part 1521 01:08:54,710 --> 01:08:57,109 of the 19th century where we had these kind of inequality 1522 01:08:57,109 --> 01:08:58,029 numbers, too. 1523 01:08:58,029 --> 01:08:59,720 AUDIENCE: Yeah, so I think it was a huge problem-- 1524 01:08:59,720 --> 01:09:00,710 WILLIAM BONVILLIAN: But we got out of them. 1525 01:09:00,710 --> 01:09:03,590 We got out of them, essentially by raising the education 1526 01:09:03,590 --> 01:09:05,439 level of the entire workforce. 1527 01:09:05,439 --> 01:09:08,000 STEPH: And increasing social labor protections. 1528 01:09:08,000 --> 01:09:09,875 WILLIAM BONVILLIAN: Yes, and we did that too. 1529 01:09:11,890 --> 01:09:14,115 MARTIN: Yeah, that is a fundamental assumption 1530 01:09:14,115 --> 01:09:16,240 that I think is pretty interesting-- the assumption 1531 01:09:16,240 --> 01:09:17,698 that there should be a middle class 1532 01:09:17,698 --> 01:09:19,840 or that it's natural for there to be. 1533 01:09:19,840 --> 01:09:22,680 Or if that's always going to be a case. 1534 01:09:22,680 --> 01:09:24,779 Not to say it shouldn't be or it should be. 1535 01:09:24,779 --> 01:09:26,196 Just it's interesting that we have 1536 01:09:26,196 --> 01:09:28,960 the assumption that the fundamentals have always shown 1537 01:09:28,960 --> 01:09:30,640 that that should be a thing. 1538 01:09:30,640 --> 01:09:32,182 WILLIAM BONVILLIAN: I think I'd argue 1539 01:09:32,182 --> 01:09:35,547 that that's pretty fundamental to the ethos of the country. 1540 01:09:35,547 --> 01:09:36,880 STEPH: The American model, yeah. 1541 01:09:36,880 --> 01:09:40,960 WILLIAM BONVILLIAN: Overall, immigration occurred here, 1542 01:09:40,960 --> 01:09:43,540 and social classes were significantly 1543 01:09:43,540 --> 01:09:47,140 reduced in the United States from the period of founding 1544 01:09:47,140 --> 01:09:49,479 until the latter part of the 19th century 1545 01:09:49,479 --> 01:09:51,776 when industrialization took off. 1546 01:09:51,776 --> 01:09:53,859 But that it was a pretty strong middle-class base. 1547 01:09:53,859 --> 01:09:56,860 And that talent base, that involved base, that citizen 1548 01:09:56,860 --> 01:09:59,530 base, is pretty key to the effect-- 1549 01:09:59,530 --> 01:10:02,800 the ability of democracy to function. 1550 01:10:02,800 --> 01:10:05,020 And what happens in democracy when 1551 01:10:05,020 --> 01:10:07,240 you start to really split the society 1552 01:10:07,240 --> 01:10:09,962 and significant inequality occurs? 1553 01:10:09,962 --> 01:10:11,920 I'm worried about some of this signaling system 1554 01:10:11,920 --> 01:10:13,140 we're getting on that at the moment. 1555 01:10:13,140 --> 01:10:13,280 MARTIN: Yeah. 1556 01:10:13,280 --> 01:10:15,197 I think it'd be interesting to do a comparison 1557 01:10:15,197 --> 01:10:16,705 to another time in history-- 1558 01:10:16,705 --> 01:10:18,747 STEPH: Like feudalism? 1559 01:10:18,747 --> 01:10:20,080 MARTIN: Well, yeah, just to go-- 1560 01:10:20,080 --> 01:10:24,163 I mean, [INAUDIBLE] from there, because we definitely are 1561 01:10:24,163 --> 01:10:25,330 living in interesting times. 1562 01:10:28,760 --> 01:10:31,600 AUDIENCE: But intuitively, after learning economics 1563 01:10:31,600 --> 01:10:35,260 for several years, I think raising the labor mobility is 1564 01:10:35,260 --> 01:10:38,950 probably one of the solutions. 1565 01:10:38,950 --> 01:10:44,590 So how do you guys feel about how effective would this be 1566 01:10:44,590 --> 01:10:46,730 and how feasible would it be? 1567 01:10:46,730 --> 01:10:48,310 MARTIN: Labor mobility? 1568 01:10:48,310 --> 01:10:50,170 AUDIENCE: Raising labor mobilities. 1569 01:10:50,170 --> 01:10:51,210 MAX: Doesn't that increase inefficiency, though? 1570 01:10:51,210 --> 01:10:52,990 WILLIAM BONVILLIAN: Well, we've decreased labor mobility 1571 01:10:52,990 --> 01:10:54,407 in the United States significantly 1572 01:10:54,407 --> 01:10:55,580 in the last several decades. 1573 01:10:55,580 --> 01:10:57,432 RASHEED: Wait, define labor mobility-- 1574 01:10:57,432 --> 01:10:59,390 WILLIAM BONVILLIAN: The ability to move up in-- 1575 01:10:59,390 --> 01:11:01,390 AUDIENCE: Yeah, once they lost their employment, 1576 01:11:01,390 --> 01:11:03,430 they can move to find new employments 1577 01:11:03,430 --> 01:11:04,710 with a new set of skills. 1578 01:11:04,710 --> 01:11:05,380 RASHEED: Got it. 1579 01:11:05,380 --> 01:11:06,463 WILLIAM BONVILLIAN: Right. 1580 01:11:06,463 --> 01:11:08,950 But what happens in these failing industrial economies 1581 01:11:08,950 --> 01:11:10,690 is that people-- 1582 01:11:10,690 --> 01:11:13,840 homeownership is typically a family's key asset. 1583 01:11:13,840 --> 01:11:17,290 If the community is failing because of industrial decline, 1584 01:11:17,290 --> 01:11:18,520 that asset collapses. 1585 01:11:18,520 --> 01:11:21,910 It's much harder for people to get out at that point. 1586 01:11:21,910 --> 01:11:24,450 So there's been less labor mobility, 1587 01:11:24,450 --> 01:11:28,750 and interestingly, less job churn in the United States 1588 01:11:28,750 --> 01:11:29,740 in the last decade. 1589 01:11:29,740 --> 01:11:32,230 And then there's social mobility, 1590 01:11:32,230 --> 01:11:35,610 which arguably has been pretty key to this country. 1591 01:11:35,610 --> 01:11:37,785 In other words, your ability to improve yourself, 1592 01:11:37,785 --> 01:11:39,660 which has always been an operating assumption 1593 01:11:39,660 --> 01:11:40,570 in the United States. 1594 01:11:40,570 --> 01:11:43,630 You could always do better than your parents, right? 1595 01:11:43,630 --> 01:11:46,180 And that may be coming to an end, as well. 1596 01:11:46,180 --> 01:11:48,040 That may be significantly forward closed. 1597 01:11:48,040 --> 01:11:49,915 MARTIN: They've shown that this generation is 1598 01:11:49,915 --> 01:11:52,110 worse off in real numbers than their parents were. 1599 01:11:52,110 --> 01:11:54,592 Or they're less likely to do what they do. 1600 01:11:54,592 --> 01:11:56,800 WILLIAM BONVILLIAN: Fortunately, you have more stuff. 1601 01:11:56,800 --> 01:11:57,970 That's the one really-- 1602 01:11:57,970 --> 01:11:59,520 MARTIN: Yeah, there's a saying where, 1603 01:11:59,520 --> 01:12:01,062 if you could choose to be alive today 1604 01:12:01,062 --> 01:12:02,910 and be poor, or be a billionaire-- 1605 01:12:02,910 --> 01:12:04,270 WILLIAM BONVILLIAN: We may be poor, but we got iPhones. 1606 01:12:04,270 --> 01:12:05,728 MARTIN: --like a multi-millionaire. 1607 01:12:05,728 --> 01:12:07,700 Yeah, it's like you're a rich-poor, right? 1608 01:12:07,700 --> 01:12:08,650 It's like, yeah, I can't afford anything. 1609 01:12:08,650 --> 01:12:10,020 But I got my phone, I got-- 1610 01:12:10,020 --> 01:12:11,768 MAX: I got beams. 1611 01:12:11,768 --> 01:12:13,810 STEPH: Actually, interestingly, in terms of memes 1612 01:12:13,810 --> 01:12:16,420 representing your culture, I saw a meme last night 1613 01:12:16,420 --> 01:12:18,808 that said, why do you think I post so many selfies? 1614 01:12:18,808 --> 01:12:21,350 I don't have a car, I don't have a house to post pictures of, 1615 01:12:21,350 --> 01:12:23,450 so I'm just going to post photos of my face. 1616 01:12:23,450 --> 01:12:25,318 And I think that's really-- 1617 01:12:25,318 --> 01:12:26,110 MAX: That's so sad. 1618 01:12:26,110 --> 01:12:27,020 STEPH: No, I think that's representative 1619 01:12:27,020 --> 01:12:28,930 of our social opportunities, right? 1620 01:12:28,930 --> 01:12:29,515 AUDIENCE: And you know that millionaire 1621 01:12:29,515 --> 01:12:32,148 who was complaining that the reason that millennials don't 1622 01:12:32,148 --> 01:12:33,940 have houses is because they're spending all 1623 01:12:33,940 --> 01:12:35,250 their money on avocado toast. 1624 01:12:35,250 --> 01:12:37,610 [LAUGHTER] 1625 01:12:38,300 --> 01:12:39,550 WILLIAM BONVILLIAN: All right. 1626 01:12:39,550 --> 01:12:41,050 So I'm going to take the opportunity 1627 01:12:41,050 --> 01:12:43,930 to do a quick wrap up of just this class, 1628 01:12:43,930 --> 01:12:46,030 because we did the wrap up of the overall class 1629 01:12:46,030 --> 01:12:47,320 at the outset. 1630 01:12:47,320 --> 01:12:49,030 But just very briefly here-- 1631 01:12:53,330 --> 01:13:00,200 Brynjolfsson and McAfee tell us that the advent of IT 1632 01:13:00,200 --> 01:13:05,126 is going to be significantly disruptive 1633 01:13:05,126 --> 01:13:09,890 in the foreseeable future, in the short to mid-term future. 1634 01:13:09,890 --> 01:13:12,140 And it's going to create significant technological job 1635 01:13:12,140 --> 01:13:12,950 displacement. 1636 01:13:12,950 --> 01:13:16,610 And David Autor in his piece "Why 1637 01:13:16,610 --> 01:13:18,980 Do We Still Have So Many Jobs?" 1638 01:13:18,980 --> 01:13:21,980 responds with a long-standing set 1639 01:13:21,980 --> 01:13:24,140 of thoughtful economic arguments, 1640 01:13:24,140 --> 01:13:27,830 arguing that we can't just look at the displaced jobs. 1641 01:13:27,830 --> 01:13:30,830 We also have to look at the new complementary jobs that 1642 01:13:30,830 --> 01:13:34,790 are coming about and the net increase in new technologies 1643 01:13:34,790 --> 01:13:36,950 and production that will accompany 1644 01:13:36,950 --> 01:13:39,290 these technological changes, too. 1645 01:13:39,290 --> 01:13:44,660 They create a ground condition that 1646 01:13:44,660 --> 01:13:48,680 may make this shift more manageable. 1647 01:13:48,680 --> 01:13:52,790 And Autor also talks about how our economy-- 1648 01:13:52,790 --> 01:13:55,640 he looks at inequality and ties it 1649 01:13:55,640 --> 01:14:00,170 to education and education attainment. 1650 01:14:00,170 --> 01:14:05,900 And he argues overall that this inequality problem 1651 01:14:05,900 --> 01:14:09,567 is tied to our education levels, education 1652 01:14:09,567 --> 01:14:10,650 attainment in our society. 1653 01:14:10,650 --> 01:14:13,550 It's going to require significant upskilling 1654 01:14:13,550 --> 01:14:15,440 of our population. 1655 01:14:15,440 --> 01:14:17,630 And then David Mindell takes us back 1656 01:14:17,630 --> 01:14:20,350 to the, what's the future of working going to look like? 1657 01:14:20,350 --> 01:14:23,570 And takes a deep dive into what most people view as the most 1658 01:14:23,570 --> 01:14:25,820 threatening territory-- robotics-- 1659 01:14:25,820 --> 01:14:29,720 and argues that, yes, there will be technological job 1660 01:14:29,720 --> 01:14:32,390 displacement, but the relationship with robotics 1661 01:14:32,390 --> 01:14:34,610 is a complementary one, that there's 1662 01:14:34,610 --> 01:14:36,590 going to be a human machine symbiosis. 1663 01:14:36,590 --> 01:14:39,380 It's going to be a more complicated picture than just 1664 01:14:39,380 --> 01:14:41,450 straight technological job displacement. 1665 01:14:41,450 --> 01:14:45,410 There will be new opportunities for people in this mix. 1666 01:14:45,410 --> 01:14:49,280 And then reading from that chapter of the upcoming book 1667 01:14:49,280 --> 01:14:55,063 that I'm doing with Peter Singer, 1668 01:14:55,063 --> 01:14:56,480 looking through the literature, we 1669 01:14:56,480 --> 01:15:00,830 conclude that the jobless future is not on us today, 1670 01:15:00,830 --> 01:15:04,040 and the low productivity rates and low investment rates 1671 01:15:04,040 --> 01:15:08,390 suggest that it's not evolving rapidly. 1672 01:15:08,390 --> 01:15:10,700 So we may have a little time here. 1673 01:15:10,700 --> 01:15:13,400 And then, meanwhile, we do have this deep problem 1674 01:15:13,400 --> 01:15:15,350 with a low growth rate, a low productivity 1675 01:15:15,350 --> 01:15:18,950 rate, a low capital investment rate, which is creating secular 1676 01:15:18,950 --> 01:15:20,695 stagnation, in Summers' terms. 1677 01:15:20,695 --> 01:15:22,070 And that's something that we need 1678 01:15:22,070 --> 01:15:26,090 to get on, because that's the immediate future. 1679 01:15:26,090 --> 01:15:29,930 And then, overall, this general upskilling of the workforce 1680 01:15:29,930 --> 01:15:31,520 serves both purposes. 1681 01:15:31,520 --> 01:15:33,910 So let's get on with that.