1 00:00:00,090 --> 00:00:02,430 The following content is provided under a Creative 2 00:00:02,430 --> 00:00:03,820 Commons license. 3 00:00:03,820 --> 00:00:06,030 Your support will help MIT OpenCourseWare 4 00:00:06,030 --> 00:00:10,120 continue to offer high-quality educational resources for free. 5 00:00:10,120 --> 00:00:12,660 To make a donation or to view additional materials 6 00:00:12,660 --> 00:00:16,620 from hundreds of MIT courses, visit MIT OpenCourseWare 7 00:00:16,620 --> 00:00:17,992 at ocw.mit.edu. 8 00:00:22,250 --> 00:00:27,290 WILLIAM BONVILLIAN: So class 1 as you all will recall 9 00:00:27,290 --> 00:00:29,960 was about direct innovation factors. 10 00:00:29,960 --> 00:00:34,550 And we looked at the growth economists, Solow and Romer 11 00:00:34,550 --> 00:00:35,270 in particular. 12 00:00:35,270 --> 00:00:39,050 And they essentially gave us two direct innovation factors 13 00:00:39,050 --> 00:00:42,830 that you really can't account for in an innovation system 14 00:00:42,830 --> 00:00:44,270 unless those are inputs. 15 00:00:44,270 --> 00:00:48,680 So what we could translate Solow as saying, 16 00:00:48,680 --> 00:00:54,080 you need to do R&D. And we can translate Romer as saying, 17 00:00:54,080 --> 00:00:56,660 you need to-- 18 00:00:56,660 --> 00:00:58,610 it's the talent in the system that's 19 00:00:58,610 --> 00:01:02,150 working in this system that's a critical input. 20 00:01:02,150 --> 00:01:04,550 So that gives us our first two innovation factors. 21 00:01:04,550 --> 00:01:08,000 Class 2 brought us our third. 22 00:01:08,000 --> 00:01:12,080 And we began to consider innovation as not just factors, 23 00:01:12,080 --> 00:01:13,790 but as a system. 24 00:01:13,790 --> 00:01:16,580 And Liz Reynolds' discussion earlier this afternoon 25 00:01:16,580 --> 00:01:20,520 was a good reminder of a systems approach to innovation. 26 00:01:20,520 --> 00:01:22,820 So we read Richard Nelson, and he 27 00:01:22,820 --> 00:01:29,978 introduced the idea of looking at national innovation systems. 28 00:01:29,978 --> 00:01:31,520 He wasn't the first to use that term. 29 00:01:31,520 --> 00:01:33,228 Chris Freeman at the University of Sussex 30 00:01:33,228 --> 00:01:36,590 was, but he really explored the dimensions 31 00:01:36,590 --> 00:01:38,660 and taught us to look at the innovation 32 00:01:38,660 --> 00:01:42,890 institutions and the strengths and weaknesses 33 00:01:42,890 --> 00:01:44,750 of those institutions. 34 00:01:44,750 --> 00:01:46,670 And with that concept then, you can 35 00:01:46,670 --> 00:01:50,180 begin to explore any innovation system. 36 00:01:50,180 --> 00:01:54,320 Our third direct innovation factor, 37 00:01:54,320 --> 00:01:57,050 then, is really all about innovation organization. 38 00:01:57,050 --> 00:02:00,900 The strength of the actors, the connections between the actors, 39 00:02:00,900 --> 00:02:04,220 and then the gaps in that innovation system. 40 00:02:04,220 --> 00:02:06,350 You can analyze all of those and understand a lot 41 00:02:06,350 --> 00:02:08,570 about the strength of a particular innovation system, 42 00:02:08,570 --> 00:02:10,460 whether it's national, whether it's regional, 43 00:02:10,460 --> 00:02:12,110 whether it's local. 44 00:02:12,110 --> 00:02:15,380 And Liz gave you a reading of her study 45 00:02:15,380 --> 00:02:16,880 of the state of Massachusetts, which 46 00:02:16,880 --> 00:02:20,570 is a good glimpse into those issues. 47 00:02:20,570 --> 00:02:23,900 So classes 3 and 4 I won't recap because Liz 48 00:02:23,900 --> 00:02:25,760 gave us a good reminder of those issues. 49 00:02:25,760 --> 00:02:29,300 But we studied manufacturing, and how it evolved 50 00:02:29,300 --> 00:02:30,410 in two different periods-- 51 00:02:30,410 --> 00:02:33,890 both the 1980s competition period and the current decline 52 00:02:33,890 --> 00:02:36,350 of US manufacturing in the first decade of this century, 53 00:02:36,350 --> 00:02:39,510 the early 2000s. 54 00:02:39,510 --> 00:02:41,660 And that was a good case study for us 55 00:02:41,660 --> 00:02:44,540 on how to take these ideas of innovation system 56 00:02:44,540 --> 00:02:48,620 and look at a system, the manufacturing system. 57 00:02:48,620 --> 00:02:52,820 Class 5 pulled back and got us into the framework 58 00:02:52,820 --> 00:02:59,360 again, looking at innovation at the institutional level. 59 00:02:59,360 --> 00:03:05,870 How does R and D and prototype, how do those stages 60 00:03:05,870 --> 00:03:08,890 operate in innovation organizations, R 61 00:03:08,890 --> 00:03:10,190 and D organizations? 62 00:03:10,190 --> 00:03:11,690 And then, how does the handoff occur 63 00:03:11,690 --> 00:03:14,023 in an innovation system and the strength of those actors 64 00:03:14,023 --> 00:03:15,110 as well? 65 00:03:15,110 --> 00:03:18,530 So that got us into thinking about looking at innovation 66 00:03:18,530 --> 00:03:20,270 at the institutional level. 67 00:03:20,270 --> 00:03:23,690 Class number six was about the Valley of Death problem. 68 00:03:23,690 --> 00:03:27,740 The gap that was in effect built in into the US innovation 69 00:03:27,740 --> 00:03:32,540 system because we focused the federal role on research, 70 00:03:32,540 --> 00:03:35,240 it didn't really focus on the follow-on later stage 71 00:03:35,240 --> 00:03:37,580 development in subsequent stages, right? 72 00:03:37,580 --> 00:03:40,640 So it guaranteed a gap between our innovation actors, 73 00:03:40,640 --> 00:03:43,940 between the research world and the implementation world. 74 00:03:46,850 --> 00:03:54,620 And we talked about innovation at the face-to-face level. 75 00:03:54,620 --> 00:03:57,970 So innovation doesn't just occur at the institutional level, 76 00:03:57,970 --> 00:03:59,150 it occurs amongst people. 77 00:03:59,150 --> 00:04:01,910 People innovate, not institutions. 78 00:04:01,910 --> 00:04:04,280 Institutions help organize an innovation system, 79 00:04:04,280 --> 00:04:05,810 but in the end it's people. 80 00:04:05,810 --> 00:04:10,370 And what are the rule sets that govern the way innovation looks 81 00:04:10,370 --> 00:04:13,590 like at the face-to-face level? 82 00:04:13,590 --> 00:04:16,279 And we got deeply into great group theory. 83 00:04:16,279 --> 00:04:18,200 Class number 8 was about DARPA, which 84 00:04:18,200 --> 00:04:23,870 is a very different innovation model than the bulk of the US 85 00:04:23,870 --> 00:04:25,697 system, organized around-- 86 00:04:25,697 --> 00:04:28,280 predominately around they end of the World War II time period. 87 00:04:30,853 --> 00:04:32,270 It's an organization that attempts 88 00:04:32,270 --> 00:04:36,260 to combine capabilities at both the institutional level, 89 00:04:36,260 --> 00:04:40,130 but also the great group forming innovation 90 00:04:40,130 --> 00:04:41,720 at the face-to-face level. 91 00:04:41,720 --> 00:04:43,430 Working on breakthrough technology 92 00:04:43,430 --> 00:04:45,603 is what DARPA calls a right/left model. 93 00:04:45,603 --> 00:04:47,270 It wants to think about what it gets out 94 00:04:47,270 --> 00:04:49,080 of the end of the innovation system. 95 00:04:49,080 --> 00:04:52,460 It's not just curiosity-driven. 96 00:04:52,460 --> 00:04:54,590 So what do they want, and then how 97 00:04:54,590 --> 00:04:56,630 to go back to the early stages and develop 98 00:04:56,630 --> 00:04:58,730 the breakthroughs to get there? 99 00:04:58,730 --> 00:05:01,490 So a very different organizational model. 100 00:05:01,490 --> 00:05:04,840 Class 9 was a case study on NIH. 101 00:05:04,840 --> 00:05:07,550 And again it brought home to us how 102 00:05:07,550 --> 00:05:12,420 to look at an innovation system, in this case, 103 00:05:12,420 --> 00:05:14,630 in the life science research world. 104 00:05:14,630 --> 00:05:18,190 And what are the institutional barriers and problems 105 00:05:18,190 --> 00:05:19,660 in that system? 106 00:05:19,660 --> 00:05:22,750 Class 10 was around energy technology, 107 00:05:22,750 --> 00:05:27,160 we introduced the idea of the difficulty of innovating 108 00:05:27,160 --> 00:05:30,130 in a legacy economic sector-- 109 00:05:30,130 --> 00:05:33,520 and established, complex, existing legacy 110 00:05:33,520 --> 00:05:37,690 sector that builds up its own paradigm to protect 111 00:05:37,690 --> 00:05:40,350 its models that are technological, 112 00:05:40,350 --> 00:05:43,900 they are economic, they are political, they are social. 113 00:05:43,900 --> 00:05:46,720 And they build barriers to disruptive threats 114 00:05:46,720 --> 00:05:49,300 from the outside, from new entrants 115 00:05:49,300 --> 00:05:51,670 that would attempt to disrupt that model. 116 00:05:51,670 --> 00:05:54,235 And how do you begin to think about bringing innovation 117 00:05:54,235 --> 00:05:57,373 into these legacy sectors is a really important question, 118 00:05:57,373 --> 00:05:59,290 because most of the economy is legacy sectors. 119 00:05:59,290 --> 00:06:03,190 And if you're walling off 80% or more of your economy 120 00:06:03,190 --> 00:06:05,770 from innovation, then you're going to dramatically affect 121 00:06:05,770 --> 00:06:08,200 your ability to grow, and your growth rate, 122 00:06:08,200 --> 00:06:09,700 and your societal well-being. 123 00:06:09,700 --> 00:06:13,960 So in effect that's the problem with legacy sectors, 124 00:06:13,960 --> 00:06:14,870 and that's the need-- 125 00:06:14,870 --> 00:06:16,662 that's the reason why we need to figure out 126 00:06:16,662 --> 00:06:18,460 how to innovate in them. 127 00:06:18,460 --> 00:06:22,480 Class 11 was all about education. 128 00:06:22,480 --> 00:06:25,110 Again, another legacy sector. 129 00:06:25,110 --> 00:06:27,370 And we talked about a whole set of ways 130 00:06:27,370 --> 00:06:31,270 to think about science education, technology 131 00:06:31,270 --> 00:06:34,240 education, and then education for innovation itself. 132 00:06:34,240 --> 00:06:38,410 Could we ever attempt to think about that as a teaching 133 00:06:38,410 --> 00:06:39,460 approach? 134 00:06:39,460 --> 00:06:42,040 And today will be all about the future work, which 135 00:06:42,040 --> 00:06:46,840 I think is an issue that is going to affect all of you, 136 00:06:46,840 --> 00:06:50,500 and will kind of dominate a lot of your time 137 00:06:50,500 --> 00:06:51,920 over ensuing decades. 138 00:06:51,920 --> 00:06:54,640 So we're going to try and sort this one out and try and figure 139 00:06:54,640 --> 00:06:57,130 out what it means by reading some of the leading 140 00:06:57,130 --> 00:06:59,650 thinkers in this area. 141 00:06:59,650 --> 00:07:05,260 And then Liz was great by doing the whole-- 142 00:07:05,260 --> 00:07:06,640 innovation is like real estate. 143 00:07:06,640 --> 00:07:08,450 It's location, location, location. 144 00:07:08,450 --> 00:07:12,317 So it really forms in these geographic regions. 145 00:07:12,317 --> 00:07:14,650 You really need to understand it in geographical region. 146 00:07:14,650 --> 00:07:16,460 We didn't have time for everything in this class, 147 00:07:16,460 --> 00:07:17,740 but that's certainly one of the classes I would 148 00:07:17,740 --> 00:07:19,540 have added if I had the time. 149 00:07:19,540 --> 00:07:24,070 So Liz walked us through some of those initial concepts 150 00:07:24,070 --> 00:07:26,500 in that field, which she does really important work in, 151 00:07:26,500 --> 00:07:28,150 by the way. 152 00:07:28,150 --> 00:07:31,600 All right you've got Brynjolffson Johnson 153 00:07:31,600 --> 00:07:33,730 and you've got the first David Autor piece. 154 00:07:33,730 --> 00:07:36,490 So why don't I do those together, 155 00:07:36,490 --> 00:07:38,830 because they're they critique each other, right? 156 00:07:41,520 --> 00:07:45,370 And this is a big debate that's going on, on the MIT campus 157 00:07:45,370 --> 00:07:49,810 with different perspectives here as we're going to see, 158 00:07:49,810 --> 00:07:52,660 about what's going to happen to work here. 159 00:07:52,660 --> 00:07:57,400 And so after the Great Recession, 160 00:07:57,400 --> 00:07:59,840 the economy went through a very slow period 161 00:07:59,840 --> 00:08:01,930 of putting people back to work. 162 00:08:01,930 --> 00:08:07,210 Now, we're now back to 5% or lower employment, 163 00:08:07,210 --> 00:08:10,810 but we still have a large number of people that never went back 164 00:08:10,810 --> 00:08:12,400 into the workforce. 165 00:08:12,400 --> 00:08:14,960 A shockingly high number. 166 00:08:14,960 --> 00:08:17,540 So this problem-- we're not past this problem. 167 00:08:17,540 --> 00:08:21,440 So this is an early piece by Erik Brynjolffson and Andrew 168 00:08:21,440 --> 00:08:21,940 McAfee. 169 00:08:21,940 --> 00:08:22,990 AUDIENCE: Sorry. 170 00:08:22,990 --> 00:08:24,160 Brynjolffson. 171 00:08:24,160 --> 00:08:25,630 WILLIAM BONVILLIAN: Brynjolffson. 172 00:08:25,630 --> 00:08:26,330 Forgive me, Tom. 173 00:08:26,330 --> 00:08:27,330 AUDIENCE: Just in case-- 174 00:08:27,330 --> 00:08:28,350 WILLIAM BONVILLIAN: No, I appreciate it. 175 00:08:28,350 --> 00:08:30,070 I'm glad we're going to get that right. 176 00:08:30,070 --> 00:08:30,790 AUDIENCE: Brynjolffson. 177 00:08:30,790 --> 00:08:31,900 WILLIAM BONVILLIAN: Brynjolfsson. 178 00:08:31,900 --> 00:08:32,400 All right. 179 00:08:32,400 --> 00:08:34,330 Good. 180 00:08:34,330 --> 00:08:35,789 So Eric and Andrew-- 181 00:08:35,789 --> 00:08:41,320 how's that-- brought us a whole set of new thinking 182 00:08:41,320 --> 00:08:42,555 about this territory. 183 00:08:42,555 --> 00:08:44,680 And there's a debate about what their findings are. 184 00:08:44,680 --> 00:08:47,470 This is an early piece they produced. 185 00:08:47,470 --> 00:08:51,760 Their book on this topic is called The Second Machine Age. 186 00:08:51,760 --> 00:08:53,530 So they looked at this problem of why 187 00:08:53,530 --> 00:08:55,240 wasn't the economy putting people back 188 00:08:55,240 --> 00:08:59,530 to work at a rate that was more acceptable? 189 00:08:59,530 --> 00:09:03,760 And they argued that the 2007, 2008 recession 190 00:09:03,760 --> 00:09:07,450 wasn't a typical business cycle recession. 191 00:09:07,450 --> 00:09:10,960 In other words, you didn't lose your job and then get it back. 192 00:09:10,960 --> 00:09:13,190 You lost your job, period. 193 00:09:13,190 --> 00:09:14,320 Right? 194 00:09:14,320 --> 00:09:17,110 Your job did not come back after this recession, 195 00:09:17,110 --> 00:09:19,390 in many sectors, particularly manufacturing 196 00:09:19,390 --> 00:09:21,760 as we talked about. 197 00:09:21,760 --> 00:09:25,060 So one explanation was, gee, it's just not a normal business 198 00:09:25,060 --> 00:09:27,340 cycle. 199 00:09:27,340 --> 00:09:29,590 It's stagnation. 200 00:09:29,590 --> 00:09:32,770 And there's a long-term decline in certain sectors 201 00:09:32,770 --> 00:09:34,750 that you have to understand and look at. 202 00:09:34,750 --> 00:09:40,150 A third explanation, which they venture to set out, 203 00:09:40,150 --> 00:09:43,720 was that it wasn't really stagnation either, 204 00:09:43,720 --> 00:09:46,300 and it wasn't too little innovation-- 205 00:09:46,300 --> 00:09:47,920 a signal of stagnation-- 206 00:09:47,920 --> 00:09:52,210 it was too much innovation for the economy 207 00:09:52,210 --> 00:09:54,010 to manage and handle. 208 00:09:54,010 --> 00:09:55,900 And so they embarked, then, on kind 209 00:09:55,900 --> 00:10:01,700 of a history of this concern about the end of work, 210 00:10:01,700 --> 00:10:05,050 which they argue started in 1995 with a growing 211 00:10:05,050 --> 00:10:09,460 concern at that time about technological job displacement. 212 00:10:09,460 --> 00:10:11,410 And they look at the surrounding technologies 213 00:10:11,410 --> 00:10:13,000 that are emerging-- 214 00:10:13,000 --> 00:10:18,010 Google's driverless cars, Watson's Jeopardy, 215 00:10:18,010 --> 00:10:22,420 Ray Kurzwell, in his book, The Age of Spiritual Machines: 216 00:10:22,420 --> 00:10:25,030 When Computers Exceed Human Intelligence, 217 00:10:25,030 --> 00:10:27,705 began to portray some of these concerns. 218 00:10:27,705 --> 00:10:29,830 In other words, people were going to get displaced. 219 00:10:29,830 --> 00:10:31,990 Their work was going to be displaced 220 00:10:31,990 --> 00:10:36,100 by the advent of the IT technology revolution 221 00:10:36,100 --> 00:10:41,560 as it moves into artificial intelligence areas. 222 00:10:41,560 --> 00:10:45,070 What is clear is that there has been exponential growth 223 00:10:45,070 --> 00:10:47,380 in the computing sector. 224 00:10:47,380 --> 00:10:49,150 And computers are thousands of times 225 00:10:49,150 --> 00:10:53,380 better than they were 30 years ago. 226 00:10:53,380 --> 00:10:57,430 But there are areas that Eric and Andrew point out, 227 00:10:57,430 --> 00:11:01,090 where computers are not good at particularly 228 00:11:01,090 --> 00:11:06,630 the sort of combination of both physical and knowledge areas, 229 00:11:06,630 --> 00:11:09,218 and are not yet good at general problem solving, 230 00:11:09,218 --> 00:11:11,010 and they're not good at creative abilities. 231 00:11:11,010 --> 00:11:12,790 But there has been exponential progress. 232 00:11:12,790 --> 00:11:16,240 So there's still a significant distance here they note. 233 00:11:16,240 --> 00:11:19,570 But their concern is that is on the way, 234 00:11:19,570 --> 00:11:23,590 and it's going to disrupt a lot of employment 235 00:11:23,590 --> 00:11:25,720 in the economy over time. 236 00:11:25,720 --> 00:11:33,160 And then they argue that given that exponential advance, 237 00:11:33,160 --> 00:11:36,130 there are two fundamental strategies in response. 238 00:11:36,130 --> 00:11:41,140 One is an organizational innovation strategy. 239 00:11:41,140 --> 00:11:42,910 And the core idea here is there's 240 00:11:42,910 --> 00:11:44,980 an opportunity for creative entrepreneurs 241 00:11:44,980 --> 00:11:51,190 that can create new business models that combine the growing 242 00:11:51,190 --> 00:11:56,320 numbers of mid-skilled workers with ever-cheaper technologies 243 00:11:56,320 --> 00:11:58,270 to create new kinds of value. 244 00:11:58,270 --> 00:11:59,110 All right? 245 00:11:59,110 --> 00:12:02,320 That could be one way out of this unemployment 246 00:12:02,320 --> 00:12:04,780 social disruption box. 247 00:12:04,780 --> 00:12:06,910 And they argue that digital technologies 248 00:12:06,910 --> 00:12:08,980 do create great opportunities for individuals 249 00:12:08,980 --> 00:12:13,780 to use their own kind of unique capabilities. 250 00:12:13,780 --> 00:12:16,430 They ask, are we running out of innovation? 251 00:12:16,430 --> 00:12:19,960 And when businesses are based on bits instead of atoms, 252 00:12:19,960 --> 00:12:23,260 then each new product adds to the set of building blocks 253 00:12:23,260 --> 00:12:25,570 that are available to the next entrepreneur, 254 00:12:25,570 --> 00:12:28,240 instead of depleting the stock of ideas. 255 00:12:28,240 --> 00:12:31,480 So this truly is an exponential growth kind of territory. 256 00:12:31,480 --> 00:12:33,910 There's no denying it, they argue. 257 00:12:33,910 --> 00:12:35,992 But it does create a new set of building blocks 258 00:12:35,992 --> 00:12:37,450 that are potentially available here 259 00:12:37,450 --> 00:12:40,030 that you can think about organizational innovation 260 00:12:40,030 --> 00:12:41,800 to get around. 261 00:12:41,800 --> 00:12:43,960 The second argument they make is that we 262 00:12:43,960 --> 00:12:47,410 need to make significant investments in human capital 263 00:12:47,410 --> 00:12:52,240 as a way of dealing with this oncoming set of issues. 264 00:12:52,240 --> 00:12:54,460 So improving the education and skill sets 265 00:12:54,460 --> 00:12:58,990 to get the most out of our kind of racing technology, they 266 00:12:58,990 --> 00:13:01,900 argue, is important. 267 00:13:01,900 --> 00:13:06,370 And they posit that there's a third Industrial Revolution 268 00:13:06,370 --> 00:13:08,440 really around computing and information, 269 00:13:08,440 --> 00:13:10,600 which is based on information that, 270 00:13:10,600 --> 00:13:12,280 in effect, doesn't get used up. 271 00:13:15,090 --> 00:13:20,380 So that's Eric and Andrew's picture. 272 00:13:20,380 --> 00:13:23,110 And it's a dark picture, right? 273 00:13:23,110 --> 00:13:24,962 Their concern is that a lot of people 274 00:13:24,962 --> 00:13:26,170 are going to get left behind. 275 00:13:26,170 --> 00:13:29,860 Let's frantically do organizational reforms, 276 00:13:29,860 --> 00:13:36,190 let's do other kinds of changes in the educational system. 277 00:13:36,190 --> 00:13:40,360 They argue for a set of tax changes. 278 00:13:40,360 --> 00:13:42,490 They really propose in the end-- not in this piece, 279 00:13:42,490 --> 00:13:46,000 but in later pieces-- a negative income tax to kind of redress 280 00:13:46,000 --> 00:13:49,150 the inequality wealth balance in the society. 281 00:13:49,150 --> 00:13:52,960 They're concerned that this is upon us, right? 282 00:13:52,960 --> 00:13:55,720 And then David Autor, MIT economist 283 00:13:55,720 --> 00:13:59,590 who's been doing some remarkable work in a series of areas, 284 00:13:59,590 --> 00:14:01,972 we read his work on manufacturing. 285 00:14:05,420 --> 00:14:08,720 He writes a responsive piece. 286 00:14:08,720 --> 00:14:12,130 Why are There Still so Many Jobs? 287 00:14:12,130 --> 00:14:13,990 is his response. 288 00:14:13,990 --> 00:14:17,920 And he argues that Eric and Andrew are only 289 00:14:17,920 --> 00:14:22,010 taking a partial look at this phenomena. 290 00:14:22,010 --> 00:14:27,440 So let's do a little bit of historical material. 291 00:14:27,440 --> 00:14:29,900 Historically, economics has always 292 00:14:29,900 --> 00:14:32,480 found that technological advance, historically, 293 00:14:32,480 --> 00:14:35,110 always creates net new jobs. 294 00:14:35,110 --> 00:14:37,860 And it's been going on since the Industrial Revolution. 295 00:14:37,860 --> 00:14:41,900 And every time you study one of these great ways of innovation, 296 00:14:41,900 --> 00:14:46,160 net, there are people affected, disrupted by the new innovation 297 00:14:46,160 --> 00:14:47,480 waves, right? 298 00:14:47,480 --> 00:14:49,400 Buggy whip manufacturers had a real problem 299 00:14:49,400 --> 00:14:51,650 with the advent of the automobile internal combustion 300 00:14:51,650 --> 00:14:52,818 engine, right? 301 00:14:52,818 --> 00:14:54,860 So you can think about a lot of displacement that 302 00:14:54,860 --> 00:14:58,250 has occurred historically, but net, the amount of employment 303 00:14:58,250 --> 00:15:01,770 overall was a net gain. 304 00:15:01,770 --> 00:15:08,040 And David said, is that era over, is the question. 305 00:15:08,040 --> 00:15:10,530 And his basic answer is no. 306 00:15:10,530 --> 00:15:11,970 That's not over. 307 00:15:11,970 --> 00:15:14,790 And what he would argue, in a way, 308 00:15:14,790 --> 00:15:16,570 is an assessment and counting problem. 309 00:15:16,570 --> 00:15:19,260 It's not that there is not going to be disruption. 310 00:15:19,260 --> 00:15:28,320 It's just that automation can also complement labor 311 00:15:28,320 --> 00:15:31,470 and can raise outputs in ways that lead 312 00:15:31,470 --> 00:15:33,660 to higher demand for labor. 313 00:15:33,660 --> 00:15:34,290 Right? 314 00:15:34,290 --> 00:15:38,010 So automation does indeed substitute for labor, 315 00:15:38,010 --> 00:15:42,210 but in this complimentary way, automation 316 00:15:42,210 --> 00:15:47,400 can also lead to new kinds of output, that lead 317 00:15:47,400 --> 00:15:48,690 to new kinds of employment. 318 00:15:48,690 --> 00:15:51,580 And you've got to understand and assess that-- 319 00:15:51,580 --> 00:15:53,890 the combination of these factors, the substitution 320 00:15:53,890 --> 00:15:55,990 as well as the complementarity-- 321 00:15:55,990 --> 00:15:59,590 in understanding what's going to happen ahead. 322 00:15:59,590 --> 00:16:01,390 And he argues that commentators tend 323 00:16:01,390 --> 00:16:04,180 to overstate the extent of machine substitution 324 00:16:04,180 --> 00:16:07,840 for the human labor and ignore the strong complementarities 325 00:16:07,840 --> 00:16:11,440 between automation and labor, that in the end increase 326 00:16:11,440 --> 00:16:16,190 productivity, raise earnings, and augment demand for labor. 327 00:16:16,190 --> 00:16:18,850 So there's one important thing that has not changed 328 00:16:18,850 --> 00:16:21,690 in the economy here, right? 329 00:16:21,690 --> 00:16:27,820 So if we look at agriculture in the 19th century, 330 00:16:27,820 --> 00:16:32,640 we employed approaching-- over 40% of our population in 1900 331 00:16:32,640 --> 00:16:34,870 was employed in agriculture. 332 00:16:34,870 --> 00:16:36,960 Now, we employ less than 1%. 333 00:16:36,960 --> 00:16:39,510 Much greater output. 334 00:16:39,510 --> 00:16:42,110 So that's a story of intense productivity gains. 335 00:16:45,260 --> 00:16:50,690 But agriculture is different than the production of goods-- 336 00:16:50,690 --> 00:16:53,480 non-agricultural goods. 337 00:16:53,480 --> 00:16:56,180 If you look at food consumption per human being 338 00:16:56,180 --> 00:16:58,640 in the United States, it's not very different 339 00:16:58,640 --> 00:17:01,520 now than it was in 1900. 340 00:17:01,520 --> 00:17:04,543 Even in America there's only so many calories we can consume. 341 00:17:04,543 --> 00:17:05,710 I mean, we're working on it. 342 00:17:05,710 --> 00:17:09,349 But even here there are limits. 343 00:17:09,349 --> 00:17:13,790 That's not true in the production of goods. 344 00:17:13,790 --> 00:17:18,550 We seem to have an insatiable appetite for more goods. 345 00:17:18,550 --> 00:17:20,675 And occasionally a movement will hit society like, 346 00:17:20,675 --> 00:17:21,800 let's get rid of our stuff. 347 00:17:21,800 --> 00:17:25,339 But it never last very long. 348 00:17:25,339 --> 00:17:27,780 So it's a different world. 349 00:17:27,780 --> 00:17:30,530 We can't make the agricultural analogy 350 00:17:30,530 --> 00:17:33,440 to non-agricultural good production, right? 351 00:17:33,440 --> 00:17:35,060 They're different worlds. 352 00:17:35,060 --> 00:17:37,070 And part of the story that the David 353 00:17:37,070 --> 00:17:40,070 otter is attempting to illustrate here 354 00:17:40,070 --> 00:17:42,080 is that the limit-- 355 00:17:42,080 --> 00:17:44,120 there is no limit on goods, all right? 356 00:17:44,120 --> 00:17:47,030 So that complementarity, that combination 357 00:17:47,030 --> 00:17:51,980 of people and machines can lead to new kinds of technologies, 358 00:17:51,980 --> 00:17:54,620 new kinds of advances, new kinds of goods that 359 00:17:54,620 --> 00:17:56,720 fit evolving sets of needs. 360 00:17:59,400 --> 00:18:04,100 And he provides an interesting example of complementarity. 361 00:18:04,100 --> 00:18:10,282 He notes that ATMs, the cash machines in every bank 362 00:18:10,282 --> 00:18:11,990 that you know of, and all over the place, 363 00:18:11,990 --> 00:18:15,350 they were introduced in 1970. 364 00:18:15,350 --> 00:18:18,980 The assumption at the time was that, well, there's 365 00:18:18,980 --> 00:18:20,840 not going to be any more bank tellers. 366 00:18:20,840 --> 00:18:23,150 That job is over, right? 367 00:18:23,150 --> 00:18:25,610 And there were a lot of those. 368 00:18:25,610 --> 00:18:32,510 But what ended up happening, was that by 2010, there 369 00:18:32,510 --> 00:18:37,550 were actually somewhat more bank tellers, but their job 370 00:18:37,550 --> 00:18:41,450 and been repurposed to not just doing, counting up cash 371 00:18:41,450 --> 00:18:42,800 and handing it to customers. 372 00:18:42,800 --> 00:18:45,860 Their jobs got repurposed into, frankly considerably more 373 00:18:45,860 --> 00:18:50,000 sophisticated, customer service kind of job. 374 00:18:50,000 --> 00:18:54,980 So an upgrading of the skills required in a way for that job, 375 00:18:54,980 --> 00:18:57,170 but no loss in employment. 376 00:18:57,170 --> 00:18:59,690 In fact, a very modest gain. 377 00:18:59,690 --> 00:19:01,640 Despite the fact that everybody thought 378 00:19:01,640 --> 00:19:03,890 that sector was going to go. 379 00:19:03,890 --> 00:19:06,420 So he argues that that's not unique. 380 00:19:06,420 --> 00:19:09,900 This is going to occur in other sectors as well. 381 00:19:09,900 --> 00:19:11,990 So the gains in productivity have not 382 00:19:11,990 --> 00:19:14,670 led to a shortfall of demand for goods and services. 383 00:19:14,670 --> 00:19:17,090 These keep growing, which is a big offsetting 384 00:19:17,090 --> 00:19:23,710 factor to the substitution of automation for work. 385 00:19:23,710 --> 00:19:25,670 And he provides an interesting description 386 00:19:25,670 --> 00:19:28,550 of how you could look at the different economies 387 00:19:28,550 --> 00:19:32,870 in recent decades, right, the 1940 to 1970 388 00:19:32,870 --> 00:19:35,240 economy versus the current economy. 389 00:19:35,240 --> 00:19:36,800 His characterization of the economy 390 00:19:36,800 --> 00:19:40,460 that has been taking hold since 1980 through the current time, 391 00:19:40,460 --> 00:19:44,270 is that it's increasingly professional and technical 392 00:19:44,270 --> 00:19:46,490 with managerial occupations, and that they're 393 00:19:46,490 --> 00:19:48,380 growing pretty rapidly. 394 00:19:48,380 --> 00:19:51,830 Whereas skilled blue collar occupations are shrinking, 395 00:19:51,830 --> 00:19:54,080 and that clerical and sales occupations, which 396 00:19:54,080 --> 00:19:56,390 are the kind of vulnerable equivalent of production 397 00:19:56,390 --> 00:19:59,490 jobs in the information age, those are in decline. 398 00:19:59,490 --> 00:20:02,600 There's a powerful piece in The Economist this week 399 00:20:02,600 --> 00:20:05,510 about the decline in retail sales operations 400 00:20:05,510 --> 00:20:07,280 overall in the United States, in light 401 00:20:07,280 --> 00:20:11,060 of the explosion of Amazon and online delivery ordering 402 00:20:11,060 --> 00:20:12,620 system. 403 00:20:12,620 --> 00:20:14,578 And how are those changes going to take place? 404 00:20:14,578 --> 00:20:15,620 What does that look like? 405 00:20:15,620 --> 00:20:19,160 And what stores-- location-based stores are thriving 406 00:20:19,160 --> 00:20:20,480 and which ones aren't, right? 407 00:20:20,480 --> 00:20:22,570 It's a very interesting assessment. 408 00:20:22,570 --> 00:20:23,070 Steph? 409 00:20:26,710 --> 00:20:28,960 AUDIENCE: I'm try to figure out if this is a good time 410 00:20:28,960 --> 00:20:29,870 to ask it. 411 00:20:29,870 --> 00:20:35,437 I'm curious about like, drone technology is still 412 00:20:35,437 --> 00:20:38,020 not at the point in which it can take over the entire delivery 413 00:20:38,020 --> 00:20:40,030 system for a company like Amazon, 414 00:20:40,030 --> 00:20:42,040 but it seems like the Federal Government is 415 00:20:42,040 --> 00:20:44,290 defunding the Postal Service. 416 00:20:44,290 --> 00:20:49,120 So what then happens to the increase in labor, or I guess, 417 00:20:49,120 --> 00:20:52,000 the increase in demand of these products 418 00:20:52,000 --> 00:20:56,298 if there isn't an equal rise in delivery infrastructure? 419 00:20:56,298 --> 00:20:57,840 I guess that would be a consideration 420 00:20:57,840 --> 00:20:59,923 that I would hope that The Economist article would 421 00:20:59,923 --> 00:21:00,590 address. 422 00:21:00,590 --> 00:21:02,020 WILLIAM BONVILLIAN: Yeah, it doesn't deal with the Postal 423 00:21:02,020 --> 00:21:02,650 Service question. 424 00:21:02,650 --> 00:21:04,300 We can kind of think it through, right? 425 00:21:04,300 --> 00:21:07,313 So what happened, right? 426 00:21:07,313 --> 00:21:08,980 I don't know about the number of letters 427 00:21:08,980 --> 00:21:11,560 you're writing longhand per day these days, 428 00:21:11,560 --> 00:21:13,990 but mine is not high. 429 00:21:13,990 --> 00:21:16,390 So we've shift over to online systems. 430 00:21:16,390 --> 00:21:21,100 So the bulk of what used to be mail is gone. 431 00:21:21,100 --> 00:21:25,390 The Postal Service however, missed the revolution, right? 432 00:21:25,390 --> 00:21:29,350 They missed the fact that online delivery of goods 433 00:21:29,350 --> 00:21:32,000 was going to explode. 434 00:21:32,000 --> 00:21:37,370 FedEx had a better model, and they captured that marketplace. 435 00:21:37,370 --> 00:21:40,330 So FedEx is thriving hand-in-glove 436 00:21:40,330 --> 00:21:45,190 with Amazon and other similar comparable delivery systems. 437 00:21:45,190 --> 00:21:48,150 The Postal Service missed that revolution by-and-large-- 438 00:21:48,150 --> 00:21:51,190 although they've been trying to catch up-- 439 00:21:51,190 --> 00:21:53,470 to their own detriment over time, right? 440 00:21:53,470 --> 00:21:54,640 That's kind of the story. 441 00:21:54,640 --> 00:21:57,160 And when you look in terms of net employment, 442 00:21:57,160 --> 00:21:58,390 we'll see how this sorts out. 443 00:21:58,390 --> 00:22:02,020 But again, this is an example of complementarity, 444 00:22:02,020 --> 00:22:04,810 the whole Federal Express and similar-- 445 00:22:04,810 --> 00:22:07,540 DHL, and other express delivery services-- 446 00:22:07,540 --> 00:22:09,460 that's a complimentary service that 447 00:22:09,460 --> 00:22:12,610 happened to compliment very nicely the evolution 448 00:22:12,610 --> 00:22:16,360 of the distribution of goods originating 449 00:22:16,360 --> 00:22:18,010 through online services. 450 00:22:18,010 --> 00:22:19,860 So that helps us think about some of this. 451 00:22:19,860 --> 00:22:20,440 Martin? 452 00:22:20,440 --> 00:22:22,523 AUDIENCE: I mean, [INAUDIBLE] is like maybe that's 453 00:22:22,523 --> 00:22:24,610 an industry that is OK to maybe be disrupted, 454 00:22:24,610 --> 00:22:27,250 because probably when it began it wasn't profitable. 455 00:22:27,250 --> 00:22:29,188 And so you needed the government to have it, 456 00:22:29,188 --> 00:22:30,730 to have it as an infrastructure for-- 457 00:22:30,730 --> 00:22:31,960 it was a tool for communication. 458 00:22:31,960 --> 00:22:33,043 WILLIAM BONVILLIAN: Right. 459 00:22:33,043 --> 00:22:35,805 AUDIENCE: And there is a there's a huge societal gain from this, 460 00:22:35,805 --> 00:22:37,180 even though it wasn't profitable. 461 00:22:37,180 --> 00:22:40,630 And then with less letters, and because you have the internet, 462 00:22:40,630 --> 00:22:43,030 and probably what happened is all these other companies 463 00:22:43,030 --> 00:22:44,620 optimized their logistics really well, 464 00:22:44,620 --> 00:22:45,730 because they're getting better and better 465 00:22:45,730 --> 00:22:48,070 and they have more money, those companies became profitable, 466 00:22:48,070 --> 00:22:49,600 and it was probably way easier and way cheaper 467 00:22:49,600 --> 00:22:51,070 to send it through their way. 468 00:22:51,070 --> 00:22:53,695 And now you have all this kind of stagnation 469 00:22:53,695 --> 00:22:56,690 in innovation inside that-- 470 00:22:56,690 --> 00:22:58,600 you have the Postal Service, that sector. 471 00:22:58,600 --> 00:23:00,370 So it just wasn't viable for them 472 00:23:00,370 --> 00:23:01,960 to keep going because it's just it's just a huge waste. 473 00:23:01,960 --> 00:23:02,710 WILLIAM BONVILLIAN: Right they? 474 00:23:02,710 --> 00:23:04,127 Have a whole set of disadvantages. 475 00:23:04,127 --> 00:23:06,190 The government handed them the task 476 00:23:06,190 --> 00:23:08,770 of having to deliver mail and parcels 477 00:23:08,770 --> 00:23:11,950 to every single neighborhood in the United States, 478 00:23:11,950 --> 00:23:14,620 no matter how empty your neighborhood was, right? 479 00:23:14,620 --> 00:23:16,810 If you were out there in rural South Dakota, 480 00:23:16,810 --> 00:23:18,970 you can imagine that the economic base 481 00:23:18,970 --> 00:23:21,400 for a strong delivery system is probably not there. 482 00:23:21,400 --> 00:23:23,350 So there were a lot of disadvantages 483 00:23:23,350 --> 00:23:25,390 they faced going into this, but in the end, 484 00:23:25,390 --> 00:23:28,320 they missed that package delivery mechanism. 485 00:23:28,320 --> 00:23:31,360 But you make a good point, Martin. 486 00:23:31,360 --> 00:23:33,160 Let me wrap up with David Autor. 487 00:23:33,160 --> 00:23:37,630 There is, in the end then, an interplay 488 00:23:37,630 --> 00:23:40,300 between people and machines. 489 00:23:40,300 --> 00:23:42,370 And the comparative advantage will 490 00:23:42,370 --> 00:23:45,310 allow computers to substitute for workers in performing 491 00:23:45,310 --> 00:23:49,300 routine codifiable tasks. 492 00:23:49,300 --> 00:23:52,270 But it also amplifies the comparative advantage 493 00:23:52,270 --> 00:23:55,450 of workers in supplying problem solving skills, 494 00:23:55,450 --> 00:23:56,960 adaptability, and creativity. 495 00:23:56,960 --> 00:23:59,187 So the people-like capabilities. 496 00:23:59,187 --> 00:24:01,270 And maybe computers will get to these some decade, 497 00:24:01,270 --> 00:24:05,060 but that's not happening anytime real soon. 498 00:24:05,060 --> 00:24:09,190 So re-addressing the balance between those two 499 00:24:09,190 --> 00:24:14,140 is the key to managing the entry of these new IT-based 500 00:24:14,140 --> 00:24:17,590 technologies into the economy, David Autor would argue. 501 00:24:17,590 --> 00:24:19,630 The frontier of automation, he recognizes, 502 00:24:19,630 --> 00:24:22,540 is rapidly advancing. 503 00:24:22,540 --> 00:24:27,340 So the challenge of substituting machines remains intense. 504 00:24:27,340 --> 00:24:31,360 But if you focus only on what's lost, 505 00:24:31,360 --> 00:24:34,300 you miss a really central economic mechanism. 506 00:24:34,300 --> 00:24:37,300 So automation affects the demand for labor 507 00:24:37,300 --> 00:24:40,210 by raising the value of the tasks 508 00:24:40,210 --> 00:24:44,650 that workers uniquely supply, right? 509 00:24:44,650 --> 00:24:52,650 So that's-- Eric and Andrew tell us about the challenge ahead 510 00:24:52,650 --> 00:24:58,110 from the IT revolution in substituting IT for work. 511 00:24:58,110 --> 00:25:00,030 David Autor begins to put us on a pathway 512 00:25:00,030 --> 00:25:03,070 to make an assessment of just what that's going to mean, 513 00:25:03,070 --> 00:25:05,070 and what the balance between people and machines 514 00:25:05,070 --> 00:25:08,280 is going to be, and how we need to think about that balance, 515 00:25:08,280 --> 00:25:09,000 right? 516 00:25:09,000 --> 00:25:12,150 So that's the core issue in his article, how come 517 00:25:12,150 --> 00:25:13,620 there's still so many jobs? 518 00:25:13,620 --> 00:25:18,900 His argument in the end is, in this territory, 519 00:25:18,900 --> 00:25:23,770 there's going to be substantial numbers of jobs. 520 00:25:23,770 --> 00:25:26,010 Beth, it's all yours. 521 00:25:26,010 --> 00:25:27,760 AUDIENCE: OK. 522 00:25:27,760 --> 00:25:30,260 So there are a lot of really good questions this week. 523 00:25:30,260 --> 00:25:34,270 So I want to spend most of the time discussing those. 524 00:25:34,270 --> 00:25:37,650 Just a few points that I noted. 525 00:25:37,650 --> 00:25:39,620 So maybe perhaps a little differently, 526 00:25:39,620 --> 00:25:42,850 I found Eric and Andrew's piece more optimistic 527 00:25:42,850 --> 00:25:43,750 than I expected. 528 00:25:43,750 --> 00:25:46,060 Especially how it ends, kind of saying 529 00:25:46,060 --> 00:25:48,430 like we've been through these revolutions before, 530 00:25:48,430 --> 00:25:49,572 we'll get through this one. 531 00:25:49,572 --> 00:25:51,280 Which I found kind of weird, because they 532 00:25:51,280 --> 00:25:54,160 spent most of the beginning of the article talking about how 533 00:25:54,160 --> 00:25:59,320 this was fundamentally different given its exponential growth. 534 00:25:59,320 --> 00:26:01,390 So that was a little hard for me to reconcile. 535 00:26:01,390 --> 00:26:03,360 I think they're trying to end it a little more upbeat, 536 00:26:03,360 --> 00:26:04,902 but maybe if I read their whole book, 537 00:26:04,902 --> 00:26:08,450 I'll see that it's a little bit more nuanced than that. 538 00:26:08,450 --> 00:26:13,660 And then, yeah, as Bill said, Professor Autor kind of 539 00:26:13,660 --> 00:26:15,513 portrays a different story in that, 540 00:26:15,513 --> 00:26:17,680 like, yes, we need to be aware of these changes that 541 00:26:17,680 --> 00:26:21,550 are coming, but they aren't as bleak as we might think. 542 00:26:21,550 --> 00:26:22,930 And then he also does talk-- 543 00:26:22,930 --> 00:26:25,000 I think it's in this article-- 544 00:26:25,000 --> 00:26:28,280 about how additional technological advances, 545 00:26:28,280 --> 00:26:31,360 such as machine learning could cut 546 00:26:31,360 --> 00:26:34,450 in a little bit to his conclusions about the skills 547 00:26:34,450 --> 00:26:39,170 that humans have that are unique from machines. 548 00:26:39,170 --> 00:26:43,480 But starting first with this Eric Andrew piece, 549 00:26:43,480 --> 00:26:45,370 Matt brought up a really interesting question 550 00:26:45,370 --> 00:26:48,250 that I think is worth discussing. 551 00:26:48,250 --> 00:26:50,260 So one of the things that they talk about 552 00:26:50,260 --> 00:26:55,420 is this new ecosystem will have a lot of opportunities 553 00:26:55,420 --> 00:26:56,530 for entrepreneurs. 554 00:26:56,530 --> 00:27:00,280 But Matt noted that, especially like in Silicon Valley, 555 00:27:00,280 --> 00:27:02,380 we see that a lot of IT entrepreneurs 556 00:27:02,380 --> 00:27:07,420 are kind of creating apps or other new advances that 557 00:27:07,420 --> 00:27:10,180 don't have a lot of value for economic growth, that might 558 00:27:10,180 --> 00:27:14,850 just be like making like, solving first world problems 559 00:27:14,850 --> 00:27:16,750 as they like to say. 560 00:27:16,750 --> 00:27:20,560 So what do you think the reality is with these entrepreneurs 561 00:27:20,560 --> 00:27:22,150 actually creating new companies that 562 00:27:22,150 --> 00:27:26,180 are going to be able to foster job growth? 563 00:27:26,180 --> 00:27:28,270 AUDIENCE: Oh, yeah I'll put a quick point on that. 564 00:27:28,270 --> 00:27:29,530 I would say that right now what's happened 565 00:27:29,530 --> 00:27:31,060 is that because the IT revolution 566 00:27:31,060 --> 00:27:33,160 there were these sectors that-- 567 00:27:33,160 --> 00:27:35,448 it's pretty much the scaling and cycle distribution. 568 00:27:35,448 --> 00:27:37,240 So things that don't matter, you can scale, 569 00:27:37,240 --> 00:27:38,770 but you make a lot of money off them. 570 00:27:38,770 --> 00:27:40,090 But we're entering a period where people are 571 00:27:40,090 --> 00:27:41,530 going to do things that matter. 572 00:27:41,530 --> 00:27:43,480 And now you have the capital markets, 573 00:27:43,480 --> 00:27:45,270 you have the infrastructure of how 574 00:27:45,270 --> 00:27:47,830 you grow companies quickly, how you on board. 575 00:27:47,830 --> 00:27:51,040 So this whole infrastructure has already been built out, right? 576 00:27:51,040 --> 00:27:53,350 So the way I would compare is like when Amazon started, 577 00:27:53,350 --> 00:27:55,308 they built this infrastructure, and it was just 578 00:27:55,308 --> 00:27:56,963 to distribute books, right? 579 00:27:56,963 --> 00:27:59,380 And then later, they evolved to all these other platforms. 580 00:27:59,380 --> 00:28:01,797 So the benefit now is that you have these capital markets. 581 00:28:01,797 --> 00:28:05,800 So that, say, somebody does do like a very important 582 00:28:05,800 --> 00:28:08,140 innovation to the-- like CRISPR-- 583 00:28:08,140 --> 00:28:11,530 you can get it passed on faster. 584 00:28:11,530 --> 00:28:15,760 So right now, I would say, like, you should think of Snapchat, 585 00:28:15,760 --> 00:28:18,340 Facebook is kind of like little to middle leagues. 586 00:28:18,340 --> 00:28:20,640 But it creates the environment that if someone 587 00:28:20,640 --> 00:28:23,590 does do something significant with a real innovation 588 00:28:23,590 --> 00:28:26,520 and creates an industry, you will get maybe 589 00:28:26,520 --> 00:28:28,810 a trillion dollar company. 590 00:28:28,810 --> 00:28:31,143 And you'll have somebody who becomes very, very wealthy. 591 00:28:31,143 --> 00:28:33,352 AUDIENCE: Would you-- I'm interested in your opinion. 592 00:28:33,352 --> 00:28:35,530 Would you argue that those first world problem 593 00:28:35,530 --> 00:28:39,250 solvers are necessary steps to allow the big players to exist? 594 00:28:39,250 --> 00:28:43,673 Like if the market was dominated first by the CRISPR-- 595 00:28:43,673 --> 00:28:45,340 AUDIENCE: I wouldn't say it's necessary, 596 00:28:45,340 --> 00:28:46,965 but I'd say that's how it is right now. 597 00:28:46,965 --> 00:28:49,240 And it's just been built on. 598 00:28:49,240 --> 00:28:52,330 There's also a common strategy where it seems really stupid 599 00:28:52,330 --> 00:28:53,820 until it's not. 600 00:28:53,820 --> 00:28:56,540 So it can also go back and forth. 601 00:28:56,540 --> 00:28:57,790 If you look at Silicon Valley. 602 00:28:57,790 --> 00:29:00,520 Silicon Valley took 57 years to become what it is today. 603 00:29:00,520 --> 00:29:03,370 It really started off with Stanford using their funding 604 00:29:03,370 --> 00:29:05,847 from World War II, and making a decision-- 605 00:29:05,847 --> 00:29:07,180 which was different than MIT's-- 606 00:29:07,180 --> 00:29:09,808 that I don't want to keep the money inside the lab. 607 00:29:09,808 --> 00:29:11,350 You built something, you research it, 608 00:29:11,350 --> 00:29:13,532 and then you go out and make a company off of it. 609 00:29:13,532 --> 00:29:14,990 But it takes a long period of time. 610 00:29:14,990 --> 00:29:16,120 And that was hard tech. 611 00:29:16,120 --> 00:29:17,170 But it was just hard to scale. 612 00:29:17,170 --> 00:29:19,295 Because what used to happen is-- like in the '90s-- 613 00:29:19,295 --> 00:29:22,280 so if you wanted to make computer science company, so 614 00:29:22,280 --> 00:29:24,280 like something like Facebook, there was actually 615 00:29:24,280 --> 00:29:26,162 15 Facebooks before Facebook. 616 00:29:26,162 --> 00:29:28,120 And what happened is you'd have to buy servers, 617 00:29:28,120 --> 00:29:29,328 you'd have to configure them. 618 00:29:29,328 --> 00:29:33,610 And you'd probably spend about like $100,000 to $750,000 619 00:29:33,610 --> 00:29:34,960 to buy all your servers, right? 620 00:29:34,960 --> 00:29:36,600 And then you got to start it with the idea. 621 00:29:36,600 --> 00:29:38,580 So like if somebody had to do a Snapchat and be like, OK, 622 00:29:38,580 --> 00:29:40,622 I paid a million dollars, now I have to build it. 623 00:29:40,622 --> 00:29:42,197 It's such a huge problem. 624 00:29:42,197 --> 00:29:44,530 And so the issue, the reason that hard tech didn't scale 625 00:29:44,530 --> 00:29:47,020 before, is because you would ask for that money, 626 00:29:47,020 --> 00:29:48,485 and you would give up about 30%. 627 00:29:48,485 --> 00:29:50,860 What would happen is the founder would start the company, 628 00:29:50,860 --> 00:29:52,060 and then what would happen is because you 629 00:29:52,060 --> 00:29:53,200 don't have any management experience, 630 00:29:53,200 --> 00:29:54,110 they'd kick you out. 631 00:29:54,110 --> 00:29:56,570 So there wasn't much control in the founder. 632 00:29:56,570 --> 00:29:58,570 So now what's happened is because of the funding 633 00:29:58,570 --> 00:30:00,057 environment, everybody has capital, 634 00:30:00,057 --> 00:30:01,390 but there's very few good ideas. 635 00:30:01,390 --> 00:30:04,955 And in fact, the moniker of VCs is 636 00:30:04,955 --> 00:30:07,330 that there's only really like five really great companies 637 00:30:07,330 --> 00:30:09,430 that we should invest in this year. 638 00:30:09,430 --> 00:30:10,870 So there's a lot of money. 639 00:30:10,870 --> 00:30:14,270 And I would say that it's more like, it's more like E News, 640 00:30:14,270 --> 00:30:14,770 right? 641 00:30:14,770 --> 00:30:17,187 All these companies you hear of is just like scandal, just 642 00:30:17,187 --> 00:30:18,843 like gossip tabloids. 643 00:30:18,843 --> 00:30:20,260 There's a couple of good companies 644 00:30:20,260 --> 00:30:22,360 that every year will come out and create really substantial 645 00:30:22,360 --> 00:30:23,800 technologies, and those are the people 646 00:30:23,800 --> 00:30:25,660 that you won't hear about today, you won't hear about them 647 00:30:25,660 --> 00:30:26,600 a year from now. 648 00:30:26,600 --> 00:30:29,705 But they'll be around 20 years from now, 10 years from now. 649 00:30:29,705 --> 00:30:30,330 AUDIENCE: Yeah. 650 00:30:30,330 --> 00:30:32,590 AUDIENCE: Do you think that capital system-- 651 00:30:32,590 --> 00:30:34,930 I mean, I imagine it would need to change a bit once. 652 00:30:34,930 --> 00:30:36,610 You start having like IT companies 653 00:30:36,610 --> 00:30:39,400 that are matching like lot mid-scope workers 654 00:30:39,400 --> 00:30:40,490 with the technology. 655 00:30:40,490 --> 00:30:45,300 So say you have a VC person that's interested in startups 656 00:30:45,300 --> 00:30:47,740 right now in the software industry. 657 00:30:47,740 --> 00:30:50,320 They have like almost 100% profit margin, 658 00:30:50,320 --> 00:30:53,730 now you start hiring people and your profit margins are lower. 659 00:30:53,730 --> 00:30:58,750 Do you think this system that's being 660 00:30:58,750 --> 00:31:01,535 put in place by what we have now will be able to adapt for that? 661 00:31:01,535 --> 00:31:03,910 AUDIENCE: Can you specify that question a little bit more 662 00:31:03,910 --> 00:31:04,850 and give an example? 663 00:31:04,850 --> 00:31:06,960 WILLIAM BONVILLIAN: So I think you make an interesting point, 664 00:31:06,960 --> 00:31:08,752 Matt, and this takes us back to the reading 665 00:31:08,752 --> 00:31:11,570 that we did about the engine and cyclotron road. 666 00:31:11,570 --> 00:31:14,590 And the fact that the venture capital system in the US 667 00:31:14,590 --> 00:31:18,130 is really focused on software companies at this point. 668 00:31:18,130 --> 00:31:23,310 And what Martin is telling us is early stage, 669 00:31:23,310 --> 00:31:25,030 the foundations are built, they'll 670 00:31:25,030 --> 00:31:27,880 able to move to hard technologies. 671 00:31:27,880 --> 00:31:30,150 We're not seeing that yet, frankly. 672 00:31:30,150 --> 00:31:34,090 Again, biotech is a separate world, 673 00:31:34,090 --> 00:31:36,250 and they've built a system there by which 674 00:31:36,250 --> 00:31:39,700 they can manage the scale up with their current financial 675 00:31:39,700 --> 00:31:40,630 model. 676 00:31:40,630 --> 00:31:42,340 But I think there's-- 677 00:31:42,340 --> 00:31:45,280 you've given us a hopeful perspective 678 00:31:45,280 --> 00:31:46,180 of what could evolve. 679 00:31:46,180 --> 00:31:47,390 AUDIENCE: I would say that an example of now going 680 00:31:47,390 --> 00:31:49,015 to the physical world from the IT world 681 00:31:49,015 --> 00:31:50,590 will be like in Uber and Airbnb where 682 00:31:50,590 --> 00:31:53,697 it's starting to filter out, and then hard tech 683 00:31:53,697 --> 00:31:55,030 will be a whole different field. 684 00:31:55,030 --> 00:31:56,405 WILLIAM BONVILLIAN: But the model 685 00:31:56,405 --> 00:31:58,690 is still job displacement, right? 686 00:31:58,690 --> 00:32:01,410 And that's how you get your return. 687 00:32:01,410 --> 00:32:04,690 So it's a really difficult model. 688 00:32:04,690 --> 00:32:06,900 And we don't necessarily have to go that way, 689 00:32:06,900 --> 00:32:10,350 as we talked about in the energy class. 690 00:32:10,350 --> 00:32:12,180 A country like Germany is organized 691 00:32:12,180 --> 00:32:15,300 around manufacturing-led innovation. 692 00:32:15,300 --> 00:32:18,480 They're producing-- there's no shortage of demand for goods 693 00:32:18,480 --> 00:32:19,230 worldwide. 694 00:32:19,230 --> 00:32:21,750 If anything, it's growing as a lot of the developing world 695 00:32:21,750 --> 00:32:24,805 becomes emerging economies. 696 00:32:24,805 --> 00:32:27,180 And Germany is well positioned to take advantage of that. 697 00:32:27,180 --> 00:32:29,263 And they've just organized their innovation system 698 00:32:29,263 --> 00:32:30,300 very differently. 699 00:32:30,300 --> 00:32:32,370 But Martin, you make a good point. 700 00:32:32,370 --> 00:32:34,120 This may be staging that we could 701 00:32:34,120 --> 00:32:36,540 move hard technologies into. 702 00:32:36,540 --> 00:32:39,720 Much as I mean, biotech is a separate world. 703 00:32:39,720 --> 00:32:42,240 But CRISPR, in a way, will result 704 00:32:42,240 --> 00:32:44,220 in what we could call hard, maybe 705 00:32:44,220 --> 00:32:48,630 they're soft technologies that could be in effect 706 00:32:48,630 --> 00:32:52,890 a platform off an IT base. 707 00:32:52,890 --> 00:32:54,380 How about another question, Beth? 708 00:32:54,380 --> 00:32:55,150 AUDIENCE: Sure. 709 00:32:55,150 --> 00:32:56,950 So I think just yesterday there was 710 00:32:56,950 --> 00:32:59,980 an article in The Washington Post or New York Times-- 711 00:32:59,980 --> 00:33:03,670 I forget-- talking about how one of the downsides 712 00:33:03,670 --> 00:33:06,400 to a lot of the lower paying jobs that college graduates are 713 00:33:06,400 --> 00:33:09,250 getting, such as counseling or teaching, 714 00:33:09,250 --> 00:33:11,560 is that they are harder to displace by automation. 715 00:33:11,560 --> 00:33:14,980 So they're talking about how so many investment banking 716 00:33:14,980 --> 00:33:17,020 jobs have already been replaced by automation, 717 00:33:17,020 --> 00:33:19,210 so you shouldn't feel so bad that you're 718 00:33:19,210 --> 00:33:21,950 taking a lower paying job because it's a safer job. 719 00:33:21,950 --> 00:33:23,950 And so this reminded me-- 720 00:33:23,950 --> 00:33:25,950 this question from Chris, I think, gets to that. 721 00:33:25,950 --> 00:33:29,440 So how do we accelerate growth in these job areas 722 00:33:29,440 --> 00:33:31,810 when we're admitting that they're seemingly less 723 00:33:31,810 --> 00:33:36,100 desirable right now for a lot of people because they pay less? 724 00:33:36,100 --> 00:33:39,280 And if these are the jobs that maybe more people should 725 00:33:39,280 --> 00:33:42,500 be going into, how do we convince people to do that? 726 00:33:42,500 --> 00:33:47,170 AUDIENCE: Well, [INAUDIBLE] there is an article [INAUDIBLE] 727 00:33:47,170 --> 00:33:49,990 Washington Post about how San Francisco is 728 00:33:49,990 --> 00:33:52,390 creating teacher housing. 729 00:33:52,390 --> 00:33:55,485 Because they felt like they were getting a lot of stories 730 00:33:55,485 --> 00:33:56,860 about how teachers had to commute 731 00:33:56,860 --> 00:33:59,620 in for several hours a day into the greater Bay Area, 732 00:33:59,620 --> 00:34:02,350 and then how some people were living in people's 733 00:34:02,350 --> 00:34:04,840 kitchens because they couldn't afford the rent. 734 00:34:04,840 --> 00:34:07,660 And so what San Francisco, the municipal government, you 735 00:34:07,660 --> 00:34:09,940 know again, the role of municipalities, decided 736 00:34:09,940 --> 00:34:12,940 was that they needed to build housing specifically 737 00:34:12,940 --> 00:34:13,637 for teachers. 738 00:34:13,637 --> 00:34:15,429 This wasn't going to be affordable housing, 739 00:34:15,429 --> 00:34:17,179 this isn't going to be low-income housing. 740 00:34:17,179 --> 00:34:19,810 This is going to be housing for teachers to live in so they 741 00:34:19,810 --> 00:34:21,400 can teach at the schools. 742 00:34:21,400 --> 00:34:23,080 And I feel like perhaps there is going 743 00:34:23,080 --> 00:34:25,600 to have to be sort of additional incentives in terms 744 00:34:25,600 --> 00:34:29,600 of infrastructure and housing to support this, 745 00:34:29,600 --> 00:34:33,969 I guess to support that population of workers, 746 00:34:33,969 --> 00:34:37,719 because it doesn't seem like it's politically 747 00:34:37,719 --> 00:34:40,449 viable to increase salary, but it 748 00:34:40,449 --> 00:34:42,580 does seem like a reasonable proposition 749 00:34:42,580 --> 00:34:44,909 to say, so long as you're working for our schools, 750 00:34:44,909 --> 00:34:47,230 and so long as you are providing a public service that 751 00:34:47,230 --> 00:34:49,600 is going to provide a net benefit to society, 752 00:34:49,600 --> 00:34:53,290 then you should have access to housing and food, at a minimum. 753 00:34:55,090 --> 00:34:57,090 AUDIENCE: Seems like-- remember that one article 754 00:34:57,090 --> 00:34:59,040 you showed me about unicorns, and how they're 755 00:34:59,040 --> 00:35:03,300 very masculine and immature? 756 00:35:03,300 --> 00:35:04,975 AUDIENCE: That is not what it said. 757 00:35:04,975 --> 00:35:06,280 [LAUGHTER] 758 00:35:06,280 --> 00:35:06,870 AUDIENCE: Well, if you're going to talk about-- 759 00:35:06,870 --> 00:35:08,240 WILLIAM BONVILLIAN: That's an unrelated topic. 760 00:35:08,240 --> 00:35:08,760 AUDIENCE: No, no. 761 00:35:08,760 --> 00:35:10,110 Because it has exit rounds and-- 762 00:35:10,110 --> 00:35:10,770 AUDIENCE: Yeah, OK. 763 00:35:10,770 --> 00:35:11,380 AUDIENCE: --seed rounds. 764 00:35:11,380 --> 00:35:13,410 AUDIENCE: Yeah, so I guess I can also talk about. 765 00:35:13,410 --> 00:35:15,870 Yeah, so I would be I would be happy to send it to all of you 766 00:35:15,870 --> 00:35:16,200 as well. 767 00:35:16,200 --> 00:35:17,617 It was an article that was written 768 00:35:17,617 --> 00:35:20,430 by a woman on-- or two women on Medium who had done research 769 00:35:20,430 --> 00:35:23,160 on the unicorn economy, that is the creation of a billion 770 00:35:23,160 --> 00:35:24,810 dollar company. 771 00:35:24,810 --> 00:35:26,790 And thinking about how that's not 772 00:35:26,790 --> 00:35:28,073 sustainable in the long run. 773 00:35:28,073 --> 00:35:30,240 What we actually need to be focusing venture capital 774 00:35:30,240 --> 00:35:33,390 funds are our zebra companies, which 775 00:35:33,390 --> 00:35:35,378 are companies that care not only about-- 776 00:35:35,378 --> 00:35:37,920 they're sort of like the triple bottom line companies, right? 777 00:35:37,920 --> 00:35:42,000 There is going to be a benefit to the company economically, 778 00:35:42,000 --> 00:35:44,285 there's going to be a benefit to society economically, 779 00:35:44,285 --> 00:35:45,660 and there's going to be a benefit 780 00:35:45,660 --> 00:35:48,720 to society as a result of the product that they're creating. 781 00:35:48,720 --> 00:35:51,960 And so they introduced the zebra model, 782 00:35:51,960 --> 00:35:54,650 because zebras are sort of mutualistic animals 783 00:35:54,650 --> 00:35:56,580 that travel in a pack, as opposed 784 00:35:56,580 --> 00:35:58,510 to these unicorns that don't exist. 785 00:35:58,510 --> 00:35:59,010 Right? 786 00:35:59,010 --> 00:36:03,000 They're not real animals that sort of promote 787 00:36:03,000 --> 00:36:04,790 any sort of benefit to the public. 788 00:36:04,790 --> 00:36:06,720 And so I'd be happy to pass it along. 789 00:36:06,720 --> 00:36:08,220 AUDIENCE: I mean the article makes a really good point 790 00:36:08,220 --> 00:36:10,470 about how like these other companies that are unicorns 791 00:36:10,470 --> 00:36:12,930 are kind of like a bull in a China shop. 792 00:36:12,930 --> 00:36:15,592 They go and they ruin like a lot of societal structures, 793 00:36:15,592 --> 00:36:17,550 and they displace people, and they don't really 794 00:36:17,550 --> 00:36:20,215 consider it, or take any consideration. 795 00:36:20,215 --> 00:36:22,590 And how it kind of-- yeah, it was an interesting article. 796 00:36:22,590 --> 00:36:23,940 AUDIENCE: And the one point that it doesn't hit, 797 00:36:23,940 --> 00:36:25,620 I will underscore for all of you, 798 00:36:25,620 --> 00:36:30,000 is the way in which companies like Uber, 799 00:36:30,000 --> 00:36:32,250 and I think some of us know this, but are essentially 800 00:36:32,250 --> 00:36:34,360 informalizing our economy, right? 801 00:36:34,360 --> 00:36:36,720 Like, one of the things that we treasure about economy 802 00:36:36,720 --> 00:36:38,095 in the developed world is that it 803 00:36:38,095 --> 00:36:41,227 is a formal economy in which people 804 00:36:41,227 --> 00:36:42,810 have labor protections, in which there 805 00:36:42,810 --> 00:36:44,227 are trade unions, in which workers 806 00:36:44,227 --> 00:36:46,680 have rights and protections against their employers, 807 00:36:46,680 --> 00:36:47,550 and have that-- 808 00:36:47,550 --> 00:36:49,440 they can use their citizenship as a means 809 00:36:49,440 --> 00:36:51,570 of leveraging against discrimination 810 00:36:51,570 --> 00:36:54,600 that might occur to them in the labor economy, right? 811 00:36:54,600 --> 00:36:57,450 We, as wage laborers, should get protections. 812 00:36:57,450 --> 00:36:59,850 But the way that these companies are introducing models 813 00:36:59,850 --> 00:37:02,790 and perhaps, in Alibaba's case, the kind that they're 814 00:37:02,790 --> 00:37:05,190 bringing to the US, is that they're 815 00:37:05,190 --> 00:37:06,810 removing the labor protections. 816 00:37:06,810 --> 00:37:10,110 So what happens if you expose citizens to-- 817 00:37:10,110 --> 00:37:13,020 or if you revert citizens from wage laborers 818 00:37:13,020 --> 00:37:17,220 with protections to informal laborers who sort of are 819 00:37:17,220 --> 00:37:18,770 itinerant in the economy. 820 00:37:18,770 --> 00:37:21,210 I mean, that's an incredible consideration. 821 00:37:21,210 --> 00:37:21,980 WILLIAM BONVILLIAN: Look, we went through 822 00:37:21,980 --> 00:37:23,070 in the 19th century. 823 00:37:23,070 --> 00:37:25,260 I mean, with the advent of industrialization there 824 00:37:25,260 --> 00:37:27,210 were no worker protections. 825 00:37:27,210 --> 00:37:31,050 And massive problems with the workforce. 826 00:37:31,050 --> 00:37:32,280 We've all read Dickens. 827 00:37:32,280 --> 00:37:36,450 So we went through that phase, and will 828 00:37:36,450 --> 00:37:38,520 more formalized protection systems 829 00:37:38,520 --> 00:37:42,910 evolve over time, even for the unicorn world? 830 00:37:42,910 --> 00:37:44,150 Let's move to David Autor 831 00:37:44,150 --> 00:37:45,270 AUDIENCE: Yeah. 832 00:37:45,270 --> 00:37:47,800 All right, so as we kind of mentioned, 833 00:37:47,800 --> 00:37:50,820 Autor seems to be slightly less concerned 834 00:37:50,820 --> 00:37:55,290 with the immediate impacts of automation. 835 00:37:55,290 --> 00:38:00,270 But Lily pointed out that there is some really startling job 836 00:38:00,270 --> 00:38:01,980 loss in some sectors. 837 00:38:01,980 --> 00:38:04,320 So even if some of these jobs are replaced, 838 00:38:04,320 --> 00:38:09,120 if we see an entire sector, such as the retail sector, collapse, 839 00:38:09,120 --> 00:38:11,490 is that something that the economy can bounce back from? 840 00:38:11,490 --> 00:38:13,728 What does that mean for social order? 841 00:38:13,728 --> 00:38:16,020 I mean, we've already seen the effects of manufacturing 842 00:38:16,020 --> 00:38:20,430 downturns have had pretty dramatic effects on social ways 843 00:38:20,430 --> 00:38:21,930 that we are constructing ourselves. 844 00:38:21,930 --> 00:38:24,840 So even if we can replace some of these jobs, 845 00:38:24,840 --> 00:38:28,530 what do you think that means for the rest of society? 846 00:38:30,792 --> 00:38:32,250 WILLIAM BONVILLIAN: Here's a trick. 847 00:38:32,250 --> 00:38:34,180 Ask Lilly since she wrote the question. 848 00:38:34,180 --> 00:38:36,430 AUDIENCE: Well, I was just thinking about it because-- 849 00:38:36,430 --> 00:38:39,030 I don't know-- my husband was listing to me 850 00:38:39,030 --> 00:38:41,520 more than a dozen retail companies 851 00:38:41,520 --> 00:38:44,690 that are entering bankruptcy. 852 00:38:44,690 --> 00:38:48,990 I think JCPenney, Sears, et cetera. 853 00:38:48,990 --> 00:38:53,910 Retail stores that are huge and employ a lot of people. 854 00:38:53,910 --> 00:38:55,770 And we thought of as like staples 855 00:38:55,770 --> 00:39:01,410 in our economy and our consumer needs. 856 00:39:01,410 --> 00:39:06,030 Also, there is talk of the mall industry going under. 857 00:39:06,030 --> 00:39:10,230 Just this like-- because people don't go to the mall 858 00:39:10,230 --> 00:39:12,293 like they used to all the time. 859 00:39:12,293 --> 00:39:13,710 So I'm just wondering if those are 860 00:39:13,710 --> 00:39:17,580 going to have almost a domino effect of well, 861 00:39:17,580 --> 00:39:20,972 a couple of retailers not that impactful, 862 00:39:20,972 --> 00:39:21,930 not that big of a deal. 863 00:39:21,930 --> 00:39:24,513 But then if you start to think about the infrastructure that's 864 00:39:24,513 --> 00:39:27,990 built up around malls, even the parking lots for malls, 865 00:39:27,990 --> 00:39:30,655 like what's going to happen to those? 866 00:39:30,655 --> 00:39:32,280 What are we going to replace them with? 867 00:39:32,280 --> 00:39:35,056 I just think it might get a little desolate? 868 00:39:35,056 --> 00:39:40,455 AUDIENCE: [INAUDIBLE] I would look into vacancies in malls. 869 00:39:40,455 --> 00:39:42,330 The way I would do it was like, if everything 870 00:39:42,330 --> 00:39:44,330 was statistical analysis of the malls in the US, 871 00:39:44,330 --> 00:39:46,170 and if they have vacancies. 872 00:39:46,170 --> 00:39:48,705 Because what I'm pretty much thinking is probably 873 00:39:48,705 --> 00:39:52,103 would happen is like the JCPenney model was optimized 874 00:39:52,103 --> 00:39:53,520 for the '90s, and it didn't evolve 875 00:39:53,520 --> 00:39:56,270 into what the new consumer needs would turn more niche. 876 00:39:56,270 --> 00:39:58,640 And so they just lost their edge and are losing money. 877 00:39:58,640 --> 00:40:00,930 But there's probably new stores like Lush 878 00:40:00,930 --> 00:40:05,913 or like Kroger coming in and taking over those. 879 00:40:05,913 --> 00:40:07,830 That's probably what I think that the business 880 00:40:07,830 --> 00:40:09,240 models of the past have been disrupted, 881 00:40:09,240 --> 00:40:10,320 and now there's new people coming in. 882 00:40:10,320 --> 00:40:11,737 Because people still go to malls-- 883 00:40:11,737 --> 00:40:14,740 AUDIENCE: But what if Lush, say like, you know how people-- 884 00:40:14,740 --> 00:40:16,380 I actually don't shop at Lush, but I 885 00:40:16,380 --> 00:40:18,420 know that they make the products in stores. 886 00:40:18,420 --> 00:40:20,700 I mean, what if machines come into Lush, and then 887 00:40:20,700 --> 00:40:23,510 the products, I mean the laborers become irrelevant? 888 00:40:23,510 --> 00:40:24,540 AUDIENCE: [INAUDIBLE]. 889 00:40:24,540 --> 00:40:25,707 My little sister loves Lush. 890 00:40:26,723 --> 00:40:28,140 WILLIAM BONVILLIAN: I think you're 891 00:40:28,140 --> 00:40:30,057 making an important point there, Martin, which 892 00:40:30,057 --> 00:40:36,120 is that there may well be ways to adapt a face-to-face store 893 00:40:36,120 --> 00:40:38,290 model to these new realities. 894 00:40:38,290 --> 00:40:41,413 And there's a slew of companies that are actually working hard 895 00:40:41,413 --> 00:40:42,330 on doing exactly this. 896 00:40:42,330 --> 00:40:43,050 AUDIENCE: I mean-- 897 00:40:43,050 --> 00:40:44,550 WILLIAM BONVILLIAN: And then there's 898 00:40:44,550 --> 00:40:47,490 a whole set of companies that specialize 899 00:40:47,490 --> 00:40:50,980 in kind of last year's goods. 900 00:40:50,980 --> 00:40:54,990 Think TJ Maxx, right, that are thriving, right? 901 00:40:54,990 --> 00:40:56,720 And suppose you built your mall around-- 902 00:40:56,720 --> 00:40:57,120 AUDIENCE: H & M. 903 00:40:57,120 --> 00:40:57,870 WILLIAM BONVILLIAN: --a combination 904 00:40:57,870 --> 00:40:58,710 of those two models. 905 00:40:58,710 --> 00:41:01,510 Maybe that's a more interesting mall model. 906 00:41:01,510 --> 00:41:03,420 So again, I'm not sure. 907 00:41:03,420 --> 00:41:05,670 I think we've got to keep in mind just how 908 00:41:05,670 --> 00:41:10,050 fast these changes are upon us, and what the strategies are. 909 00:41:10,050 --> 00:41:13,950 I think David Autor would argue that there's a new delivery 910 00:41:13,950 --> 00:41:16,740 model emerging here, right? 911 00:41:16,740 --> 00:41:20,310 A new kind of role for people that's just like we 912 00:41:20,310 --> 00:41:23,730 saw with the ATM machines, that's actually a significantly 913 00:41:23,730 --> 00:41:25,320 upgraded kind of occupation. 914 00:41:25,320 --> 00:41:28,350 And can we move on those kinds of steps, 915 00:41:28,350 --> 00:41:32,670 right, and then readjust some of the existing models for that? 916 00:41:32,670 --> 00:41:34,010 I mean, it's an open question. 917 00:41:34,010 --> 00:41:35,910 In this piece in The Economist this week, 918 00:41:35,910 --> 00:41:38,190 which I recommended earlier, reiterates 919 00:41:38,190 --> 00:41:39,690 some of the things you were raising, 920 00:41:39,690 --> 00:41:42,470 Lily, which is that there's going 921 00:41:42,470 --> 00:41:44,640 to be a lot of grief in malls unless they 922 00:41:44,640 --> 00:41:47,783 get their models sorted out. 923 00:41:47,783 --> 00:41:49,200 And they note a number of examples 924 00:41:49,200 --> 00:41:51,755 of models of malls that are actually working to do that. 925 00:41:51,755 --> 00:41:53,880 AUDIENCE: I think there's two main things happening 926 00:41:53,880 --> 00:41:57,135 that are probably going to be disruptive to retailers. 927 00:41:57,135 --> 00:41:59,760 Because I've studied retail for a bit looking at a whole market 928 00:41:59,760 --> 00:42:00,570 analysis. 929 00:42:00,570 --> 00:42:02,720 And there's a reason Amazon's going to retail, 930 00:42:02,720 --> 00:42:05,580 is because all of e-commerce is only 30% of what people buy. 931 00:42:05,580 --> 00:42:07,922 People still buy like, and go to places. 932 00:42:07,922 --> 00:42:08,880 It's just a lot better. 933 00:42:08,880 --> 00:42:11,360 They get to see it, they get to feel it. 934 00:42:11,360 --> 00:42:12,840 And that's why they go to malls. 935 00:42:12,840 --> 00:42:14,340 Their disruption though, is that they're not 936 00:42:14,340 --> 00:42:15,440 going to really have workers. 937 00:42:15,440 --> 00:42:16,110 I don't know if you've seen how they're 938 00:42:16,110 --> 00:42:17,152 going to do their stores. 939 00:42:17,152 --> 00:42:18,160 AUDIENCE: [INAUDIBLE] 940 00:42:18,160 --> 00:42:19,410 AUDIENCE: The other-- that's one disruption where 941 00:42:19,410 --> 00:42:21,180 if you don't have employees. 942 00:42:21,180 --> 00:42:22,710 But issues like teenagers, right? 943 00:42:22,710 --> 00:42:23,432 Like, it's not-- 944 00:42:23,432 --> 00:42:25,140 I don't know if somebody expects to raise 945 00:42:25,140 --> 00:42:28,440 a family on like an H & M salary, or like a Zara salary. 946 00:42:28,440 --> 00:42:32,078 The second disruption would be 3D printers, and what 947 00:42:32,078 --> 00:42:34,620 it does when you do materials, and when you do like, fashion, 948 00:42:34,620 --> 00:42:35,420 even. 949 00:42:35,420 --> 00:42:36,780 Because if you go into a store-- 950 00:42:36,780 --> 00:42:39,470 because most of their cost is all inventory. 951 00:42:39,470 --> 00:42:41,095 And it leads to a lot of logistics. 952 00:42:41,095 --> 00:42:43,470 If I can make my store in such a way that you can come in 953 00:42:43,470 --> 00:42:45,992 and you'll get a tailored dress is probably better 954 00:42:45,992 --> 00:42:48,450 because it's very unique and you can make like a 3D printer 955 00:42:48,450 --> 00:42:50,460 for a dress, that'd be really cool. 956 00:42:50,460 --> 00:42:52,560 And somebody can go in and get a perfect dress 957 00:42:52,560 --> 00:42:54,227 based on their style, based on a picture 958 00:42:54,227 --> 00:42:56,007 that they have on their like, Instagram. 959 00:42:56,007 --> 00:42:58,090 Then you don't have inventory and they can sell it 960 00:42:58,090 --> 00:42:59,965 at like a decent price, and it would probably 961 00:42:59,965 --> 00:43:01,980 be a relatively cheap dress, right? 962 00:43:01,980 --> 00:43:03,780 Or if you can go and get a tailor-made suit 963 00:43:03,780 --> 00:43:05,900 if you're a guy, and it's like perfect for you, 964 00:43:05,900 --> 00:43:08,310 and it costs you like, $100, $200, and it's in the fabric 965 00:43:08,310 --> 00:43:08,700 that you want. 966 00:43:08,700 --> 00:43:10,270 Especially with synthetic fabrics. 967 00:43:10,270 --> 00:43:12,478 I don't know if we talked about in this class-- well, 968 00:43:12,478 --> 00:43:14,970 we talked about in the industrial center. 969 00:43:14,970 --> 00:43:16,440 But those are the big disruptions. 970 00:43:16,440 --> 00:43:18,050 Or if you're like, instead of having-- 971 00:43:18,050 --> 00:43:20,280 being an auto store, instead of having all the inventory 972 00:43:20,280 --> 00:43:22,620 that cost a lot of money, and it gets old, and it gets rusty, 973 00:43:22,620 --> 00:43:24,745 you can build a part when somebody comes in exactly 974 00:43:24,745 --> 00:43:27,270 for their car because you have the data for that piece. 975 00:43:27,270 --> 00:43:29,120 That's the big disruption. 976 00:43:29,120 --> 00:43:31,240 And it will-- and it's like-- it'll 977 00:43:31,240 --> 00:43:33,460 be quick toppling of a lot of core businesses 978 00:43:33,460 --> 00:43:34,390 that-- because all you have to do 979 00:43:34,390 --> 00:43:36,723 is look at their balance sheet, see where they spent all 980 00:43:36,723 --> 00:43:39,070 their money, and then you make that your business model 981 00:43:39,070 --> 00:43:40,760 as a startup completely the opposite. 982 00:43:40,760 --> 00:43:42,670 And then they cant' compete against you. 983 00:43:42,670 --> 00:43:44,587 WILLIAM BONVILLIAN: And against this backdrop, 984 00:43:44,587 --> 00:43:50,890 remember, too, that the entire movement of computer-like goods 985 00:43:50,890 --> 00:43:56,560 was towards these big box stores in distant low-cost suburbs 986 00:43:56,560 --> 00:43:57,910 with gigantic parking lots. 987 00:43:57,910 --> 00:44:00,540 That was the entire movement of that sector. 988 00:44:00,540 --> 00:44:03,760 Until Apple completely reinvented the model, 989 00:44:03,760 --> 00:44:08,320 and decided that, let's have fabulous face-to-face, highly 990 00:44:08,320 --> 00:44:14,170 personal, employee-heavy, great service, 991 00:44:14,170 --> 00:44:17,250 beautiful designed stores. 992 00:44:17,250 --> 00:44:19,180 It is the most valuable retail property 993 00:44:19,180 --> 00:44:21,440 in the planet is an Apple Store. 994 00:44:21,440 --> 00:44:23,703 There's nothing like it, right? 995 00:44:23,703 --> 00:44:25,870 They've managed to create more value per square foot 996 00:44:25,870 --> 00:44:30,160 than any real estate magnate ever dreamed of, right? 997 00:44:30,160 --> 00:44:33,190 Again, through a very different kind of model, 998 00:44:33,190 --> 00:44:36,280 and again, taking advantage of a new kind of way 999 00:44:36,280 --> 00:44:38,590 of looking at employees, and in effect, 1000 00:44:38,590 --> 00:44:41,960 upskilling that whole employee base. 1001 00:44:41,960 --> 00:44:44,560 So these are ways in which David Autor would 1002 00:44:44,560 --> 00:44:50,020 argue are ways of creating complementarity to these new, 1003 00:44:50,020 --> 00:44:52,430 evolving technologies. 1004 00:44:52,430 --> 00:44:54,310 How about one more Auto question then? 1005 00:44:54,310 --> 00:44:55,060 AUDIENCE: Perfect. 1006 00:44:55,060 --> 00:44:58,330 So the other kind of controversial, or 1007 00:44:58,330 --> 00:45:01,320 thought-provoking topic that's brought up in these articles 1008 00:45:01,320 --> 00:45:06,250 was the idea that if we reach a state where so many jobs are 1009 00:45:06,250 --> 00:45:08,560 eliminated and automation takes over, 1010 00:45:08,560 --> 00:45:10,660 we'll really be in a state of abundance 1011 00:45:10,660 --> 00:45:14,980 rather than deficiency. 1012 00:45:14,980 --> 00:45:16,972 So they bring up the idea of we'll 1013 00:45:16,972 --> 00:45:19,180 still have to figure out how to allocate these goods, 1014 00:45:19,180 --> 00:45:22,150 and allocate how much money people have even there's 1015 00:45:22,150 --> 00:45:23,830 no jobs for working. 1016 00:45:23,830 --> 00:45:27,290 So I think that's a pretty crazy thing to think about right now. 1017 00:45:27,290 --> 00:45:29,842 But I'd be interested to hear what you 1018 00:45:29,842 --> 00:45:31,050 think could happen with that. 1019 00:45:31,050 --> 00:45:31,300 WILLIAM BONVILLIAN: Right. 1020 00:45:31,300 --> 00:45:33,550 In other words, if we're aiming at ever more efficient 1021 00:45:33,550 --> 00:45:36,710 production, we're going to create productivity gains, 1022 00:45:36,710 --> 00:45:39,490 which in turn generate new resources in a society 1023 00:45:39,490 --> 00:45:43,090 that you could use for additional societal well-being. 1024 00:45:43,090 --> 00:45:46,210 So that's the society-wide picture. 1025 00:45:46,210 --> 00:45:47,860 But then if you look at it in terms 1026 00:45:47,860 --> 00:45:51,070 of a particular line of goods, or a particular region, 1027 00:45:51,070 --> 00:45:54,700 there is more wealth potentially being created, right, 1028 00:45:54,700 --> 00:45:56,440 for any given line of goods. 1029 00:45:56,440 --> 00:45:58,210 And that indeed can be distributed, 1030 00:45:58,210 --> 00:46:01,750 and that can be a part of a new kind of employee, 1031 00:46:01,750 --> 00:46:03,670 a new kind of generation of employees, 1032 00:46:03,670 --> 00:46:05,170 and what their return is. 1033 00:46:05,170 --> 00:46:06,460 AUDIENCE: Mm-hmm. 1034 00:46:06,460 --> 00:46:10,010 So I guess, kind of what do you see as a hypothetical picture 1035 00:46:10,010 --> 00:46:11,038 for how that would work? 1036 00:46:11,038 --> 00:46:12,580 Is this something that the government 1037 00:46:12,580 --> 00:46:14,460 would have to kind of regulate? 1038 00:46:14,460 --> 00:46:16,140 Is this coming from the markets? 1039 00:46:16,140 --> 00:46:17,220 AUDIENCE: Are you talking about like universal income 1040 00:46:17,220 --> 00:46:17,965 [INAUDIBLE]? 1041 00:46:17,965 --> 00:46:18,590 AUDIENCE: Yeah. 1042 00:46:18,590 --> 00:46:19,760 AUDIENCE: OK. 1043 00:46:19,760 --> 00:46:22,260 Anybody have interest in that? 1044 00:46:22,260 --> 00:46:24,742 No, I'm not too socialist. 1045 00:46:24,742 --> 00:46:26,450 AUDIENCE: I was going to make an analogy. 1046 00:46:29,530 --> 00:46:31,532 I have heard this is a fact. 1047 00:46:31,532 --> 00:46:32,990 I have not done research on it so I 1048 00:46:32,990 --> 00:46:34,940 don't know if it's speculation, but the way in which 1049 00:46:34,940 --> 00:46:36,470 the diamond industry operates, which 1050 00:46:36,470 --> 00:46:38,220 is that there is an abundance of diamonds, 1051 00:46:38,220 --> 00:46:40,070 but there is a control of the market, 1052 00:46:40,070 --> 00:46:43,190 and thus they raise the price on the diamond significantly. 1053 00:46:43,190 --> 00:46:46,390 And so the reason I bring this as an example in which 1054 00:46:46,390 --> 00:46:50,947 to draw a parallel is because universal basic income, 1055 00:46:50,947 --> 00:46:53,280 and the both of you, I think the place where both of you 1056 00:46:53,280 --> 00:46:55,460 are coming from, is an assumption that people 1057 00:46:55,460 --> 00:46:57,590 want to distribute those resources, right? 1058 00:46:57,590 --> 00:46:59,420 And that people want other-- that we 1059 00:46:59,420 --> 00:47:02,450 want to create those productivity gains 1060 00:47:02,450 --> 00:47:05,930 rather than decrease, or rather if we increase our productivity 1061 00:47:05,930 --> 00:47:08,270 gains, then we won't act upon them. 1062 00:47:08,270 --> 00:47:10,040 And I don't necessarily-- 1063 00:47:10,040 --> 00:47:12,050 I don't know if that's true. 1064 00:47:12,050 --> 00:47:14,237 And that's, I think my concern. 1065 00:47:14,237 --> 00:47:15,320 AUDIENCE: Can you clarify? 1066 00:47:15,320 --> 00:47:19,280 AUDIENCE: Like, just because we can make more, say, food. 1067 00:47:19,280 --> 00:47:22,850 Just because we will have the capacity to make more food 1068 00:47:22,850 --> 00:47:24,980 doesn't mean we will. 1069 00:47:24,980 --> 00:47:27,290 Just because we have the capacity to-- 1070 00:47:27,290 --> 00:47:29,930 AUDIENCE: Or doesn't mean that everyone will be, on a whole, 1071 00:47:29,930 --> 00:47:30,698 healthier. 1072 00:47:30,698 --> 00:47:31,365 AUDIENCE: Right. 1073 00:47:31,365 --> 00:47:32,115 AUDIENCE: Oh yeah. 1074 00:47:32,115 --> 00:47:34,340 AUDIENCE: Exactly. 1075 00:47:34,340 --> 00:47:37,820 Just because we will have access to more things, 1076 00:47:37,820 --> 00:47:42,770 and we will have more rapid access to these goods, 1077 00:47:42,770 --> 00:47:45,560 does not mean that it will be a-- 1078 00:47:45,560 --> 00:47:49,700 that people will have access to them, or that it will be like 1079 00:47:49,700 --> 00:47:51,620 you were saying a good thing overall. 1080 00:47:51,620 --> 00:47:54,320 AUDIENCE: I mean I think the idea of giving 1081 00:47:54,320 --> 00:47:56,420 people hope that if they lose their job 1082 00:47:56,420 --> 00:47:58,550 their whole life isn't going to really, really suck 1083 00:47:58,550 --> 00:48:02,375 and they'll be poor is important. 1084 00:48:02,375 --> 00:48:05,000 I don't know if universal basic income would be the best thing, 1085 00:48:05,000 --> 00:48:07,588 just because the incentives. 1086 00:48:07,588 --> 00:48:10,130 And I think that will be really a question of our generation. 1087 00:48:10,130 --> 00:48:12,980 Like, how do you make sure that these people end up 1088 00:48:12,980 --> 00:48:15,540 having a place in society, and being productive? 1089 00:48:15,540 --> 00:48:18,440 And not going into depression, or not spending their days just 1090 00:48:18,440 --> 00:48:20,150 like not doing anything. 1091 00:48:20,150 --> 00:48:21,340 Like, how do you-- 1092 00:48:21,340 --> 00:48:24,380 I think a big thing that will have to happen, probably, 1093 00:48:24,380 --> 00:48:26,690 is that governments, or even private organizations, 1094 00:48:26,690 --> 00:48:29,720 because governments now might not be adequate at this, 1095 00:48:29,720 --> 00:48:31,220 will need to figure out key business 1096 00:48:31,220 --> 00:48:35,150 opportunities for micro entrepreneurs 1097 00:48:35,150 --> 00:48:37,760 that could give them sustainable income, 1098 00:48:37,760 --> 00:48:40,240 and give them the ability to capital lines 1099 00:48:40,240 --> 00:48:45,140 so that they'd be able to generate wealth and create 1100 00:48:45,140 --> 00:48:47,360 businesses in areas that need these businesses. 1101 00:48:47,360 --> 00:48:49,220 Because I think a lot of the problem is in businesses. 1102 00:48:49,220 --> 00:48:50,320 Like, you start a business and you don't know 1103 00:48:50,320 --> 00:48:51,440 if you will succeed or not. 1104 00:48:51,440 --> 00:48:52,940 But if somebody could do an analysis 1105 00:48:52,940 --> 00:48:55,100 like they do for McDonald's, where it's like, OK, 1106 00:48:55,100 --> 00:48:56,490 we know this area this will be successful. 1107 00:48:56,490 --> 00:48:57,160 We know there's a need. 1108 00:48:57,160 --> 00:48:59,320 The government probably knows where there's needs. 1109 00:48:59,320 --> 00:49:01,028 If you can do an analysis to figure out-- 1110 00:49:01,028 --> 00:49:02,870 data to figure out, OK, a company 1111 00:49:02,870 --> 00:49:04,160 that does this would be really useful here, 1112 00:49:04,160 --> 00:49:06,180 a company that does this would be really useful here. 1113 00:49:06,180 --> 00:49:07,472 These will be the requirements. 1114 00:49:07,472 --> 00:49:11,000 You can do a franchise model for people to start companies. 1115 00:49:11,000 --> 00:49:13,340 I'm not against like universal basic income. 1116 00:49:13,340 --> 00:49:15,523 I just worry about the incentives of not having 1117 00:49:15,523 --> 00:49:17,440 the ability-- or maybe you create an incentive 1118 00:49:17,440 --> 00:49:20,050 that you study and you become a teacher. 1119 00:49:20,050 --> 00:49:23,000 It's a thing that, no matter what you say-- 1120 00:49:23,000 --> 00:49:26,270 I can say like what it is, but it's more about how you do it. 1121 00:49:26,270 --> 00:49:27,980 Because you could incentivize and create 1122 00:49:27,980 --> 00:49:30,215 more teachers, but if you don't respect those teachers, 1123 00:49:30,215 --> 00:49:32,270 or if they don't value them, then it'll be very weird. 1124 00:49:32,270 --> 00:49:34,220 If you create it and people just use the money 1125 00:49:34,220 --> 00:49:36,137 and build these businesses, and they're really 1126 00:49:36,137 --> 00:49:39,270 going out and having fun, I don't know. 1127 00:49:39,270 --> 00:49:41,630 AUDIENCE: I think for me, universal basic income is one 1128 00:49:41,630 --> 00:49:43,922 of those things where like economists really talk about 1129 00:49:43,922 --> 00:49:52,110 and we can, in a sense, there's like a lot of literature 1130 00:49:52,110 --> 00:49:55,290 and kind of journal articles kind of fielding this idea like 1131 00:49:55,290 --> 00:49:57,340 what if we don't. 1132 00:49:57,340 --> 00:49:58,897 But in the US, I'm trying to see, 1133 00:49:58,897 --> 00:50:01,230 if we have this sort of widening of this barbell effect, 1134 00:50:01,230 --> 00:50:03,622 where we had this erosion of this middle class, 1135 00:50:03,622 --> 00:50:04,830 and kind of like this large-- 1136 00:50:08,150 --> 00:50:11,400 like, basically the people getting rich 1137 00:50:11,400 --> 00:50:13,950 aren't increasing in number fast enough, 1138 00:50:13,950 --> 00:50:15,510 but like, there's a lot of people 1139 00:50:15,510 --> 00:50:17,070 kind of regressing this way. 1140 00:50:17,070 --> 00:50:19,830 And so I'm trying to see like, what are the conditions that 1141 00:50:19,830 --> 00:50:22,858 will allow for us to call for a universal basic income? 1142 00:50:22,858 --> 00:50:24,900 WILLIAM BONVILLIAN: Rashid, I think you're really 1143 00:50:24,900 --> 00:50:27,540 onto an important point here. 1144 00:50:27,540 --> 00:50:34,230 If you've got a declining middle class, their incentives 1145 00:50:34,230 --> 00:50:38,190 to, in effect, take money out of their own incomes 1146 00:50:38,190 --> 00:50:41,063 and redistribute them, that's going to be a problem. 1147 00:50:41,063 --> 00:50:43,230 If that's a threatened community, which increasingly 1148 00:50:43,230 --> 00:50:48,480 it is, then politically, that is a very difficult step to take. 1149 00:50:48,480 --> 00:50:51,120 So I would argue that we probably 1150 00:50:51,120 --> 00:50:53,340 need to look at education, which is a great way 1151 00:50:53,340 --> 00:50:56,820 to lead into our next reading. 1152 00:50:56,820 --> 00:51:00,750 So another terrific David Autor piece. 1153 00:51:00,750 --> 00:51:03,090 David, for my money is doing the best work in economics 1154 00:51:03,090 --> 00:51:03,968 anywhere. 1155 00:51:06,960 --> 00:51:10,800 This is his-- this builds on the Goldin and Katz piece 1156 00:51:10,800 --> 00:51:12,510 that we talked about in the last class, 1157 00:51:12,510 --> 00:51:15,600 right, and kind of takes it another kind of step further. 1158 00:51:18,410 --> 00:51:22,710 And this piece is all about skills, 1159 00:51:22,710 --> 00:51:27,100 and education, and the rise of earnings inequality. 1160 00:51:27,100 --> 00:51:33,720 And now, there is a steep and persistent rise 1161 00:51:33,720 --> 00:51:35,430 of earnings inequality in the US labor 1162 00:51:35,430 --> 00:51:37,770 market that has been going on for more than a couple 1163 00:51:37,770 --> 00:51:39,840 of decades now. 1164 00:51:39,840 --> 00:51:41,730 And it's not just true in the United States. 1165 00:51:41,730 --> 00:51:47,370 This is a phenomenon in the developed world in general. 1166 00:51:47,370 --> 00:51:48,942 But why look at inequality? 1167 00:51:51,520 --> 00:51:53,760 And what do we start to see? 1168 00:51:53,760 --> 00:51:56,940 First, the earnings premium for education 1169 00:51:56,940 --> 00:52:03,810 has risen very significantly as we talked about last week. 1170 00:52:03,810 --> 00:52:07,940 So 2/3 of the overall rise of earning dispersion between 1980 1171 00:52:07,940 --> 00:52:11,780 and 2005 is accounted for by this increased premium that's 1172 00:52:11,780 --> 00:52:14,300 associated with schooling in general, 1173 00:52:14,300 --> 00:52:17,390 and post-secondary education in particular. 1174 00:52:17,390 --> 00:52:20,640 So let me just jump to a slide of this. 1175 00:52:20,640 --> 00:52:22,730 So this is a projection that comes out 1176 00:52:22,730 --> 00:52:24,710 of the quite respected Georgetown University 1177 00:52:24,710 --> 00:52:26,540 Center on Education and Workforce, which 1178 00:52:26,540 --> 00:52:31,280 does really first-rate work on the workforce. 1179 00:52:31,280 --> 00:52:36,500 And they found that looking at what projections were for 2024, 1180 00:52:36,500 --> 00:52:40,130 they found that in terms of job openings, 1181 00:52:40,130 --> 00:52:42,740 high school diploma and less than a high school diploma 1182 00:52:42,740 --> 00:52:45,320 was going to be about 19.7 million job 1183 00:52:45,320 --> 00:52:47,960 openings in that time period. 1184 00:52:47,960 --> 00:52:51,140 Job openings for those with college associate degrees-- 1185 00:52:51,140 --> 00:52:55,100 some college associate degrees and college plus college-- 1186 00:52:55,100 --> 00:52:57,830 36.5 million jobs, right? 1187 00:52:57,830 --> 00:53:02,540 So this is a profound upskilling of the workforce 1188 00:53:02,540 --> 00:53:04,700 that's going on. 1189 00:53:04,700 --> 00:53:07,970 And look, by the way, the higher education system 1190 00:53:07,970 --> 00:53:11,190 isn't necessarily going to produce these numbers. 1191 00:53:11,190 --> 00:53:13,737 So what that means is that the people on this end 1192 00:53:13,737 --> 00:53:16,070 are going to be able to charge a premium because there's 1193 00:53:16,070 --> 00:53:18,560 going to be a shortage of demand for their skill sets. 1194 00:53:18,560 --> 00:53:19,730 That's what's going on. 1195 00:53:19,730 --> 00:53:21,170 That's what David Autor is telling 1196 00:53:21,170 --> 00:53:26,270 us has been going on now for several decades in the US. 1197 00:53:26,270 --> 00:53:30,020 It's exactly that phenomena. 1198 00:53:30,020 --> 00:53:37,930 And the earnings gap, then, between college and high school 1199 00:53:37,930 --> 00:53:40,270 graduates has more than doubled in the United States 1200 00:53:40,270 --> 00:53:43,340 over the last three decades. 1201 00:53:43,340 --> 00:53:47,000 So lifetime earnings expectation with a college degree 1202 00:53:47,000 --> 00:53:50,540 is now very powerful compared to your lifetime earning 1203 00:53:50,540 --> 00:53:55,400 expectation if you have a high school degree. 1204 00:53:55,400 --> 00:53:59,780 And the skill premium concept here, 1205 00:53:59,780 --> 00:54:03,950 offers real insight into the evolution of inequality 1206 00:54:03,950 --> 00:54:04,700 in the US society. 1207 00:54:04,700 --> 00:54:08,750 Now of course, this is all related to the employment 1208 00:54:08,750 --> 00:54:11,390 issues, and particularly the future employment 1209 00:54:11,390 --> 00:54:15,580 issues that we've just been talking about. 1210 00:54:15,580 --> 00:54:18,460 One more statistic that underscores this. 1211 00:54:18,460 --> 00:54:20,185 This is the fall in real earnings. 1212 00:54:22,790 --> 00:54:27,760 In a Hamilton project study that was done for Brookings, 1213 00:54:27,760 --> 00:54:30,040 this is the percentage of real earnings 1214 00:54:30,040 --> 00:54:38,370 in working age from 1990 to 2013 for men aged 30 to 45. 1215 00:54:38,370 --> 00:54:41,490 No high school diploma, median income decline 1216 00:54:41,490 --> 00:54:44,880 went down 20 points in that time period. 1217 00:54:44,880 --> 00:54:50,570 High school diploma down 13 points in that time period. 1218 00:54:50,570 --> 00:54:53,570 So we've got a profound societal issue. 1219 00:54:53,570 --> 00:54:58,070 This is just a big signal of an inequality problem 1220 00:54:58,070 --> 00:55:01,340 that's a problem now, and because of the phenomenon 1221 00:55:01,340 --> 00:55:05,990 we've just been talking about in two previous readings. 1222 00:55:05,990 --> 00:55:09,510 Is potentially going to get significantly worse over time. 1223 00:55:09,510 --> 00:55:11,360 So this is a big societal dilemma 1224 00:55:11,360 --> 00:55:14,750 that you guys are going to have to figure out, right? 1225 00:55:14,750 --> 00:55:19,970 And it looks like Autor is telling us that a lot of this 1226 00:55:19,970 --> 00:55:21,980 has to do with the education system. 1227 00:55:21,980 --> 00:55:23,900 How are we going to alter the education 1228 00:55:23,900 --> 00:55:28,940 system to move a lot of people to higher levels of education? 1229 00:55:28,940 --> 00:55:30,980 How is that phenomenon going to work? 1230 00:55:30,980 --> 00:55:33,170 What are we going to do in terms of designing 1231 00:55:33,170 --> 00:55:37,140 new institutions and new mechanisms to do that? 1232 00:55:37,140 --> 00:55:40,310 So the rising demand for educated labor 1233 00:55:40,310 --> 00:55:45,590 in advanced economies, including the US, it's a profound rise. 1234 00:55:45,590 --> 00:55:47,540 And if the supply of educated workers 1235 00:55:47,540 --> 00:55:50,660 doesn't keep pace with the persistent outward shifts 1236 00:55:50,660 --> 00:55:54,140 in demand for the skills, then you just 1237 00:55:54,140 --> 00:55:58,580 multiply your inequality problem. 1238 00:55:58,580 --> 00:56:06,540 And it forces a public policy response around upskilling, 1239 00:56:06,540 --> 00:56:07,040 right? 1240 00:56:07,040 --> 00:56:10,190 That's a societal task we're going to have to accomplish. 1241 00:56:10,190 --> 00:56:13,730 Now, Rashid pointed out, I think very astutely, 1242 00:56:13,730 --> 00:56:16,760 that, with this barbell phenomena, 1243 00:56:16,760 --> 00:56:19,970 which is another David Autor concept, right, 1244 00:56:19,970 --> 00:56:22,220 that the society looks increasingly like a barbell. 1245 00:56:22,220 --> 00:56:26,360 You've got a very successful and growing upper middle class, 1246 00:56:26,360 --> 00:56:28,410 you've got the middle of the economy, 1247 00:56:28,410 --> 00:56:31,130 which is being in decline and being-- 1248 00:56:31,130 --> 00:56:33,440 we just saw the median income data-- 1249 00:56:33,440 --> 00:56:38,690 and moving towards lower-end service delivery jobs, right? 1250 00:56:38,690 --> 00:56:43,880 Personal service delivery kinds of positions. 1251 00:56:43,880 --> 00:56:48,740 The politics of that make it problematic to deal with, 1252 00:56:48,740 --> 00:56:51,050 say income distribution in the United States. 1253 00:56:51,050 --> 00:56:54,170 It's just not going to be politically simple. 1254 00:56:54,170 --> 00:56:55,910 If everybody was rising together that's 1255 00:56:55,910 --> 00:56:57,670 an easier problem to tackle. 1256 00:56:57,670 --> 00:56:59,600 But they're not. 1257 00:56:59,600 --> 00:57:02,450 So education is probably the way in which we're 1258 00:57:02,450 --> 00:57:04,150 going to have to address this. 1259 00:57:04,150 --> 00:57:04,872 All right? 1260 00:57:04,872 --> 00:57:06,830 We're going to have to up skill that workforce. 1261 00:57:06,830 --> 00:57:10,260 And look, by doing that, as Martin pointed out, 1262 00:57:10,260 --> 00:57:12,050 you're addressing the underlying problem. 1263 00:57:12,050 --> 00:57:14,750 You're helping people realize their talents 1264 00:57:14,750 --> 00:57:16,940 and their capabilities, right? 1265 00:57:16,940 --> 00:57:20,540 And that's, frankly, a better solution 1266 00:57:20,540 --> 00:57:22,880 to the problem than just throwing money at people, 1267 00:57:22,880 --> 00:57:24,020 I would argue, right? 1268 00:57:24,020 --> 00:57:25,730 It's probably not the only thing you do, 1269 00:57:25,730 --> 00:57:27,920 and it's not going to help everybody, 1270 00:57:27,920 --> 00:57:32,630 but can we find strategies to do this? 1271 00:57:32,630 --> 00:57:35,750 So I've been working on this, and I've been thinking about it 1272 00:57:35,750 --> 00:57:41,720 and talking to Sanjay Sarma, who runs the online education 1273 00:57:41,720 --> 00:57:46,820 effort at MIT, and MIT has developed these new platform 1274 00:57:46,820 --> 00:57:47,700 technologies. 1275 00:57:47,700 --> 00:57:51,200 MITx and edX is another platform provider, 1276 00:57:51,200 --> 00:57:53,840 pervasive online capability. 1277 00:57:53,840 --> 00:57:57,350 But on the other hand, it's hard to envision 1278 00:57:57,350 --> 00:57:59,480 a steelworker in southeastern Ohio 1279 00:57:59,480 --> 00:58:04,080 who's lost their job, at being willing or able to spend 1280 00:58:04,080 --> 00:58:06,620 a substantial amount of time in front of a computer 1281 00:58:06,620 --> 00:58:09,037 blue screen in their basement while their kids are running 1282 00:58:09,037 --> 00:58:10,100 around upstairs, right? 1283 00:58:10,100 --> 00:58:11,342 Just not terribly likely. 1284 00:58:11,342 --> 00:58:13,550 AUDIENCE: Or, if there was a case brought up recently 1285 00:58:13,550 --> 00:58:15,820 that somebody was worried about losing their job because they 1286 00:58:15,820 --> 00:58:17,860 would lose their insurance, and their kid might not be 1287 00:58:17,860 --> 00:58:19,040 able to get proper [INAUDIBLE]. 1288 00:58:19,040 --> 00:58:20,582 WILLIAM BONVILLIAN: Yes, the problems 1289 00:58:20,582 --> 00:58:22,960 multiply when you get into this box. 1290 00:58:22,960 --> 00:58:25,820 And we've had two recent-- recently two 1291 00:58:25,820 --> 00:58:28,130 very interesting books that start 1292 00:58:28,130 --> 00:58:32,580 to portray this problem for us. 1293 00:58:32,580 --> 00:58:38,570 And so Hillbillology is a very interesting story of one 1294 00:58:38,570 --> 00:58:43,430 of the big diasporas in US history out of the South, 1295 00:58:43,430 --> 00:58:47,570 out of the Appalachians, into industrial employment 1296 00:58:47,570 --> 00:58:49,070 in the Middle West. 1297 00:58:49,070 --> 00:58:51,020 And with the decline of US manufacturing, 1298 00:58:51,020 --> 00:58:53,450 that community really got nailed. 1299 00:58:53,450 --> 00:58:55,410 And what do you do with that community? 1300 00:58:55,410 --> 00:58:58,190 This book is a pay-on to that community, 1301 00:58:58,190 --> 00:59:00,980 and how to even think about the kind of problems 1302 00:59:00,980 --> 00:59:02,120 they've got into. 1303 00:59:02,120 --> 00:59:05,060 Opioids are one example cited in that book. 1304 00:59:05,060 --> 00:59:10,910 There's a new book just out by Annie Goldstein 1305 00:59:10,910 --> 00:59:16,550 about a town in Wisconsin. 1306 00:59:16,550 --> 00:59:18,800 Town of about 60,000 people. 1307 00:59:18,800 --> 00:59:21,785 The major employer was a General Motors plant 1308 00:59:21,785 --> 00:59:24,110 that employed 9,000 people. 1309 00:59:24,110 --> 00:59:26,690 It had survived the Rust Belt. It was a quality plant. 1310 00:59:26,690 --> 00:59:30,740 But when General Motors went bankrupt in 2008, 1311 00:59:30,740 --> 00:59:33,980 all those people lost their jobs, every single one of them. 1312 00:59:33,980 --> 00:59:36,470 And it put that community-- 1313 00:59:36,470 --> 00:59:38,840 tax revenues declined, housing values 1314 00:59:38,840 --> 00:59:41,690 declined, everything is going down, right? 1315 00:59:41,690 --> 00:59:46,590 Community services collapse, charitable contributions drop, 1316 00:59:46,590 --> 00:59:48,500 right? 1317 00:59:48,500 --> 00:59:51,410 It's painful to watch what happens in that community. 1318 00:59:51,410 --> 00:59:53,390 Finally, one good thing happens. 1319 00:59:53,390 --> 00:59:56,630 Dollar store opens a distribution plant. 1320 00:59:56,630 --> 00:59:58,460 The jobs pay way less than half of what 1321 00:59:58,460 --> 01:00:01,010 the General Motors jobs paid. 1322 01:00:01,010 --> 01:00:03,640 So that's the kind of problem that we're up against here. 1323 01:00:03,640 --> 01:00:05,510 And in light of all we've just been talking 1324 01:00:05,510 --> 01:00:07,910 about technological job displacement occurring 1325 01:00:07,910 --> 01:00:11,810 over time, we've got a big upskilling job 1326 01:00:11,810 --> 01:00:15,320 to do here, I would argue, in our economy. 1327 01:00:15,320 --> 01:00:20,180 So online capabilities are one piece, but that's not 1328 01:00:20,180 --> 01:00:21,520 going to be the answer here. 1329 01:00:21,520 --> 01:00:23,270 It's probably going to be a blended model. 1330 01:00:23,270 --> 01:00:26,600 And can we think about how to enlist community colleges, 1331 01:00:26,600 --> 01:00:29,600 and how to join them with universities 1332 01:00:29,600 --> 01:00:30,680 developing curriculum? 1333 01:00:30,680 --> 01:00:32,750 And how do we know what the curriculum is? 1334 01:00:32,750 --> 01:00:36,200 What's the content education in a society that's going 1335 01:00:36,200 --> 01:00:38,000 to change the nature of work? 1336 01:00:38,000 --> 01:00:39,560 How do we begin to understand that? 1337 01:00:39,560 --> 01:00:41,990 What's the changing nature of work that these technologies 1338 01:00:41,990 --> 01:00:43,290 are going to drive? 1339 01:00:43,290 --> 01:00:44,450 How do we educate for that? 1340 01:00:44,450 --> 01:00:46,940 So we've got a big task ahead. 1341 01:00:46,940 --> 01:00:49,050 But a mix of employers, community colleges, 1342 01:00:49,050 --> 01:00:51,560 universities, and labor organizations 1343 01:00:51,560 --> 01:00:54,140 is probably going to be the way we try to figure some of this 1344 01:00:54,140 --> 01:00:55,100 out. 1345 01:00:55,100 --> 01:00:58,850 So I'm just extrapolating from David Autor's point 1346 01:00:58,850 --> 01:01:01,070 about the underlying importance of education 1347 01:01:01,070 --> 01:01:03,867 to this inequality problem, and emphasizing 1348 01:01:03,867 --> 01:01:05,450 that education is probably going to be 1349 01:01:05,450 --> 01:01:07,580 the fix we're going to use. 1350 01:01:07,580 --> 01:01:09,380 But then, how are going to do that, right? 1351 01:01:09,380 --> 01:01:11,420 So a big challenging problem. 1352 01:01:11,420 --> 01:01:13,867 So who's got this one? 1353 01:01:13,867 --> 01:01:15,950 Sanam, you want to lead us through some questions? 1354 01:01:15,950 --> 01:01:17,723 AUDIENCE: Sure. 1355 01:01:17,723 --> 01:01:18,890 Yeah, so a couple of points. 1356 01:01:18,890 --> 01:01:20,630 I thought the focus on inequality 1357 01:01:20,630 --> 01:01:23,000 was really interesting here, especially 1358 01:01:23,000 --> 01:01:26,690 he touches on briefly why inequality is actually 1359 01:01:26,690 --> 01:01:27,733 good for innovation. 1360 01:01:27,733 --> 01:01:29,150 And this is something that we talk 1361 01:01:29,150 --> 01:01:32,640 a lot of that in econ classes, and how, of course there 1362 01:01:32,640 --> 01:01:34,430 are greater incentives to be more 1363 01:01:34,430 --> 01:01:36,980 productive because the rewards and returns are better. 1364 01:01:36,980 --> 01:01:39,410 And then also, generally in countries 1365 01:01:39,410 --> 01:01:43,942 that have more unequal distributions of income, 1366 01:01:43,942 --> 01:01:45,650 that attracts people from other countries 1367 01:01:45,650 --> 01:01:48,560 with more equal distribution of income to come to that country 1368 01:01:48,560 --> 01:01:49,100 and stay. 1369 01:01:49,100 --> 01:01:52,970 So in terms of attracting talent for our innovation system 1370 01:01:52,970 --> 01:01:55,500 from abroad, it's actually kind of a benefit. 1371 01:01:55,500 --> 01:01:58,150 But obviously there are a lot of problems 1372 01:01:58,150 --> 01:02:00,320 as well that she talks about. 1373 01:02:00,320 --> 01:02:04,940 Especially the intergenerational mobility problem. 1374 01:02:04,940 --> 01:02:09,890 So that there's a decline in wages for people who 1375 01:02:09,890 --> 01:02:11,510 are not in college educated. 1376 01:02:11,510 --> 01:02:14,570 And that tends to perpetuate within families and communities 1377 01:02:14,570 --> 01:02:15,570 as well. 1378 01:02:15,570 --> 01:02:19,260 So one of my big questions here, which 1379 01:02:19,260 --> 01:02:22,250 is just kind of a general question, 1380 01:02:22,250 --> 01:02:25,070 is how do we make sure that in discussions 1381 01:02:25,070 --> 01:02:27,620 about innovation and technological advancement, 1382 01:02:27,620 --> 01:02:30,790 that we address these problems of systemic inequality, that 1383 01:02:30,790 --> 01:02:32,840 still persist in the country? 1384 01:02:32,840 --> 01:02:36,380 And kind of tying onto that, to what extent 1385 01:02:36,380 --> 01:02:40,190 are the systems of privilege of educational and skill 1386 01:02:40,190 --> 01:02:42,465 advancement in attainment embedded in this framework 1387 01:02:42,465 --> 01:02:44,548 that we've been studying throughout this semester? 1388 01:02:44,548 --> 01:02:46,888 So I kind of want to get your thoughts on that. 1389 01:02:50,580 --> 01:02:51,830 AUDIENCE: Can you repeat that? 1390 01:02:51,830 --> 01:02:52,670 Just like the main question? 1391 01:02:52,670 --> 01:02:52,970 WILLIAM BONVILLIAN: Why don't you 1392 01:02:52,970 --> 01:02:54,260 break them into the two pieces? 1393 01:02:54,260 --> 01:02:55,135 AUDIENCE: Yeah, sure. 1394 01:02:55,135 --> 01:02:58,437 So my first question was how-- 1395 01:02:58,437 --> 01:03:00,020 in these discussions we've been having 1396 01:03:00,020 --> 01:03:03,110 about encouraging innovation and technological advancement, 1397 01:03:03,110 --> 01:03:08,330 how do we address problems of systemic inequality as well? 1398 01:03:08,330 --> 01:03:10,580 AUDIENCE: Just for the limiting down of your question, 1399 01:03:10,580 --> 01:03:14,510 would you mind defining for us what 1400 01:03:14,510 --> 01:03:17,090 is systemic inequality is in your definition? 1401 01:03:17,090 --> 01:03:20,150 Because I have my own very many definitions, right, around 1402 01:03:20,150 --> 01:03:22,520 my field in political science. 1403 01:03:22,520 --> 01:03:24,230 And then, two, what systemic inequalities 1404 01:03:24,230 --> 01:03:25,070 do you want to talk about? 1405 01:03:25,070 --> 01:03:25,700 Is it race? 1406 01:03:25,700 --> 01:03:26,870 IS it income inequality? 1407 01:03:26,870 --> 01:03:27,930 Is it social class? 1408 01:03:27,930 --> 01:03:28,990 Is it what durable inequalities? 1409 01:03:28,990 --> 01:03:30,250 AUDIENCE: Educational disparities? 1410 01:03:30,250 --> 01:03:30,875 AUDIENCE: Yeah. 1411 01:03:30,875 --> 01:03:33,650 AUDIENCE: Yeah, so I'm going to go up with Autor's piece was 1412 01:03:33,650 --> 01:03:35,630 and talk about-- 1413 01:03:35,630 --> 01:03:37,340 so by systemic inequality he was talking 1414 01:03:37,340 --> 01:03:40,910 about how the educational attainment 1415 01:03:40,910 --> 01:03:43,190 gap tends to persist within families, 1416 01:03:43,190 --> 01:03:45,910 and then extrapolate that into like larger communities which 1417 01:03:45,910 --> 01:03:47,810 these families are in. 1418 01:03:47,810 --> 01:03:51,600 And then it just kind of gets perpetuated in that sense. 1419 01:03:51,600 --> 01:03:54,050 So I think, maybe specifically talk 1420 01:03:54,050 --> 01:03:57,710 about like when it comes to educational attainment, 1421 01:03:57,710 --> 01:04:01,160 and then the later on how that translates into income 1422 01:04:01,160 --> 01:04:03,860 and into wages. 1423 01:04:03,860 --> 01:04:09,720 We can just talk about how that kind of factors in [INAUDIBLE].. 1424 01:04:09,720 --> 01:04:11,100 AUDIENCE: Bernie had it right. 1425 01:04:11,100 --> 01:04:12,720 Free education. 1426 01:04:12,720 --> 01:04:15,600 That's the crux. 1427 01:04:15,600 --> 01:04:17,683 AUDIENCE: I think there's an important point here 1428 01:04:17,683 --> 01:04:19,600 that you said about intergenerational mobility 1429 01:04:19,600 --> 01:04:21,810 and educational attainment. 1430 01:04:21,810 --> 01:04:23,640 And now that there's just this transition 1431 01:04:23,640 --> 01:04:26,640 in this additional premium on like having a bachelors 1432 01:04:26,640 --> 01:04:31,650 and a post-secondary degree that didn't exist, 1433 01:04:31,650 --> 01:04:33,948 but like it's sort of too late to fix that once 1434 01:04:33,948 --> 01:04:36,240 you've already matriculated sort of into the workforce. 1435 01:04:36,240 --> 01:04:39,900 And now, I mean, I don't have my post-secondary or college 1436 01:04:39,900 --> 01:04:43,230 degree, but I'm 45 and I have all these other problems 1437 01:04:43,230 --> 01:04:44,190 to deal with. 1438 01:04:44,190 --> 01:04:47,320 And so I think just like maybe Bill 1439 01:04:47,320 --> 01:04:50,558 was starting to say, like there are some fixes with maybe 1440 01:04:50,558 --> 01:04:52,350 community college and kind of night classes 1441 01:04:52,350 --> 01:04:59,670 that are like sort of kind of segues 1442 01:04:59,670 --> 01:05:03,480 back in to the system of like getting back 1443 01:05:03,480 --> 01:05:06,180 that educational premium that maybe you didn't have 1444 01:05:06,180 --> 01:05:07,590 the opportunity to previously. 1445 01:05:07,590 --> 01:05:10,560 And I think maybe that's a lot easier 1446 01:05:10,560 --> 01:05:13,650 than these giant kind of skill retraining 1447 01:05:13,650 --> 01:05:14,877 or repurposing maybe. 1448 01:05:14,877 --> 01:05:16,460 I can imagine like a larger gym filled 1449 01:05:16,460 --> 01:05:19,040 with like computers that kind of people come in 1450 01:05:19,040 --> 01:05:21,420 and like take online classes of how to do 1451 01:05:21,420 --> 01:05:22,770 better or something like that. 1452 01:05:22,770 --> 01:05:25,775 But you really have to focus, I would say, on 1453 01:05:25,775 --> 01:05:27,420 like there's this large subset as we 1454 01:05:27,420 --> 01:05:31,140 saw in this last election of people who have shifted out 1455 01:05:31,140 --> 01:05:33,490 of jobs, particularly in manufacturing, 1456 01:05:33,490 --> 01:05:36,265 who are intergenerationally speaking on 1457 01:05:36,265 --> 01:05:39,000 like the I've been out of school or been out of college 1458 01:05:39,000 --> 01:05:40,230 for 20 plus years. 1459 01:05:40,230 --> 01:05:42,860 And like, is it too late for me to go back, or switch, or gain 1460 01:05:42,860 --> 01:05:45,177 these skills, these educational skills that I need? 1461 01:05:45,177 --> 01:05:46,260 WILLIAM BONVILLIAN: Right. 1462 01:05:46,260 --> 01:05:49,740 So Rashid, I think you make a very good point here. 1463 01:05:49,740 --> 01:05:53,970 We're going to need that steelworker 1464 01:05:53,970 --> 01:05:55,680 I was talking about before who lost 1465 01:05:55,680 --> 01:05:57,743 his job in southeastern Ohio. 1466 01:05:57,743 --> 01:05:59,910 He's not going to go back and get a four-year degree 1467 01:05:59,910 --> 01:06:00,660 in college. 1468 01:06:00,660 --> 01:06:01,770 It is not going to happen. 1469 01:06:01,770 --> 01:06:03,660 Whether Bernie wants to make it free or not, 1470 01:06:03,660 --> 01:06:05,910 Lily, it's not going to happen. 1471 01:06:05,910 --> 01:06:09,240 We need an entirely different set 1472 01:06:09,240 --> 01:06:11,760 of institutional models that's going 1473 01:06:11,760 --> 01:06:13,170 to work with this community. 1474 01:06:13,170 --> 01:06:18,030 And as we have this technological job 1475 01:06:18,030 --> 01:06:19,230 displacement-- 1476 01:06:19,230 --> 01:06:23,610 and I personally will talk more about this later, 1477 01:06:23,610 --> 01:06:25,560 I don't think this is going to be something 1478 01:06:25,560 --> 01:06:26,580 that happens tomorrow. 1479 01:06:26,580 --> 01:06:28,980 I think we've got time to work on this, all right? 1480 01:06:28,980 --> 01:06:32,010 I think it's going to be a more gradual process than some 1481 01:06:32,010 --> 01:06:36,240 of the technological dystopians have portrayed. 1482 01:06:36,240 --> 01:06:38,160 But we're going to have to use that time, 1483 01:06:38,160 --> 01:06:39,868 and I think we're going to have to create 1484 01:06:39,868 --> 01:06:43,200 a whole new set of institutional arrangements to begin to cope. 1485 01:06:43,200 --> 01:06:45,730 Otherwise, we're just leaving too many people behind. 1486 01:06:45,730 --> 01:06:49,087 And the price in our society and the price in our democracy-- 1487 01:06:49,087 --> 01:06:51,420 as we just learned from the last presidential election-- 1488 01:06:51,420 --> 01:06:53,760 is pretty high. 1489 01:06:53,760 --> 01:06:55,040 That's my case, right? 1490 01:06:55,040 --> 01:06:56,290 That's what I want to work on. 1491 01:06:58,820 --> 01:07:01,480 But other of you I know have thoughts on this. 1492 01:07:01,480 --> 01:07:06,260 AUDIENCE: [INAUDIBLE] 1493 01:07:06,260 --> 01:07:07,875 AUDIENCE: Three small thoughts. 1494 01:07:07,875 --> 01:07:10,250 The first of which is that this conversation is making me 1495 01:07:10,250 --> 01:07:12,800 think of education as a sort of social security 1496 01:07:12,800 --> 01:07:15,090 for young people. 1497 01:07:15,090 --> 01:07:16,427 That's our social safety net. 1498 01:07:16,427 --> 01:07:18,135 AUDIENCE: Like it's our insurance policy? 1499 01:07:18,135 --> 01:07:18,740 AUDIENCE: Yeah, it's our insurance. 1500 01:07:18,740 --> 01:07:19,090 AUDIENCE: OK. 1501 01:07:19,090 --> 01:07:19,860 AUDIENCE: Yeah. 1502 01:07:19,860 --> 01:07:20,470 Yeah. 1503 01:07:20,470 --> 01:07:23,690 And I mean I-- 1504 01:07:23,690 --> 01:07:27,300 like I said earlier in the semester, I took a course-- 1505 01:07:27,300 --> 01:07:30,770 or I just finished yesterday, a course in computer science. 1506 01:07:30,770 --> 01:07:32,440 And it was so hard. 1507 01:07:32,440 --> 01:07:35,640 I mean it was really challenging and really positive 1508 01:07:35,640 --> 01:07:38,010 experience for me, personally. 1509 01:07:38,010 --> 01:07:40,438 But I also, as I was doing these assignments, 1510 01:07:40,438 --> 01:07:42,480 considered what it would be like to have a family 1511 01:07:42,480 --> 01:07:45,240 and to have obligations beyond myself, 1512 01:07:45,240 --> 01:07:47,880 and to have to be retrained in a vocation. 1513 01:07:47,880 --> 01:07:50,728 And that's almost insurmountable. 1514 01:07:50,728 --> 01:07:53,020 I can't imagine what it would be like for my father who 1515 01:07:53,020 --> 01:07:55,380 is a single income earner in the service industry, 1516 01:07:55,380 --> 01:07:57,450 to lose his job and to have to think 1517 01:07:57,450 --> 01:08:01,487 about the ways in which he can't help our family anymore, right? 1518 01:08:01,487 --> 01:08:02,820 I come from a low-income family. 1519 01:08:02,820 --> 01:08:05,610 That's a consideration I have to have. 1520 01:08:05,610 --> 01:08:10,080 And then secondly, about not so much inequality, 1521 01:08:10,080 --> 01:08:14,370 but I did research for a political science 1522 01:08:14,370 --> 01:08:17,779 course on global inequality on waste pickers in the world, 1523 01:08:17,779 --> 01:08:19,529 and the role that they play in the economy 1524 01:08:19,529 --> 01:08:22,229 specifically in solid waste and the recycling process 1525 01:08:22,229 --> 01:08:23,910 in the global supply chain. 1526 01:08:23,910 --> 01:08:25,920 And one of the things that came up a lot 1527 01:08:25,920 --> 01:08:29,100 was social exclusion as the mechanism 1528 01:08:29,100 --> 01:08:30,840 by which inequality happens. 1529 01:08:30,840 --> 01:08:32,340 And I feel like we need to talk more 1530 01:08:32,340 --> 01:08:35,970 about things in the terminology of social exclusion, 1531 01:08:35,970 --> 01:08:38,370 because then we understand how it happens 1532 01:08:38,370 --> 01:08:41,700 and who commits the action, and who 1533 01:08:41,700 --> 01:08:43,109 is the receiver of the action. 1534 01:08:43,109 --> 01:08:45,067 AUDIENCE: What do you mean by social exclusion? 1535 01:08:45,067 --> 01:08:47,850 AUDIENCE: So social exclusion is preventing individuals 1536 01:08:47,850 --> 01:08:51,810 from accessing a sector as a result of a durable inequality. 1537 01:08:51,810 --> 01:08:53,580 And so I'll break that down further. 1538 01:08:53,580 --> 01:08:56,189 It's essentially in the case of, say, waste pickers. 1539 01:08:56,189 --> 01:08:58,770 The way in which they have been excluded in, say, Brazil, 1540 01:08:58,770 --> 01:09:01,750 from accessing education because of their race. 1541 01:09:01,750 --> 01:09:02,250 Right? 1542 01:09:02,250 --> 01:09:04,170 So that's a very complicated picture. 1543 01:09:04,170 --> 01:09:08,680 But what we can determine is that because families, 1544 01:09:08,680 --> 01:09:12,270 in sort of the middle and upper class in Brazil, 1545 01:09:12,270 --> 01:09:14,470 have always had access to education, 1546 01:09:14,470 --> 01:09:17,189 they want to sustain that standard of living 1547 01:09:17,189 --> 01:09:18,359 for their progeny. 1548 01:09:18,359 --> 01:09:20,520 And so then their progeny gets educated. 1549 01:09:20,520 --> 01:09:23,160 But at the same time, there's only so 1550 01:09:23,160 --> 01:09:25,402 many spots at the national universities, right? 1551 01:09:25,402 --> 01:09:26,819 And so those spots are never going 1552 01:09:26,819 --> 01:09:30,359 to go to the children of the low income parents. 1553 01:09:30,359 --> 01:09:32,910 They're always going to go to the wealthier families. 1554 01:09:32,910 --> 01:09:36,297 And if we think about things in a process of social exclusion 1555 01:09:36,297 --> 01:09:38,130 rather than inequality, which is the outcome 1556 01:09:38,130 --> 01:09:42,000 of social exclusion, I think we can start rectifying some-- 1557 01:09:42,000 --> 01:09:44,760 at least some of the dimensions by which 1558 01:09:44,760 --> 01:09:46,200 we oppress other people. 1559 01:09:46,200 --> 01:09:48,810 Because it's really easy to say, oh, there's inequalities, 1560 01:09:48,810 --> 01:09:50,560 and to identify all of them. 1561 01:09:50,560 --> 01:09:52,109 But I think it's much harder to think 1562 01:09:52,109 --> 01:09:54,090 about the ways in which we ourselves either 1563 01:09:54,090 --> 01:09:57,690 uphold those inequalities, perpetuate those inequalities, 1564 01:09:57,690 --> 01:10:00,460 or benefit from those inequalities, right? 1565 01:10:00,460 --> 01:10:02,100 And so we have to, I think that if we 1566 01:10:02,100 --> 01:10:05,130 talk about things in the language of exclusion, 1567 01:10:05,130 --> 01:10:08,100 there's almost a responsibility and accountability 1568 01:10:08,100 --> 01:10:12,396 to the effects that those exclusions have. 1569 01:10:12,396 --> 01:10:13,890 AUDIENCE: What's the third one? 1570 01:10:13,890 --> 01:10:16,410 AUDIENCE: And yeah I was just-- that was little-- 1571 01:10:16,410 --> 01:10:20,730 I've had a lot of thoughts about that for a long time. 1572 01:10:20,730 --> 01:10:24,310 But I think it's really important 1573 01:10:24,310 --> 01:10:26,310 that we are accountable for the ways in which we 1574 01:10:26,310 --> 01:10:27,720 benefit from inequalities. 1575 01:10:27,720 --> 01:10:29,430 And I'm not an exception. 1576 01:10:29,430 --> 01:10:30,548 AUDIENCE: Yeah. 1577 01:10:30,548 --> 01:10:32,340 I had a quick point, but if yours is quick. 1578 01:10:32,340 --> 01:10:34,850 I think to Steph's point just super quick. 1579 01:10:34,850 --> 01:10:36,630 I think there are a lot of studies 1580 01:10:36,630 --> 01:10:39,600 that say like, inequality in a sense 1581 01:10:39,600 --> 01:10:44,187 like we're OK with, like, it's OK that Martin makes 1582 01:10:44,187 --> 01:10:46,770 more money than I did this week, like, we're all OK with that. 1583 01:10:46,770 --> 01:10:48,640 But I think in the rising inequality, 1584 01:10:48,640 --> 01:10:51,570 in this barbell effect, like, you have people-- 1585 01:10:51,570 --> 01:10:53,040 like this growing subset of people 1586 01:10:53,040 --> 01:10:54,960 who just aren't earning as much. 1587 01:10:54,960 --> 01:10:57,360 And there's sort of a lot of them. 1588 01:10:57,360 --> 01:11:00,330 And there's not really this distribution of inequality, 1589 01:11:00,330 --> 01:11:04,650 then we get to say, we have more reason to say, 1590 01:11:04,650 --> 01:11:06,840 like, I'm not OK with once I have less. 1591 01:11:06,840 --> 01:11:09,660 And then I think there's a difference between inequality 1592 01:11:09,660 --> 01:11:12,870 and unfairness, which I think Steph was trying to get at. 1593 01:11:12,870 --> 01:11:15,833 And it's when unfairness in a sense like, 1594 01:11:15,833 --> 01:11:17,250 I didn't have the same opportunity 1595 01:11:17,250 --> 01:11:19,580 that Martin did to get to that point. 1596 01:11:19,580 --> 01:11:21,690 Like, I didn't have the same college education 1597 01:11:21,690 --> 01:11:25,470 that Martin did to get to that point, that's sort 1598 01:11:25,470 --> 01:11:26,710 of when these problems arise. 1599 01:11:26,710 --> 01:11:29,130 And I think once we have this rising-- 1600 01:11:29,130 --> 01:11:31,180 we have sort of a rising sentiment of inequality, 1601 01:11:31,180 --> 01:11:33,180 and I think it's going to be a lot easier for us 1602 01:11:33,180 --> 01:11:36,390 to say that these systems are sort of bringing things 1603 01:11:36,390 --> 01:11:37,928 that are kind of on the whole. 1604 01:11:37,928 --> 01:11:40,470 And so the systems are unfair and causing these inequalities, 1605 01:11:40,470 --> 01:11:45,090 and now we can have, I'm going to say political pressure, 1606 01:11:45,090 --> 01:11:46,590 and not kind of the social impetus 1607 01:11:46,590 --> 01:11:49,157 to really change things. 1608 01:11:49,157 --> 01:11:51,240 AUDIENCE: I'll make one quick point on this paper, 1609 01:11:51,240 --> 01:11:53,637 and then one on just inequity. 1610 01:11:53,637 --> 01:11:55,470 So what's interesting to me about this paper 1611 01:11:55,470 --> 01:11:57,930 is that there is another article that came out 1612 01:11:57,930 --> 01:12:00,070 about inequality in education. 1613 01:12:00,070 --> 01:12:02,160 On, it was like one of the major publications, 1614 01:12:02,160 --> 01:12:04,050 like Time or Wall Street Journal. 1615 01:12:04,050 --> 01:12:06,550 And what they did is they looked at people who actually went 1616 01:12:06,550 --> 01:12:09,090 to college that were minorities or from low income 1617 01:12:09,090 --> 01:12:10,020 versus high income. 1618 01:12:10,020 --> 01:12:10,320 Yeah. 1619 01:12:10,320 --> 01:12:11,903 But what they did is they mapped where 1620 01:12:11,903 --> 01:12:13,260 they were like in the future. 1621 01:12:13,260 --> 01:12:15,780 And it turned out that it was pretty much the same, right? 1622 01:12:15,780 --> 01:12:17,558 So they don't go up the ladder. 1623 01:12:17,558 --> 01:12:19,350 And so what I question when I see something 1624 01:12:19,350 --> 01:12:22,820 like this is like, OK, well if you actually go and get 1625 01:12:22,820 --> 01:12:25,300 a college education are you better off financially? 1626 01:12:25,300 --> 01:12:27,720 And is that the metric of success? 1627 01:12:27,720 --> 01:12:30,090 You might get personal fulfillment, better lifestyle. 1628 01:12:30,090 --> 01:12:31,890 But I think it's a flawed idea to say 1629 01:12:31,890 --> 01:12:33,390 if you go to college you will end up 1630 01:12:33,390 --> 01:12:34,910 making more money that way. 1631 01:12:34,910 --> 01:12:36,910 Because it might have been based on the paradigm 1632 01:12:36,910 --> 01:12:39,077 then in the '50s, yeah, you'd get a really good job, 1633 01:12:39,077 --> 01:12:41,867 and you have great benefits. 1634 01:12:41,867 --> 01:12:43,950 But to start to say that, OK, if you go to college 1635 01:12:43,950 --> 01:12:44,910 then you'll be set. 1636 01:12:44,910 --> 01:12:46,440 And then to say, OK, we just need to get everybody 1637 01:12:46,440 --> 01:12:47,400 through college. 1638 01:12:47,400 --> 01:12:50,170 That's a very big bubble mentality. 1639 01:12:50,170 --> 01:12:52,570 And I think we'll be paying very heavily for it. 1640 01:12:52,570 --> 01:12:55,777 I just recently saw an interview on like Goldman Sachs, 1641 01:12:55,777 --> 01:12:57,360 and how they saw the financial crisis. 1642 01:12:57,360 --> 01:12:59,027 And they explained it in terms of, well, 1643 01:12:59,027 --> 01:13:00,585 we built the road, right? 1644 01:13:00,585 --> 01:13:02,460 But when we were giving out these securities, 1645 01:13:02,460 --> 01:13:04,460 we didn't make decision put our money into them. 1646 01:13:04,460 --> 01:13:06,150 It was all these people were trying to-- 1647 01:13:06,150 --> 01:13:08,160 they thought that this was the factor to success 1648 01:13:08,160 --> 01:13:09,210 and kept buying them and buying them. 1649 01:13:09,210 --> 01:13:10,560 Every time they bought one they would crack the road, 1650 01:13:10,560 --> 01:13:11,852 crack the road, crack the road. 1651 01:13:11,852 --> 01:13:14,760 But we had the duty to give certain kinds of bonds, 1652 01:13:14,760 --> 01:13:17,880 and other people sold them again, which we didn't do. 1653 01:13:17,880 --> 01:13:21,143 And so ultimately we get-- 1654 01:13:21,143 --> 01:13:23,810 it's easy to point the finger at us because the road was broken. 1655 01:13:23,810 --> 01:13:25,602 But it's because the actions of individuals 1656 01:13:25,602 --> 01:13:28,940 buying, and buying, and buying, going into grief. 1657 01:13:28,940 --> 01:13:31,380 And so I think that's an interesting point. 1658 01:13:31,380 --> 01:13:32,940 WILLIAM BONVILLIAN: So that is an example of a bubble 1659 01:13:32,940 --> 01:13:33,480 economy [INAUDIBLE]. 1660 01:13:33,480 --> 01:13:34,355 AUDIENCE: A bubble economy, yeah. 1661 01:13:34,355 --> 01:13:35,410 And also this point-- 1662 01:13:35,410 --> 01:13:37,160 and I think Stephanie's point-- 1663 01:13:37,160 --> 01:13:38,110 I forget which one-- 1664 01:13:38,110 --> 01:13:41,400 on, oh, the insurance policy was made by Peter Thiel 1665 01:13:41,400 --> 01:13:44,440 in 2010 about why-- 1666 01:13:44,440 --> 01:13:46,200 he questioned if you go to college 1667 01:13:46,200 --> 01:13:48,740 would you be better off? 1668 01:13:48,740 --> 01:13:51,270 And so that's not an inequality of education. 1669 01:13:51,270 --> 01:13:55,110 On the issue of inequity what I find really interesting 1670 01:13:55,110 --> 01:13:58,290 is that we talk about equality, but we're 1671 01:13:58,290 --> 01:14:00,930 living in a relatively prosperous time. 1672 01:14:00,930 --> 01:14:03,343 Also, the US, it is a land of opportunity. 1673 01:14:03,343 --> 01:14:05,760 And so I thought it was really interesting, or even ironic 1674 01:14:05,760 --> 01:14:07,560 during the 1% movement, that it was 1675 01:14:07,560 --> 01:14:11,370 like the 1% of the world getting angry at the 1%. 1676 01:14:11,370 --> 01:14:13,380 And it is this thing that we have 1677 01:14:13,380 --> 01:14:15,500 this kind of social contract that in the US 1678 01:14:15,500 --> 01:14:19,300 you will be able to advance and move forward. 1679 01:14:19,300 --> 01:14:21,040 But I think inequity is a big issue. 1680 01:14:21,040 --> 01:14:23,550 Like, how do you make it so that people can move up? 1681 01:14:23,550 --> 01:14:26,160 Warren Buffett talks about the lottery of life. 1682 01:14:26,160 --> 01:14:29,100 That say you were about to be born in five minutes. 1683 01:14:29,100 --> 01:14:31,490 If you had to design the world any which way, 1684 01:14:31,490 --> 01:14:32,360 and you don't know if you're going 1685 01:14:32,360 --> 01:14:33,710 to be born a woman, a man, if you're 1686 01:14:33,710 --> 01:14:35,460 going to be born disabled, if you're going 1687 01:14:35,460 --> 01:14:37,140 to be born mentally retarded. 1688 01:14:37,140 --> 01:14:37,330 AUDIENCE: With a mental disability. 1689 01:14:37,330 --> 01:14:39,150 AUDIENCE: Or if you're going to be born-- 1690 01:14:39,150 --> 01:14:40,060 you know what I mean. 1691 01:14:40,060 --> 01:14:42,060 If you would be born somewhere else in the world 1692 01:14:42,060 --> 01:14:43,200 where you don't have an opportunity 1693 01:14:43,200 --> 01:14:45,575 and you have to pick up scraps for the rest of your life. 1694 01:14:45,575 --> 01:14:47,840 What kind of world would you want to create, 1695 01:14:47,840 --> 01:14:50,168 and how do you go building about that world? 1696 01:14:50,168 --> 01:14:51,960 And I think that's a really interesting way 1697 01:14:51,960 --> 01:14:52,830 of looking at it. 1698 01:14:52,830 --> 01:14:55,550 Because really, I don't think equal outcome I don't think 1699 01:14:55,550 --> 01:14:56,670 will ever be a thing. 1700 01:14:56,670 --> 01:14:59,370 And to say that that will be a thing is a very flawed idea 1701 01:14:59,370 --> 01:15:02,940 and would just lead to self-anger, 1702 01:15:02,940 --> 01:15:04,860 and you will get jealousy of other people. 1703 01:15:04,860 --> 01:15:07,620 There's also things in life that you can't buy. 1704 01:15:07,620 --> 01:15:09,510 If you focus on building that, if you 1705 01:15:09,510 --> 01:15:12,360 build great relationships, if you're good to people 1706 01:15:12,360 --> 01:15:13,710 they'll be good to you. 1707 01:15:13,710 --> 01:15:15,252 Because there's a lot of billionaires 1708 01:15:15,252 --> 01:15:16,770 that everybody hates them. 1709 01:15:16,770 --> 01:15:19,710 You can look at the President, right? 1710 01:15:19,710 --> 01:15:22,013 And so I think that's a big issue. 1711 01:15:22,013 --> 01:15:23,180 And there's just been that-- 1712 01:15:23,180 --> 01:15:25,290 well, there's also the idea of a capitalist society 1713 01:15:25,290 --> 01:15:26,850 where we have to have these figureheads that 1714 01:15:26,850 --> 01:15:27,683 made a lot of money. 1715 01:15:27,683 --> 01:15:29,350 And we have to have the proper incentive 1716 01:15:29,350 --> 01:15:30,690 to want to become like them. 1717 01:15:30,690 --> 01:15:31,800 And we have to push that-- 1718 01:15:31,800 --> 01:15:34,750 that's why the system is built for us to want to revere people 1719 01:15:34,750 --> 01:15:37,710 on the Forbes list, so that we can continue 1720 01:15:37,710 --> 01:15:39,270 growing capitalistically. 1721 01:15:39,270 --> 01:15:40,650 But I think definitely what will happen-- especially 1722 01:15:40,650 --> 01:15:42,108 my generation-- is people are going 1723 01:15:42,108 --> 01:15:44,490 to start to question capitalism, especially the issues 1724 01:15:44,490 --> 01:15:45,578 that Marx brought up. 1725 01:15:45,578 --> 01:15:47,370 So that we'll say, how do we build a better 1726 01:15:47,370 --> 01:15:52,090 capitalist system where we can be better humans? 1727 01:15:52,090 --> 01:15:55,162 Or there might be a lot more assholes, I don't know. 1728 01:15:55,162 --> 01:16:00,990 AUDIENCE: Also to Martin's point about your college degree 1729 01:16:00,990 --> 01:16:03,180 may not be the right path, or it's 1730 01:16:03,180 --> 01:16:06,240 kind of hazardous to say that if we just give everyone a college 1731 01:16:06,240 --> 01:16:08,183 degree and kind of subsidize that, 1732 01:16:08,183 --> 01:16:10,350 they'll just make higher incomes and like everything 1733 01:16:10,350 --> 01:16:11,310 will be fine. 1734 01:16:11,310 --> 01:16:14,890 I think there's-- and I can't remember which slide it was. 1735 01:16:14,890 --> 01:16:18,565 Like, what percent-- like what degree level and like 1736 01:16:18,565 --> 01:16:19,690 how it affects your income. 1737 01:16:19,690 --> 01:16:20,930 AUDIENCE: The change in real earnings? 1738 01:16:20,930 --> 01:16:22,763 AUDIENCE: Yeah, the change in real earnings. 1739 01:16:24,112 --> 01:16:25,320 WILLIAM BONVILLIAN: This one? 1740 01:16:25,320 --> 01:16:27,750 AUDIENCE: This one. 1741 01:16:27,750 --> 01:16:29,030 AUDIENCE: Seems uneven. 1742 01:16:29,030 --> 01:16:30,150 AUDIENCE: It seems uneven. 1743 01:16:30,150 --> 01:16:35,520 But I think there's something to be said about kind 1744 01:16:35,520 --> 01:16:37,590 of generational nobility here. 1745 01:16:37,590 --> 01:16:41,070 So like those who had a bachelors degree coming out 1746 01:16:41,070 --> 01:16:43,230 in maybe 1970 or a little bit earlier, 1747 01:16:43,230 --> 01:16:46,770 that income level was relative to the time. 1748 01:16:46,770 --> 01:16:48,930 Like, it allowed me to live comfortably, 1749 01:16:48,930 --> 01:16:51,153 whereas my bachelor's degree now, 1750 01:16:51,153 --> 01:16:53,070 and whatever bump in income level I get, like, 1751 01:16:53,070 --> 01:16:54,220 am I living comfortably? 1752 01:16:54,220 --> 01:16:57,000 Am I going to be able to subsist? 1753 01:16:57,000 --> 01:16:58,977 And that might be like an indicator 1754 01:16:58,977 --> 01:17:00,810 where it's like, yeah, it's unequal in here, 1755 01:17:00,810 --> 01:17:03,360 but like there's actually a bigger social problem here. 1756 01:17:03,360 --> 01:17:06,990 So even though I might be making more than I was in 1990, 1757 01:17:06,990 --> 01:17:11,160 like is my 1990 salary, at that level is it enough for me 1758 01:17:11,160 --> 01:17:12,030 to live in 1990? 1759 01:17:12,030 --> 01:17:14,060 And like, is my 2013 salary, is that enough? 1760 01:17:14,060 --> 01:17:15,727 AUDIENCE: These are real earning, right? 1761 01:17:15,727 --> 01:17:18,100 So these are inflation adjusting. 1762 01:17:18,100 --> 01:17:18,600 Yeah. 1763 01:17:18,600 --> 01:17:19,225 AUDIENCE: Yeah. 1764 01:17:19,225 --> 01:17:21,355 Inflation adjusted, but like cost of living. 1765 01:17:21,355 --> 01:17:22,730 AUDIENCE: There's also one factor 1766 01:17:22,730 --> 01:17:24,240 I think is really important. 1767 01:17:24,240 --> 01:17:26,550 And I think the recent election showcased-- 1768 01:17:26,550 --> 01:17:27,250 were you done with your point? 1769 01:17:27,250 --> 01:17:27,875 AUDIENCE: Yeah. 1770 01:17:27,875 --> 01:17:28,937 No, I'm done. 1771 01:17:28,937 --> 01:17:31,520 AUDIENCE: Which is, I think the real issue happening right now 1772 01:17:31,520 --> 01:17:34,640 is that we are in a stagnation in terms of actual real term 1773 01:17:34,640 --> 01:17:37,220 prospect-- like real prosperity growth. 1774 01:17:37,220 --> 01:17:41,090 The numbers can say we grew, but everything costs a lot more. 1775 01:17:41,090 --> 01:17:44,900 The way the system is created to show that, oh, 1776 01:17:44,900 --> 01:17:45,860 we're doing well. 1777 01:17:45,860 --> 01:17:47,690 I think like the poverty line, right, 1778 01:17:47,690 --> 01:17:49,708 is flawed the way we're calculating it now, 1779 01:17:49,708 --> 01:17:51,500 because it's not really showing what people 1780 01:17:51,500 --> 01:17:52,820 can buy with what they have. 1781 01:17:52,820 --> 01:17:55,070 And that's how we create it. 1782 01:17:55,070 --> 01:17:57,620 Because everything costs a lot more, but they've adjusted it 1783 01:17:57,620 --> 01:17:59,453 and they've fudged the numbers in such a way 1784 01:17:59,453 --> 01:18:01,310 that it seems to be kind of a big picture. 1785 01:18:01,310 --> 01:18:01,940 WILLIAM BONVILLIAN: So we're going to-- 1786 01:18:01,940 --> 01:18:03,440 Martin, we're going to get into this 1787 01:18:03,440 --> 01:18:05,232 when we get to the last reading and lay out 1788 01:18:05,232 --> 01:18:07,220 this whole issue of secular stagnation, 1789 01:18:07,220 --> 01:18:10,130 and what the implications of that historically 1790 01:18:10,130 --> 01:18:13,430 low productivity rate we're at, and historically low investment 1791 01:18:13,430 --> 01:18:15,260 rates that we're at. 1792 01:18:15,260 --> 01:18:16,610 And that may be-- 1793 01:18:16,610 --> 01:18:18,440 I think you're about to drive at this-- 1794 01:18:18,440 --> 01:18:23,280 that may be a really, big time, near time problem. 1795 01:18:23,280 --> 01:18:25,820 Whereas the technological job displacement maybe a little 1796 01:18:25,820 --> 01:18:26,665 further off. 1797 01:18:26,665 --> 01:18:27,290 AUDIENCE: Yeah. 1798 01:18:27,290 --> 01:18:27,930 AUDIENCE: Well, yeah, I was going 1799 01:18:27,930 --> 01:18:29,360 to finish with when people don't see 1800 01:18:29,360 --> 01:18:31,040 like what is going to be the next step, 1801 01:18:31,040 --> 01:18:32,150 they kind of start to lose hope and they 1802 01:18:32,150 --> 01:18:34,400 start fighting over things that really are non-issues. 1803 01:18:34,400 --> 01:18:36,858 This happens in a lot-- it's a common organizational thing. 1804 01:18:36,858 --> 01:18:38,240 When you lose your vision, people 1805 01:18:38,240 --> 01:18:39,615 start fighting over really like-- 1806 01:18:39,615 --> 01:18:41,550 very intensely-- about things that 1807 01:18:41,550 --> 01:18:43,640 maybe don't matter as much, or wouldn't have been 1808 01:18:43,640 --> 01:18:45,483 an issue when they had vision. 1809 01:18:45,483 --> 01:18:47,150 And I think that's one of the reasons we 1810 01:18:47,150 --> 01:18:49,400 see such intense infighting right now. 1811 01:18:49,400 --> 01:18:51,928 Because we're really starting to question 1812 01:18:51,928 --> 01:18:53,720 like what will-- well, not really question, 1813 01:18:53,720 --> 01:18:54,960 but we should just see America as it is, 1814 01:18:54,960 --> 01:18:56,602 and no one has said this what America 1815 01:18:56,602 --> 01:18:58,060 will be 50 years from now and we'll 1816 01:18:58,060 --> 01:19:00,020 start working towards that. 1817 01:19:00,020 --> 01:19:03,140 And I think that's a big factor that I saw in the last four 1818 01:19:03,140 --> 01:19:04,262 years. 1819 01:19:04,262 --> 01:19:05,720 WILLIAM BONVILLIAN: Sanam, you want 1820 01:19:05,720 --> 01:19:09,290 to give us a closing point after all of our discussions on this? 1821 01:19:09,290 --> 01:19:11,330 AUDIENCE: There's a slight point, I think. 1822 01:19:11,330 --> 01:19:12,872 We might be able to talk about later, 1823 01:19:12,872 --> 01:19:14,690 but like the GI Bill as like-- 1824 01:19:14,690 --> 01:19:17,330 I wanted to mention it as like the last big, kind 1825 01:19:17,330 --> 01:19:19,730 of governmentally-subsidized, like, real movement 1826 01:19:19,730 --> 01:19:24,050 towards kind of educating en masse a large subset 1827 01:19:24,050 --> 01:19:28,370 of the population for the sake of education 1828 01:19:28,370 --> 01:19:29,870 and all the other reasons. 1829 01:19:29,870 --> 01:19:32,670 But there's [INAUDIBLE] if I didn't say, 1830 01:19:32,670 --> 01:19:36,392 there's a whole subset of like social inequities that exist 1831 01:19:36,392 --> 01:19:38,725 within the GI Bill that we don't have to get into, but-- 1832 01:19:38,725 --> 01:19:40,267 WILLIAM BONVILLIAN: But you're right. 1833 01:19:40,267 --> 01:19:43,490 I mean, the two mass higher education 1834 01:19:43,490 --> 01:19:46,760 bills that the US did-- so the Land Grant College 1835 01:19:46,760 --> 01:19:50,750 Act that created, in effect, the institutional base, and the GI 1836 01:19:50,750 --> 01:19:54,560 Bill that took this huge population returning from World 1837 01:19:54,560 --> 01:19:57,230 War II and stuck them in college, 1838 01:19:57,230 --> 01:20:03,320 those created gigantic returns for the society. 1839 01:20:03,320 --> 01:20:06,710 So I understand your point, Martin, 1840 01:20:06,710 --> 01:20:09,680 which is that just getting a college degree 1841 01:20:09,680 --> 01:20:14,130 doesn't necessarily fit you for the life worth living. 1842 01:20:14,130 --> 01:20:15,780 I think we will all agree on that. 1843 01:20:15,780 --> 01:20:17,680 But it sure helps. 1844 01:20:17,680 --> 01:20:18,847 It sure moves you a step up. 1845 01:20:18,847 --> 01:20:20,305 AUDIENCE: Better than [INAUDIBLE].. 1846 01:20:20,305 --> 01:20:22,703 WILLIAM BONVILLIAN: And gets you closer in range. 1847 01:20:22,703 --> 01:20:24,870 Sanam, how about a closing thought from you on this, 1848 01:20:24,870 --> 01:20:26,187 and we'll take a short break. 1849 01:20:26,187 --> 01:20:28,520 AUDIENCE: I think it's a lot of good points, definitely. 1850 01:20:28,520 --> 01:20:31,730 And I think that when we're thinking about policy 1851 01:20:31,730 --> 01:20:34,190 and how to move forward, the discussion 1852 01:20:34,190 --> 01:20:38,150 about systems of privilege and exclusion 1853 01:20:38,150 --> 01:20:39,510 are really important as well. 1854 01:20:39,510 --> 01:20:41,510 And also to think about kind of like what can we 1855 01:20:41,510 --> 01:20:44,434 define as success both individually and as a system 1856 01:20:44,434 --> 01:20:48,760 is also an interesting and problematic thing 1857 01:20:48,760 --> 01:20:51,580 that we should continue to talk about.