1 00:00:00,090 --> 00:00:02,430 The following content is provided under a Creative 2 00:00:02,430 --> 00:00:03,820 Commons license. 3 00:00:03,820 --> 00:00:06,030 Your support will help MIT OpenCourseWare 4 00:00:06,030 --> 00:00:10,120 continue to offer high quality educational resources for free. 5 00:00:10,120 --> 00:00:12,660 To make a donation or to view additional materials 6 00:00:12,660 --> 00:00:16,620 from hundreds of MIT courses, visit MIT OpenCourseWare 7 00:00:16,620 --> 00:00:17,992 at ocw.mit.edu. 8 00:00:20,698 --> 00:00:22,240 WILLIAM BONVILLIAN: All right, Steph. 9 00:00:22,240 --> 00:00:27,480 Let's go into Claudia Goldin and Lawrence Katz. 10 00:00:27,480 --> 00:00:30,060 STEPH: Obviously, Goldin and Katz 11 00:00:30,060 --> 00:00:37,690 focus on the big picture issue, which 12 00:00:37,690 --> 00:00:43,410 is what you have drawn for us in the big picture-- 13 00:00:43,410 --> 00:00:45,620 pun semi-intended. 14 00:00:45,620 --> 00:00:47,995 And a consideration that I had was 15 00:00:47,995 --> 00:00:50,620 on the research and development process, which is not something 16 00:00:50,620 --> 00:00:52,510 that they touched on here, which could maybe 17 00:00:52,510 --> 00:00:54,550 provide a short-term fix. 18 00:00:54,550 --> 00:00:57,010 And I'd love to hear your input. 19 00:00:57,010 --> 00:00:59,620 Is there, in the research and development process, 20 00:00:59,620 --> 00:01:03,040 people who openly talk about the ways in which their technology 21 00:01:03,040 --> 00:01:04,930 may cause social disruption? 22 00:01:04,930 --> 00:01:07,050 And if so, can you identify someone 23 00:01:07,050 --> 00:01:08,580 on a project that you've worked on 24 00:01:08,580 --> 00:01:10,000 or a firm that you know of that's 25 00:01:10,000 --> 00:01:13,510 concerned about the social or economic implications 26 00:01:13,510 --> 00:01:15,910 of the technologies that they are building? 27 00:01:15,910 --> 00:01:17,810 RASHID: Don't VCs do this all the time? 28 00:01:17,810 --> 00:01:18,630 They decide-- 29 00:01:18,630 --> 00:01:19,630 MARTIN: No, they don't. 30 00:01:19,630 --> 00:01:20,630 [LAUGHTER] 31 00:01:20,630 --> 00:01:22,662 RASHID: But they might decide, this is my niche, 32 00:01:22,662 --> 00:01:24,120 this is what I haven't seen before, 33 00:01:24,120 --> 00:01:28,480 and these are the reasons why this particular technology 34 00:01:28,480 --> 00:01:30,723 would take advantage of this unseen market. 35 00:01:30,723 --> 00:01:32,140 And so I don't know if that's what 36 00:01:32,140 --> 00:01:33,490 you're trying to get at with-- 37 00:01:33,490 --> 00:01:33,990 MARTIN: No. 38 00:01:33,990 --> 00:01:36,207 I think her point was like, OK, if we do this, 39 00:01:36,207 --> 00:01:37,540 it will change society this way. 40 00:01:37,540 --> 00:01:39,500 Like Uber said, OK, if we do this, 41 00:01:39,500 --> 00:01:41,140 this technology will get rid of all these cab drivers, 42 00:01:41,140 --> 00:01:42,973 and there's going to be a social disruption. 43 00:01:42,973 --> 00:01:44,255 Is that your kind of angle? 44 00:01:44,255 --> 00:01:45,130 STEPH: Sort of, yeah. 45 00:01:45,130 --> 00:01:45,640 MARTIN: OK. 46 00:01:45,640 --> 00:01:45,940 Yeah. 47 00:01:45,940 --> 00:01:47,200 I think your angle was more like, 48 00:01:47,200 --> 00:01:48,658 oh, this is going to be a huge hit, 49 00:01:48,658 --> 00:01:51,870 and it's really going to change society, right? 50 00:01:51,870 --> 00:01:52,370 RASHID: Yes. 51 00:01:52,370 --> 00:01:53,190 MARTIN: OK, cool. 52 00:01:53,190 --> 00:01:56,770 STEPH: Right, so I guess in my consideration of business 53 00:01:56,770 --> 00:02:00,850 and consumer markets, and in some of the research 54 00:02:00,850 --> 00:02:03,827 that I'm conducting for my paper on autonomous vehicles, one 55 00:02:03,827 --> 00:02:05,410 of the questions that I've been tasked 56 00:02:05,410 --> 00:02:07,510 to ask by anthropologists, two firms 57 00:02:07,510 --> 00:02:09,220 that I might be interviewing, or even 58 00:02:09,220 --> 00:02:12,640 as I'm going over literature base and public releases 59 00:02:12,640 --> 00:02:17,680 to the press, is essentially, who on the staff 60 00:02:17,680 --> 00:02:21,640 is in charge of answering questions or raising 61 00:02:21,640 --> 00:02:24,400 considerations about social disruption 62 00:02:24,400 --> 00:02:27,310 or economic disruption that your technology might cause? 63 00:02:27,310 --> 00:02:29,940 And if a firm can't answer that, then you have your answer-- 64 00:02:29,940 --> 00:02:30,690 that they're not-- 65 00:02:30,690 --> 00:02:33,065 that they don't consider that as a part of their research 66 00:02:33,065 --> 00:02:34,065 and development process. 67 00:02:34,065 --> 00:02:36,315 And I think it's possible that some people would argue 68 00:02:36,315 --> 00:02:39,090 that that's beyond the scope of what the firm should be doing, 69 00:02:39,090 --> 00:02:39,590 right? 70 00:02:39,590 --> 00:02:41,680 They should be focusing on the technology. 71 00:02:41,680 --> 00:02:44,477 They perhaps should not have an ethicist that they should-- 72 00:02:44,477 --> 00:02:46,310 or perhaps they should just have an ethicist 73 00:02:46,310 --> 00:02:48,018 that they consult with, but it should not 74 00:02:48,018 --> 00:02:50,947 be someone who's formally a part of their staff. 75 00:02:50,947 --> 00:02:52,780 So I'd love to hear your perspective on this 76 00:02:52,780 --> 00:02:54,220 as perhaps a short-term fix. 77 00:02:54,220 --> 00:02:58,210 As we sort of promote technological advances, 78 00:02:58,210 --> 00:03:00,970 should there be someone who's constantly raising a flag 79 00:03:00,970 --> 00:03:03,700 and saying, ha, perhaps this might not be a good idea 80 00:03:03,700 --> 00:03:10,265 to deploy in the time frame that we have set for ourselves? 81 00:03:10,265 --> 00:03:12,890 SPEAKER 1: So I'm working on the mobility of the future project 82 00:03:12,890 --> 00:03:13,940 as part of my thesis. 83 00:03:13,940 --> 00:03:17,010 And so a lot of it is with autonomous vehicles. 84 00:03:17,010 --> 00:03:18,500 And it does seem-- 85 00:03:18,500 --> 00:03:21,347 well, first of all, with that example specifically, 86 00:03:21,347 --> 00:03:23,180 a lot of the companies who are working on it 87 00:03:23,180 --> 00:03:25,020 are very secretive about what they're doing. 88 00:03:25,020 --> 00:03:27,560 So it's not very clear as to what's 89 00:03:27,560 --> 00:03:29,660 going on within the companies. 90 00:03:29,660 --> 00:03:33,510 But outside of them, at least with this example, 91 00:03:33,510 --> 00:03:37,130 there is a lot of work and awareness 92 00:03:37,130 --> 00:03:39,660 about potential disruptions. 93 00:03:39,660 --> 00:03:42,260 I question whether that's always been the case or kind 94 00:03:42,260 --> 00:03:44,960 of the recent disruptions have spurred people 95 00:03:44,960 --> 00:03:48,020 to think about this more. 96 00:03:48,020 --> 00:03:50,067 So there's a group at the MIT AgeLab, 97 00:03:50,067 --> 00:03:51,650 I don't know if you've talked to them. 98 00:03:51,650 --> 00:03:52,620 STEPH: The what lab? 99 00:03:52,620 --> 00:03:53,540 SPEAKER 1: AgeLab? 100 00:03:53,540 --> 00:03:53,840 Age. 101 00:03:53,840 --> 00:03:54,340 STEPH: Age? 102 00:03:54,340 --> 00:03:54,883 Like, A-G-E? 103 00:03:54,883 --> 00:03:55,550 SPEAKER 1: Yeah. 104 00:03:55,550 --> 00:03:56,652 MARTIN: What do they do? 105 00:03:56,652 --> 00:03:58,610 SPEAKER 1: Well, they do a lot of stuff related 106 00:03:58,610 --> 00:04:00,560 to how people are aging. 107 00:04:00,560 --> 00:04:02,720 But they also have a big group that's 108 00:04:02,720 --> 00:04:06,800 working on the autonomous vehicles, 109 00:04:06,800 --> 00:04:10,280 less so on the technology side, but how it's affecting people. 110 00:04:10,280 --> 00:04:13,700 So they have a lot of experiments 111 00:04:13,700 --> 00:04:16,130 where they're letting people drive autonomous cars 112 00:04:16,130 --> 00:04:18,589 and recording them with their consent, 113 00:04:18,589 --> 00:04:21,079 and they're trying to get knowledge about how people are 114 00:04:21,079 --> 00:04:23,300 interacting with these technologies, what 115 00:04:23,300 --> 00:04:26,300 changes it could have, what negative changes it could have. 116 00:04:26,300 --> 00:04:29,125 So they showed us a video of this guy who was looking down 117 00:04:29,125 --> 00:04:30,500 at his phone for like two minutes 118 00:04:30,500 --> 00:04:34,140 while driving on the Mass Pike, which is terrifying. 119 00:04:34,140 --> 00:04:38,990 And so as we transition from a not quite autonomous vehicle 120 00:04:38,990 --> 00:04:41,240 to a fully one, he was talking about how 121 00:04:41,240 --> 00:04:43,490 he expects to see more accidents in the near future, 122 00:04:43,490 --> 00:04:44,850 until we get to the point. 123 00:04:44,850 --> 00:04:48,260 So I think people are thinking about how 124 00:04:48,260 --> 00:04:51,110 these technologies and their developments 125 00:04:51,110 --> 00:04:53,553 affect things other than just how we're 126 00:04:53,553 --> 00:04:54,720 getting from place to place. 127 00:04:54,720 --> 00:04:56,270 It's not such a simple story. 128 00:04:56,270 --> 00:04:58,392 STEPH: And would you mind elaborating? 129 00:04:58,392 --> 00:05:00,100 The reason I bring up this point, and why 130 00:05:00,100 --> 00:05:01,642 I've been thinking about it so fully, 131 00:05:01,642 --> 00:05:05,390 is because Bill assessed that a very important component 132 00:05:05,390 --> 00:05:07,790 of driverless vehicles is the impact that it's 133 00:05:07,790 --> 00:05:10,310 going to have on the workforce, on truck drivers, 134 00:05:10,310 --> 00:05:13,665 and on delivery laborers. 135 00:05:13,665 --> 00:05:16,040 Do you think that those are considerations that have been 136 00:05:16,040 --> 00:05:19,280 raised vis-รก-vis driverless cars? 137 00:05:19,280 --> 00:05:21,950 And do you think that there is maybe a role for someone 138 00:05:21,950 --> 00:05:26,240 to be highlighting that as an issue within each innovation 139 00:05:26,240 --> 00:05:27,493 team? 140 00:05:27,493 --> 00:05:28,160 SPEAKER 1: Yeah. 141 00:05:28,160 --> 00:05:31,215 I feel like that hasn't been as much of a focus in our work. 142 00:05:31,215 --> 00:05:32,840 People talk about it, but no one really 143 00:05:32,840 --> 00:05:34,800 talks about-- they're like, that's going to be a problem. 144 00:05:34,800 --> 00:05:35,370 [LAUGHTER] 145 00:05:35,370 --> 00:05:37,550 And that's the end of the discussion 146 00:05:37,550 --> 00:05:38,960 from what I've seen so far. 147 00:05:38,960 --> 00:05:42,800 But I do think that, especially after this election, 148 00:05:42,800 --> 00:05:47,492 people are talking more and more about disruptive technologies. 149 00:05:47,492 --> 00:05:49,200 I hadn't heard people talk about it much. 150 00:05:49,200 --> 00:05:50,742 But then I was listening to a podcast 151 00:05:50,742 --> 00:05:54,342 today where someone was like, I noticed in Logan the movie-- 152 00:05:54,342 --> 00:05:56,050 CHLOE: I was just about to bring that up. 153 00:05:56,050 --> 00:05:56,290 Oh my gosh. 154 00:05:56,290 --> 00:05:57,910 SPEAKER 2: Wait, if you spoil it, I swear-- 155 00:05:57,910 --> 00:05:59,120 SPEAKER 1: I haven't seen it. 156 00:05:59,120 --> 00:05:59,740 [LAUGHTER] 157 00:05:59,740 --> 00:06:01,220 SPEAKER 2: I haven't seen it. 158 00:06:01,220 --> 00:06:02,220 MARTIN: Cover your ears. 159 00:06:02,220 --> 00:06:02,986 STEPH: Me either. 160 00:06:02,986 --> 00:06:05,060 SPEAKER 1: In the movie Logan, apparently-- 161 00:06:05,060 --> 00:06:06,550 I haven't seen it, you might do a better job 162 00:06:06,550 --> 00:06:08,480 describing-- but there's a scene where there's 163 00:06:08,480 --> 00:06:09,868 all these trucks around. 164 00:06:09,868 --> 00:06:11,660 And none of them have cabs, because they're 165 00:06:11,660 --> 00:06:12,452 driving themselves. 166 00:06:12,452 --> 00:06:14,243 And there was a senator talking about this. 167 00:06:14,243 --> 00:06:16,350 And she's like, wow, this really could happen. 168 00:06:16,350 --> 00:06:18,770 And then what happens to all of these truck drivers? 169 00:06:18,770 --> 00:06:20,930 I never thought about it until I saw Logan. 170 00:06:20,930 --> 00:06:23,780 So I think that's interesting how it's permeating through-- 171 00:06:23,780 --> 00:06:26,510 WILLIAM BONVILLIAN: So I want to hold this discussion 172 00:06:26,510 --> 00:06:29,690 until next week's class, which is about the future of work. 173 00:06:29,690 --> 00:06:31,280 I mean, it's exactly like-- 174 00:06:31,280 --> 00:06:32,310 it's about this problem. 175 00:06:32,310 --> 00:06:34,772 So we've gotten it on the table. 176 00:06:34,772 --> 00:06:36,980 Everybody's going to have a chance to think about it. 177 00:06:36,980 --> 00:06:37,190 SPEAKER 1: Watch Logan. 178 00:06:37,190 --> 00:06:39,523 WILLIAM BONVILLIAN: And you can go back and watch Logan. 179 00:06:39,523 --> 00:06:41,180 You can recount your favorite stories. 180 00:06:41,180 --> 00:06:43,310 But, Steph, do bring this up again 181 00:06:43,310 --> 00:06:45,620 when we go after these issues next week. 182 00:06:45,620 --> 00:06:48,680 But focus us more now on Katz and Goldin and Bamol, 183 00:06:48,680 --> 00:06:49,420 if you would. 184 00:06:49,420 --> 00:06:49,920 STEPH: OK. 185 00:06:49,920 --> 00:06:51,170 WILLIAM BONVILLIAN: Thank you. 186 00:06:51,170 --> 00:06:53,420 STEPH: I have been redirected. 187 00:06:53,420 --> 00:06:54,650 WILLIAM BONVILLIAN: I'm not trying to silence you, Steph. 188 00:06:54,650 --> 00:06:55,370 STEPH: No, no, absolutely not. 189 00:06:55,370 --> 00:06:58,280 WILLIAM BONVILLIAN: I promise you airtime in the next class. 190 00:06:58,280 --> 00:06:59,163 [LAUGHTER] 191 00:06:59,163 --> 00:07:00,080 STEPH: Well, I think-- 192 00:07:03,160 --> 00:07:07,430 I guess one of my critiques of my social science education-- 193 00:07:07,430 --> 00:07:11,170 and I think this is generally my critique of social science 194 00:07:11,170 --> 00:07:12,940 education generally-- 195 00:07:12,940 --> 00:07:16,670 is that there are benefits to talking about an issue, 196 00:07:16,670 --> 00:07:20,320 but there's only so much that public discourse can get you, 197 00:07:20,320 --> 00:07:22,220 especially in a round table setting, 198 00:07:22,220 --> 00:07:24,660 in an academic environment, where the most of us 199 00:07:24,660 --> 00:07:27,520 start to feel like, for the most part, 200 00:07:27,520 --> 00:07:30,370 are on the same side of an issue. 201 00:07:30,370 --> 00:07:32,797 So I struggle to come up with good questions 202 00:07:32,797 --> 00:07:34,880 and think about what might actually be meaningful, 203 00:07:34,880 --> 00:07:37,930 as we're having a conversation about an issue as large and as 204 00:07:37,930 --> 00:07:42,280 difficult to grapple with as inequality, especially 205 00:07:42,280 --> 00:07:46,390 as Katz and Goldin purported. 206 00:07:46,390 --> 00:07:52,390 But I think one of you talked about how, 207 00:07:52,390 --> 00:07:55,780 relative to Bamol's analysis, if America should shift 208 00:07:55,780 --> 00:08:01,030 towards a focus on primarily promoting breakthrough 209 00:08:01,030 --> 00:08:04,420 innovations, where we can still maintain competitive advantages 210 00:08:04,420 --> 00:08:06,190 in the research and development system 211 00:08:06,190 --> 00:08:07,840 rather than even concerning ourselves 212 00:08:07,840 --> 00:08:11,710 at all with the fixing of the education system 213 00:08:11,710 --> 00:08:14,500 or the fixing of the R&D pipeline. 214 00:08:14,500 --> 00:08:17,020 And this sort of hints at a question that Matt 215 00:08:17,020 --> 00:08:19,430 had asked earlier in the semester, where he said, 216 00:08:19,430 --> 00:08:24,160 is the impetus on us, as a civil society, to educate everyone? 217 00:08:24,160 --> 00:08:25,570 Is that even advantageous to us? 218 00:08:25,570 --> 00:08:27,197 Do people even want to be educated? 219 00:08:27,197 --> 00:08:28,280 And what are the benefits? 220 00:08:28,280 --> 00:08:32,350 And making this resource available to all, 221 00:08:32,350 --> 00:08:34,720 when it seems like we're already failing 222 00:08:34,720 --> 00:08:37,419 spectacularly at this experiment that is 223 00:08:37,419 --> 00:08:40,059 the United States of America. 224 00:08:40,059 --> 00:08:41,559 MARTIN: I don't think we're failing. 225 00:08:41,559 --> 00:08:42,226 SPEAKER 2: Yeah. 226 00:08:42,226 --> 00:08:44,860 MARTIN: It's 300 years old, and that's relatively short. 227 00:08:44,860 --> 00:08:47,810 We're still figuring our stuff out, in terms of nations, 228 00:08:47,810 --> 00:08:48,310 right? 229 00:08:48,310 --> 00:08:50,770 SPEAKER 2: Yeah, I'd say we did pretty well. 230 00:08:50,770 --> 00:08:52,690 We're not great, not anymore. 231 00:08:52,690 --> 00:08:54,816 But-- so let's make it great again. 232 00:08:54,816 --> 00:08:55,316 [LAUGHS] 233 00:08:55,316 --> 00:08:56,730 MARTIN: I think it's kind of like bubble thinking, 234 00:08:56,730 --> 00:08:58,930 like people losing their minds in sensationalism. 235 00:08:58,930 --> 00:08:59,930 I think it's more like-- 236 00:08:59,930 --> 00:09:01,722 because their argument's pretty much like-- 237 00:09:01,722 --> 00:09:03,340 it's like saying, oh, yeah, we have 238 00:09:03,340 --> 00:09:05,470 a McDonald's, everything's set up to do McDonald's. 239 00:09:05,470 --> 00:09:06,360 But look at Chipotle. 240 00:09:06,360 --> 00:09:08,870 How do we make McDonald's, the burger, be Chipotle, right? 241 00:09:08,870 --> 00:09:10,930 It's like you really can't change one system 242 00:09:10,930 --> 00:09:12,160 to a whole new one. 243 00:09:12,160 --> 00:09:14,243 What you do is you have to make Chipotle, and then 244 00:09:14,243 --> 00:09:15,600 make Chipotle a poppin' chain. 245 00:09:15,600 --> 00:09:16,100 [LAUGHTER] 246 00:09:16,100 --> 00:09:17,808 You have to go and make the next system-- 247 00:09:17,808 --> 00:09:18,720 SPEAKER 2: Poppin'? 248 00:09:18,720 --> 00:09:19,135 MARTIN: Yeah. 249 00:09:19,135 --> 00:09:20,960 WILLIAM BONVILLIAN: I'm going to use this metaphor, Martin. 250 00:09:20,960 --> 00:09:21,820 It's good. 251 00:09:21,820 --> 00:09:22,950 MARTIN: I mean, because people are freaking out. 252 00:09:22,950 --> 00:09:25,270 It's like, this hamburger just can't be a burrito. 253 00:09:25,270 --> 00:09:27,250 And I'm like, why? 254 00:09:27,250 --> 00:09:28,630 Well, the whole infrastructure is 255 00:09:28,630 --> 00:09:30,195 made to make hamburgers, right? 256 00:09:30,195 --> 00:09:32,487 STEPH: Well, see, and I think that at the heart of what 257 00:09:32,487 --> 00:09:34,990 you're saying, to sort of elucidate where I'm coming 258 00:09:34,990 --> 00:09:37,380 from in asking this question, there 259 00:09:37,380 --> 00:09:41,530 is an education professor who I grew close to during my time 260 00:09:41,530 --> 00:09:45,430 at Wellesley, who in his first year of his PhD program, 261 00:09:45,430 --> 00:09:49,090 his advisor posed to everyone, if you had the option, 262 00:09:49,090 --> 00:09:52,150 would you blow up the education system and start again 263 00:09:52,150 --> 00:09:55,240 or would you change it incrementally? 264 00:09:55,240 --> 00:09:59,450 And his answer was blow it up and start over. 265 00:09:59,450 --> 00:10:00,860 But we don't have that option. 266 00:10:00,860 --> 00:10:04,247 We don't have the privilege of bringing in the Chipotles 267 00:10:04,247 --> 00:10:05,830 and then seeing if they work, and then 268 00:10:05,830 --> 00:10:08,620 adopting all the McDonald's to suit that consumer market, 269 00:10:08,620 --> 00:10:10,000 right? 270 00:10:10,000 --> 00:10:12,415 We very much recognize that students are humans 271 00:10:12,415 --> 00:10:17,590 and that humans have a very particular ethic consideration 272 00:10:17,590 --> 00:10:19,690 than do technologies or other kinds of research 273 00:10:19,690 --> 00:10:21,530 and development projects. 274 00:10:21,530 --> 00:10:25,600 So if we are to meet the kinds of goals 275 00:10:25,600 --> 00:10:28,330 that Goldin and Katz and Bamol and Freeman 276 00:10:28,330 --> 00:10:33,220 are advocating for in improving our education system, 277 00:10:33,220 --> 00:10:36,100 how do we deal with the fact that these are human lives 278 00:10:36,100 --> 00:10:39,578 with whom we are experimenting? 279 00:10:39,578 --> 00:10:40,995 MARTIN: I think it's a good point, 280 00:10:40,995 --> 00:10:42,330 but I'm not talking about that. 281 00:10:42,330 --> 00:10:43,830 What I'm trying to say is like, what 282 00:10:43,830 --> 00:10:45,038 are we defining as education? 283 00:10:45,038 --> 00:10:46,500 Are we defining education as I want 284 00:10:46,500 --> 00:10:49,350 people to know math and science really well to communicate it, 285 00:10:49,350 --> 00:10:51,350 to think innovatively in it? 286 00:10:51,350 --> 00:10:53,700 Or are we defining it as like, we're going to get out 287 00:10:53,700 --> 00:10:55,440 of a four-year college, right? 288 00:10:55,440 --> 00:10:56,970 So I think a lot this argument is, 289 00:10:56,970 --> 00:10:58,890 the current infrastructure or the relationship 290 00:10:58,890 --> 00:11:01,130 for the last 100 years has been colleges. 291 00:11:01,130 --> 00:11:03,188 So we define education as graduating a college. 292 00:11:03,188 --> 00:11:05,730 But if you make it as, I want to make the best, most educated 293 00:11:05,730 --> 00:11:08,970 person with all my tools right now, 294 00:11:08,970 --> 00:11:12,090 and being able to import them into society well, 295 00:11:12,090 --> 00:11:13,560 in terms of into jobs really well, 296 00:11:13,560 --> 00:11:15,510 how would we make that ideal system? 297 00:11:15,510 --> 00:11:16,780 You start making that system. 298 00:11:16,780 --> 00:11:19,860 And it's not going to change overnight, right? 299 00:11:19,860 --> 00:11:22,830 You don't grow up two years old and the next day you're-- 300 00:11:22,830 --> 00:11:25,340 I mean, there is the movie 13 Going on 30, 301 00:11:25,340 --> 00:11:28,012 but it's very rare, right? 302 00:11:28,012 --> 00:11:29,400 [LAUGHTER] 303 00:11:29,400 --> 00:11:31,950 So I think it's you start making the new system 304 00:11:31,950 --> 00:11:33,845 and you kind of start to pour over. 305 00:11:33,845 --> 00:11:35,220 There's a whole issue with trying 306 00:11:35,220 --> 00:11:37,350 to change a current system, because they're already 307 00:11:37,350 --> 00:11:38,200 playing-- 308 00:11:38,200 --> 00:11:40,735 well, this happens with later, mature markets. 309 00:11:40,735 --> 00:11:43,110 What happens is there's going to be three or four or five 310 00:11:43,110 --> 00:11:43,630 players, right? 311 00:11:43,630 --> 00:11:45,088 So when we talk about institutions, 312 00:11:45,088 --> 00:11:46,740 we have Harvard, MIT, Stanford. 313 00:11:46,740 --> 00:11:47,850 They're trying to fight against each other, 314 00:11:47,850 --> 00:11:48,930 because they don't want to-- 315 00:11:48,930 --> 00:11:50,972 it's hard to be different when we have to compete 316 00:11:50,972 --> 00:11:52,350 against someone like that. 317 00:11:52,350 --> 00:11:54,507 And so they have to play their games. 318 00:11:54,507 --> 00:11:57,090 The reason an innovator can come in is because they don't have 319 00:11:57,090 --> 00:11:59,730 to play that keeping-up-with-the-Joneses 320 00:11:59,730 --> 00:12:01,410 games. 321 00:12:01,410 --> 00:12:02,090 That's my point. 322 00:12:02,090 --> 00:12:03,460 SPEAKER 2: You're throwing out a lot of metaphors. 323 00:12:03,460 --> 00:12:03,960 [LAUGHTER] 324 00:12:03,960 --> 00:12:05,900 STEPH: Yeah, [INAUDIBLE],, Chipotle-- 325 00:12:05,900 --> 00:12:07,070 [LAUGHTER] 326 00:12:07,070 --> 00:12:08,920 LILY: 13 Going on 30. 327 00:12:08,920 --> 00:12:10,920 MARTIN: So there's a whole book on communication 328 00:12:10,920 --> 00:12:12,030 where it's like, if you make-- 329 00:12:12,030 --> 00:12:13,680 I could explain it in a super technical way, 330 00:12:13,680 --> 00:12:14,410 but you're not going to get it. 331 00:12:14,410 --> 00:12:15,210 And I can say something like that, 332 00:12:15,210 --> 00:12:16,543 and you'll be like, OK, kind of. 333 00:12:16,543 --> 00:12:17,918 WILLIAM BONVILLIAN: Well, we're-- 334 00:12:17,918 --> 00:12:18,585 [LAUGHTER] 335 00:12:18,585 --> 00:12:20,752 MARTIN: The other thing is that you won't forget it, 336 00:12:20,752 --> 00:12:23,680 because it's so out there, that you're like, OK, OK-- 337 00:12:23,680 --> 00:12:24,730 SPEAKER 1: Chipotle. 338 00:12:24,730 --> 00:12:26,730 MARTIN: You guys are going to remember the three 339 00:12:26,730 --> 00:12:27,480 points, right? 340 00:12:27,480 --> 00:12:28,313 SPEAKER 2: Well, no. 341 00:12:28,313 --> 00:12:32,070 I remember that Chipotle versus McDonald's was discussed. 342 00:12:32,070 --> 00:12:32,990 I don't know-- 343 00:12:32,990 --> 00:12:33,690 MARTIN: OK. 344 00:12:33,690 --> 00:12:34,080 WILLIAM BONVILLIAN: Well, you know, 345 00:12:34,080 --> 00:12:35,580 let me say something here too, which 346 00:12:35,580 --> 00:12:38,970 is that we are going to dig into the textbook reading on legacy 347 00:12:38,970 --> 00:12:41,880 sectors, because there's no question, 348 00:12:41,880 --> 00:12:44,910 but education, and higher education in particular, 349 00:12:44,910 --> 00:12:46,560 are legacy systems. 350 00:12:46,560 --> 00:12:49,920 And the whole issue of how do you bring change 351 00:12:49,920 --> 00:12:52,690 into legacy systems is kind of an underlying theme here. 352 00:12:52,690 --> 00:12:55,218 So I'm glad we're right at the brink of trying 353 00:12:55,218 --> 00:12:56,010 to figure this out. 354 00:12:58,950 --> 00:13:07,770 I guess I would posit that the meaning of Freeman's reading 355 00:13:07,770 --> 00:13:11,970 for us, I think, was that there are real economic consequences 356 00:13:11,970 --> 00:13:15,840 to your leadership in the science and engineering talent 357 00:13:15,840 --> 00:13:16,680 base. 358 00:13:16,680 --> 00:13:22,680 And that Katz and Goldin take us to kind of a next step 359 00:13:22,680 --> 00:13:28,590 and enable us to address a larger set of societal problems 360 00:13:28,590 --> 00:13:32,350 and tie it significantly to the education system. 361 00:13:32,350 --> 00:13:34,200 In other words, the growing problem 362 00:13:34,200 --> 00:13:36,090 of the polarization of our society 363 00:13:36,090 --> 00:13:38,370 and the growing income inequality 364 00:13:38,370 --> 00:13:43,130 in our society that will affect the quality of the democracy, 365 00:13:43,130 --> 00:13:49,830 and probably already is, that's very much tied to these curves 366 00:13:49,830 --> 00:13:54,480 here and what portion of the population 367 00:13:54,480 --> 00:13:57,360 we're moving onto that upward technology 368 00:13:57,360 --> 00:13:59,640 curve and what portion of the population 369 00:13:59,640 --> 00:14:00,660 we're leaving behind. 370 00:14:00,660 --> 00:14:06,040 That's the meaning of that crossover point in the 1970s. 371 00:14:06,040 --> 00:14:08,910 And what are the policies that would really 372 00:14:08,910 --> 00:14:12,090 address trying to get those lines back 373 00:14:12,090 --> 00:14:13,250 into parallel, right? 374 00:14:13,250 --> 00:14:15,270 Because that seems to be a pretty important 375 00:14:15,270 --> 00:14:16,980 societal problem. 376 00:14:16,980 --> 00:14:20,940 So Freeman identifies a set of economic issues 377 00:14:20,940 --> 00:14:23,460 that draw us to be concerned with the science 378 00:14:23,460 --> 00:14:25,650 and engineering talent base. 379 00:14:25,650 --> 00:14:28,740 And Katz and Goldin tell us, by the way, 380 00:14:28,740 --> 00:14:32,250 there are very deep societal well-being 381 00:14:32,250 --> 00:14:36,340 problems that are tied to the education system as well. 382 00:14:36,340 --> 00:14:42,720 And then Bamol points to us and says, 383 00:14:42,720 --> 00:14:46,380 by the way, these folks are studying the higher education 384 00:14:46,380 --> 00:14:50,100 system, but that's not the only route here. 385 00:14:50,100 --> 00:14:51,990 We have to understand this other dimension 386 00:14:51,990 --> 00:14:53,730 if you want to introduce innovation 387 00:14:53,730 --> 00:14:57,030 into the overall economy. 388 00:14:57,030 --> 00:14:59,260 Education for incremental advance 389 00:14:59,260 --> 00:15:02,310 is probably not the only system we ought to be worried about. 390 00:15:02,310 --> 00:15:05,550 And that's kind of what Steph's round of questions 391 00:15:05,550 --> 00:15:09,060 here was driving at, I think, if I've interpreted you right, 392 00:15:09,060 --> 00:15:12,840 which is what is that more disruptive education 393 00:15:12,840 --> 00:15:15,810 system that we have to contemplate here 394 00:15:15,810 --> 00:15:18,570 if we want to educate not only for incremental advance, 395 00:15:18,570 --> 00:15:21,082 but for true innovation as well? 396 00:15:21,082 --> 00:15:23,170 Is that fair? 397 00:15:23,170 --> 00:15:24,670 Sort of. 398 00:15:24,670 --> 00:15:25,660 STEPH: Yeah. 399 00:15:25,660 --> 00:15:32,590 I think if I were to have one last maybe point or question, 400 00:15:32,590 --> 00:15:33,940 I would ask if any of you read-- 401 00:15:33,940 --> 00:15:35,565 WILLIAM BONVILLIAN: Yeah, why don't you 402 00:15:35,565 --> 00:15:37,142 make some key points in your mind 403 00:15:37,142 --> 00:15:38,350 about what these pieces mean. 404 00:15:38,350 --> 00:15:39,700 You just heard my version, but-- 405 00:15:39,700 --> 00:15:40,990 STEPH: Yeah. 406 00:15:40,990 --> 00:15:44,410 So I guess I last started off with a straw poll. 407 00:15:44,410 --> 00:15:46,540 As you know, I favor these. 408 00:15:46,540 --> 00:15:49,360 How many of you have heard or read Pedagogy of the Oppressed 409 00:15:49,360 --> 00:15:50,460 by Paulo Freire? 410 00:15:53,350 --> 00:15:56,063 Do you know what pedagogy means? 411 00:15:56,063 --> 00:15:57,980 LILY: Maybe Martin can explain it to us with-- 412 00:15:57,980 --> 00:15:59,337 [LAUGHTER] 413 00:15:59,337 --> 00:16:01,795 WILLIAM BONVILLIAN: I'm going to rely on Lily for this one. 414 00:16:01,795 --> 00:16:02,295 STEPH: Yeah. 415 00:16:05,670 --> 00:16:08,692 So was there anyone who had read Pedagogy of the Oppressed? 416 00:16:08,692 --> 00:16:10,150 Have any of you ever taken a course 417 00:16:10,150 --> 00:16:13,818 in education, education policy? 418 00:16:13,818 --> 00:16:15,610 How many of you have very strong feelings-- 419 00:16:15,610 --> 00:16:16,600 WILLIAM BONVILLIAN: Except for this course, of course. 420 00:16:16,600 --> 00:16:17,850 STEPH: Except for this course. 421 00:16:17,850 --> 00:16:25,270 How many of you feel like you're stakeholders in a conversation 422 00:16:25,270 --> 00:16:28,570 about the change in national education policy or in state 423 00:16:28,570 --> 00:16:29,760 education policy? 424 00:16:29,760 --> 00:16:31,510 SPEAKER 2: OK, I think we found something. 425 00:16:31,510 --> 00:16:32,230 STEPH: OK. 426 00:16:32,230 --> 00:16:35,440 Who feels like they're a stakeholder in the education 427 00:16:35,440 --> 00:16:36,487 policy realm? 428 00:16:36,487 --> 00:16:37,820 SPEAKER 2: I mean, don't we all? 429 00:16:37,820 --> 00:16:39,370 MARTIN: A stake stakeholder or active stakeholder? 430 00:16:39,370 --> 00:16:40,787 STEPH: Like an active stakeholder, 431 00:16:40,787 --> 00:16:45,040 like you're willing to show up to Congress, to go to protests, 432 00:16:45,040 --> 00:16:49,900 to talk to legislators, blah, blah blah, blah, blah. 433 00:16:49,900 --> 00:16:52,200 I mean, that kind of highlights, to me, 434 00:16:52,200 --> 00:16:54,410 the divide between the kinds of stuff 435 00:16:54,410 --> 00:16:57,730 that Goldin, Katz, Bamol, Freemen, et cetera, 436 00:16:57,730 --> 00:16:59,680 are talking about. 437 00:16:59,680 --> 00:17:02,650 And this is, again, hinting at my issues 438 00:17:02,650 --> 00:17:04,920 with the way in which we teach social sciences. 439 00:17:04,920 --> 00:17:07,579 That so much of the Socratic method, which is essentially 440 00:17:07,579 --> 00:17:09,700 Bill explaining to us what it is that we read 441 00:17:09,700 --> 00:17:12,460 and then having us flesh it out and debate 442 00:17:12,460 --> 00:17:14,560 in a lively and analogical way. 443 00:17:18,252 --> 00:17:19,960 The Socratic method is to get us thinking 444 00:17:19,960 --> 00:17:22,180 about what this means truly in implementation 445 00:17:22,180 --> 00:17:24,493 and how to grapple with theory. 446 00:17:24,493 --> 00:17:26,619 RASHID: I think, yeah, as you start 447 00:17:26,619 --> 00:17:30,850 bringing that up, it might have been maybe instructive for us 448 00:17:30,850 --> 00:17:33,790 to have a couple of readings by a couple of states who 449 00:17:33,790 --> 00:17:36,160 are grappling with education issues 450 00:17:36,160 --> 00:17:39,550 and grappling with some issues, particularly those that 451 00:17:39,550 --> 00:17:41,680 are like, I am the lead administrator, 452 00:17:41,680 --> 00:17:45,760 I'm in charge of the education for maybe the city of Chicago 453 00:17:45,760 --> 00:17:48,520 versus the state of Illinois, and how they're actually 454 00:17:48,520 --> 00:17:52,565 going about grappling with these public institutions trying 455 00:17:52,565 --> 00:17:53,590 to change these things. 456 00:17:53,590 --> 00:17:54,910 Because it's really nice to kind of look 457 00:17:54,910 --> 00:17:56,452 at the macroscopic level and be like, 458 00:17:56,452 --> 00:17:58,910 OK, we're not graduating enough [INAUDIBLE] science 459 00:17:58,910 --> 00:17:59,650 and engineers. 460 00:17:59,650 --> 00:18:01,468 But in reality, education, as we said, 461 00:18:01,468 --> 00:18:03,760 because we have this really distributed model in the US 462 00:18:03,760 --> 00:18:06,370 compared to Sweden, it's like, I'm 463 00:18:06,370 --> 00:18:09,220 looking at this not only community college by community 464 00:18:09,220 --> 00:18:10,450 college, but state by state. 465 00:18:10,450 --> 00:18:16,330 And that's how folks sort of interact with these education 466 00:18:16,330 --> 00:18:17,090 issues. 467 00:18:17,090 --> 00:18:18,550 And I think maybe to Steph's point 468 00:18:18,550 --> 00:18:20,997 and what I'm trying to get from her is like, 469 00:18:20,997 --> 00:18:23,080 you have these active stakeholders that are really 470 00:18:23,080 --> 00:18:26,800 involved in their small pools and might 471 00:18:26,800 --> 00:18:30,528 have a little bit more of an effect or a lot more ideas 472 00:18:30,528 --> 00:18:32,320 on how to innovate within their small pools 473 00:18:32,320 --> 00:18:35,230 to get to this state where we want 474 00:18:35,230 --> 00:18:37,060 breakthrough education in addition 475 00:18:37,060 --> 00:18:38,770 to incremental advances. 476 00:18:38,770 --> 00:18:41,890 And so I think my recommendation would be, 477 00:18:41,890 --> 00:18:44,543 is there someone maybe in this Pedagogy of the Oppressed book 478 00:18:44,543 --> 00:18:46,210 they could offer an alternative opinion. 479 00:18:46,210 --> 00:18:50,210 Are there folks who do education professionally 480 00:18:50,210 --> 00:18:51,960 and who do educational administration-- do 481 00:18:51,960 --> 00:18:54,043 they have these same thoughts and are they worried 482 00:18:54,043 --> 00:18:55,630 about these same things? 483 00:18:55,630 --> 00:18:57,700 And are they trying to look at these problems 484 00:18:57,700 --> 00:18:58,978 at the same level? 485 00:18:58,978 --> 00:19:00,520 STEPH: And are their opinions-- and I 486 00:19:00,520 --> 00:19:02,860 think this is a crucial point for me-- 487 00:19:02,860 --> 00:19:04,660 are their opinions respected? 488 00:19:04,660 --> 00:19:08,568 Because a PhD from the Stanford Graduate School of Education 489 00:19:08,568 --> 00:19:10,360 or the Harvard Graduate School of Education 490 00:19:10,360 --> 00:19:12,230 does not mean the same things to most people 491 00:19:12,230 --> 00:19:15,670 as a PhD in material science from those same institutions. 492 00:19:15,670 --> 00:19:17,620 And that concerns me, right? 493 00:19:17,620 --> 00:19:19,840 Because if it's the individuals-- 494 00:19:19,840 --> 00:19:23,230 there's also, I guess, an undercurrent in education, 495 00:19:23,230 --> 00:19:26,230 and we have mentioned it before in this class, of those who 496 00:19:26,230 --> 00:19:28,640 can do, those who can't teach. 497 00:19:28,640 --> 00:19:30,970 And it is that sort of point of disrespect 498 00:19:30,970 --> 00:19:33,370 that we got at at the beginning of class 499 00:19:33,370 --> 00:19:37,840 that I think merits far more consideration by this group 500 00:19:37,840 --> 00:19:41,830 than do perhaps the contents of the economics readings 501 00:19:41,830 --> 00:19:42,558 by Romer. 502 00:19:42,558 --> 00:19:44,350 Because if we, ourselves, are not convinced 503 00:19:44,350 --> 00:19:46,240 that teaching is a worthwhile profession 504 00:19:46,240 --> 00:19:47,890 and that the education of young people 505 00:19:47,890 --> 00:19:53,590 is a worthwhile activity, then who can be convinced of that 506 00:19:53,590 --> 00:19:56,170 if we are involved and invested in the promotion 507 00:19:56,170 --> 00:19:58,930 of more equitable education of science and technology? 508 00:19:58,930 --> 00:20:00,820 Because we know that that's the key 509 00:20:00,820 --> 00:20:05,350 to driving economic growth, and also prosperity 510 00:20:05,350 --> 00:20:07,570 and decreasing inequality. 511 00:20:07,570 --> 00:20:09,160 And so that's why to me, when Bill 512 00:20:09,160 --> 00:20:12,130 says that Goldin and Katz are talking about what 513 00:20:12,130 --> 00:20:15,280 are the policies that would address us getting back 514 00:20:15,280 --> 00:20:19,030 to a more equitable economic system, to me 515 00:20:19,030 --> 00:20:21,580 it's not about policies, it's about culture. 516 00:20:21,580 --> 00:20:23,590 It's about our perceptions of other people 517 00:20:23,590 --> 00:20:25,590 and what they are doing with their lives 518 00:20:25,590 --> 00:20:29,490 and about our distancing of ourselves as, 519 00:20:29,490 --> 00:20:32,160 but that's not my issue, that's not my problem, right? 520 00:20:32,160 --> 00:20:35,850 So I think it's unfair and a little bit intellectually 521 00:20:35,850 --> 00:20:40,560 irresponsible of us to claim that distance while also 522 00:20:40,560 --> 00:20:41,970 critiquing it. 523 00:20:41,970 --> 00:20:44,880 So I would posit to all of you that, 524 00:20:44,880 --> 00:20:48,450 if you have time in the next maybe two or three years 525 00:20:48,450 --> 00:20:51,060 to pick up this book, Pedagogy of the Oppressed by Freire-- 526 00:20:51,060 --> 00:20:52,530 MARTIN: What's the book about? 527 00:20:52,530 --> 00:20:54,750 STEPH: It's about the education system in Brazil 528 00:20:54,750 --> 00:20:59,040 and about the ways in which students have been disregarded 529 00:20:59,040 --> 00:21:03,360 by the system and conceived of as passive recipients 530 00:21:03,360 --> 00:21:06,830 of knowledge rather than as active learners. 531 00:21:06,830 --> 00:21:09,860 And the metaphor that he utilizes 532 00:21:09,860 --> 00:21:13,155 is that they're receptacles, and that we dump in knowledge, 533 00:21:13,155 --> 00:21:15,030 and then we expect them to regurgitate it out 534 00:21:15,030 --> 00:21:16,940 and to be able to apply it in the field. 535 00:21:16,940 --> 00:21:18,940 Where as his understanding of pedagogy, 536 00:21:18,940 --> 00:21:21,440 which is the study of teaching and the study of learning, 537 00:21:21,440 --> 00:21:23,815 is that individuals have to grapple with something. 538 00:21:23,815 --> 00:21:25,440 And that's sort of at the heart of what 539 00:21:25,440 --> 00:21:29,670 Lily will be talking about in the MITx reading, 540 00:21:29,670 --> 00:21:32,340 when they start talking about Seymour Papert, who 541 00:21:32,340 --> 00:21:35,970 was a brilliant professor and researcher at MIT who created 542 00:21:35,970 --> 00:21:41,730 the field of constructionism, which is the study of something 543 00:21:41,730 --> 00:21:43,260 by doing the something. 544 00:21:43,260 --> 00:21:46,170 And that is actually being very widely adopted 545 00:21:46,170 --> 00:21:49,590 in Southeast Asia in countries like Singapore and Thailand, 546 00:21:49,590 --> 00:21:52,740 who are experiencing momentous economic advances 547 00:21:52,740 --> 00:21:55,240 in the research and development sector. 548 00:21:55,240 --> 00:21:58,440 So I think very much pedagogy and considerations 549 00:21:58,440 --> 00:22:00,420 of culture become really relevant 550 00:22:00,420 --> 00:22:02,897 when we talk about education. 551 00:22:02,897 --> 00:22:04,980 LILY: Well, I think we need to get back to dollars 552 00:22:04,980 --> 00:22:05,522 here, though. 553 00:22:05,522 --> 00:22:10,350 Because if you're head of household, single parent, 554 00:22:10,350 --> 00:22:12,455 or single-income family-- 555 00:22:12,455 --> 00:22:15,660 who in here thinks that you can have a family on a K 556 00:22:15,660 --> 00:22:17,910 through 12 teacher's salary? 557 00:22:17,910 --> 00:22:18,450 You can't. 558 00:22:18,450 --> 00:22:20,070 And even if you're-- 559 00:22:20,070 --> 00:22:23,790 yes, the average K through 12 teacher salary is higher 560 00:22:23,790 --> 00:22:28,620 in, say, Los Angeles than it is in central Illinois or rural 561 00:22:28,620 --> 00:22:31,380 parts of the US, so they compensate for living expenses, 562 00:22:31,380 --> 00:22:35,240 but you still cannot be single parent head of household on K 563 00:22:35,240 --> 00:22:36,080 through 12 salary. 564 00:22:36,080 --> 00:22:40,622 So sure, you can think that K through 12 education 565 00:22:40,622 --> 00:22:43,080 needs to be reformed, you can be passionate about teaching. 566 00:22:43,080 --> 00:22:46,690 But if you are financially responsible for a family unit, 567 00:22:46,690 --> 00:22:48,900 you can't be a teacher in K through 12. 568 00:22:48,900 --> 00:22:51,420 It doesn't matter how much you care about it. 569 00:22:51,420 --> 00:22:53,712 MARTIN: There's a part from the reading that was like-- 570 00:22:53,712 --> 00:22:56,278 I think they needed 50 hours to get a $1,000 or something 571 00:22:56,278 --> 00:22:57,333 like that [INAUDIBLE]. 572 00:22:57,333 --> 00:22:59,750 WILLIAM BONVILLIAN: So I'll tell you what I'm going to do. 573 00:22:59,750 --> 00:23:02,430 I'm going to push us, so we can get out of here 574 00:23:02,430 --> 00:23:04,880 at a reasonable time level. 575 00:23:04,880 --> 00:23:09,410 I'm going to do the last reading from our textbook, 576 00:23:09,410 --> 00:23:14,140 and then come back to the MITx reading. 577 00:23:14,140 --> 00:23:15,890 MARTIN: Yeah, we need more graphics, Bill, 578 00:23:15,890 --> 00:23:17,570 because all these have been so great. 579 00:23:17,570 --> 00:23:20,870 WILLIAM BONVILLIAN: I figured I'd put myself up here. 580 00:23:20,870 --> 00:23:23,790 MARTIN: Did you get Michael [INAUDIBLE] to do it? 581 00:23:23,790 --> 00:23:24,890 WILLIAM BONVILLIAN: I had to really struggle with this, 582 00:23:24,890 --> 00:23:26,006 Martin, I can tell you. 583 00:23:26,006 --> 00:23:28,400 [LAUGHTER] 584 00:23:28,400 --> 00:23:35,330 So, look, here's what's different. 585 00:23:35,330 --> 00:23:38,990 So new technology innovations are 586 00:23:38,990 --> 00:23:42,550 very slow to enter legacy sectors, right? 587 00:23:42,550 --> 00:23:45,740 Remember discussing energy? 588 00:23:45,740 --> 00:23:47,630 We invented fire-- 589 00:23:47,630 --> 00:23:49,810 I don't know how many millions of years that took. 590 00:23:49,810 --> 00:23:51,380 Then, we burned trees. 591 00:23:51,380 --> 00:23:52,915 We did that for a millenia. 592 00:23:52,915 --> 00:23:55,040 RASHID: I think the word invented was hastily used. 593 00:23:55,040 --> 00:23:56,518 [LAUGHTER] 594 00:23:56,518 --> 00:23:57,310 MARTIN: Discovered. 595 00:23:57,310 --> 00:23:59,360 [INTERPOSING VOICES] 596 00:23:59,360 --> 00:24:02,470 WILLIAM BONVILLIAN: Shall we say lightning struck, right? 597 00:24:02,470 --> 00:24:05,210 And we moved to coal, right? 598 00:24:05,210 --> 00:24:08,330 Moved to coal, and then we moved to oil. 599 00:24:08,330 --> 00:24:10,820 And that's about as far as we've gotten, essentially, 600 00:24:10,820 --> 00:24:11,570 since that time. 601 00:24:11,570 --> 00:24:12,860 SPEAKER 2: I mean, the rate of adoption 602 00:24:12,860 --> 00:24:13,880 has increased exponentially. 603 00:24:13,880 --> 00:24:14,963 So I think we're doing OK. 604 00:24:14,963 --> 00:24:16,880 WILLIAM BONVILLIAN: Yes, compared to wood. 605 00:24:16,880 --> 00:24:18,835 MARTIN: The population, too, has gone up. 606 00:24:18,835 --> 00:24:20,210 WILLIAM BONVILLIAN: So similarly, 607 00:24:20,210 --> 00:24:22,970 another legacy sector, higher education. 608 00:24:22,970 --> 00:24:27,432 What was the last really big reform in education? 609 00:24:27,432 --> 00:24:28,140 So we went from-- 610 00:24:28,140 --> 00:24:29,810 STEPH: Allowing women to participate. 611 00:24:29,810 --> 00:24:30,495 WILLIAM BONVILLIAN: Well, that was an advance. 612 00:24:30,495 --> 00:24:31,703 There's no question about it. 613 00:24:31,703 --> 00:24:37,460 But we had the platonic academy, and then 614 00:24:37,460 --> 00:24:41,990 a momentous breakthrough around 1500 years later-- 615 00:24:41,990 --> 00:24:43,940 we came up with the book. 616 00:24:43,940 --> 00:24:46,340 That was clearly a great breakthrough. 617 00:24:46,340 --> 00:24:47,990 That's, from a technological advance 618 00:24:47,990 --> 00:24:53,780 period, that's about it, until we finally figured out 619 00:24:53,780 --> 00:24:55,560 how to use this kind of stuff. 620 00:24:55,560 --> 00:24:59,300 And we're starting to figure out how to bring that IT 621 00:24:59,300 --> 00:25:00,630 revolution into the classroom. 622 00:25:00,630 --> 00:25:05,030 So that presents-- if the issue is bringing change 623 00:25:05,030 --> 00:25:07,880 into this system, technology, obviously, 624 00:25:07,880 --> 00:25:09,980 can be a change driver. 625 00:25:09,980 --> 00:25:12,620 And that's an interesting story. 626 00:25:12,620 --> 00:25:15,800 That's not the only change and reform story, 627 00:25:15,800 --> 00:25:18,240 as you just pointed out, Steph. 628 00:25:18,240 --> 00:25:21,150 There's other important changes that have to be considered. 629 00:25:21,150 --> 00:25:25,790 But really, within the last five, six years, 630 00:25:25,790 --> 00:25:32,090 we've actually seen the entry of IT-type tools 631 00:25:32,090 --> 00:25:38,480 into the classroom setting with potential momentous change. 632 00:25:38,480 --> 00:25:41,960 So there's an open question here about whether MOOCs 633 00:25:41,960 --> 00:25:45,200 and online education are going to be a disruptive innovation 634 00:25:45,200 --> 00:25:47,660 and whether they will disrupt higher education 635 00:25:47,660 --> 00:25:49,640 and substitute-- 636 00:25:49,640 --> 00:25:51,420 in effect, create a new model. 637 00:25:51,420 --> 00:25:57,390 Is this a Chipotle replaces McDonald's kind of approach? 638 00:25:57,390 --> 00:26:00,770 See, I told you I'd still hear your metaphor, Martin. 639 00:26:00,770 --> 00:26:04,790 And how will higher education respond? 640 00:26:04,790 --> 00:26:07,250 Will it respond with a blended model? 641 00:26:07,250 --> 00:26:11,580 Or will it attempt to keep its existing model? 642 00:26:11,580 --> 00:26:13,820 So we'll talk about that for a minute. 643 00:26:13,820 --> 00:26:20,272 As we've seen, universities present a deep problem 644 00:26:20,272 --> 00:26:21,980 for this kind of disruption, because they 645 00:26:21,980 --> 00:26:23,060 are legacy sectors. 646 00:26:23,060 --> 00:26:25,880 And like all legacy sectors, they 647 00:26:25,880 --> 00:26:30,590 tended to resist disruptive change. 648 00:26:30,590 --> 00:26:34,250 Institutions of higher education conduct 649 00:26:34,250 --> 00:26:36,410 almost no R&D on education itself-- 650 00:26:36,410 --> 00:26:37,640 almost none. 651 00:26:37,640 --> 00:26:38,930 It's astonishing. 652 00:26:38,930 --> 00:26:40,692 The services sector, generally, are not 653 00:26:40,692 --> 00:26:42,650 particularly good at this, but higher education 654 00:26:42,650 --> 00:26:45,380 may lead the pack. 655 00:26:45,380 --> 00:26:47,420 There are very perverse pricing issues 656 00:26:47,420 --> 00:26:50,330 in higher education, as you all are paying tuition 657 00:26:50,330 --> 00:26:53,060 at the moment understand. 658 00:26:53,060 --> 00:26:56,120 It's a very decentralized model. 659 00:26:56,120 --> 00:27:01,310 It's very hard to spread change and spread new ideas, 660 00:27:01,310 --> 00:27:03,710 to transition new ideas, because this 661 00:27:03,710 --> 00:27:06,690 is a very decentralized system. 662 00:27:06,690 --> 00:27:09,920 In legacy sector terms, there's a collective action problem. 663 00:27:09,920 --> 00:27:12,710 How do you get collective action across literally thousands 664 00:27:12,710 --> 00:27:13,400 of institutions? 665 00:27:13,400 --> 00:27:15,692 And I'm just focused on higher education at this point. 666 00:27:15,692 --> 00:27:18,950 There's a whole additional story for online 667 00:27:18,950 --> 00:27:22,650 and the K through 12 sector. 668 00:27:22,650 --> 00:27:23,330 What's happened? 669 00:27:23,330 --> 00:27:28,140 You all are familiar with edX, the entry of a nonprofit model 670 00:27:28,140 --> 00:27:28,640 here. 671 00:27:28,640 --> 00:27:31,670 And frankly, I have to give tremendous credit 672 00:27:31,670 --> 00:27:34,250 to Rafael Reif, because I never thought universities 673 00:27:34,250 --> 00:27:37,735 themselves would originate potentially disruptive changes 674 00:27:37,735 --> 00:27:38,360 for themselves. 675 00:27:38,360 --> 00:27:40,670 That kind of violates my rule set. 676 00:27:40,670 --> 00:27:45,590 But Rafael basically, personally, 677 00:27:45,590 --> 00:27:49,250 drove a fair amount of this. 678 00:27:49,250 --> 00:27:55,940 And his background-- he was a poor kid growing up 679 00:27:55,940 --> 00:28:01,040 in Venezuela and from a family that had not sent children 680 00:28:01,040 --> 00:28:02,000 to college. 681 00:28:02,000 --> 00:28:07,850 And he managed to get to go to the technical college 682 00:28:07,850 --> 00:28:09,592 in Venezuela. 683 00:28:09,592 --> 00:28:11,050 It was a real breakthrough for him. 684 00:28:11,050 --> 00:28:16,610 And ends up at MIT through a series of miracles 685 00:28:16,610 --> 00:28:20,450 and becomes president. 686 00:28:20,450 --> 00:28:24,200 And he thinks all the time about that community 687 00:28:24,200 --> 00:28:26,120 that he left behind. 688 00:28:26,120 --> 00:28:30,020 I know from knowing him and conversations with him 689 00:28:30,020 --> 00:28:33,140 just how he saw this online education 690 00:28:33,140 --> 00:28:39,200 model as a remarkable new tool to get 691 00:28:39,200 --> 00:28:43,400 education and educational platforms everywhere. 692 00:28:43,400 --> 00:28:48,140 I mean, it is a remarkable new entrant possibility 693 00:28:48,140 --> 00:28:50,030 in education. 694 00:28:50,030 --> 00:28:52,100 And he did this, frankly. 695 00:28:52,100 --> 00:28:53,690 He drove the creation of edX. 696 00:28:53,690 --> 00:28:55,400 And there were corresponding examples 697 00:28:55,400 --> 00:28:59,400 coming out of Stanford Udacity and Coursera 698 00:28:59,400 --> 00:29:00,770 in a comparable period. 699 00:29:00,770 --> 00:29:02,750 But he wanted to do a nonprofit model, 700 00:29:02,750 --> 00:29:08,360 because he thought it would be better-geared to this mission 701 00:29:08,360 --> 00:29:10,640 of significantly improving education everywhere, 702 00:29:10,640 --> 00:29:12,900 that a new tool set would be at hand. 703 00:29:12,900 --> 00:29:15,890 Suppose kids like him growing up on sandlot baseball 704 00:29:15,890 --> 00:29:23,990 lots in Venezuela could have had access to MIT courses, right? 705 00:29:23,990 --> 00:29:26,060 What would that be like? 706 00:29:26,060 --> 00:29:30,290 So that was his story, and that's the story 707 00:29:30,290 --> 00:29:33,380 behind the creation of edX. 708 00:29:33,380 --> 00:29:37,520 And the nonprofit model enables certain kinds of things. 709 00:29:37,520 --> 00:29:39,800 A for-profit model enables other things. 710 00:29:39,800 --> 00:29:42,290 But one of the interesting features of edX 711 00:29:42,290 --> 00:29:44,450 is that it is an open-source model. 712 00:29:44,450 --> 00:29:46,970 It's an open-source technology platform. 713 00:29:46,970 --> 00:29:49,580 And people contribute and constantly build up 714 00:29:49,580 --> 00:29:52,600 the quality of the technology behind the model. 715 00:29:52,600 --> 00:29:56,270 It's harder to do that in a for-profit kind of approach. 716 00:29:56,270 --> 00:29:59,120 It's still finding its way towards a business model, 717 00:29:59,120 --> 00:30:00,380 and we'll talk about it. 718 00:30:00,380 --> 00:30:03,830 The other major MOOC providers are listed here. 719 00:30:03,830 --> 00:30:06,050 You're familiar with them. 720 00:30:06,050 --> 00:30:09,560 They're on for-profit models. 721 00:30:09,560 --> 00:30:12,920 The political world took a look at these MOOCs 722 00:30:12,920 --> 00:30:17,260 and had different ideological reactions. 723 00:30:17,260 --> 00:30:19,310 On the right, to some extent, they 724 00:30:19,310 --> 00:30:22,340 saw an opportunity of these free online educational courses 725 00:30:22,340 --> 00:30:24,890 as an opportunity to get rid of these pesky left-wing 726 00:30:24,890 --> 00:30:29,450 universities that are constantly training the wrong people. 727 00:30:29,450 --> 00:30:32,630 On the left, there was a sense that, oh, we can finally 728 00:30:32,630 --> 00:30:36,740 get rid of tuition, have the Bernie Sanders free tuition 729 00:30:36,740 --> 00:30:40,130 dream, because online education doesn't cost anything, 730 00:30:40,130 --> 00:30:43,520 and we'll just do that. 731 00:30:43,520 --> 00:30:46,730 Obviously, that's a little bit of magical thinking here 732 00:30:46,730 --> 00:30:48,440 on both sides. 733 00:30:48,440 --> 00:30:50,030 But states have begun passing laws, 734 00:30:50,030 --> 00:30:52,940 doing things like requiring $10,000 BAs 735 00:30:52,940 --> 00:30:57,540 and comparable kinds of issues. 736 00:30:57,540 --> 00:30:59,990 But the deep question here is, what's 737 00:30:59,990 --> 00:31:02,510 going to happen to the residential campus? 738 00:31:02,510 --> 00:31:05,930 Is online just going to displace this whole thing? 739 00:31:05,930 --> 00:31:07,820 And if not, why not? 740 00:31:07,820 --> 00:31:14,810 So online, I think as all of you understand, 741 00:31:14,810 --> 00:31:17,410 it can do some things really well. 742 00:31:17,410 --> 00:31:19,640 It's a whole new tool set for visualization, 743 00:31:19,640 --> 00:31:21,830 for representation. 744 00:31:21,830 --> 00:31:24,050 It offers incredible opportunities 745 00:31:24,050 --> 00:31:30,260 for reinforcement and assessment that a lecture can't do. 746 00:31:30,260 --> 00:31:34,700 You can use feedback loops and repetition 747 00:31:34,700 --> 00:31:40,760 to do continuous assessment, which there's 748 00:31:40,760 --> 00:31:45,140 no way your standard college and university lecture class can 749 00:31:45,140 --> 00:31:47,190 ever replicate those kinds of capabilities, 750 00:31:47,190 --> 00:31:49,020 and they're pretty important. 751 00:31:49,020 --> 00:31:53,180 So online is going to have some features that are better 752 00:31:53,180 --> 00:31:56,570 than what lectures can do. 753 00:31:56,570 --> 00:32:01,670 And then, potentially, if you move the lecture online, 754 00:32:01,670 --> 00:32:03,170 then you can free up your classroom 755 00:32:03,170 --> 00:32:06,860 for much more interchange and interpersonal kinds 756 00:32:06,860 --> 00:32:11,750 of communication and more of the learning 757 00:32:11,750 --> 00:32:15,440 by doing opportunities that Steph was suggesting. 758 00:32:15,440 --> 00:32:22,940 So maybe it frees up what the classroom could become. 759 00:32:22,940 --> 00:32:27,500 Now, vital educational elements remain face-to-face. 760 00:32:27,500 --> 00:32:32,360 At least at the moment, online can't replicate these. 761 00:32:32,360 --> 00:32:36,410 So oral expression, oral presentation, 762 00:32:36,410 --> 00:32:41,000 advocacy skills, the way in which 763 00:32:41,000 --> 00:32:42,740 you have to organize your expertise 764 00:32:42,740 --> 00:32:47,360 to be able to speak about a subject area-- 765 00:32:47,360 --> 00:32:48,728 face-to-face can do this. 766 00:32:48,728 --> 00:32:51,020 It's really hard to do that effectively, at this point, 767 00:32:51,020 --> 00:32:53,290 with the technology online. 768 00:32:53,290 --> 00:32:59,260 Written analysis-- so far, online writing evaluation 769 00:32:59,260 --> 00:33:02,410 leaves a lot to be desired. 770 00:33:02,410 --> 00:33:05,050 Research-- look, in the end, you really 771 00:33:05,050 --> 00:33:08,860 do research on lab benches with colleagues. 772 00:33:08,860 --> 00:33:10,970 It's still quite face-to-face. 773 00:33:10,970 --> 00:33:14,530 And you can replicate some of that through simulation 774 00:33:14,530 --> 00:33:17,260 and modeling, but a lot of it you can't. 775 00:33:17,260 --> 00:33:23,350 So there's a lot that probably needs to stay 776 00:33:23,350 --> 00:33:25,930 on the face-to-face side. 777 00:33:25,930 --> 00:33:29,350 So these kinds of skills. 778 00:33:29,350 --> 00:33:33,850 Learning requires a lot of human scaffolding for discourse, 779 00:33:33,850 --> 00:33:36,250 argumentation, mentoring, for making the case, 780 00:33:36,250 --> 00:33:39,970 for research, for making the conceptual leap. 781 00:33:39,970 --> 00:33:44,800 And a lot of what occurs in the classroom, the socialization 782 00:33:44,800 --> 00:33:48,750 process of the classroom drives a lot of learning-- 783 00:33:48,750 --> 00:33:51,940 the kind of back and forth, the kind of competition, 784 00:33:51,940 --> 00:33:53,710 the kind of connections between people, 785 00:33:53,710 --> 00:33:57,460 friendships that build, the kind of community feeling of a class 786 00:33:57,460 --> 00:33:58,630 comes together. 787 00:33:58,630 --> 00:34:02,585 Those are very powerful learning aids. 788 00:34:02,585 --> 00:34:04,210 These are very powerful learning tools. 789 00:34:04,210 --> 00:34:06,910 And obviously, you can't get that stuff 790 00:34:06,910 --> 00:34:10,360 sitting in a basement in front of a blue screen. 791 00:34:10,360 --> 00:34:13,620 So the real question here is, how do you 792 00:34:13,620 --> 00:34:15,889 optimize the two sides? 793 00:34:15,889 --> 00:34:19,150 How do we let online do what it does best? 794 00:34:19,150 --> 00:34:22,600 And how do we let the face-to-face piece thrive 795 00:34:22,600 --> 00:34:24,699 and do more? 796 00:34:24,699 --> 00:34:27,639 That's really the opportunity that we now 797 00:34:27,639 --> 00:34:31,540 have, is to completely restructure education 798 00:34:31,540 --> 00:34:34,480 with a whole new tool set that, in turn, will 799 00:34:34,480 --> 00:34:37,000 enable that face-to-face education to kind of rise 800 00:34:37,000 --> 00:34:38,080 to a new level. 801 00:34:38,080 --> 00:34:39,010 That's the dream. 802 00:34:39,010 --> 00:34:40,150 That's the hope. 803 00:34:40,150 --> 00:34:43,000 That's the reform this potentially enables. 804 00:34:43,000 --> 00:34:45,520 So it's a human machine symbiosis, remember? 805 00:34:45,520 --> 00:34:47,590 It's JCR Licklider. 806 00:34:47,590 --> 00:34:51,040 Let the computer do what it's good, and let people 807 00:34:51,040 --> 00:34:54,010 do what they're good at, and you have a symbiosis of the two. 808 00:34:54,010 --> 00:34:56,770 That's what we could get in education now. 809 00:34:56,770 --> 00:34:59,120 That's the opportunity space. 810 00:34:59,120 --> 00:35:02,170 And that's a blended learning model. 811 00:35:02,170 --> 00:35:04,100 Now, the technology is going to change, 812 00:35:04,100 --> 00:35:08,302 and it's going to get better at some of these things. 813 00:35:08,302 --> 00:35:13,240 But that's an ongoing and extended process. 814 00:35:13,240 --> 00:35:15,550 What happens to the university? 815 00:35:15,550 --> 00:35:18,100 For a long time, newspaper journalists 816 00:35:18,100 --> 00:35:21,627 were writing articles, oh, ha, ha, 817 00:35:21,627 --> 00:35:23,210 what's going to happen to universities 818 00:35:23,210 --> 00:35:25,600 is what happened to us as journalists. 819 00:35:25,600 --> 00:35:28,720 All our newspapers went out of business essentially, right? 820 00:35:28,720 --> 00:35:29,350 Ha, ha. 821 00:35:29,350 --> 00:35:31,000 It's going to happen to you too, see? 822 00:35:31,000 --> 00:35:33,220 This online revolution is going to drive you under. 823 00:35:35,900 --> 00:35:38,990 It's problematic if we do that, because frankly, 824 00:35:38,990 --> 00:35:41,727 universities create the course content. 825 00:35:41,727 --> 00:35:43,810 They are fairly important institutions, after all, 826 00:35:43,810 --> 00:35:45,560 on the content side. 827 00:35:45,560 --> 00:35:48,890 They're research engines as well as teaching centers, 828 00:35:48,890 --> 00:35:51,560 so we kind of need that research in the American system 829 00:35:51,560 --> 00:35:54,260 and many other countries. 830 00:35:54,260 --> 00:35:56,965 We blend the research model and the teaching model, 831 00:35:56,965 --> 00:35:58,340 so that there's a lot of learning 832 00:35:58,340 --> 00:36:01,830 by doing, more so in the graduate education phase, 833 00:36:01,830 --> 00:36:04,800 but there's a lot of learning by doing in that system. 834 00:36:04,800 --> 00:36:09,830 So you need these institutions for that stuff. 835 00:36:09,830 --> 00:36:15,620 And if you use the model to drive the university model 836 00:36:15,620 --> 00:36:19,700 under, you've got these underlying really deep kind 837 00:36:19,700 --> 00:36:21,640 of problems. 838 00:36:21,640 --> 00:36:23,390 So there's a question about whether or not 839 00:36:23,390 --> 00:36:24,395 universities will adapt. 840 00:36:29,035 --> 00:36:31,160 And part of that story is bringing learning science 841 00:36:31,160 --> 00:36:32,660 to online education, and we're going 842 00:36:32,660 --> 00:36:34,070 to get to that in just a minute. 843 00:36:36,980 --> 00:36:39,260 The online revolution does create 844 00:36:39,260 --> 00:36:40,360 incredible opportunities. 845 00:36:40,360 --> 00:36:42,830 We talked earlier about the possibility 846 00:36:42,830 --> 00:36:49,280 of worldwide availability of high-quality education. 847 00:36:49,280 --> 00:36:52,847 It's not a blended model, but it's pretty good. 848 00:36:52,847 --> 00:36:53,930 It's pretty good material. 849 00:36:53,930 --> 00:36:56,300 You can learn a lot from it. 850 00:36:56,300 --> 00:36:59,780 That's a staggering new opportunity for education 851 00:36:59,780 --> 00:37:00,300 worldwide. 852 00:37:00,300 --> 00:37:05,000 And that's precisely why Rafael Reif pushed this effort. 853 00:37:05,000 --> 00:37:07,270 That was the vision that he saw, frankly. 854 00:37:07,270 --> 00:37:07,770 Martin? 855 00:37:07,770 --> 00:37:08,320 MARTIN: OK. 856 00:37:08,320 --> 00:37:10,418 So the US probably won't do a DARPA for education. 857 00:37:10,418 --> 00:37:12,460 But I was thinking a lot, especially when talking 858 00:37:12,460 --> 00:37:14,545 with this about JCR Licklider-- 859 00:37:14,545 --> 00:37:15,670 WILLIAM BONVILLIAN: Mm-hmm. 860 00:37:15,670 --> 00:37:16,580 MARTIN: Is that right? 861 00:37:16,580 --> 00:37:17,130 WILLIAM BONVILLIAN: Yeah. 862 00:37:17,130 --> 00:37:18,797 MARTIN: I always want to say Licklicker. 863 00:37:18,797 --> 00:37:19,550 [LAUGHTER] 864 00:37:19,550 --> 00:37:22,130 So what if they did a little special institute 865 00:37:22,130 --> 00:37:24,350 that just focuses exactly on what the vision is? 866 00:37:24,350 --> 00:37:26,270 Because JCR focused a lot on the vision 867 00:37:26,270 --> 00:37:28,850 of what ARPANET would be, so the vision 868 00:37:28,850 --> 00:37:30,695 in an ideal education world. 869 00:37:30,695 --> 00:37:32,570 Which I know they have that as an initiative, 870 00:37:32,570 --> 00:37:35,390 but have that DARPA intensity with project managers 871 00:37:35,390 --> 00:37:37,250 and moving initiatives forward. 872 00:37:37,250 --> 00:37:38,430 WILLIAM BONVILLIAN: I mean, it's an interesting idea. 873 00:37:38,430 --> 00:37:40,130 And there was discussion of creating 874 00:37:40,130 --> 00:37:42,388 a DARPA within the Department of Education. 875 00:37:42,388 --> 00:37:44,930 I have to say, I was skeptical about whether that could work, 876 00:37:44,930 --> 00:37:48,140 because there's no real tradition of R&D 877 00:37:48,140 --> 00:37:49,790 in the Department of Education. 878 00:37:49,790 --> 00:37:52,310 It's really quite limited. 879 00:37:52,310 --> 00:37:54,290 And would they understand the model enough? 880 00:37:54,290 --> 00:37:57,620 Interestingly, today-- or yesterday-- 881 00:37:57,620 --> 00:38:01,130 DARPA issued a broad area announcement for, 882 00:38:01,130 --> 00:38:05,210 you guessed it, develop a learning machine 883 00:38:05,210 --> 00:38:07,160 for lifelong education. 884 00:38:07,160 --> 00:38:08,540 That's the challenge, right? 885 00:38:08,540 --> 00:38:11,850 So DARPA is now in this territory 886 00:38:11,850 --> 00:38:13,100 in some very interesting ways. 887 00:38:13,100 --> 00:38:15,590 And actually, the military is in this territory. 888 00:38:15,590 --> 00:38:19,610 The military has done more than any other organization by far 889 00:38:19,610 --> 00:38:24,290 to use computer gaming as a training tool, right? 890 00:38:24,290 --> 00:38:27,560 And computer gaming, as you all understand probably a lot 891 00:38:27,560 --> 00:38:33,850 better than I do, operates on an education kind of model. 892 00:38:33,850 --> 00:38:36,840 In other words, learning that game is a very complex process 893 00:38:36,840 --> 00:38:38,340 and is acquired over time. 894 00:38:38,340 --> 00:38:42,180 And a good game gets that flow right 895 00:38:42,180 --> 00:38:45,360 of helping introduce the new elements of the game to you 896 00:38:45,360 --> 00:38:48,360 and the learning that the game requires in manageable steps, 897 00:38:48,360 --> 00:38:50,370 and then reinforcing that and reiterating 898 00:38:50,370 --> 00:38:53,615 or put you back in the feedback loops so that you get it, 899 00:38:53,615 --> 00:38:55,740 and then driving you to the next stage of the game. 900 00:38:55,740 --> 00:38:58,433 That's a very interesting potential educational tool. 901 00:38:58,433 --> 00:39:00,600 Probably some of the best work in the country that's 902 00:39:00,600 --> 00:39:02,640 being done on that is being done by Eric Klopfer 903 00:39:02,640 --> 00:39:06,060 here at MIT, who's developed a whole new set of education 904 00:39:06,060 --> 00:39:07,843 games for K through 12. 905 00:39:07,843 --> 00:39:10,260 I'm surprised that industry hasn't moved in on the sector. 906 00:39:10,260 --> 00:39:12,010 I think it's gradually starting to happen. 907 00:39:12,010 --> 00:39:15,300 But that's an interesting part of this story, 908 00:39:15,300 --> 00:39:19,170 because the only story here is not MOOCs. 909 00:39:19,170 --> 00:39:21,870 MOOCs came about because a whole bunch of computer nerds 910 00:39:21,870 --> 00:39:26,560 at places like MIT and Stanford saw, wait a minute, 911 00:39:26,560 --> 00:39:29,050 we got broadband around the world. 912 00:39:29,050 --> 00:39:30,780 We have a whole new delivery mechanism. 913 00:39:30,780 --> 00:39:32,550 We've got a whole new tool set. 914 00:39:32,550 --> 00:39:34,860 Let's do something. 915 00:39:34,860 --> 00:39:36,920 And they rushed to fill that void. 916 00:39:36,920 --> 00:39:42,320 And edX was created by folks out of our computer science 917 00:39:42,320 --> 00:39:43,560 department. 918 00:39:43,560 --> 00:39:44,060 Rashid? 919 00:39:44,060 --> 00:39:44,560 Go ahead. 920 00:39:44,560 --> 00:39:45,733 RASHID: Yeah. 921 00:39:45,733 --> 00:39:46,900 So it might be instructive-- 922 00:39:46,900 --> 00:39:48,900 I'm going to go from past to present. 923 00:39:48,900 --> 00:39:51,497 But it might be instructive to take a look at, 924 00:39:51,497 --> 00:39:53,580 when we figured out the printing press was a thing 925 00:39:53,580 --> 00:39:55,530 and we could mass print books, who 926 00:39:55,530 --> 00:39:58,480 was the one who decided how do we change maybe 927 00:39:58,480 --> 00:40:01,200 from a Socratic teacher-student-apprentice 928 00:40:01,200 --> 00:40:05,580 relationship to teacher and then large lecture hall? 929 00:40:05,580 --> 00:40:07,890 What were the driving initiatives? 930 00:40:07,890 --> 00:40:10,140 Or who started with these large lecture 931 00:40:10,140 --> 00:40:15,060 halls and these same maybe Licklider-based vision 932 00:40:15,060 --> 00:40:17,940 for how is education going to look once I have a textbook? 933 00:40:17,940 --> 00:40:19,530 And so now, how is-- 934 00:40:19,530 --> 00:40:21,900 you need the same people to start taking a look at, 935 00:40:21,900 --> 00:40:25,800 how is education going to look in this blended learning model? 936 00:40:25,800 --> 00:40:27,300 And I don't know-- 937 00:40:27,300 --> 00:40:30,180 even if MIT or Stanford and all these people 938 00:40:30,180 --> 00:40:33,380 have any sort of idea of what that vision 939 00:40:33,380 --> 00:40:35,115 for that blended learning model is. 940 00:40:35,115 --> 00:40:36,490 WILLIAM BONVILLIAN: Well, I think 941 00:40:36,490 --> 00:40:39,180 we're starting to look at it in a pretty serious way here 942 00:40:39,180 --> 00:40:43,200 and a number of other places, as well. 943 00:40:43,200 --> 00:40:46,200 Good intellectual work is ongoing. 944 00:40:46,200 --> 00:40:49,683 Let me close off with a couple of points about this. 945 00:40:49,683 --> 00:40:51,600 And let's be sure to hold on to your question, 946 00:40:51,600 --> 00:40:53,725 because I think it'll be particularly relevant when 947 00:40:53,725 --> 00:40:56,010 we talk about the learning science piece too, Rashid. 948 00:40:58,710 --> 00:41:00,750 One thing that we're learning about these MOOCs 949 00:41:00,750 --> 00:41:06,340 is, how many people want to go and sit 950 00:41:06,340 --> 00:41:08,860 in front of blue screens in their basement 951 00:41:08,860 --> 00:41:12,552 and take a one-off course? 952 00:41:12,552 --> 00:41:13,480 MARTIN: Bill Gates. 953 00:41:13,480 --> 00:41:15,022 WILLIAM BONVILLIAN: Maybe Bill Gates. 954 00:41:15,022 --> 00:41:16,360 [LAUGHTER] 955 00:41:16,360 --> 00:41:18,460 I have to confess that I probably 956 00:41:18,460 --> 00:41:22,390 came to college because I wanted the degree, right? 957 00:41:22,390 --> 00:41:25,270 That's probably why I took all those courses in the end. 958 00:41:25,270 --> 00:41:27,130 I wanted something that would translate 959 00:41:27,130 --> 00:41:29,170 into economic gain for me. 960 00:41:29,170 --> 00:41:30,850 That's obviously a profound motivator. 961 00:41:30,850 --> 00:41:34,360 So platforms that are organized simply 962 00:41:34,360 --> 00:41:40,930 around offering one-off courses without a coherent framework, 963 00:41:40,930 --> 00:41:43,360 without a resulting credential, that's 964 00:41:43,360 --> 00:41:45,040 not a very good economic model. 965 00:41:45,040 --> 00:41:48,970 So the MOOC offerers had been very much moving 966 00:41:48,970 --> 00:41:51,430 in the direction of trying to find 967 00:41:51,430 --> 00:41:54,970 some kind of certificate that will certify that you've 968 00:41:54,970 --> 00:41:56,510 accomplished something. 969 00:41:56,510 --> 00:42:01,908 So a lot of people will be just interested in content. 970 00:42:01,908 --> 00:42:03,450 And I've certainly taken MOOC courses 971 00:42:03,450 --> 00:42:05,230 just because I was intrigued with the content. 972 00:42:05,230 --> 00:42:06,688 But a lot of people are going to be 973 00:42:06,688 --> 00:42:10,360 motivated by that credential. 974 00:42:10,360 --> 00:42:15,130 And MOOCs are moving in the direction 975 00:42:15,130 --> 00:42:19,570 of offering credentials for completing things. 976 00:42:19,570 --> 00:42:23,748 I mean, Georgia Tech and Udacity have a master's degree 977 00:42:23,748 --> 00:42:25,540 in computer science now, which is available 978 00:42:25,540 --> 00:42:27,980 online at a remarkably reasonable price-- far 979 00:42:27,980 --> 00:42:34,840 low in class, on campus tuition levels. 980 00:42:34,840 --> 00:42:37,360 MIT is, in a very innovative way, 981 00:42:37,360 --> 00:42:40,360 doing these mini master's courses. 982 00:42:40,360 --> 00:42:43,870 I think we did one in an area MIT is famous for, 983 00:42:43,870 --> 00:42:45,250 which is supply chain management. 984 00:42:45,250 --> 00:42:46,625 And there's several more to come. 985 00:42:46,625 --> 00:42:50,080 The D-Lab has just picked one up or the world development 986 00:42:50,080 --> 00:42:51,070 community is doing one. 987 00:42:51,070 --> 00:42:52,990 There's going to be more at MIT. 988 00:42:52,990 --> 00:42:56,140 That's a very interesting model. 989 00:42:56,140 --> 00:42:58,930 You get a, quote, "micro master's certificate" 990 00:42:58,930 --> 00:43:00,820 for completing the course. 991 00:43:00,820 --> 00:43:04,750 And you pay, because there's got to be assessment, 992 00:43:04,750 --> 00:43:05,405 and that costs. 993 00:43:05,405 --> 00:43:07,030 Somebody has got to do that assessment, 994 00:43:07,030 --> 00:43:09,130 so that's going to be a cost. 995 00:43:09,130 --> 00:43:12,220 So you pay a modest amount for that assessment, 996 00:43:12,220 --> 00:43:14,950 and you take a year-long micro master's course. 997 00:43:14,950 --> 00:43:17,660 And then MIT, if you've done really well-- 998 00:43:17,660 --> 00:43:22,000 MIT will accept you for a semester on campus. 999 00:43:22,000 --> 00:43:24,277 Now, they can only offer that to a modest number 1000 00:43:24,277 --> 00:43:26,860 of the thousands of people that are getting the micro master's 1001 00:43:26,860 --> 00:43:27,360 degree. 1002 00:43:27,360 --> 00:43:29,470 But interestingly, other universities 1003 00:43:29,470 --> 00:43:34,780 now, once you complete the MIT micro master's piece, 1004 00:43:34,780 --> 00:43:39,270 they are enabling you to come on campus 1005 00:43:39,270 --> 00:43:41,170 and get a real master's in the course 1006 00:43:41,170 --> 00:43:43,340 of an additional semester of work. 1007 00:43:43,340 --> 00:43:46,390 So that's a very interesting model. 1008 00:43:46,390 --> 00:43:49,420 That's obviously much more of a blended model. 1009 00:43:49,420 --> 00:43:51,610 You get a lot of content and information, 1010 00:43:51,610 --> 00:43:53,980 which online is pretty good at on the online side. 1011 00:43:53,980 --> 00:43:57,250 And then you really come in for an intensive classroom 1012 00:43:57,250 --> 00:44:01,270 experience, which is much more the way in which they designed 1013 00:44:01,270 --> 00:44:02,590 the micro master's here. 1014 00:44:02,590 --> 00:44:04,048 When you do the real master's, it's 1015 00:44:04,048 --> 00:44:05,990 much more learning by doing. 1016 00:44:05,990 --> 00:44:06,490 Steph? 1017 00:44:06,490 --> 00:44:07,450 STEPH: Well, I just wanted to note 1018 00:44:07,450 --> 00:44:10,260 that a similar model exists at the Harvard Extension School, 1019 00:44:10,260 --> 00:44:12,460 except it's obviously in person. 1020 00:44:12,460 --> 00:44:14,940 The extension school is perceived as a back door 1021 00:44:14,940 --> 00:44:16,840 into Harvard in the same way that you 1022 00:44:16,840 --> 00:44:19,930 can take one-off courses at MIT if you talk to a professor 1023 00:44:19,930 --> 00:44:21,060 and they approve you. 1024 00:44:21,060 --> 00:44:22,840 So there's lots of opportunities, 1025 00:44:22,840 --> 00:44:26,980 I think, to take a look at Harvard's extension school 1026 00:44:26,980 --> 00:44:29,272 model to think about how blended learning might happen. 1027 00:44:29,272 --> 00:44:30,355 WILLIAM BONVILLIAN: Right. 1028 00:44:30,355 --> 00:44:31,900 And MIT used to have an extension 1029 00:44:31,900 --> 00:44:35,590 school in its early days, called the Lowell Institute, which 1030 00:44:35,590 --> 00:44:37,180 is essentially an adult education 1031 00:44:37,180 --> 00:44:41,680 program that was available for citizens in Boston-- 1032 00:44:41,680 --> 00:44:45,910 very innovative and interesting, on its old historic campus, 1033 00:44:45,910 --> 00:44:48,100 right across the river in Back Bay. 1034 00:44:48,100 --> 00:44:49,720 Somehow that got lost in the process. 1035 00:44:49,720 --> 00:44:54,280 But in a way, MIT has just opened a whole new school. 1036 00:44:54,280 --> 00:44:56,080 That's MITx. 1037 00:44:56,080 --> 00:44:57,018 And it's a massive-- 1038 00:44:57,018 --> 00:44:58,060 MARTIN: It's very global. 1039 00:44:58,060 --> 00:45:00,970 WILLIAM BONVILLIAN: --worldwide global extension school. 1040 00:45:00,970 --> 00:45:02,560 So there's a whole new school here. 1041 00:45:02,560 --> 00:45:04,870 We just haven't fully recognized it yet. 1042 00:45:04,870 --> 00:45:09,160 And, look, lifelong learning may be the best application. 1043 00:45:09,160 --> 00:45:14,380 Because in lifelong learning, in theory at least, 1044 00:45:14,380 --> 00:45:20,140 you've already learned those explanatory skills, 1045 00:45:20,140 --> 00:45:22,570 those oral presentation skills. 1046 00:45:22,570 --> 00:45:25,370 You've already learned those writing skills. 1047 00:45:25,370 --> 00:45:29,380 You've already got a lot of those fundamental pieces, which 1048 00:45:29,380 --> 00:45:31,715 are largely a part of the undergraduate education scene, 1049 00:45:31,715 --> 00:45:33,340 but also part of the high school scene. 1050 00:45:33,340 --> 00:45:35,890 In theory, you've got some of those down. 1051 00:45:35,890 --> 00:45:41,140 And then what you're in for, after doing work 1052 00:45:41,140 --> 00:45:45,620 in a relevant area, you're in to enhance your career or skill 1053 00:45:45,620 --> 00:45:48,310 set in a lifelong learning setting. 1054 00:45:48,310 --> 00:45:51,190 The average age of a student at a community college is 29. 1055 00:45:51,190 --> 00:45:53,370 In other words, they're already in the workforce. 1056 00:45:53,370 --> 00:45:56,740 40% of what community colleges now offer 1057 00:45:56,740 --> 00:46:00,330 are certificates in quasi-professional areas 1058 00:46:00,330 --> 00:46:02,530 that, in effect, certify to employers that you've 1059 00:46:02,530 --> 00:46:04,920 got an additional skill set. 1060 00:46:04,920 --> 00:46:08,790 That's very interesting, that lifelong learning opportunity 1061 00:46:08,790 --> 00:46:11,160 to upgrade your skill set and qualify, for you, 1062 00:46:11,160 --> 00:46:14,160 a new set of career areas. 1063 00:46:14,160 --> 00:46:15,690 That's a really important feature. 1064 00:46:15,690 --> 00:46:17,070 That's a really important problem 1065 00:46:17,070 --> 00:46:22,830 for those people who got left behind through this failure 1066 00:46:22,830 --> 00:46:26,970 of our higher education system to graduate enough people. 1067 00:46:26,970 --> 00:46:29,520 That's a really interesting, promising option 1068 00:46:29,520 --> 00:46:33,400 for what online can help us with. 1069 00:46:33,400 --> 00:46:37,560 And lifelong learning may be the best tool 1070 00:46:37,560 --> 00:46:41,580 we've got that MOOCs and online education can apply to. 1071 00:46:41,580 --> 00:46:46,110 So those are some features of what 1072 00:46:46,110 --> 00:46:49,170 online education may offer us. 1073 00:46:49,170 --> 00:46:51,450 The economic model, we're still feeling our way. 1074 00:46:51,450 --> 00:46:52,630 What's the business model? 1075 00:46:52,630 --> 00:46:54,960 How are these cost centers going to justify themselves? 1076 00:46:54,960 --> 00:46:58,530 We're not there yet, but it looks like this certificate, 1077 00:46:58,530 --> 00:47:02,490 micro master's kind of stuff is probably a good way to do this. 1078 00:47:02,490 --> 00:47:07,560 Lifelong learning may be another very good way to do it. 1079 00:47:07,560 --> 00:47:11,160 Will universities be willing? 1080 00:47:11,160 --> 00:47:13,590 In other words, we're sort of at a situation 1081 00:47:13,590 --> 00:47:16,200 now where universities offer a bunch of MOOCs 1082 00:47:16,200 --> 00:47:18,480 over on one side, purely online. 1083 00:47:18,480 --> 00:47:22,170 And then they're running their same old university 1084 00:47:22,170 --> 00:47:24,630 with lecture classes. 1085 00:47:24,630 --> 00:47:25,920 And nothing really changes. 1086 00:47:25,920 --> 00:47:29,070 Because as we've discussed, the blended model 1087 00:47:29,070 --> 00:47:32,400 is what's key to the reform, because that will drive 1088 00:47:32,400 --> 00:47:34,650 a new kind of classroom, the kind of cultural change 1089 00:47:34,650 --> 00:47:36,330 you're talking about, Steph, right? 1090 00:47:36,330 --> 00:47:37,950 It could be a driver for this. 1091 00:47:37,950 --> 00:47:40,500 So unless universities are willing to really think hard 1092 00:47:40,500 --> 00:47:44,100 about the blended model, then we won't get the revolution 1093 00:47:44,100 --> 00:47:47,220 that we really need, a kind of transformation of what goes on 1094 00:47:47,220 --> 00:47:49,580 in the classroom, and then full utilization 1095 00:47:49,580 --> 00:47:53,640 of this whole new tool set that gives us essentially 1096 00:47:53,640 --> 00:47:56,190 information and content access. 1097 00:47:56,190 --> 00:47:59,360 So we're still going to need-- and the term, 1098 00:47:59,360 --> 00:48:02,610 think you were looking for, Steph, in your argument, 1099 00:48:02,610 --> 00:48:04,740 and it's a term that we use in the legacy sector 1100 00:48:04,740 --> 00:48:07,750 analysis, who are the change agents going to be? 1101 00:48:07,750 --> 00:48:09,660 How are we going to drive change here? 1102 00:48:09,660 --> 00:48:11,950 And then that gets us into our next reading. 1103 00:48:11,950 --> 00:48:13,830 MARTIN: Well, but you'd also change the university business 1104 00:48:13,830 --> 00:48:14,070 model. 1105 00:48:14,070 --> 00:48:15,840 Because I know when I was on the executive board 1106 00:48:15,840 --> 00:48:17,257 for my fraternity, we were looking 1107 00:48:17,257 --> 00:48:19,170 at a lot of internal reports for MIT. 1108 00:48:19,170 --> 00:48:22,330 And they make a lot more money off of grad students 1109 00:48:22,330 --> 00:48:23,550 than undergrads usually. 1110 00:48:23,550 --> 00:48:26,430 So potentially becoming more of-- because MIT 1111 00:48:26,430 --> 00:48:28,702 is pretty much just a research institution. 1112 00:48:28,702 --> 00:48:30,910 WILLIAM BONVILLIAN: Well, MIT provides huge subsidies 1113 00:48:30,910 --> 00:48:33,580 to graduate students in a massive system of fellowship. 1114 00:48:33,580 --> 00:48:34,830 So they're not paying tuition. 1115 00:48:34,830 --> 00:48:36,288 MARTIN: But they end up doing a lot 1116 00:48:36,288 --> 00:48:39,210 of prolific work that MIT is known for and research studies. 1117 00:48:39,210 --> 00:48:40,560 WILLIAM BONVILLIAN: In term of the research side, yes. 1118 00:48:40,560 --> 00:48:42,393 MARTIN: What is the output of the institute? 1119 00:48:42,393 --> 00:48:45,660 So ideally-- or the way I've seen it usually is-- 1120 00:48:45,660 --> 00:48:47,550 WILLIAM BONVILLIAN: Yes, research output, 1121 00:48:47,550 --> 00:48:48,840 graduate students are core. 1122 00:48:48,840 --> 00:48:51,276 But tuition, believe me, undergraduates are key. 1123 00:48:51,276 --> 00:48:53,200 MARTIN: But I feel like a better business model would probably 1124 00:48:53,200 --> 00:48:54,980 be getting government funding for certain research 1125 00:48:54,980 --> 00:48:56,022 that has to happen for the government 1126 00:48:56,022 --> 00:48:57,230 or something like that. 1127 00:48:57,230 --> 00:48:58,980 I mean, you get a lot more money that way. 1128 00:48:58,980 --> 00:49:00,063 WILLIAM BONVILLIAN: Right. 1129 00:49:01,870 --> 00:49:02,370 We'll see. 1130 00:49:02,370 --> 00:49:03,995 I'm not yet clear who is going to drive 1131 00:49:03,995 --> 00:49:10,070 this revolution of blended learning education model. 1132 00:49:10,070 --> 00:49:11,380 MIT is now exploring it. 1133 00:49:11,380 --> 00:49:14,190 I mean, there's over 90 online classes. 1134 00:49:14,190 --> 00:49:17,160 And over 90% of MIT students-- you all can tell me this is 1135 00:49:17,160 --> 00:49:19,320 true-- 1136 00:49:19,320 --> 00:49:23,880 according to my friend Sanjay Sarma, VP for online learning, 1137 00:49:23,880 --> 00:49:26,160 Sanjay says 90% of undergraduate students 1138 00:49:26,160 --> 00:49:29,040 now have a blended learning experience 1139 00:49:29,040 --> 00:49:30,995 in their undergraduate careers. 1140 00:49:30,995 --> 00:49:33,880 Does that sound right to you all? 1141 00:49:33,880 --> 00:49:36,378 OK, I'm reassured. 1142 00:49:36,378 --> 00:49:37,920 Let me try to lay out, and then we'll 1143 00:49:37,920 --> 00:49:42,930 have 20 minutes plus for discussion. 1144 00:49:42,930 --> 00:49:47,910 The authors of this 2016 report on learning science 1145 00:49:47,910 --> 00:49:53,030 were Sanjay Sarma, Karen Willcox at AeroAstro, 1146 00:49:53,030 --> 00:49:54,930 Eric Klopfer, who I mentioned before, 1147 00:49:54,930 --> 00:49:58,507 Philip Lippel in my office in Washington. 1148 00:50:03,380 --> 00:50:07,740 The new online platforms, as I said a little earlier, 1149 00:50:07,740 --> 00:50:10,410 came out of computer science departments, 1150 00:50:10,410 --> 00:50:14,367 because they saw these incredible broadband access 1151 00:50:14,367 --> 00:50:14,950 opportunities. 1152 00:50:14,950 --> 00:50:16,242 And believe me, they were real. 1153 00:50:16,242 --> 00:50:18,840 I was in Egypt about four years ago teaching 1154 00:50:18,840 --> 00:50:21,720 at the American University of Cairo for about 10 days 1155 00:50:21,720 --> 00:50:27,870 or participating in programs there lecturing. 1156 00:50:27,870 --> 00:50:34,720 And I realized that the availability of these things 1157 00:50:34,720 --> 00:50:36,430 is a complete worldwide phenomena. 1158 00:50:36,430 --> 00:50:42,380 And I asked my Egyptian friends, how many people in your country 1159 00:50:42,380 --> 00:50:47,638 do you think have access to an iPhone? 1160 00:50:47,638 --> 00:50:49,430 They thought about it, and they said, well, 1161 00:50:49,430 --> 00:50:51,763 you know, between friends and neighbors and connections, 1162 00:50:51,763 --> 00:50:56,690 we probably have well over 80% of our population with access 1163 00:50:56,690 --> 00:50:57,500 to iPhones. 1164 00:50:57,500 --> 00:51:03,110 So in other words, this is a pretty poor developing country. 1165 00:51:03,110 --> 00:51:05,480 That's an astonishing access system. 1166 00:51:05,480 --> 00:51:07,110 That's an amazing access system. 1167 00:51:07,110 --> 00:51:09,490 There's a whole new delivery vehicle out there. 1168 00:51:09,490 --> 00:51:11,240 And the computer science departments, 1169 00:51:11,240 --> 00:51:13,107 logically, saw that first. 1170 00:51:13,107 --> 00:51:14,690 So they realized there was a new tool, 1171 00:51:14,690 --> 00:51:20,540 and they raced to create content to do that, largely just 1172 00:51:20,540 --> 00:51:22,760 videoing classrooms, right? 1173 00:51:25,340 --> 00:51:31,790 And what this group felt was, look, 1174 00:51:31,790 --> 00:51:37,600 we better figure out how to optimize the online education 1175 00:51:37,600 --> 00:51:39,440 pieces that we're now doing. 1176 00:51:39,440 --> 00:51:43,840 How do we understand what's been happening in cognitive science 1177 00:51:43,840 --> 00:51:46,090 and learning science, which comes out of the education 1178 00:51:46,090 --> 00:51:49,600 side, and the neuroscience, and take advantage 1179 00:51:49,600 --> 00:51:52,540 of what they've been learning about learning in the last 15 1180 00:51:52,540 --> 00:51:57,820 years and try and embed those capabilities 1181 00:51:57,820 --> 00:52:04,450 for better learning into our online courses? 1182 00:52:04,450 --> 00:52:10,030 It was a great aim, because the MOOC development community 1183 00:52:10,030 --> 00:52:13,150 hadn't frankly, seriously, in an organized way, 1184 00:52:13,150 --> 00:52:14,200 thought about this. 1185 00:52:14,200 --> 00:52:17,290 So this report made a real contribution. 1186 00:52:17,290 --> 00:52:20,690 And there's now a whole research community. 1187 00:52:20,690 --> 00:52:26,870 And I attended their annual meeting last month at MIT. 1188 00:52:26,870 --> 00:52:29,200 And it was a community from all over the world. 1189 00:52:29,200 --> 00:52:32,750 Universities from everywhere were present. 1190 00:52:32,750 --> 00:52:34,070 It was fascinating. 1191 00:52:34,070 --> 00:52:37,570 So there were four key recommendations. 1192 00:52:37,570 --> 00:52:40,630 Integrate learning science from education 1193 00:52:40,630 --> 00:52:43,540 with cognitive science and neuroscience. 1194 00:52:43,540 --> 00:52:45,130 These are disciplinary communities 1195 00:52:45,130 --> 00:52:48,980 that never talk to each other, that are in no communication. 1196 00:52:48,980 --> 00:52:52,065 They're coming up with significant territories, 1197 00:52:52,065 --> 00:52:54,190 and there could be a lot of benefit from crossover. 1198 00:52:54,190 --> 00:52:56,680 So that was one of the points. 1199 00:52:56,680 --> 00:52:59,710 Optimally structured online courses and modules 1200 00:52:59,710 --> 00:53:03,310 can be an important facilitator for higher education, emphasis 1201 00:53:03,310 --> 00:53:05,830 on optimally structured. 1202 00:53:05,830 --> 00:53:08,283 So there are all these phenomena. 1203 00:53:08,283 --> 00:53:09,700 And this report looked at a number 1204 00:53:09,700 --> 00:53:14,500 of the phenomena that affect learning, like mind wandering, 1205 00:53:14,500 --> 00:53:17,650 segmenting learning into more bite-sized, understandable, 1206 00:53:17,650 --> 00:53:20,020 manageable pieces. 1207 00:53:20,020 --> 00:53:21,640 Retrieval learning-- in other words, 1208 00:53:21,640 --> 00:53:25,210 what's the right mix between study and test-- 1209 00:53:25,210 --> 00:53:27,280 the reinforcement mechanism. 1210 00:53:27,280 --> 00:53:28,780 How do you do space retrieval? 1211 00:53:28,780 --> 00:53:30,460 In other words, recovery of information 1212 00:53:30,460 --> 00:53:31,940 over an extended period of time. 1213 00:53:31,940 --> 00:53:34,000 How do you optimize that? 1214 00:53:34,000 --> 00:53:35,470 What's the role of curiosity? 1215 00:53:35,470 --> 00:53:38,380 Can you use that as a driver in the learning space? 1216 00:53:38,380 --> 00:53:40,840 So they began to look at all of these phenomena, which 1217 00:53:40,840 --> 00:53:43,198 have actually been studied in a number of these fields, 1218 00:53:43,198 --> 00:53:44,740 and attempted to incorporate them in. 1219 00:53:44,740 --> 00:53:48,220 I'm not going to try to wander through each one 1220 00:53:48,220 --> 00:53:49,840 of these elements. 1221 00:53:49,840 --> 00:53:54,220 But here's some of the literature. 1222 00:53:54,220 --> 00:53:58,750 And the MRI scans show you mind wandering. 1223 00:53:58,750 --> 00:54:01,030 I mean, different parts of the brain 1224 00:54:01,030 --> 00:54:05,100 are zipping around the different territories. 1225 00:54:05,100 --> 00:54:08,620 And it's a phenomena that needs to be accounted for. 1226 00:54:12,550 --> 00:54:14,620 It's a very natural phenomena. 1227 00:54:14,620 --> 00:54:18,580 Mind wandering is-- it's Darwinian. 1228 00:54:18,580 --> 00:54:21,550 It's very important for our minds to wander. 1229 00:54:21,550 --> 00:54:24,797 If we're able to completely stay on topic, 1230 00:54:24,797 --> 00:54:26,380 the saber-toothed tiger would probably 1231 00:54:26,380 --> 00:54:29,140 leap through our thatched cottage and finish us off. 1232 00:54:29,140 --> 00:54:30,700 We've got to be keeping an eye out 1233 00:54:30,700 --> 00:54:32,620 for the saber-toothed tiger. 1234 00:54:32,620 --> 00:54:33,940 So this is very Darwinian. 1235 00:54:33,940 --> 00:54:37,270 It's an important skill to have mind wandering and the ability 1236 00:54:37,270 --> 00:54:39,670 to move from topic and area to area. 1237 00:54:39,670 --> 00:54:42,310 It's an important part of the creativity of the brain. 1238 00:54:42,310 --> 00:54:44,950 But it's been viewed, historically, 1239 00:54:44,950 --> 00:54:47,710 as an enemy of learning. 1240 00:54:47,710 --> 00:54:49,810 Teachers want students on the task 1241 00:54:49,810 --> 00:54:55,030 for that full time of the classroom experience 1242 00:54:55,030 --> 00:54:57,170 and are worried when they get distracted. 1243 00:54:57,170 --> 00:54:59,470 So how do you do this? 1244 00:54:59,470 --> 00:55:01,630 One approach is to segment learning 1245 00:55:01,630 --> 00:55:03,430 into bite-sized pieces. 1246 00:55:03,430 --> 00:55:06,400 So we attempt to do a bit of that in this class 1247 00:55:06,400 --> 00:55:13,760 with 10, 15 minutes of talking head, i.e. me, and then turning 1248 00:55:13,760 --> 00:55:16,280 the discussion over to you all. 1249 00:55:16,280 --> 00:55:19,040 So that you've got varying things occurring, 1250 00:55:19,040 --> 00:55:21,770 and you never have much more than 10 or 15 minutes. 1251 00:55:21,770 --> 00:55:23,630 And it probably should be 8 to 10, 1252 00:55:23,630 --> 00:55:29,330 frankly, on a single presenter. 1253 00:55:29,330 --> 00:55:31,730 And then you move to attempt to absorb 1254 00:55:31,730 --> 00:55:35,090 the material in the discussion and have 1255 00:55:35,090 --> 00:55:38,328 folks who were presenters of that material 1256 00:55:38,328 --> 00:55:39,620 develop the content themselves. 1257 00:55:39,620 --> 00:55:40,910 SPEAKER 2: What is that graph? 1258 00:55:40,910 --> 00:55:41,910 WILLIAM BONVILLIAN: Hmm? 1259 00:55:41,910 --> 00:55:43,433 SPEAKER 2: What is that graph? 1260 00:55:43,433 --> 00:55:45,350 WILLIAM BONVILLIAN: You know, it doesn't quite 1261 00:55:45,350 --> 00:55:48,530 match exactly what I was attempting to show to you. 1262 00:55:48,530 --> 00:55:53,510 But it does attempt to show a tutorial versus a lecture model 1263 00:55:53,510 --> 00:55:57,560 and what the median time frame is in any given limit. 1264 00:55:57,560 --> 00:56:01,130 So in a tutorial, one-on-one setting, 1265 00:56:01,130 --> 00:56:04,280 you can see that there's more focus on topic over an extended 1266 00:56:04,280 --> 00:56:05,050 period of time. 1267 00:56:05,050 --> 00:56:11,480 Whereas your median point starts to switch in a lecture model 1268 00:56:11,480 --> 00:56:12,920 and gets shorter. 1269 00:56:12,920 --> 00:56:15,710 So frankly, tutorials are a better way 1270 00:56:15,710 --> 00:56:17,200 of learning than a lecture. 1271 00:56:17,200 --> 00:56:20,570 But it's a fairly unaffordable model. 1272 00:56:20,570 --> 00:56:24,300 Another lesson-- retrieval learning. 1273 00:56:24,300 --> 00:56:26,150 The study-study-- in other words, 1274 00:56:26,150 --> 00:56:29,180 you study and then you study, right? 1275 00:56:29,180 --> 00:56:33,860 You study and you study it again versus you study and then take 1276 00:56:33,860 --> 00:56:35,990 a test about it. 1277 00:56:35,990 --> 00:56:40,010 And over what period of time, which works best? 1278 00:56:40,010 --> 00:56:43,860 So if I was imposing this graph, we'd have a test every week, 1279 00:56:43,860 --> 00:56:44,600 just so you know. 1280 00:56:44,600 --> 00:56:45,590 I'm sparing you. 1281 00:56:45,590 --> 00:56:47,930 But that would be the optimal model. 1282 00:56:47,930 --> 00:56:53,030 This is one week retention-- study-study, 42% retention, 1283 00:56:53,030 --> 00:56:54,980 56% versus study-test. 1284 00:56:54,980 --> 00:56:59,240 In the words, being forced to regurgitate and focus 1285 00:56:59,240 --> 00:57:02,930 on the material definitely does serve a learning purpose. 1286 00:57:02,930 --> 00:57:06,380 Curiosity does make a difference-- in other words, 1287 00:57:06,380 --> 00:57:10,060 if you can bring curiosity into the game. 1288 00:57:10,060 --> 00:57:12,920 And there's something famous in education called an Ebbinghaus 1289 00:57:12,920 --> 00:57:15,570 Curve, the forgetting curve. 1290 00:57:15,570 --> 00:57:18,350 In other words, people forget. 1291 00:57:18,350 --> 00:57:22,220 After about a week, it's history. 1292 00:57:22,220 --> 00:57:25,370 And maybe after a month you remember some vague outline. 1293 00:57:25,370 --> 00:57:29,630 So there's a profound forgetting curve in the human mind. 1294 00:57:29,630 --> 00:57:32,690 And can you do spaced retrieval to try and change 1295 00:57:32,690 --> 00:57:34,880 that curve to retain it longer? 1296 00:57:34,880 --> 00:57:37,700 These are all fundamental issues in learning 1297 00:57:37,700 --> 00:57:41,090 that this report attempted to start to grapple with, 1298 00:57:41,090 --> 00:57:41,810 as you saw. 1299 00:57:48,620 --> 00:57:50,570 Recommendations three and four were 1300 00:57:50,570 --> 00:57:54,790 to support the profession of learning engineer. 1301 00:57:54,790 --> 00:57:58,903 So MIT got into this situation where, first of all, 1302 00:57:58,903 --> 00:58:01,070 there's a lot of faculty resistance on every faculty 1303 00:58:01,070 --> 00:58:04,210 of these online courses, because the faculty is thinking, 1304 00:58:04,210 --> 00:58:06,440 this stuff's going to put me out of business, 1305 00:58:06,440 --> 00:58:07,978 they're going to take away my job. 1306 00:58:07,978 --> 00:58:09,895 So there's always a lot of anger, frustration, 1307 00:58:09,895 --> 00:58:13,850 and resentment when the online course arrives. 1308 00:58:13,850 --> 00:58:18,320 So MIT very cleverly attempted to get 1309 00:58:18,320 --> 00:58:25,160 some of its top-noted teachers to take on online courses. 1310 00:58:25,160 --> 00:58:28,160 In other words, if the most respected of your peers 1311 00:58:28,160 --> 00:58:30,620 are doing these online courses, how can you complain? 1312 00:58:30,620 --> 00:58:33,410 That was a model here. 1313 00:58:33,410 --> 00:58:36,740 And they were able to do it on the sales point 1314 00:58:36,740 --> 00:58:41,270 to the faculty member that, hey, how many people have you 1315 00:58:41,270 --> 00:58:42,980 taught in your lifetime at MIT? 1316 00:58:42,980 --> 00:58:44,780 A few thousand? 1317 00:58:44,780 --> 00:58:46,760 Your first course is going to have 30,000, 1318 00:58:46,760 --> 00:58:49,100 so you're going to be famous. 1319 00:58:49,100 --> 00:58:50,540 That was the sales pitch. 1320 00:58:50,540 --> 00:58:53,810 That's an attractive pitch. 1321 00:58:53,810 --> 00:59:00,500 So MIT got its senior faculty and most respected faculty 1322 00:59:00,500 --> 00:59:05,000 teaching the early suite of a lot of these courses. 1323 00:59:05,000 --> 00:59:09,230 And that was interesting. 1324 00:59:09,230 --> 00:59:12,890 And then they had this experience, right? 1325 00:59:12,890 --> 00:59:15,350 They gave their best lectures, and then 1326 00:59:15,350 --> 00:59:16,850 they built in all this assessment 1327 00:59:16,850 --> 00:59:18,920 and the 10-minute rule and all this kind of stuff 1328 00:59:18,920 --> 00:59:21,890 into the class, and they had to completely redo their lectures 1329 00:59:21,890 --> 00:59:26,060 to fit all the requirements that Sanjay Sarma and the edX team 1330 00:59:26,060 --> 00:59:26,990 was forcing on them. 1331 00:59:26,990 --> 00:59:29,330 So they had to rewrite all their lectures 1332 00:59:29,330 --> 00:59:31,850 to fit the new formatting. 1333 00:59:31,850 --> 00:59:36,320 And then they gave their course and they put it online, 1334 00:59:36,320 --> 00:59:38,480 which, of course, every student can take. 1335 00:59:38,480 --> 00:59:40,520 So then they go back to the regular semester, 1336 00:59:40,520 --> 00:59:45,140 they're giving their lecture class, no one shows up. 1337 00:59:45,140 --> 00:59:47,450 They've all taken the class online already. 1338 00:59:47,450 --> 00:59:49,730 Why are they going to show up? 1339 00:59:49,730 --> 00:59:52,040 So then the faculty member realized, 1340 00:59:52,040 --> 00:59:54,470 I'm going to get zero attendance here 1341 00:59:54,470 --> 00:59:57,920 unless I develop a completely different kind of content. 1342 00:59:57,920 --> 00:59:59,810 So then the faculty members started 1343 00:59:59,810 --> 01:00:03,470 to, in interesting kind of ways, rethink 1344 01:00:03,470 --> 01:00:06,770 what the content was going to be in the actual face-to-face 1345 01:00:06,770 --> 01:00:08,300 classroom. 1346 01:00:08,300 --> 01:00:10,170 That's not easy for a faculty member who 1347 01:00:10,170 --> 01:00:12,110 has never looked at learning science problems 1348 01:00:12,110 --> 01:00:14,390 or studied MOOCs and studied the technology. 1349 01:00:14,390 --> 01:00:18,460 So MIT began equipping them with graduate students 1350 01:00:18,460 --> 01:00:21,050 were really interested in this stuff, who 1351 01:00:21,050 --> 01:00:23,330 wanted to use it in their own teaching experiences 1352 01:00:23,330 --> 01:00:26,420 and decided to dig in, realizing they would be better teachers 1353 01:00:26,420 --> 01:00:29,303 and potentially more saleable if they 1354 01:00:29,303 --> 01:00:30,470 had this kind of background. 1355 01:00:30,470 --> 01:00:33,950 So these what MIT called "learning engineers" 1356 01:00:33,950 --> 01:00:36,080 were assigned to the faculty member. 1357 01:00:36,080 --> 01:00:39,280 Now, the faculty member was willing to accept 1358 01:00:39,280 --> 01:00:42,320 the graduate student, if the graduate student 1359 01:00:42,320 --> 01:00:43,555 was studying in their field. 1360 01:00:43,555 --> 01:00:45,680 In other words, they respected the graduate student 1361 01:00:45,680 --> 01:00:50,125 for having mastered the professor's own territory. 1362 01:00:50,125 --> 01:00:51,500 They weren't interested in having 1363 01:00:51,500 --> 01:00:54,020 graduate students, know-it-alls, trying 1364 01:00:54,020 --> 01:00:56,150 to dictate to them from a totally different field. 1365 01:00:56,150 --> 01:00:59,180 But they were prepared to accept the graduate student, 1366 01:00:59,180 --> 01:01:01,490 if the graduate student had already shown mastery 1367 01:01:01,490 --> 01:01:02,723 of their own fields. 1368 01:01:02,723 --> 01:01:03,890 That was respected learning. 1369 01:01:03,890 --> 01:01:07,160 So that was another piece in this learning engineer 1370 01:01:07,160 --> 01:01:09,080 experience at MIT. 1371 01:01:09,080 --> 01:01:11,780 And it's worked, so we created this whole community 1372 01:01:11,780 --> 01:01:12,710 of graduate students. 1373 01:01:12,710 --> 01:01:14,918 When we talk about how to bring about the revolution, 1374 01:01:14,918 --> 01:01:17,570 a whole bunch of graduate students now at MIT 1375 01:01:17,570 --> 01:01:19,940 are getting experience as learning engineers, 1376 01:01:19,940 --> 01:01:23,450 helping these ossified faculty members, 1377 01:01:23,450 --> 01:01:28,190 like me, learn how to do this fancy MOOC, 1378 01:01:28,190 --> 01:01:30,140 online, blended learning, change around 1379 01:01:30,140 --> 01:01:32,030 your face-to-face classroom, optimized 1380 01:01:32,030 --> 01:01:33,230 learning experience stuff. 1381 01:01:33,230 --> 01:01:36,080 And that's, you know, these are all change models. 1382 01:01:36,080 --> 01:01:38,630 These are all potential ways in which 1383 01:01:38,630 --> 01:01:41,180 change agents can operate. 1384 01:01:41,180 --> 01:01:45,050 But there's a very deep question about, 1385 01:01:45,050 --> 01:01:48,470 how do we do a change model within a higher education 1386 01:01:48,470 --> 01:01:49,252 legacy sector? 1387 01:01:49,252 --> 01:01:50,960 Because it's a very decentralized system. 1388 01:01:50,960 --> 01:01:53,180 And MIT's learning lessons, and other universities 1389 01:01:53,180 --> 01:01:54,890 that are doing this are learning lessons. 1390 01:01:54,890 --> 01:01:56,690 But how do we exchange these lessons 1391 01:01:56,690 --> 01:01:59,297 and get these lessons adopted through the community? 1392 01:01:59,297 --> 01:02:00,380 That's a tough assignment. 1393 01:02:00,380 --> 01:02:04,113 That's where all of Steph's points about her implication 1394 01:02:04,113 --> 01:02:06,780 was, how do we get change agents willing to step up to the plate 1395 01:02:06,780 --> 01:02:08,690 drive this stuff? 1396 01:02:08,690 --> 01:02:11,450 STEPH: And get them respected, as you've just noted. 1397 01:02:11,450 --> 01:02:12,770 WILLIAM BONVILLIAN: Yes. 1398 01:02:12,770 --> 01:02:13,910 All right. 1399 01:02:13,910 --> 01:02:17,540 We've got 20 minutes for Q&A. Lily, it's you. 1400 01:02:17,540 --> 01:02:18,820 LILY: That's me. 1401 01:02:18,820 --> 01:02:21,807 I will start-- which one would you like me to start with? 1402 01:02:21,807 --> 01:02:23,682 WILLIAM BONVILLIAN: Whichever one you prefer. 1403 01:02:23,682 --> 01:02:25,570 LILY: Hmm, that's tough. 1404 01:02:25,570 --> 01:02:26,416 OK. 1405 01:02:26,416 --> 01:02:30,800 Let's start-- we ending with the MITx online learning, 1406 01:02:30,800 --> 01:02:35,722 so let's go to that paper and discuss it. 1407 01:02:35,722 --> 01:02:36,680 WILLIAM BONVILLIAN: OK. 1408 01:02:36,680 --> 01:02:39,290 LILY: A couple of people had questions, including myself, 1409 01:02:39,290 --> 01:02:45,050 as to what subject areas the online or blended learning 1410 01:02:45,050 --> 01:02:46,210 could be-- 1411 01:02:46,210 --> 01:02:50,020 or what subject areas could readily adapt those practices, 1412 01:02:50,020 --> 01:02:53,500 and if it's just impossible to really implement 1413 01:02:53,500 --> 01:02:58,040 this sort of style in some subject areas. 1414 01:02:58,040 --> 01:03:01,270 So, for example, I would not have 1415 01:03:01,270 --> 01:03:06,622 wanted to take this course if it was an online course. 1416 01:03:06,622 --> 01:03:08,080 I don't have a three-hour attention 1417 01:03:08,080 --> 01:03:09,760 span sitting at my computer. 1418 01:03:09,760 --> 01:03:11,770 And I wouldn't have-- 1419 01:03:11,770 --> 01:03:15,070 sure, you could have a discussion group, 1420 01:03:15,070 --> 01:03:17,470 and everyone tunes in on their own devices, 1421 01:03:17,470 --> 01:03:20,670 and so you could have this pseudo-interactive discussion. 1422 01:03:20,670 --> 01:03:24,010 But there is just something about sitting around a seminar 1423 01:03:24,010 --> 01:03:27,940 table and interacting with other humans, for me. 1424 01:03:27,940 --> 01:03:31,450 So I wanted to hear what you all thought about the applicability 1425 01:03:31,450 --> 01:03:36,810 of the MITx in certain fields. 1426 01:03:36,810 --> 01:03:38,550 SPEAKER 1: Yeah, I've always questioned 1427 01:03:38,550 --> 01:03:41,220 how either it could be improved or whether it 1428 01:03:41,220 --> 01:03:45,120 is suitable for a lot of engineering classes, 1429 01:03:45,120 --> 01:03:48,390 just from my experience both as a student, 1430 01:03:48,390 --> 01:03:50,400 and then as a TA grading stuff. 1431 01:03:53,100 --> 01:03:54,690 So many of the questions are not just 1432 01:03:54,690 --> 01:03:56,710 about getting the right answer. 1433 01:03:56,710 --> 01:03:58,380 It's about making sure you understand 1434 01:03:58,380 --> 01:03:59,860 how to approach the problem. 1435 01:03:59,860 --> 01:04:02,250 Did you just make one little mistake along the way 1436 01:04:02,250 --> 01:04:03,750 and that gives you the wrong answer? 1437 01:04:03,750 --> 01:04:05,910 A lot of times there's not even numbers involved. 1438 01:04:05,910 --> 01:04:09,960 And so I know they did solid-state chemistry 1439 01:04:09,960 --> 01:04:13,280 as one of the classes that was first really involved with edX. 1440 01:04:13,280 --> 01:04:15,780 And I know the students would get really frustrated with it, 1441 01:04:15,780 --> 01:04:18,450 because they were like, oh, I rounded this number wrong, 1442 01:04:18,450 --> 01:04:21,930 and that's why my answer's off or all these little things 1443 01:04:21,930 --> 01:04:26,113 that edX just wasn't sensitive enough to pick up on. 1444 01:04:26,113 --> 01:04:28,530 And so I don't know if that's a question of the technology 1445 01:04:28,530 --> 01:04:32,220 improving, such that we have better machine learning 1446 01:04:32,220 --> 01:04:34,770 artificial intelligence to tell what students are doing, 1447 01:04:34,770 --> 01:04:37,630 or whether that's just a limitation that will 1448 01:04:37,630 --> 01:04:41,130 be present with this field. 1449 01:04:41,130 --> 01:04:41,630 CHLOE: Yeah. 1450 01:04:41,630 --> 01:04:44,390 I think building onto that and from our earlier discussion 1451 01:04:44,390 --> 01:04:47,660 about whether teachers and professors still 1452 01:04:47,660 --> 01:04:51,230 have a role in the very much more 1453 01:04:51,230 --> 01:04:52,970 online or automated education, is 1454 01:04:52,970 --> 01:04:56,090 that one of the important features of being physically 1455 01:04:56,090 --> 01:04:57,800 in a classroom or interacting with a TA 1456 01:04:57,800 --> 01:05:00,258 like yourself, or with other students, or with a professor, 1457 01:05:00,258 --> 01:05:03,740 is that they're most important-- even more 1458 01:05:03,740 --> 01:05:06,740 valuable skill of theirs, as opposed to knowing 1459 01:05:06,740 --> 01:05:12,680 their material inside and out, is having the teaching 1460 01:05:12,680 --> 01:05:17,090 ability to identify what your problem is and understand 1461 01:05:17,090 --> 01:05:21,020 the student's psychology and understand the learning 1462 01:05:21,020 --> 01:05:22,100 process. 1463 01:05:22,100 --> 01:05:24,600 They can't just point at your answer and say, this is wrong, 1464 01:05:24,600 --> 01:05:27,080 because you did this, but understand why you're wrong 1465 01:05:27,080 --> 01:05:29,900 and then redirect you and then course correct you. 1466 01:05:29,900 --> 01:05:33,470 So I think that will always be an important element of a truly 1467 01:05:33,470 --> 01:05:35,780 well-rounded education, and you only 1468 01:05:35,780 --> 01:05:39,140 get that from interacting with other people. 1469 01:05:39,140 --> 01:05:42,220 KEVIN: I do think that technology as a sense now, 1470 01:05:42,220 --> 01:05:46,530 the face-to-face component, is critical in a lot of learning, 1471 01:05:46,530 --> 01:05:48,230 especially in a class like this, right? 1472 01:05:48,230 --> 01:05:52,160 But what's to say 5, 10, 20 years from now 1473 01:05:52,160 --> 01:05:54,478 we can't all just slap a pair of VR goggles on, 1474 01:05:54,478 --> 01:05:56,270 and then we're sitting in a simulated room, 1475 01:05:56,270 --> 01:05:58,562 and then Bill doesn't need to fly in for every lecture. 1476 01:05:58,562 --> 01:05:59,140 [LAUGHTER] 1477 01:05:59,140 --> 01:06:01,550 You get that same experience-- 1478 01:06:01,550 --> 01:06:03,282 WILLIAM BONVILLIAN: Great idea, Kevin. 1479 01:06:03,282 --> 01:06:04,800 --how we market it. 1480 01:06:04,800 --> 01:06:07,310 And you get the same experience of sitting around, 1481 01:06:07,310 --> 01:06:11,090 but you don't have to leave your couch or wherever you are. 1482 01:06:11,090 --> 01:06:14,130 CHLOE: Yeah, I think that's allowable and achievable. 1483 01:06:14,130 --> 01:06:15,298 But I think-- 1484 01:06:15,298 --> 01:06:16,340 KEVIN: As it stands now-- 1485 01:06:16,340 --> 01:06:17,960 CHLOE: Yeah. 1486 01:06:17,960 --> 01:06:20,120 We don't have to physically be in the same room, 1487 01:06:20,120 --> 01:06:24,170 but there's still a role for a human educator 1488 01:06:24,170 --> 01:06:28,340 and for human students to interact with each other. 1489 01:06:28,340 --> 01:06:33,560 WILLIAM BONVILLIAN: So far, I think, Chloe and Kevin, so far, 1490 01:06:33,560 --> 01:06:38,960 much of our communication is not the words that we're mouthing. 1491 01:06:38,960 --> 01:06:40,550 It's the expressions we use. 1492 01:06:40,550 --> 01:06:42,480 It's eye contact. 1493 01:06:42,480 --> 01:06:45,410 It's hand expressions. 1494 01:06:45,410 --> 01:06:47,840 There's just a whole range of stuff 1495 01:06:47,840 --> 01:06:49,910 that accompanies what we're actually saying, 1496 01:06:49,910 --> 01:06:52,820 that we use as part of our communication systems. 1497 01:06:52,820 --> 01:06:56,990 And so far, the technology has not 1498 01:06:56,990 --> 01:07:00,290 been precise enough to enable us to capture 1499 01:07:00,290 --> 01:07:04,760 that incredible depth that face-to-face allows. 1500 01:07:04,760 --> 01:07:07,190 And to some extent, it's frustrating. 1501 01:07:07,190 --> 01:07:14,060 In other words, because that mix of non-spoken communication 1502 01:07:14,060 --> 01:07:16,940 skills doesn't get fully picked up, 1503 01:07:16,940 --> 01:07:20,860 people are frustrated by the experience. 1504 01:07:20,860 --> 01:07:23,930 This is a notorious problem in conference calls, 1505 01:07:23,930 --> 01:07:26,600 even in video conference calls, right? 1506 01:07:26,600 --> 01:07:28,700 People don't quite see what the other person 1507 01:07:28,700 --> 01:07:30,710 is driving at or trying to communicate, 1508 01:07:30,710 --> 01:07:33,320 because this raft of other kinds of communication 1509 01:07:33,320 --> 01:07:34,370 is not picked up. 1510 01:07:34,370 --> 01:07:37,943 Now, that's not to say that this won't get a lot better. 1511 01:07:37,943 --> 01:07:39,860 And we've obviously moved to a whole new level 1512 01:07:39,860 --> 01:07:44,360 of high definition and a lot of different machinery. 1513 01:07:44,360 --> 01:07:45,920 So it may well get better. 1514 01:07:45,920 --> 01:07:48,110 At the moment, it's not good enough yet 1515 01:07:48,110 --> 01:07:52,530 to substitute for being next to you. 1516 01:07:52,530 --> 01:07:54,450 MARTIN: But I'd add an addendum, though. 1517 01:07:54,450 --> 01:07:55,825 Because we're kind of seeing how, 1518 01:07:55,825 --> 01:07:58,090 oh, it's not how the original is. 1519 01:07:58,090 --> 01:07:59,300 But form follows function. 1520 01:07:59,300 --> 01:08:00,800 And because it has a different form, 1521 01:08:00,800 --> 01:08:03,260 you have other functions that we wouldn't have here. 1522 01:08:03,260 --> 01:08:05,060 So if it's all online, then other people 1523 01:08:05,060 --> 01:08:07,130 can be telling me what they got wrong 1524 01:08:07,130 --> 01:08:08,670 or I can get recommendations. 1525 01:08:08,670 --> 01:08:11,193 But the thing is the system hasn't been perfected as well. 1526 01:08:11,193 --> 01:08:12,860 Or like you said, you don't want to stay 1527 01:08:12,860 --> 01:08:14,030 in a three-hour lecture. 1528 01:08:14,030 --> 01:08:16,925 I usually watch videos at 3x if they're educational, 1529 01:08:16,925 --> 01:08:19,385 and I'll just go through the content, go back and forth. 1530 01:08:19,385 --> 01:08:20,510 That's why I talk about it. 1531 01:08:20,510 --> 01:08:22,420 Because it was like, yo, yo, OK. 1532 01:08:22,420 --> 01:08:24,229 But I'd get a lot more done, and I'd just 1533 01:08:24,229 --> 01:08:25,939 focus on the stuff I want to do. 1534 01:08:25,939 --> 01:08:28,910 So there's new functions you can do. 1535 01:08:28,910 --> 01:08:30,620 Another really important aspect that I 1536 01:08:30,620 --> 01:08:32,600 thought of from an organizational standpoint 1537 01:08:32,600 --> 01:08:33,975 is, in this class, certain people 1538 01:08:33,975 --> 01:08:36,345 are going to talk up more. 1539 01:08:36,345 --> 01:08:37,970 And other people that might be more shy 1540 01:08:37,970 --> 01:08:39,689 might not want to say something. 1541 01:08:39,689 --> 01:08:42,350 So that's why you got people that online are talking heads, 1542 01:08:42,350 --> 01:08:44,689 but physically won't say a word. 1543 01:08:44,689 --> 01:08:45,890 So that adds new dynamics. 1544 01:08:45,890 --> 01:08:47,348 And depending on the personalities, 1545 01:08:47,348 --> 01:08:52,323 other people can excel, right? 1546 01:08:52,323 --> 01:08:54,740 WILLIAM BONVILLIAN: That's why I try to make some of you-- 1547 01:08:54,740 --> 01:08:56,500 all of you-- be discussion leaders a lot 1548 01:08:56,500 --> 01:08:57,870 in the course of this semester. 1549 01:08:57,870 --> 01:09:01,885 MARTIN: I mean, the issue also is racism, equity, sexism. 1550 01:09:01,885 --> 01:09:04,010 Like if you don't know the gender or the background 1551 01:09:04,010 --> 01:09:06,860 of a person, you only judge them based on the ideas. 1552 01:09:06,860 --> 01:09:08,235 Or the content of their character 1553 01:09:08,235 --> 01:09:10,542 isn't a layer that you get. 1554 01:09:10,542 --> 01:09:13,751 CHLOE: I think-- sorry, were you still expanding? 1555 01:09:13,751 --> 01:09:15,126 MARTIN: Well, I was going to make 1556 01:09:15,126 --> 01:09:17,840 a point of so that allows other people to lead 1557 01:09:17,840 --> 01:09:20,490 or excel in this setting. 1558 01:09:20,490 --> 01:09:20,990 CHLOE: Yeah. 1559 01:09:20,990 --> 01:09:22,790 I mean, there's two sides to every coin. 1560 01:09:22,790 --> 01:09:25,460 So I agree, the anonymity advantagement 1561 01:09:25,460 --> 01:09:27,380 could definitely be a huge plus. 1562 01:09:27,380 --> 01:09:30,170 But, I mean, when you have people trolling your classes, 1563 01:09:30,170 --> 01:09:31,367 that's just as quickly-- 1564 01:09:31,367 --> 01:09:32,200 [INTERPOSING VOICES] 1565 01:09:32,200 --> 01:09:34,540 MARTIN: So the thing is, the post is like Piazza, right? 1566 01:09:34,540 --> 01:09:36,140 Your posts can be anonymous, but the instructors 1567 01:09:36,140 --> 01:09:37,140 know who you are. 1568 01:09:37,140 --> 01:09:37,819 And I think-- 1569 01:09:37,819 --> 01:09:40,450 [LAUGHTER] 1570 01:09:40,450 --> 01:09:42,520 So they can just say, oh, please, 1571 01:09:42,520 --> 01:09:43,770 or they'll just kick you out. 1572 01:09:43,770 --> 01:09:43,880 CHLOE: True. 1573 01:09:43,880 --> 01:09:45,899 MARTIN: And that's very like an 80-20 parental principle 1574 01:09:45,899 --> 01:09:46,520 where-- 1575 01:09:46,520 --> 01:09:48,279 I mean, I don't think everyone's going to just start trolling 1576 01:09:48,279 --> 01:09:49,370 unless it becomes a thing. 1577 01:09:49,370 --> 01:09:51,300 But it's an interesting dynamic. 1578 01:09:51,300 --> 01:09:52,062 LILY: Yeah, Matt? 1579 01:09:52,062 --> 01:09:54,020 MATT: I think even if we got to the point where 1580 01:09:54,020 --> 01:09:55,470 the technology-- well, [INAUDIBLE].. 1581 01:09:55,470 --> 01:09:56,845 Even if we got to the point where 1582 01:09:56,845 --> 01:09:59,480 the technology was perfect and we could simulate 1583 01:09:59,480 --> 01:10:01,790 being in this class together, I think 1584 01:10:01,790 --> 01:10:04,790 there's still a lot of value in the fact that on my way home, 1585 01:10:04,790 --> 01:10:08,270 I'll pass by the MechE lounge, and I'll talk to someone 1586 01:10:08,270 --> 01:10:11,090 about their next startup idea. 1587 01:10:11,090 --> 01:10:14,480 Just like when we talked about decentralizing manufacturing, 1588 01:10:14,480 --> 01:10:17,330 there's a lot of know-how and innovation 1589 01:10:17,330 --> 01:10:19,280 capability embodied in just having 1590 01:10:19,280 --> 01:10:21,942 things together physically. 1591 01:10:21,942 --> 01:10:23,900 But also with the whole blended learning model, 1592 01:10:23,900 --> 01:10:25,490 I don't think anyone's really trying 1593 01:10:25,490 --> 01:10:29,170 to say that we're going to throw away face-to-face interaction 1594 01:10:29,170 --> 01:10:30,590 or anything like that. 1595 01:10:30,590 --> 01:10:33,950 And then back to your original point about, 1596 01:10:33,950 --> 01:10:38,250 what does online learning work really well for? 1597 01:10:38,250 --> 01:10:42,405 I think right now my experience has been it works pretty well, 1598 01:10:42,405 --> 01:10:46,840 or it's always better for quantitative kind of classes. 1599 01:10:46,840 --> 01:10:49,290 But I've been working over the last semester 1600 01:10:49,290 --> 01:10:51,830 and setting up an undergraduate law class here. 1601 01:10:51,830 --> 01:10:57,260 And one of the initiatives is putting on a lot of the content 1602 01:10:57,260 --> 01:10:58,160 online. 1603 01:10:58,160 --> 01:11:03,290 And what we found that the MOOC format allows us to do 1604 01:11:03,290 --> 01:11:06,050 is give maybe some of the instructional video 1605 01:11:06,050 --> 01:11:08,530 in bite-sized pieces online beforehand, 1606 01:11:08,530 --> 01:11:10,910 and it opens a lot of in-class time 1607 01:11:10,910 --> 01:11:16,220 to actually do a case discussion and opens up new opportunities 1608 01:11:16,220 --> 01:11:17,910 there. 1609 01:11:17,910 --> 01:11:19,070 LILY: OK. 1610 01:11:19,070 --> 01:11:21,170 STEPH: Oh, I was actually going to add to yours. 1611 01:11:21,170 --> 01:11:25,010 I participated in the Europe and STS for a year 1612 01:11:25,010 --> 01:11:27,390 with a professor named Louis Bucciarelli, who started off 1613 01:11:27,390 --> 01:11:29,617 in mechanical engineering, has a PhD also 1614 01:11:29,617 --> 01:11:31,200 in science, technology, and society, I 1615 01:11:31,200 --> 01:11:33,508 think at the same time. 1616 01:11:33,508 --> 01:11:36,050 And he talked for a long time, I think in AeroAstro, as well. 1617 01:11:36,050 --> 01:11:40,460 So he's a very storied professional emeritus now. 1618 01:11:40,460 --> 01:11:41,910 And he's starting a program called 1619 01:11:41,910 --> 01:11:43,730 Liberal Studies in Engineering. 1620 01:11:43,730 --> 01:11:45,500 And one of the challenges that we 1621 01:11:45,500 --> 01:11:48,230 had when producing modules for the program 1622 01:11:48,230 --> 01:11:52,625 is that although it did have the quantitative components 1623 01:11:52,625 --> 01:11:55,280 that you're saying that I think are pretty easy to communicate, 1624 01:11:55,280 --> 01:12:00,050 there was a lot of nuances behind the materials 1625 01:12:00,050 --> 01:12:03,620 that we had selected that made it really complicated for us 1626 01:12:03,620 --> 01:12:10,340 to convey both the content and to then ask questions about it. 1627 01:12:10,340 --> 01:12:13,820 And I think one of the examples that most stood out to me 1628 01:12:13,820 --> 01:12:17,000 was when we were doing a module on wells, 1629 01:12:17,000 --> 01:12:23,700 and specifically implementing wells in the developing world 1630 01:12:23,700 --> 01:12:25,685 in the community of Tanzania. 1631 01:12:25,685 --> 01:12:27,060 There is a drawing of a well that 1632 01:12:27,060 --> 01:12:30,340 had been built I think in the 16th or 17th century 1633 01:12:30,340 --> 01:12:33,660 that he had scanned and put into the module. 1634 01:12:33,660 --> 01:12:34,940 But it was not-- 1635 01:12:34,940 --> 01:12:39,360 the mechanism by which the water was pulled up from the ground 1636 01:12:39,360 --> 01:12:42,000 was not very clear, and the piston just 1637 01:12:42,000 --> 01:12:44,370 was not very well-drawn. 1638 01:12:44,370 --> 01:12:48,780 And that prevented a lot of-- 1639 01:12:48,780 --> 01:12:51,053 I was the one who was writing a lot of the material, 1640 01:12:51,053 --> 01:12:53,220 and it was then sent to me, and then I had to do it. 1641 01:12:53,220 --> 01:12:54,678 And then I had to give him feedback 1642 01:12:54,678 --> 01:12:55,890 on my process of doing it. 1643 01:12:55,890 --> 01:12:57,390 And it prevented a lot of my ability 1644 01:12:57,390 --> 01:12:58,590 to actually do the problems. 1645 01:12:58,590 --> 01:13:00,840 So I think that's one of the really interesting things 1646 01:13:00,840 --> 01:13:03,060 about MOOCs as well, and especially multidisciplinary 1647 01:13:03,060 --> 01:13:06,480 MOOCs, that there are so many nuances behind the material, 1648 01:13:06,480 --> 01:13:09,780 that even if the content seems fairly straightforward, if you 1649 01:13:09,780 --> 01:13:11,813 have no way to interact-- or rather, 1650 01:13:11,813 --> 01:13:13,230 if you don't have an immediate way 1651 01:13:13,230 --> 01:13:16,110 to interact with the instructor, it 1652 01:13:16,110 --> 01:13:18,090 does add an extra layer of complication 1653 01:13:18,090 --> 01:13:20,160 in your ability to solve something. 1654 01:13:20,160 --> 01:13:23,910 And I think that that's precisely why we benefit 1655 01:13:23,910 --> 01:13:26,550 from the blended model, because then you can come to class 1656 01:13:26,550 --> 01:13:28,290 and say, hey, professor, you uploaded 1657 01:13:28,290 --> 01:13:32,160 this image in the problem set, and I have no idea how 1658 01:13:32,160 --> 01:13:33,480 the piston actually operates. 1659 01:13:33,480 --> 01:13:35,355 Could you explain this to me more thoroughly? 1660 01:13:35,355 --> 01:13:37,583 Or could you redraw it and reupload it? 1661 01:13:37,583 --> 01:13:40,000 WILLIAM BONVILLIAN: So part of the motivation, by the way, 1662 01:13:40,000 --> 01:13:42,720 for filming this class is for us to think about whether or not 1663 01:13:42,720 --> 01:13:46,440 we take more of the lectures segments online 1664 01:13:46,440 --> 01:13:49,230 and then have even more of an organized discussion 1665 01:13:49,230 --> 01:13:52,660 focus in that face-to-face classroom. 1666 01:13:52,660 --> 01:13:56,340 So part of the reason of the filming 1667 01:13:56,340 --> 01:13:58,540 here is to create an online course. 1668 01:13:58,540 --> 01:14:01,230 But it's also possible to use this for more 1669 01:14:01,230 --> 01:14:03,840 of a blended model here. 1670 01:14:03,840 --> 01:14:05,880 LILY: So I wanted to come bring something up 1671 01:14:05,880 --> 01:14:07,830 that Bill mentioned earlier, which I think 1672 01:14:07,830 --> 01:14:13,910 will also transition us into the Bonvillian and Weiss reading. 1673 01:14:13,910 --> 01:14:19,200 Bill mentioned graduate programs online. 1674 01:14:19,200 --> 01:14:23,250 So you complete a year-long certificate sort of program, 1675 01:14:23,250 --> 01:14:26,180 and then can complete the remainder of your graduate work 1676 01:14:26,180 --> 01:14:27,980 on site. 1677 01:14:27,980 --> 01:14:31,500 I think that's a really interesting model and idea, 1678 01:14:31,500 --> 01:14:35,540 so I could see that working well for things like-- 1679 01:14:35,540 --> 01:14:36,973 well, first of all-- 1680 01:14:36,973 --> 01:14:38,640 I could see that working well for things 1681 01:14:38,640 --> 01:14:41,610 like maybe business, engineering, computer 1682 01:14:41,610 --> 01:14:43,110 science, et cetera. 1683 01:14:43,110 --> 01:14:47,040 And those graduate programs are typically quite expensive 1684 01:14:47,040 --> 01:14:48,660 and can be two to three years. 1685 01:14:48,660 --> 01:14:54,360 And unlike a lot of PhD programs or science programs, 1686 01:14:54,360 --> 01:14:56,220 it's not paid for. 1687 01:14:56,220 --> 01:15:00,970 So we're talking maybe $50,000 a year for two to three years. 1688 01:15:00,970 --> 01:15:03,630 Which brings me to something that 1689 01:15:03,630 --> 01:15:07,680 came up in Bill's reading, which is student loans and increasing 1690 01:15:07,680 --> 01:15:08,760 levels of student loans. 1691 01:15:08,760 --> 01:15:11,040 And I was doing a little bit of research. 1692 01:15:11,040 --> 01:15:14,850 Does anyone have any idea what the current outstanding student 1693 01:15:14,850 --> 01:15:15,855 loan debt is in the US? 1694 01:15:15,855 --> 01:15:17,230 SPEAKER 2: Isn't there trillions? 1695 01:15:17,230 --> 01:15:18,563 MARTIN: Easily in the trillions. 1696 01:15:18,563 --> 01:15:20,010 LILY: Yeah, $1.5 trillion. 1697 01:15:20,010 --> 01:15:22,920 [LAUGHS] And they're not collateralized 1698 01:15:22,920 --> 01:15:25,170 and the default rates are increasing. 1699 01:15:25,170 --> 01:15:26,880 SPEAKER 2: Wait, they are? 1700 01:15:26,880 --> 01:15:29,570 Last I heard, I thought they were down. 1701 01:15:29,570 --> 01:15:34,343 LILY: Not according to the 2017 statistics. 1702 01:15:34,343 --> 01:15:35,010 SPEAKER 2: Crap. 1703 01:15:35,010 --> 01:15:36,650 What I heard was in 2016, so-- 1704 01:15:36,650 --> 01:15:38,141 [LAUGHTER] 1705 01:15:40,630 --> 01:15:43,280 LILY: So with that in mind, could 1706 01:15:43,280 --> 01:15:49,250 the master's program blended online, then concatenated 1707 01:15:49,250 --> 01:15:56,690 or shorthand on-site learning, improve the situation 1708 01:15:56,690 --> 01:15:59,630 with student loan debt, while still 1709 01:15:59,630 --> 01:16:03,793 qualifying the person with a graduate degree? 1710 01:16:03,793 --> 01:16:05,210 SPEAKER 1: I would just be curious 1711 01:16:05,210 --> 01:16:07,168 what the breakdown is between different fields. 1712 01:16:07,168 --> 01:16:10,970 Because I know for MBAs and law schools, so much of it 1713 01:16:10,970 --> 01:16:13,100 is building your network while you're there. 1714 01:16:13,100 --> 01:16:15,590 So people are willing to take on hundreds of thousands 1715 01:16:15,590 --> 01:16:19,970 of dollars in loans just to have connections to the Harvard 1716 01:16:19,970 --> 01:16:21,910 community or whatever. 1717 01:16:21,910 --> 01:16:26,270 So if that is only a small section of the total loans that 1718 01:16:26,270 --> 01:16:28,530 are outstanding, then I think this could be impactful. 1719 01:16:28,530 --> 01:16:30,170 But if that's outweighing it-- 1720 01:16:30,170 --> 01:16:31,520 WILLIAM BONVILLIAN: I'm not worried about that class 1721 01:16:31,520 --> 01:16:32,420 repaying the loans. 1722 01:16:32,420 --> 01:16:33,020 They're OK. 1723 01:16:33,020 --> 01:16:33,540 [LAUGHTER] 1724 01:16:33,540 --> 01:16:35,642 SPEAKER 1: But that would still be considered 1725 01:16:35,642 --> 01:16:36,850 part of the statistic, right? 1726 01:16:36,850 --> 01:16:41,533 Because the first year out, they might owe $300,000 in loans. 1727 01:16:41,533 --> 01:16:42,950 And while they are paying it back, 1728 01:16:42,950 --> 01:16:44,908 I think they would still be willing to take out 1729 01:16:44,908 --> 01:16:48,110 $300,000 of loans regardless of new technologies. 1730 01:16:48,110 --> 01:16:51,110 STEPH: This also assumes the goodness of the programs that 1731 01:16:51,110 --> 01:16:53,750 are putting out these courses. 1732 01:16:53,750 --> 01:16:56,750 There's a lot of, I think, predatory-- 1733 01:16:56,750 --> 01:16:58,528 what is the name of the one that just 1734 01:16:58,528 --> 01:17:01,070 got shut down that was really bad, the University of Phoenix? 1735 01:17:01,070 --> 01:17:02,360 SPEAKER 1: It's called The University. 1736 01:17:02,360 --> 01:17:03,290 LILY: That's the one. 1737 01:17:03,290 --> 01:17:03,790 [LAUGHS] 1738 01:17:03,790 --> 01:17:05,215 [INTERPOSING VOICES] 1739 01:17:05,215 --> 01:17:07,090 STEPH: Yeah, it sort of presumes the goodness 1740 01:17:07,090 --> 01:17:08,470 of a lot of these organizations. 1741 01:17:08,470 --> 01:17:09,500 MARTIN: Well, yeah, we're assuming 1742 01:17:09,500 --> 01:17:10,990 they're all elite ones, where it's like, 1743 01:17:10,990 --> 01:17:12,500 we're sure you're going to get something [INAUDIBLE].. 1744 01:17:12,500 --> 01:17:13,470 STEPH: Yeah, exactly. 1745 01:17:13,470 --> 01:17:15,530 And it also presumes that they're useful, 1746 01:17:15,530 --> 01:17:20,210 like there's an instrumentality or a utility coefficient 1747 01:17:20,210 --> 01:17:23,120 that you gain from having participated in these courses. 1748 01:17:23,120 --> 01:17:25,135 And I don't know if the return on investment 1749 01:17:25,135 --> 01:17:27,260 is going to be good for someone who doesn't already 1750 01:17:27,260 --> 01:17:30,860 have a foundational degree in something. 1751 01:17:30,860 --> 01:17:33,260 It assumes that the job market is 1752 01:17:33,260 --> 01:17:36,710 willing to accept the people who are 1753 01:17:36,710 --> 01:17:38,100 graduating with this knowledge. 1754 01:17:38,100 --> 01:17:42,128 And I don't know if that's true right now. 1755 01:17:42,128 --> 01:17:43,670 LILY: There's a really good breakdown 1756 01:17:43,670 --> 01:17:47,210 on four-year universities, by the way, not just 1757 01:17:47,210 --> 01:17:48,380 graduate programs. 1758 01:17:48,380 --> 01:17:49,370 So, yeah. 1759 01:17:49,370 --> 01:17:50,870 WILLIAM BONVILLIAN: So, Lily, do you 1760 01:17:50,870 --> 01:17:54,740 want to make a closing set of points about these two 1761 01:17:54,740 --> 01:17:57,440 readings? 1762 01:17:57,440 --> 01:17:58,250 LILY: Yeah. 1763 01:17:58,250 --> 01:18:00,980 Pay off your student loans and don't default on them or else 1764 01:18:00,980 --> 01:18:03,540 our entire economy can collapse. 1765 01:18:03,540 --> 01:18:04,590 SPEAKER 2: Again? 1766 01:18:04,590 --> 01:18:05,090 LILY: Again. 1767 01:18:05,090 --> 01:18:07,090 Well, these are the kinds of numbers-- you know, 1768 01:18:07,090 --> 01:18:09,740 there's a critical default number, just as with 1769 01:18:09,740 --> 01:18:10,850 the housing market. 1770 01:18:10,850 --> 01:18:12,530 SPEAKER 2: All right, so we need to all buy shorts-- 1771 01:18:12,530 --> 01:18:12,970 [LAUGHTER] 1772 01:18:12,970 --> 01:18:15,157 LILY: I don't know, how do you short something that's 1773 01:18:15,157 --> 01:18:15,795 not collateralized? 1774 01:18:15,795 --> 01:18:17,060 MARTIN: There's a whole thing about-- yeah, this being 1775 01:18:17,060 --> 01:18:18,950 a bubble, the education bubble. 1776 01:18:18,950 --> 01:18:22,160 I read a big thing about Peter Thiel 1777 01:18:22,160 --> 01:18:24,210 since we brought him up in the first class. 1778 01:18:24,210 --> 01:18:26,000 RASHID: I think my favorite example 1779 01:18:26,000 --> 01:18:29,210 is there's federally subsidized student loans. 1780 01:18:29,210 --> 01:18:32,210 And if you get clever, you can use your federally subsidized 1781 01:18:32,210 --> 01:18:34,760 student loans to invest properly, and then 1782 01:18:34,760 --> 01:18:36,860 flip those, because those are inherently safer. 1783 01:18:36,860 --> 01:18:39,093 So even if you don't want to-- 1784 01:18:39,093 --> 01:18:40,760 I should say, there are a lot of options 1785 01:18:40,760 --> 01:18:42,447 to pay back your student loans. 1786 01:18:42,447 --> 01:18:44,030 WILLIAM BONVILLIAN: So, Lily, bring us 1787 01:18:44,030 --> 01:18:47,150 to a couple of key conclusions about the two readings. 1788 01:18:47,150 --> 01:18:50,520 LILY: Conclusions, let's see. 1789 01:18:50,520 --> 01:18:54,440 Let me go to my notes. 1790 01:18:54,440 --> 01:18:56,600 I think, in general, the readings 1791 01:18:56,600 --> 01:18:59,840 led us to believe that online learning, or at least 1792 01:18:59,840 --> 01:19:07,190 blended learning, is going to increase in usage 1793 01:19:07,190 --> 01:19:11,360 or popularity, whether the institutions, the universities, 1794 01:19:11,360 --> 01:19:13,790 like it or not. 1795 01:19:13,790 --> 01:19:17,430 And I don't know. 1796 01:19:17,430 --> 01:19:19,858 I think I covered everything else that I had thought of. 1797 01:19:19,858 --> 01:19:20,900 WILLIAM BONVILLIAN: Good. 1798 01:19:20,900 --> 01:19:23,750 Let me do just a quick wrap up. 1799 01:19:23,750 --> 01:19:27,140 Freeman taught us about the talent base 1800 01:19:27,140 --> 01:19:30,500 and how it's going to affect the innovation system 1801 01:19:30,500 --> 01:19:33,020 and made us aware of the fact that the SNT talent 1802 01:19:33,020 --> 01:19:36,710 base is going to be a pretty key component of US 1803 01:19:36,710 --> 01:19:39,000 overall comparative advantage, as other people move 1804 01:19:39,000 --> 01:19:41,590 to copy the model. 1805 01:19:41,590 --> 01:19:46,010 Romer's core point was that government policy is focused 1806 01:19:46,010 --> 01:19:52,160 really on a capital supply and really on the demand 1807 01:19:52,160 --> 01:19:54,740 side of the equation, right? 1808 01:19:54,740 --> 01:19:57,170 And that the talent supply system really 1809 01:19:57,170 --> 01:19:59,480 was not a particularly significant federal government 1810 01:19:59,480 --> 01:20:01,040 focus. 1811 01:20:01,040 --> 01:20:04,192 It probably needs to be, because that's a very important factor 1812 01:20:04,192 --> 01:20:05,900 in innovation, as we've discussed earlier 1813 01:20:05,900 --> 01:20:08,720 with his prospector theory. 1814 01:20:08,720 --> 01:20:13,280 And then he drove us to look at higher educational institutions 1815 01:20:13,280 --> 01:20:15,920 and how they don't get the economic signaling that 1816 01:20:15,920 --> 01:20:18,710 would lead them to increase the supply. 1817 01:20:18,710 --> 01:20:22,010 And he helped us think about what some of the barriers were 1818 01:20:22,010 --> 01:20:23,630 and how you could change that economic 1819 01:20:23,630 --> 01:20:26,600 signaling to change the way in which the higher education 1820 01:20:26,600 --> 01:20:28,900 system dealt with the supply kit question. 1821 01:20:28,900 --> 01:20:33,260 Katz and Goldin taught us about the tie 1822 01:20:33,260 --> 01:20:36,200 between the ever-increasing technology 1823 01:20:36,200 --> 01:20:37,577 requirements of the economy. 1824 01:20:37,577 --> 01:20:40,160 Since the Industrial Revolution, we've been on a rising curve, 1825 01:20:40,160 --> 01:20:42,110 and it may be accelerating. 1826 01:20:42,110 --> 01:20:43,880 And then they demonstrated for us 1827 01:20:43,880 --> 01:20:47,780 how important it is to keep the education curve, the talent 1828 01:20:47,780 --> 01:20:51,440 base, ahead of that curve, playing off of that curve. 1829 01:20:51,440 --> 01:20:55,640 If you let them cross, like we did in the 1970s, then 1830 01:20:55,640 --> 01:20:57,590 you start to drive towards pretty serious 1831 01:20:57,590 --> 01:20:59,570 economic inequality problems, because you're 1832 01:20:59,570 --> 01:21:02,990 leaving a large part of your population behind, 1833 01:21:02,990 --> 01:21:07,040 unable to get back on that rising parallel 1834 01:21:07,040 --> 01:21:08,930 and stay up with the technology curve 1835 01:21:08,930 --> 01:21:11,780 and earn the corresponding incomes. 1836 01:21:11,780 --> 01:21:15,170 Bamol alerted us to the fact that education 1837 01:21:15,170 --> 01:21:20,120 for invention and innovation, by the way, 1838 01:21:20,120 --> 01:21:23,840 looks different than standard education systems in science 1839 01:21:23,840 --> 01:21:27,110 and technology today, which are more geared, 1840 01:21:27,110 --> 01:21:31,430 historically, towards incremental advances. 1841 01:21:31,430 --> 01:21:34,910 MIT's Online Education Report got 1842 01:21:34,910 --> 01:21:36,410 us thinking about learning science, 1843 01:21:36,410 --> 01:21:39,800 how you could apply learning science to really optimize 1844 01:21:39,800 --> 01:21:43,730 both the online model and the blended learning model. 1845 01:21:43,730 --> 01:21:46,310 And then the reading from the textbook 1846 01:21:46,310 --> 01:21:49,040 showed us a set of the challenges 1847 01:21:49,040 --> 01:21:52,340 for online education, how it's a potentially disruptive tool 1848 01:21:52,340 --> 01:21:54,855 in a legacy sector education system, 1849 01:21:54,855 --> 01:21:57,230 and got us thinking about, who are the change agents that 1850 01:21:57,230 --> 01:22:00,950 might really drive the optimal model, which is really probably 1851 01:22:00,950 --> 01:22:03,400 a blended learning model.