For most of my college career, I was taught in a non-interactive lecture (from 15 to 300 students at a time). Sometimes the lectures were even being recorded or simulcast to remote locations. The lecturer often had no idea who I was. In graduate school, I got a good look at the "sausage factory" of how many core computer science class lectures are really developed. Unless it's an area of research and publication for the professor, they do a minimal amount of work on their lecture notes and exams. In some cases, the same professor taught the same subject year after year. Sometimes this resulted in progressively better notes, but sometimes not. In other cases, the class rotated among professors. They often took each other's lecture notes and used them with varying degrees of revision. Only when a course was a true passion of the professor was the lecture anything much more than what could be gleaned from a couple of textbooks on the topic.
Beyond the lectures, most of the labs and assignments were handled by grad student teaching assistants. I also got to see many of my friends go through the TA machine in grad school. The rare lucky ones got to TA a course related to their research, but most TAs were assigned to courses where they only knew as much as they'd learned in their own undergrad experience in that course. A few TAs would scramble to get up to speed. The smart ones would do the bare minimum for the course and focus on their research.
Then there are the textbooks. I look at the dead tree debris that on my bookshelves from my nearly dozen years of higher education and very few books contain anything beyond what's on Wikipedia today.
I can look back at a topic like "operating systems" and cringe. This was a undergraduate course that rotated amongst professors. At the time I took it, I'd already been on a dozen operating systems including VMS, NeXTStep and Dynix, played with Minix code, read Bach and Tanenbaum, etc. The PhD student assigned to teach my class had an undergrad in linguistics. I believe he'd never even logged into any unix system, much less had an operating systems course himself. He even dropped out late in the semester and disappeared. I damn near asked for a refund, but I took my grade and moved on.
There's no reason that a few passionate people on the web couldn't top that.
I feel the complete opposite for my courses in mechanical engineering. The teachers really worked at it and I still use the text books to this day. It wasn't all about the lectures themselves but how they taught us to think which has become invaluable. For me college wasn't about the courses, but the curriculum which was really the value added experience.
Something like computer science may be learned strictly from reading but this isn't universal. Gates argument is course specific.
I like to daydream about a scenario like yours, but by and large it's an exception to the rule, and one that is even most frustrating in light of high education costs.
It really depends on where you go and what you do. At most big public schools, I suspect the "TAs do everything / uninterested faculty" is the norm, chiefly because the faculty are rewarded for research, not teaching. If you go to liberal arts colleges (I went to Clark University: http://www.clarku.edu) and engineering schools like Harvey Mudd, Worcester Polytechnic Institute, etc., then undergrads are the focus, professors are rewarded for teaching, and you're more likely to have a better and in-depth experience.
Yes, I should have made that clear. Almost all of my academic career was at large research schools. A number of my fellow grad students instead went on to smaller schools where they are fairly happy teaching more and researching / publishing less. Without very many (or any) grad students to be TAs, they end up covering all the bases on their classes. In smaller departments, they tend to own all their classes year after year rather than rotating.
The classes are also smaller, which helps tremendously. At one point, I actually took a differential equations class at a junior college. We had 5 students. It was a great setting in an unexpected place.
My operating systems class devoted a whole lecture to how OSes deal with tape drives. Meanwhile, I don't think we covered networking. This was three years ago, and the course material was essentially unchanged from the 1970s.
When it comes to computers, the internet has been better than most undergraduate programs for a while now. And it's only getting better.
We did piles on networking, dcom, everything in the OSI stack actually. (I went to a canadian university, UNB).
While I agree undergrad uni isn't going to be on the cutting edge, I think that learning the basics, and the ideas behind why decisions are/were made is extremely valuable.
In short, you can't move forward without learning where you've been.
I've always thought computer science could be taught in the opposite order. Instead of starting with "Pascal/Java 101" and finishing with CPU design / architecture, what if you started at the hardware and worked up to high level languages? That's actually more of the course I followed, but only by chance.
How goes the saying... Computers are related to computer science in the same way telescopes are related to astronomy. It's not necessary to understand the inner mechanics of the machine to do the science.
Computer Science is closer to a branch of Math, it's relation to physical computers is a historical accident. I believe you are talking about a branch of Engineering. I followed more like this, but I had programming in the first semester - probably because we were supposed to be able to use the machines we were being trained to design... I'm not a computer scientist and I seldom think like one, but I am an engineer - and I think like one all the time.
It's really interesting to see the differences between Computer Science departments due to where they came from. The two big places I've seen them grow out of Electrical Engineering and Math, but they've also come out of things like some weird informatics or library science. I'm sure there are some other odd departments out there with interesting histories.
The problem with this is that not everyone will be able to handle Machine level things in the beginning. Even in the best universities students will be intimidated if there first programming class would be in assembly. They barely get by on lisp
The key is starting where people are interested. If they are interested in web, start with HTML/JS.
If they want to make blinky lights, start with low level hardware. Once they know how to do something and are still interested they will move on to other areas. Thats the crux of why the web will deliver better education because they will follow their own path.
Education is only meaningful and relevant if it enables a person to do what they want to do and could not before. Arguably, the issue is that the educational system is trending towards becoming more standardized where as society is trending towards becoming more individual and specialized.
The OSI stack? That's even less relevant than sequential access algorithms! The IETF crowd had won that battle decisively before the first OSI committees even met, much less after they shat their gibberingly baroque standards documents into the world.
Except Mr Gates said "the best education", and what you describe is far from the best. I don't doubt that the web can give an experience better than many bad to mediocre university eduction experiences. However I'm very skeptical that the Web can compete with the best. The interaction between teacher and students necessary for that would be hard to simulate (or stimulate) without placing everybody in the same room.
Agreed. One of the best aspects of sites like HN and Progit are that over the years I've been able to figure out which CS/Software Engineering books are the really good ones. From SICP to Date's database books, if you take the time to do that you can give yourself a foundation of knowledge as good or better than probably 90% of the schools out there.
In my college experience, I did the most learning when working my way through the problem sets with my fellow students. It would have been an awful lot harder without that (and a lot less fun). One would have to be fairly motivated to succeed alone in a room watching a video over the internet.
Well this is true if you want to be a software developer. You can get very skilled and competent just learning online. For everything else I don't know... I sure would get a little uncomfortable if my surgeon told me he skipped college and learned everything from tutorials before he gave me my anesthesia.
Your anesthesiologist could have probably done 75% of his coursework online with no loss of quality. Most of his classes involved sitting in giant lecture halls listening to a professor read more or less verbatim from a powerpoint or earlier equivalent.
Well this is true if you want to be a software developer.
Or a mathematician, director, physicist, screen writer, journalist, accountant, statistician, historian, designer, botanist, cook, mechanic, etc.
Now that I think of it, most professions already have online communities for discussion and group learning of skills which one self-motivated enough to study. What's most blatantly missing is the accreditation: it's not easy to cheat the accreditation system at a University (mostly because the experience of academic culture is what gives the accreditation validity), but it's a lot easier to "cheat" online.
Whoa whoa whoa, WHAT? Historian, cook, screen writer, journalist, director? On what planet can directorial skills be conveyed via a lecture video clip? Did CalArts, which graduated Tim Burton, John Lasseter, Pete Docter, Joe Ranft, and countless other masters of animation do so simply by showing them what amounted to the same stock lectures with some automated method of remote feedback? Or was it instead significantly in part a communal learning experience that involved peer criticism and one on one tutelage from professors who were in the room with them as they drew and developed their craft?
I'm sorry, but believe it or not, not all professional skills can be remotely acquired without a kind of human transference of intangible, personable skills. Education in higher academia is about much more than the acquisition of particular facts--it's a culture, practice, and way of life. Critical ideas are meant to be discussed in person, ruminated on, and debated in real, highly dynamic discussions. Real life work environments, especially in software companies, are ones in which critical, GROUP brainstorming dynamics are particularly important, because oftentimes the answer doesn't lie in the answer key in the back of the book, but rather as a result of a group process of rumination, guess, check, and revision.
Samuel, As an art school guy I think Arts education is going to go online even before the traditional Liberal Arts training. Primarily because art, design, film making are areas where credentials are nearly useless and the portfolio of work is paramount.
In terms of formal skills sites like Vectips, Smashing Magazine, et al. Do a good job conveying how to work with tools.
I do agree that the sense of fraternity that grows up in a creative environment is important, but the web is flattening that out. Look at a site like Threadless.com where there is a dedicated critique community. I think film makers coming up sharing videos via YouTube, collaborating remotely, and will replace a lot of the community that is typically found on campus.
More broadly, we may be looking at the end of a historical blip. For hundreds of years, skills were transferred via apprenticeship, on the job training, etc. A number of demographic and political trends changed this from the 1950's til now, but maybe we are seeing a reversion to the mean?
Oops, I think we're talking past each other, or maybe I was projecting my own ideas on the article too much. I agree with you completely - actually, I can't think of any profession where video lectures alone would get you a real understanding of the material (maybe accounting?). Even when I studied abstract math, I learned a lot more in study groups and office hours than I did during lecture.
My point isn't that a good set of lecture videos will be world-changing (we've had broadcast video for awhile now). My point is that that the learning all of these subjects can have a strong web component to augment the process (and "web" doesn't necessarily mean video). Maybe when I put it that way, it sounds almost obvious. A stronger statement: whatever form web-augmented learning takes in the next century will improve learning so drastically we'll barely recognize it.
Ah, perhaps we are talking past each other. I completely agree about the web being integral to the learning experience: from printing out course readings to working online exercises. I suppose where I'm coming from is speaking out against the idea that higher education is simply about learning facts and rote methods. To me it's ALSO about interaction and the group learning/creating dynamic, and that can't be parceled out through the web, and to that end I fundamentally disagree with Mr. Gates. But he's a world famous autodidact; I can see where he gets this impression.
@mcgraw5 I guess I'm coming from the angle that most of my humanities courses in undergrad took place in seminar settings, and the discussions that we had were sublime, in part because we could share moments of intense intellectual discussion. I simply can't see that happening well with the same group of 11 + a professor over video chat.
I'm not sure the normal group discussions are better than the threaded discussions like we have in here.
And creating small group face to face dynamics could be achievable using group video chat/telepresence.
and :
"""
A recent 93-page report on online education, conducted by SRI International for the Department of Education, has a starchy academic title, but a most intriguing conclusion: “On average, students in online learning conditions performed better than those receiving face-to-face instruction.”
""" http://bits.blogs.nytimes.com/2009/08/19/study-finds-that-on...
And luckily there is no software at all in the machines that monitor your vital signs, decide how much medicine or radiation you need, route your ambulance etc.
What you should be worried about is practical experience, not accreditation.
The reason why you feel comfortable with a self taught software developer is because you can write and run software on your own.
Let's say you got shot. Who would you want to operate on you, someone who went to medical school but probably didn't see too many gunshot wounds, or someone from a field hospital in Iraq who operated on hundreds of cases but does not have a medical degree?
Maybe in your world (and mine). But in pourer countries, many people will prefer a surgeon who read the course online, than no surgeon at all. Mlearing and mhealth will be huge in developing countries.
I sure would get a little uncomfortable if my $x told me he skipped college and learned everything from tutorials before $y
$x = "software developer"
$y = "we made him head of this huge project without even evaluating his skills"
I don't think it's so unreasonable for a surgeon to learn the traditional body of knowledge online, learn his way around an operating table IRL, take a test, and then go do his residency like every other doctor.
I think there just isn't enough good information out there for a surgeon. It's already hard to find good information on some obscurer topics of software development, like for example windows driver programming, where you won't go far without books and/or learning inside a company with knowledge about it. Now jump into a complete different profession and suddenly the web isn't all that big. How many useful & high-quality surgeon sites do you think are out there? Maybe 2-3?
On the other hand learning iPhone programming, which may be all you need to get a job as a developer, is significant easier than any learning any other profession online, just given by how much information is available on the topic and how well it's presented.
Right. But you're talking about 'as-is.' Someone could easily come along and change all of that with an amazing way to instruct people online. All of a sudden we have people that know a lot about medical topics or windows drivers instructing classes online.
Even if there was plenty of information and the best online, it is very different to read something and to learn something. With programming is easy because you can just start coding and test for yourself and experiment and all that. You can't do all that if you want to be a surgeon. You can't just have a dead body sent to your house.
It is a bit like a pharmacist or a chemist who has never mixed two compounds together but knows all about them, yet does not know for example what colour they are, or what do they actually look like, or what the actual reaction from mixing them looks like.
I think another comment here made a good point. It is not just about the learning. It is also about teaching you how to think, and experience of course, practical experience. For example, I might be comfortable talking to a lawyer who got all his knowledge from the internet but only for reference, to ask him lets say what the law is, but I wouldn't hire him to argue a case for me. You can read all the books, or lectures, but without the practical experience of how to apply the law to facts which is then graded to give you an idea of whether you are applying it correctly and how to find things you do not know in the area of law where there often are conflicting points and maybe the law is not settled, and perhaps most importantly how to think like a lawyer, which argument would do, which evidence would do.
Even if all of these things can be replaced through virtual stimulation, why would you want to sit in front of a computer for say three years on your own rather than share the experience with others and make friends and have the opportunity to be part of interesting student groups, and expand yourself in many more areas than just the core subject.
I suppose it might be good for someone who might be thinking of a career change, if all those things can be replaced, but for an 18 year old, it would be very disadvantageous.
In any event, as things currently are Gates is very much dreaming. People have access to textbooks yet they hardly have replaced universities though pretty much they teach what is in the text book in the lectures which are only a small part of undergrad courses.
I imagine with VR getting better and better, a surgeon could eventually train for new procedures much the same way pilots get trained in simulators to handle different scenarios.
There already some companies working on medical simulators. and some medical schools use them. but some simulators need special manipulators to be effective , so it's hard to deliver them over the internet.
And obviously , using simulators can offer great results:
"""
The residents had displayed similar psychomotor skills, but the two groups performed very differently in the O.R. Those who had trained on the simulator completed the operation, on average, twenty-nine per cent faster than those who had not. Moreover, the residents who had received standard surgical training were nine times more likely to hesitate during the operation, five times more likely to injure the gallbladder or burn surrounding tissues, and six times more likely to make other errors. Additional research has shown that simulator training significantly enhances performance of hernia repairs, nasal sinus procedures, and bronchoscopic examinations of the lungs.
"""
http://www.newyorker.com/archive/2005/05/02/050502fa_fact?cu...
There are some procedures, such as laperoscopic (sp?) procedures, that the surgeon does "remotely" - I mean, in close proximity of the patient, but not actually holding the tools. Are there difficulties that would prevent a surgeon from performing these procedures from someplace else?
A few loosely related thoughts as to why I'm fond of the University system (in response to the negative tone of some of the comments here and at techcrunch):
- It's the only socially acceptable way to spend years simply learning as a full time job.
- The argument that "lectures will soon be online" only makes sense if you believe that physically being present in lectures is the way that people gain knowledge in school.
- University admissions maintain a level of intelligence in incoming classes that I don't think an online community ever could - so that the students who interact at a university are interacting in a productive way.
- Physical colocation is just a better way of interacting with people than communicating online is.
- "Communicating online" is in its infancy. The way people communicate online now (facebook, twitter) is wholly different than 10 years ago, and will be wholly different 10 years from now.
- I agree that University admissions provide a useful filter, but one may emerge online, as well. Already, the people you meet on HN are not the people you'll meet on 4chan. I wonder what an HN clone would be like if membership were applied for and kept only as long as you maintained a certain comment point average.
- Agreed. A great deal of what Universities provide is a cultural submersion into a social class. This is hard, if not impossible, to replicate online. That said, it has nothing to do with learning. People can still learn online and go about indoctrinating in one social class or another in other ways.
- Social norms follow progress, not vice versa. Whatever Universities will be like in 1000 years won't be socially acceptable when it first appears.
The way people communicate online now (facebook, twitter) is wholly different than 10 years ago, and will be wholly different 10 years from now.
Different, yes. Better? Ten years ago, the conversations I had on mailing lists had a lot more potential than you can fit in 140 characters. The trend isn't very promising.
I was about to say the same thing. The step from usenet and mailing lists to facebook and twitter has been a huge step backwards in terms of quality discussion and meaningful communication.
These points have merit, yet I know of at least one pioneering effort that was successful: The "Higher Cracking University" or +HCU for short. It ceased functioning a long while back, but at that time it didn't just meet, but far surpassed the expectations of online learning described in these comments.
Here you had a community that
- spent years simply learning
- shared their knowledge (for free!) in essays, email, and forum discussions
- maintained a level of intelligence in incoming classes by use of strainers (reversing assignments such as http://www.woodmann.com/fravia/strain99.htm)
- never needed to meet in person to contribute these great works
It can be done, but perhaps the question is, "why haven't we seen more of these in the wild?"
Those are excellent, I love how approaches everything in terms of the normal person who might find some concepts tricky and need a closer explanation. I've gone to lectures then watched one of these on the same content and he really explains it so much better.
The thing lecturers tend to do is assume everyone in the class has been through the same set of classes before the current one and assume many things should be completely obvious and not need explanation, when often they do.
I have never found the fidelity of online learning to be _anywhere_ in the ball park of in-institution learning.
Especially in the areas of humanities and so forth, but also in Engineering (I hold a BSCEng, and am currently a Masters student) labs, conversation, community learning are all essential.
I also can't help but think of the Socratic method, and how at least I personally tend to learn in that manner. Simply being presented with information as fact doesn't really do it for me. Being guided, and assisted at coming to the truth on my own always leaves me with a much deeper connection to the knowledge I've acquired. And I feel that requires as certain measure of realtime interaction with someone who is an expert in the field. Simply watching their lectures while informative, does not seem to be enough.
While this may change in the future, given that online learning is in its infancy, I think 5 years is a bit short term. Especially considering the pace at which the established institution of education moves.
I was learning Scheme from old SICP lectures online.
Best lecture EVER.
Why would professors need to make same lectures every year? It make more sense to make new lectures only when they have something new or better to say.
You can't ask questions and start discussions during pre-recorded lectures. The important part of a lecture isn't just the words being spoken by the lecturer, but also the impromptu discussions they lead to.
Then you also need to tape the questions and answers and assemble a video FAQ accompanying the lectures. Or let other people support the lectures and answer upcomming questions.
I think that its a collosal waste of time and ressources to make great lecturers repeat the same lecture over and over and over again to a small, limited number of students, instead of letting them record a larger number of one-time lectures. My favourite example is Salman Khan. If he had to repeat a few chosen topics over and over, he would never had the chance of creating a body of work as large and comprehensive as his academy.
He said about that: "With so little effort on my own part, I can empower an unlimited amount of people for all time. I can't imagine a better use of my time."
By the way, in the recorded SICP lectures, there arent actually _that_ many interesting questions asked. If Abelson and Sussman thought your way and prefered live interaction just in order to be able to answer more questions, thousands of people during the last 25 years would have not been able to experience them _at all_. It is not fair to reduce great lecturers to mere question answerers, IMO the accompanying Q&A sessions always account for a very small part of the value of a lecture, i.e. just to fill a occasional gap.
Don't get me wrong, recorded lectures are great. I love the SICP lectures and I'm really happy the web as given me the opportunity to see them. However if I was given the choice I'd far prefer to have had the chance to be there live and discuss the material with the lecturers, and if the lecturers at my university had said "I'm not going to given any lectures, since I recorded them a few years ago" I'd feel very ripped off. At the end of the, as good as recorded lectures are, they'll never replace "the best" teaching experiences.
He said that you'll be able to find the best *lectures in the world which will be better than any single University. With initiatives like MIT OpenCourseWare and iTunes U, this isn't that hard to imagine.
I think most HN readers wouldn't doubt the power of a motivated, passionate individual when it comes to learning.
To me, the takeaway point that is most interesting is Gates' desire to break down the barriers to higher education and the transfer of knowledge. He also recognizes the declining value of a Bachelor's Degree.
Educational Institutions will always be vital to the education of "tomorrow's leaders," but the current paradigm is flawed and I like where Gates seems to be going with this.
I'm curious as to how Gates compares America to Asia and says that Asia is currently beating the West (example he gives is that Asia has smaller, shorter textbooks and yet scores higher on global acadeimic tests).
Whereas in Asia people are scrambling to emulate the Western education system. They see the American education system as better at fostering originality and creativity. And there's a bias towards rote in Eastern education systems (at the moment).
Could it be that this is a grass-is-greener on the other side phenomenon?
Nope. Its a find something to support your point and bend it a little bit.
I was thinking on the point he made though that maybe textbooks in America are thicker because America has more scientific knowledge than Asia and maybe the books get updated more often and maybe they are more in depth. Basically, I do not think you can just take one variable and compare it to another and proclaim that's the reason.
I have a masters degree from both a US university and one in Europe. It always struck me how the American books were consistently thicker then the the ones I got in Europe. The contents was the same basically, just much more condensed writing. The American books I could read almost like a novel, whereas before I had to carefully study each page, sometimes read it two times to understand what it meant.
Just a matter of style really, I wouldn't read too much into it. Perhaps the fact that in the US you have to pay for your own text books is a driver too.
"He believes that no matter how you came about your knowledge, you should get credit for it. Whether it’s an MIT degree or if you got everything you know from lectures on the web, there needs to be a way to highlight that."
So, is he planning to try to develop an organization to provide that certification (because that's one of the major functions of colleges, though how important it is varies by field) or is he just saying that someone should? I'm pretty sure that he could manage it if he wanted to, even though it could be a pretty huge challenge [I can think of a few ways to get the core of the process (knowledge verification) to work, but none of them are quite perfect, so I'm sure that the main problem would be pretty easy to solve].
The established brick and mortar higher ed industry is not going to have their livelihood taken away without a fight. More fundamentally, very few individuals aged 18 - 25 have the self-discipline to complete the equivalent of an undergraduate education on-line, with no class schedules, no set exam times, nothing to enforce any routine of learning. Some do no doubt, but not anything near the number that are able to do it now within the structure of a traditional university or college.
You mentioned "the equivalent of an undergrad education". This "equivalent" can have different levels (associate, bachelors, with honors), number of majors, average grade, testing criteria, testing jurisdiction, duration, how recent, etc. When the brick and mortar higher ed industry dies, perhaps the concept of achieving some recognized education "equivalency" will die with it. Everything will be ongoing learning, never attaining.
While it's not perfect, a college degree is usually a decent indicator that a person has at least heard of the stuff you want. A new way to quantify level or progression of a person would have to replace it.
There is no reason why the current Universities can't provide this for online courses. Online courses do not mean that there won't be any Profs and TAs. But you just won't be going to their office on preset hours and sit in a classroom on specific hours.
Undergrad education in certain subject really doesn't change from year to year. You could build a course and do small adjustments year to year. What you would provide as a service in my view is the human help.
What I hope won't happen is that Universities will turn into for profit course giving institutions. We need the R&D work done at Universities. I'd be suprised if the government would want to continue funding Universities as they do now if for profit website could do it cheaper. Do the current for profit school like the University of Phoenix do any research?
I agree with most of what he says except for US books beeing to long. I recognized it when I spend a year in the US as an exchange student. The books really are longer than in Germany but I actually consider it to be a good thing. Sometimes explaining something in 3 sentences makes it easier and therefor faster to understand than saying it in 1 sentence which you have to read over and over agin to get it.
Well, tastes differ, but I tend to get bored pretty quick and can't stand verbose texts. In my mind they have to be briefly worded and exact and that's pretty hard to come by. Uwe Schönings "Theoretische Informatik- kurz gefasst" (similar to: Gems of Theoretical Computer Science) is an example of what I have in mind. It's hard to find something like that, most books are a pain to read (in my opinion).
Education and accreditation are two different things. I think for some things, just knowing it is valuable and accreditation is not so important. I have thought a lot about how to effectively convey information. I think the web is a good means to do some of that. I don't know that the university function of accreditation will so readily go away. I think that's probably unrealistic.
You're right, and accreditation is key for value perceived by the job market, and accredited colleges will try to increase online enrollment thanks to the inevitable collapse of the student loan bubble.
Physical enrollment in college now is way overpriced, and the market will soon correct that overpricing leaving colleges competing directly to enroll many more students with less cost overhead per student, which will lead to a vast proliferation of accredited online degree offerings.
the thing is, accreditation, in and of itself, is pretty goddamn difficult. Schools do a horrible job of accrediting software engineers. Generally speaking, accreditation doesn't really start working well until job roles stop changing so fast.
I homeschooled my sons. My sons are "twice exceptional" -- gifted with learning (and other) disabilities. I have always told them that their best hope of making their lives work will be to found their own company. They can very effectively get things done but I don't think they can work for other people. When you found your own company, you don't need accreditation. Either the product sells or it doesn't.
Similarly, I spent a lot of time very sick. I made a lot of dietary and lifestyle changes and I am healthier now than I have ever been in my life. I am not a doctor but I am also not impressed with the current healthcare model followed in the US. I think I have a cheaper and more effective alternative to offer. Study after study after study shows that diet and lifestyle are huge factors in all the major (deadly) illnesses, like heart disease, diabetes, cancer. But doctors mostly don't push for dietary and lifestyle changes. They are mostly in the business of selling pills and surgeries. A big part of why they don't is that people aren't very receptive. Many people just want an easy answer. And it's quite hard to make money off of promoting lifestyle changes.
So to me, the big challenges are how to more effectively convey information and how to monetize it. I am still trying to decide if I should largely abandon my current websites and start over in a new space. I don't yet know the answer to that. But I do know that I have narrowed it down to "education is the answer" and I do know that I am very good at some things that other people seem to not be good at. I know this from years spent on homeschooling forums where for some particularly difficult things, I often was the only person who had much to say.
So I find this article very relevant at the moment and encouraging. I don't know if I have much to really contribute to the conversation that will be of interest to others or that they will value. Perhaps I just need to talk about it. For my own goals, accreditation is unimportant. The piece that matters is whether or not the information is conveyed adequately, which my current websites are not doing. I think I will be doing something educational on the web in the future. So I am kind of excited to see this piece.
speaking as someone who has experience working for other people /and/ starting companies, I would argue that a lack of social skills is limiting in both.
In both cases, if you are primarily a technical person, yea, lacking social skills is a problem that will hold you back, but it's not the end of the world, or your career. It's certainly not as bad as, say, being of average intelligence, or lacking curiosity, or being lazy.
Running your own company gives you a lot more flexibility; if you know yourself well, you can use that flexibility to build a situation where your strengths help and your weaknesses don't matter so much...
But really, the computer industry was designed for mildly-autistic nerds who don't play well with others. by the standards of such things, I'm almost a friendly extrovert.
speaking as someone who has experience working for other people /and/ starting companies, I would argue that a lack of social skills is limiting in both.
That isn't my primary concern. I have a form of cystic fibrosis, which has a life expectancy in the mid-thirties. My 23 year old son has the same diagnosis. He has not been on antibiotics in over 12 years and has taken no medication for about 3 1/2 years. This was achieved by carefully arranging our lives, something you cannot do in a job. He also has other issues, like eyesight issues. I currently work for a large company and have been stuck in the same entry level job for nearly four years. Conditions at work are a problem for me due to my health and this means my performance at work is not that great and it shows. If my son has adequate control over his working conditions, he can be very productive. If he doesn't, he can't be. Most employers are not going to accommodate his needs. And I went through the intake process with a program for helping handicapped individuals get a job. Fortunately, I got a job on my own before they provided any of their so-called "assistance" or I likely would be in worse straits.
I am very good at explaining social things to my sons and helping them cope in that regard. They can do that piece of it. But they still need a high degree of control over their lives, something a regular job will not offer.
The social skills you need as a one-person company is much narrower set than what's needed when employed at a big political-minded company.
When you are a consultant you have only one boss: they guy that is able to sign the check. When you are employed, there can be multiple official and non-official bosses that you have to satisfy, plus all of the personal/social drama.
eh, the social drama is different, and it depends on the industry. For instance, I can hold down a SysAdmin job just fine, as a contractor or not, but I lack the social skills to rent out people who work for me as sysadmins, even when they are obviously good people and I'm charging below market rates. in some cases, I've given up and suggested the people for direct hire... and they got hired, at rates higher than I would have charged, even including my cut.
I think either way, there is a tradeoff. If you get shit done, people will put up with a lot of social problems, in both situations. Like the guy who wrote a hn story about how I didn't answer his support email. He refused a refund and wanted to remain a customer... the thing was, he wanted better support, but when the rubber hit the road and it was time to cut a cheque, he chose cheap over good support.
You can set up your company so that social skills don't matter as much, but that limits what markets you can go after. You can also choose your career while working for others, and get similar tolerance for social oddity.
(of course, the OP's problem was not so much social skills as flexibility, and being self-employed almost always is more flexible than working for other people.)
>But doctors mostly don't push for dietary and lifestyle changes. They are mostly in the business of selling pills and surgeries.
really? my doctor pretty regularly admonishes me to eat better and work out more. I think the problem is me... lifestyle changes are /hard/
I mean, maybe they need better tools for making those admonishments more effective, but they certainly recommend diet and lifestyle changes before breaking out the pills and scalpels.
Not when you have something really deadly, like cystic fibrosis. Most people with CF are on several thousand dollars a month worth of "maintenance drugs" -- ie when they "aren't sick" -- and take more when they have an exacerbation. They account for something like half of all pediatric lung transplants and a third of all adult lung transplants (edit: Those are probably US specific figures). Meanwhile, I am dismissed as crazy for saying that diet and lifestyle can make a really big difference.
The current state of the art advice for CF patients regarding diet is "high calorie, high fat, high salt". The medical establishment routinely pushes unhealthy foods at people with this condition because such foods have a high fat count and, therefore, a high calorie count. Healing my body has shrunk my appetite. I no longer need the extremely high calorie diet that is the norm for my condition. I get treated by the CF community like I made up my diagnosis and was never actually sick. The idea that you can have CF and get well is wholly alien and utterly rejected by most members of the CF community, as well as the medical establishment. Most research for treatment is for drug therapies, not diet and lifestyle. The idea is that since it is so deadly, the treatments need to be equally scary (much like chemo for cancer patients). I have found that is completely wrong and is, in fact, part of the problem.
Short version: Good quality sea salt makes a huge difference. Good quality fats make a huge difference. Addressing PH balance does miraculous things (people with CF tend to be way too acid and this likely is part of what causes the "normal progression of CF" because high acidity within cells causes proteins to miss-fold and a miss-folded protein is the crux of the matter in CF).
I can't remember if it was Stanford or Berkeley, but one of them was going to open up the ability to earn a full degree online. I'm not sure when, but I heard that transition has a lot of students at the university in an uproar.
One step at a time and the elite education will be everywhere, though it'll still cost an arm-and-a-leg.
I am huge on web-based education. I can't imagine what my skill-set would look like if I had access to today's web when I was growing up. I would love to build a website that brings educators together, from anywhere, to instruct groups of 10-15 people on specific topics-of-choice -- for free.
Why hasn't online learning trickled down to the lower levels? Public schools could really benefit from having a big catalog of online course material to offer students. I realize there's a need and benefit to instructor led courses but our entire education policy in the US seems to focus on the lowest common denominator. Why not let the best and brightest push forward with self paced learning? Give them more choices. Let them start focusing on their unique interests. This would possibly give teachers more time to help the kids who are falling behind. I would have done much better in school with self paced learning.
It wasn't until we had something like Youtube and widespread broadband that it became viable for even the poorest schools. That was only 2005. Look at how quickly stuff like Khan Academy and other things sprung up after Youtube launched.
That's only 5 years. Schools have to be conservative in things like this because the things they teach kids will serve as the foundation of their lives. Give it another 5 or 10, and I think we'll see an online component as a standard part of public schooling.
There are a couple of things I can get from attending uni that I can't get from learning online.
- Focus - Sitting at the computer it is very easily to be distracted trying to pick up a challenging new concept. I find this even more true about things I don't derive any immediate benefit from, as in it is easier to concentrate on learning a library I'm going to use in a freelance project than say an obscure statistical model for a class. Sitting in the lecture give me that chance to clear my head from most distractions and focus on the content.
- Mindset - I head into uni, it usually gets me in the right mindset for learning and completing tasks. I don't get that sitting at home.
I would also add the guidance of some lecturers/ supervisors but this point really depends where you are. I can't say in my own experience this couldn't have been replaced online although their are moments when having such experienced teachers can help.
Edit- I would also add that these issues could be solved by getting together at a co-working space with some like minded others. I guess a hybrid model like that could have some merit, the educational institute acts as a network to connect like minded people for online study. Maybe they could also have a staff member that has a hands off role and just answers questions over email for a much reduced fee over the traditional system.
Because internet education should be enough for anyone...
Now, seriously, Gates has a very weak track record on predicting the future of technology. That's why Microsoft was never known to be a first mover into any new market. It's really hard to predict the future - I was watching 2001 with my 14 year-old son last week and couldn't help but laugh that in an enormous space station an what seemed high orbit you would have AT&T phone booths.
"Mobile devices will be able to send and receive messages, but it will be expensive and unusual to use them to receive an individual video stream"
On ordering flowers online: "You'll be able to watch the florist arrange the bouquet, change your mind if you want, and replace wilting roses with fresh anemones."
"I believe that we’ll not only be using the keyboard and the mouse to interact, but during that time we will have perfected speech recognition and speech output well enough that those will become a standard part of the interface." (this is from mid 90's, IIRC)
"I believe OS/2 is destined to be the most important operating system, and possibly program, of all time," which is right in a sense that NT was once OS/2 3.0
"Spam will be a thing of the past in two years' time,"
"We will never make a 32-bit operating system," (that was when he introduced the MSX in Japan)
The low-end of the education will sure improve with the internet. Kids with mobile phones and computers can quickly google stuff up and browse the wikipedia to get some deeper information beyond their textbooks offer. It will augment, but it won't replace physical schools anytime soon.
It's not that hard to predict the future of technology if you follow the relevant fields of science/engineering. There are usually a set of core problems that everyone in the field is aware of and has been working on for decades. What is difficult, is predicting the timing of mass market viability of those technologies. (So in this respect it's like how many people had identified a housing bubble in 2004, but anyone who'd bet heavily on its imminent collapse then would've lost out.)
Microsoft in particular actually have a track record of entering a market prematurely with an unpolished entry (often because the hardware resources/architecture/maturity of the underlying academic field aren't yet there to support a polished entry), failing, shelving the idea, and then being caught flatfooted N years later when someone else enters with something more polished.
There seems to be a consensus on HN that it's very important for startup founders to live in the same location. Face-to-face time is considered so important that VCs say they will not even consider investing in distributed startups. The same theme is strong when describing the hassles with outsourcing.
Yet a lot of people completely ignore this for education. Suddenly there is no value in spending time with classmates/teachers. It's all very strange to me.
I think Gates is right, and this is something which I've also ranted about in the past. The cost of university education has been increasing, there are an increasing number of people wanting university education and that level of education is increasingly expected in the workforce. There is also a significant fraction of the population who cannot afford university education (don't have wealthy parents) or do not want to take out gigantic loans which may take them decades to repay. University education is no longer a ticket to a high paying job, and that has been the case for quite some time, so large education loans can't necessarily be justified on a financial basis. Also this year many applicants have been turned away by universities due to budget cuts which means that they have fewer places, and that trend could go on for a few more years at least.
The solution isn't new. The Open University has been around for a long time, and something like OU, augmented with all the media content that the internet can deliver, would seem to be the logical way to deliver higher education at lower cost.
I do not see anyone replacing an Oxford place with OU or any university place with OU. I think, though I might be very much wrong, it caters to niches, rather than the core of university market which is 18 to 21 year old.
The difficulty with OU is that it is not too easy to motivate yourself. In uni you know you are there to study - kind of - with OU or any other virtual replacement its a bit like a project which easily can drift because really nothing has changed with your life. You'll just get a bunch of stuff in the post and some online logging.
Also, there seems to be many generalisations on your comment and almost wishful thinking. Sort of saying, things are bad, so this replacement is the alternative so lets do it, rather than judging the replacement on its merits.
Yeh, that might be because there have been record numbers of applications to university this year and some 100,000 students are not going to get a place at any university so many probably rightly thought getting a distance degree while working might be a good alternative.
But we would need much more than an increase in percentage to make any judgement. What are they studying, how many get to graduate by age group, how many enrol by each age group...
"Five years from now on the web for free you’ll be able to find the best lectures in the world"
This is already true--for just one example, see academicearth.org. I suppose Bill has to make it a prediction for the future for the press to be able to digest it.
I often tell people how absurd modern education is given the existence of video lectures (and the printing press, for that matter). They usually tell me that video lectures aren't a replacement for live lectures. When I ask if they've actually watched any video lectures, they usually haven't.
Best advantage of video lectures: I can download them with the netvideohunter firefox extension and watch them 1.5x speed in VLC, which actually improves my comprehension because I'm less bored.
Free textbooks are probably even better. I'm too busy with my side project to study, but here are two that look AWESOME:
Best class I ever took was last year, the professor had been doing chalk and board since 1962, he was amazing, and it was because I was there. Maybe after his generation is gone we won't lose anything from communicating entirely via web, but to get the full breadth of knowledge from the lingering analogites among us, you gotta roll out of bed.
I think the real value of the web is to give students free access to the nitty-gritty details of potential careers. These details would not serve as a replacement for education (I feel that's impossible, since hands-on lab experience and access to expensive equipment, etc. are so important to competency). Rather, they would help to ensure that students are embarking on a path that they're sure they want.
Some people spend a couple of years and thousands of dollars before discovering that their major isn't really what they thought it would be. And by then, they're just unhappy; they have invested too much to turn back, and don't really have passion anymore. I strongly suspect that a lot of people who do crappy work simply got into this kind of trap; if they'd had the chance to find out more about other jobs in detail, they may have made a different choice of major, and been much happier and more productive somewhere else.
Most students today realise that reading lecture notes n the course website is often an entirely adequate substitute for actually attending the lecture.
But that misses the point. None of the knowledge or experiences I gained at university that have value to me today were obtained in a lecture theatre; nor could they be readily substituted for by a website.
That's great. Now please set up an organisation, whether profit or non-profit that will examine the content of the most common degree courses that teach something where there is a "core" to the field regardless of prior qualifications and certify that knowledge.
Personally, I think it would be best to, as much as possible, disconnect the teaching and the certification. both are hard problems by themselves; when you put them together, you end up with what we have now.
I would give a robot doctor to those that considered education as a simple fact. You should have time to consider that your "doctor" is not so nice as a real one.
So when your "robot doctor" is acceptable to you, then I should consider your consideration for education.
Amen. I'm taking a class in Abstract Algrebra in the fall. I already have a general overview of the topic and have been working some practice problems from online resources. It's amazing what's available these days if you want to teach yourself.
On the other hand, I just finished taking a couple of courses online and I have to say that I really missed the hands-on and interpersonal (study groups, discussions, etc.) aspects of a real classroom environment.
The context is clearly different, but the contrast seemed interesting.
An excerpt:
"The reasoning behind this decision was economic. The government reckoned that every million new graduates would raise China’s gdp by 2%. Not only did university graduates make skilled employees and attract higher incomes, but education itself was an engine of economic growth."
I agree with the title, but this seems short-sighted:
Five years from now on the web for free you’ll be able to find the best lectures in the world
Lectures might be the optimal way of learning given the technology of the last millenium, but I don't think they'll survive even a century in this one. Computers are interactive interfaces, and "The Web" is an interactive medium. I have a feeling that whatever replaces lectures is going to take advantage of that.
There are non-interactive digital media too such as podcasts... it seems like nearly every scholarly lecture I try to listen to, though, has the first 20-30 minutes taken up by (multiple!) fawning introductions or a reading of the class syllabus. I'm getting annoyed just thinking about it.
As a concrete example, the Apple-provided WWDC 2010 sessions on iOS 4.0 and OS X are some of the best educational materials I've ever seen.
Well recorded, tested material, and, this is key, absolutely knowledgable presenters. The people that built those systems. I doubt any university, at any price, could provide lecture materials as broad as these. Add in the fact that I can watch these at my own pace on an iPad, literally anywhere, and scrub back and forwards, and there's no excuse for missing anything.
I think back to my sausage-factory university classes, and pretty-much it was mostly second-rate stuff, with a few notable exceptions (Guest lecture from SPJ on graph-reduction, in 87 was probably the high-point). It's sad to think I spent four years of my life doing a bachelors, when I could probably cover the material about 50% faster now, and retain it significantly better.
In short, Gates is late. It'll be way faster than 5 years. Hell, the new SICP lectures are already available I believe, and I'm sure geology, physics, maths, stats will all follow.
University and high school is a way to employ people who study and obey rules. If you dry this fountain and this job get outsourced, less people will get incentives to study. A society without universities would be a very disrupted society. Power is university supported.
It would be nice if there was more substance to the article. It would be fascinating to know the reasons Gates has formed this opinion - the specific trends he's seeing and some actionable road-map to make this happen.
Eh, it seems to me that if he knows how to do well /without/ a formal education, he may be in a position where he knows more than most people about what parts of 'education' are important, and which parts are not.
Well, at least in his field of expertise. Different fields depend on education in different ways.
Really, what he was best at as I gather is summed up by this:
After reading the January 1975 issue of Popular Electronics that demonstrated the Altair 8800, Gates contacted Micro Instrumentation and Telemetry Systems (MITS), the creators of the new microcomputer, to inform them that he and others were working on a BASIC interpreter for the platform. In reality, Gates and Allen did not have an Altair and had not written code for it; they merely wanted to gauge MITS's interest. MITS president Ed Roberts agreed to meet them for a demo, and over the course of a few weeks they developed an Altair emulator that ran on a minicomputer, and then the BASIC interpreter. The demonstration, held at MITS's offices in Albuquerque, was a success and resulted in a deal with MITS to distribute the interpreter as Altair BASIC
right. my point was that gates is clearly /very good/ at something. I mean, from what I read, he's no slouch as a computer guy, either, I mean, how often are projects like that delivered on time? Sure, he made a dishonest claim about what he had, but when called on it, he delivered.
While his point is that he was in some degree lucky and gained success by writing one code. Sure that might have not been easy back then, it might have required some intelligence, but it hardly makes him qualified to know much about education.
Adopting the scientific methods to how people become successful.
Just because I might be intelligent I can't quite tell someone how to be intelligent. A scientist might be able to however if he does a lot of studies and is able to derive rules etc.
Besides, he is not serially successful, as in build one company and move on to the next, or successful in business, politics, history, medicine, physics. He just wrote one code. As I say sure it took some wits from him to make the deal and intelligence to write the code, but that in no way makes him qualified to know much about education, which is such an extensive topic.
And yes he was a drop out hence his genius idea of accrediting any knowledge however it was gained. If someone has some knowledge there are other ways to prove it to others than through accreditation. We have I think enough pieces of paper which misleads people as to our ability.
if one needs formal education to have an opinion on education, why are you talking to me? I didn't go to school either.
My point is just that school can't be the only qualifier for knowing about education- unless the point of education is to make people good at school.
>And yes he was a drop out hence his genius idea of accrediting any knowledge however it was gained. If someone has some knowledge there are other ways to prove it to others than through accreditation. We have I think enough pieces of paper which misleads people as to our ability.
I think most people agree that most certifications are garbage. However, most degrees aren't any better, when it comes to accrediting that a person has a particular skill. It's a hard problem that isn't being solved well by anyone right now.
the thing is, a college degree doesn't say anything about the ability of a person to do a particular job, and people in industry recognize this.
But the idea that you could have a certifying agency say that joe is a level 5 unix Engineer is very attractive to companies... a good certification system would make hiring people way easier.
The problem is that certification is one of those things that's easy to do badly and extremely difficult to do well.
talk about doing Lean Development and MVP back in 1975. Pitch a product idea. If no interest, move on. But if interest, go build it, then sell it, profit!
when will this "gates was a dropout so he don't know nothin' about skool" meme die? the man went to prestigious private schools and Harvard and was well on his way to graduating when he chose to drop out. it wasn't like he was a poor kid who had no opportunity to ever know what it was like to be in a proper skool
He did drop out of Harvard, no? Every dropout has an excuse in the end, I'm not going into that. I just don't understand what makes people so upset about this point.
i think it might also be because people often conflate two types of drop-outs:
1.) those who dropped out of school due to lack of financial or social resources, or due to personal emotional problems
2.) those who dropped out of school because they had an incredible opportunity to capitalize on their natural talents and take advantage of their favorable surroundings
many successful 'drop-out entrepreneurs' fall in the latter category ... they came from middle-class or upper-middle-class homes, their parents had advanced degrees, they often attended private high schools and prestigious colleges, etc. etc. etc.
i'm afraid that if the meme of "dropping out is cool, just look at zuckerberg, gates, brin/page, etc." keeps propagating, that will give kids who aren't as advantaged (those in the former category) misguided expectations for what awaits them when they drop 'outside of the system'
I assume it wasn't your intention, but your comment implied that he dropped out because he was academically weak, which is why I presume people are upset.
For most of my college career, I was taught in a non-interactive lecture (from 15 to 300 students at a time). Sometimes the lectures were even being recorded or simulcast to remote locations. The lecturer often had no idea who I was. In graduate school, I got a good look at the "sausage factory" of how many core computer science class lectures are really developed. Unless it's an area of research and publication for the professor, they do a minimal amount of work on their lecture notes and exams. In some cases, the same professor taught the same subject year after year. Sometimes this resulted in progressively better notes, but sometimes not. In other cases, the class rotated among professors. They often took each other's lecture notes and used them with varying degrees of revision. Only when a course was a true passion of the professor was the lecture anything much more than what could be gleaned from a couple of textbooks on the topic.
Beyond the lectures, most of the labs and assignments were handled by grad student teaching assistants. I also got to see many of my friends go through the TA machine in grad school. The rare lucky ones got to TA a course related to their research, but most TAs were assigned to courses where they only knew as much as they'd learned in their own undergrad experience in that course. A few TAs would scramble to get up to speed. The smart ones would do the bare minimum for the course and focus on their research.
Then there are the textbooks. I look at the dead tree debris that on my bookshelves from my nearly dozen years of higher education and very few books contain anything beyond what's on Wikipedia today.
I can look back at a topic like "operating systems" and cringe. This was a undergraduate course that rotated amongst professors. At the time I took it, I'd already been on a dozen operating systems including VMS, NeXTStep and Dynix, played with Minix code, read Bach and Tanenbaum, etc. The PhD student assigned to teach my class had an undergrad in linguistics. I believe he'd never even logged into any unix system, much less had an operating systems course himself. He even dropped out late in the semester and disappeared. I damn near asked for a refund, but I took my grade and moved on.
There's no reason that a few passionate people on the web couldn't top that.