Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Does anybody know if Notch is self taught? I thought he had a CS Degree from somewhere?

I can fluctuate between thinking I am a fairly competent developer to thinking that I am possibly the worst programmer there is. I'm not sure which is a better attitude to have, hopefully I am somewhere in the middle.

I think the issue with reading some of the discussion on HN is that you get people talking in detail about things like functional programming languages , systems with huge scalability , hardcore math problems and the finer points of memory management in the Linux kernel that you feel it is obvious that you should understand this stuff.

I have been trying to do some more book reading to improve, of course the issue is that whenever you read any book recommendation threads of HN there is always at least 30 or so recommendations of some pretty thick books and no chance I'd have time to read them all.

There is also a difference between having deep knowledge of the tools and libraries that you are using right now and having a deeper understanding of theory for example learning git vs learning graph theory etc.



Does anybody know if Notch is self taught? I thought he had a CS Degree from somewhere?

I can't say with any degree of authority, but based on what I've read of his code, I'd guess he's self-taught.

My bigoted preconception is that when it comes to abstraction in code, people with CS degrees err on the side of doing too much, creating extremely elaborate object frameworks that close down options as much as they help reuse code. Self-taught folks, on the other hand, err on the side of doing too little, relying on cut and paste, util packages, and promiscuous sharing of data.

When it comes to algorithm design, people with CS degrees tend to pull in libraries of esoteric things, sometimes overengineered for the purpose at hand. Self-taught folks tend to write something from scratch that almost, but not quite, does the job right.

Based on what I've read from decompiled Minecraft, I'd guess Notch is self-taught. The abstraction is just barely enough to get the job done, and the algorithms are decidedly homebrew. That's neither praise nor criticism, just a comment on style.


I think that's because university teaches abstractions and pretty much nothing else. A project that is done in the average CS class will be presented as a way to teach a design pattern rather than teaching how to solve a problem.

Also the marking scheme will tend to favour a broken solution that is an attempt at an elegant abstraction rather than a more basic abstraction that is well tested and works.

There is a certain danger in being educated, for example a self educated programmer might have a problem and just implement a O(n^2) solution and move on whereas a college educated programmer might spend excessive time trying to work out a way to do it in O(log N).


> I think that's because university teaches abstractions and pretty much nothing else.

Maybe my CS degree was unique (I don't think it was), but at the 400 level we had specialization choices including security, graphics, web development, databases, operating systems, embedded, software engineering, etc.

> A project that is done in the average CS class will be presented as a way to teach a design pattern rather than teaching how to solve a problem.

Design patterns != algorithms. My professors actually spent precious little time teaching me how to program, most of that was self-taught. What they taught me to do was how to solve problems, and that education has been very useful.

If your CS professors are spending most of their time teaching you design patterns, you should ask for a refund. The only CS class that taught me design patterns was my 400-level software engineering course.


That's not quite what I meant, most CS classes cover many other things than design patterns.

The problem is more that in an undergrad degree you are unlikely to need to do any large scale (by industry standards) project. So the way that they teach you good design is largely forced and the most academic way to do that is through a "design patterns" type class where the project that you are building is small enough not to require design patterns in the first place.

This will then educate people that you should really always look for patterns to apply and ways to abstract things since you will be artificially marked up at college for building abstractions and will look for them in every problem you have.

On the other hand someone who is more taught by experience will start by writing awful code like putting their whole program into 1 or 2 functions , they will then experience pain points because of this and realise certain re-occuring problems and either come up with their own solutions to them or read "gang of 4" , this will teach them design patterns the natural way.


On the other hand someone who is more taught by experience will start by writing awful code like putting their whole program into 1 or 2 functions , they will then experience pain points because of this and realise certain re-occuring problems and either come up with their own solutions to them or read "gang of 4" , this will teach them design patterns the natural way.

This is exactly how I taught myself. The transition from unorganized to organized (patterns) happened for me pretty naturally. For me, it was learn how to organize, or quit. Now-a-days, patterns are everywhere (and probably have been, but I didn't get it until I got it :)


+1 reminds me of j2ee pet shop. Beautiful abstractions; low as a dog.


>There is a certain danger in being educated, for example a self educated programmer might have a problem and just implement a O(n^2) solution and move on whereas a college educated programmer might spend excessive time trying to work out a way to do it in O(log N)

The CS graduate will have covered big O classifications but if they immediately focus on this form of optimization it's probably because they have spent years reading blogs that tell them this is the nature of tests at places like Google.

If every CS grad automatically thought this way by virtue of their education there would be no reason to test for it in interviews simply because a CS degree is often a minimum requirement for the kind of job where you'll be asked these questions.

It's only a priority if you make it one and those self-taught guys can probably self-teach themselves big O.


Your assumption that self taught programmers (ie one that did not get a "formal" education) do not understand fundamentals is utter bullshit. You could reverse your statements and they both still work. I've worked with people that have a masters in CS and in general it matters about 1% of the time.

A programmer today has a wealth of information they can pull from that does not require a single ounce of formal education. It takes dedication to the craft not bucket loads of money.


I don't think I made that assumption in that post at all.

My point was more that a formal CS education is likely to give you a different perspective on programming vs being self taught.

I would imagine most self taught programmers focus on results oriented learning, when I first learned to program before doing any formal CS my approach was "I want to do X , what is the minimum set of stuff I need to learn in order to do that good enough", after learning more formal CS and being forced to consider things like abstraction and efficiency for their own sake I would always focus more on them in every program I write.

Not suggesting that you can't be completely self taught and learn everything you could from an academic education (you can) but you are less likely to spend a month learning a bunch of design patterns and algorithms unless they directly apply to something you need to do right now.

You are more likely to just start hacking away at something then think "oh, this code is a mess how can I fix that?" rather than reading the entire gang of 4 book to start off with.


I guess its obvious, but I am (mainly) a self-taught programmer, and I think that you really don't know what you are talking about.

When I was around seven years old, we had a Tandy Color Computer 2. I wore out the book that came with that, playing with BASIC. We also had a Vic 20, a TI-99a and an Ohio Scientific, and eventually an IBM AT compatible. For years, I spent a lot of time playing with short BASIC programs.

Eventually, maybe around seventh grade, I got a book called Turbo Pascal Disk Tutor (or something like that). I loved that book and I spent many months studying the book and doing the exercises. I was very serious about learning object-oriented programming. Over the next couple of years I experimented with a simple wireframe 3D CAD-like program. I became very familiar with abstraction, polymorphism and other object-oriented concepts before I entered the 9th grade.

Anyway, I'm not going to list every single program I ever wrote or design pattern or programming language or concept I taught myself, but the point is, I did read books and learn a lot of things that are actually apparently missing from many undergrad and even graduate CS-like programs. A guy at Stanford just recently came out with a Rails course partly about software engineering, which apparently is practically revolutionary. There is more contemporary software engineering baked into Rails than what probably more than half of CS or even SE graduates in the last five or ten years ever saw in their courses.

And ever since I dropped out of college (only took like two CS-related courses while I was there), I have been extremely motivated to learn as much about CS and software engineering as I can, mainly because of attitudes like yours.


> My bigoted preconception [...]

Well, I would have called you on this, but I guess I don't have to.

> The abstraction is just barely enough to get the job done [...]

Or, in other words, the perfect amount...


>> The abstraction is just barely enough to get the job done [...]

> Or, in other words, the perfect amount...

Well no, because the job doesn't end when it's "done". Minecraft is the perfect example:

In the earliest versions of the game, blocks were all basically homogenous cubes of some material, so they didn't need to be oriented. Later, blocks were added that did need to be rotated in various ways, e.g. torches, stairs. But each of these blocks had their own private system for choosing, storing, and rendering their orientation. These systems were often similar, but not identical. At this point, roughly half the blocks in the game are orientable in some way and there is still no generic orientation system. Such a system would have avoided massive amounts of redundant code, prevented many bugs, made the user experience more consistent, and made various 3rd party tools much easier to develop.

"You ain't gonna need it" is a cop-out. You are going to need some things. The trick is anticipating which things, and it will definitely pay off if you can guess correctly.


Hmm, from what I can tell, Minecraft is fairly successful, despite it being a "perfect example" of not having enough abstraction.

Of course, you fail to really acknowledge the risks of premature abstraction. Sure, if you could see the future, and know what patterns could be usefully factored out into abstractions, it would be good to start with those abstractions. But what happens if you incorrectly predict that an abstraction will be needed? You create a bunch of unnecessary framework code that is harder to understand, likely less efficient, and worst of all, you wasted time writing code that you didn't need.

YAGNI is not a cop-out. The best way to create abstractions is from concrete examples. Write something once. Then, once you actually find yourself writing it twice, abstract it out. That guarantees that you don't waste time on things that you don't use. It also generally leads to better abstractions, because you have concrete use-cases to work from.

Anyway, back to the Minecraft story, who are you to say that the game would be better if Notch followed the premature abstraction strategy? Isn't it possible, perhaps, that he would have wasted enough time implementing ivory towers of abstraction that he might have left out the features that actually made the game fun?


But what happens if you incorrectly predict that an abstraction will be needed?

Then you made a mistake and hopefully learned something. I didn't say architecting software was easy or without risk, just that you can't avoid doing it by following simplistic rules.

The best way to create abstractions is from concrete examples. Write something once. Then, once you actually find yourself writing it twice, abstract it out.

It's nice when things go that way, but it's not the general case. Often, by the time there is a concrete use for abstraction, the damage is done. For example, adding network multiplayer to a game that has been architected for single player is a nightmare of hacks and duplicated code (Notch has explicitly lamented about that one).

Anyway, back to the Minecraft story, who are you to say that the game would be better if Notch followed the premature abstraction strategy?

Notch himself seems to be saying as much in that blog post. But that aside, I'm a fellow game developer who has spent dozens of hours reading and modifying the Minecraft code. Even after a round-trip of obfuscation, it tells the story of its creation quite vividly, and the theme of the story is "hack around it!" Though I still have tremendous admiration for the game and its developers.


I think the point was that the source code doesn't make the game better or worse.

A nightmare of hacks is fine as long as the product is great.

IMO writing such code is sometimes even better, especially for a solo developer. If you aim to write beautiful code, it might eventually outweigh everything else, while giving a false impression that you're doing the right thing. Your goal is the product, not code. I'd argue that you can't focus on both (it's called ‘focus’ for a reason).


Of course source code makes the game better or worse. The game is made of code. The code makes the game what it is.

If you write good code, your product will work better and be done sooner. This is the definition of good code. If you write bad code, your product may be overbudget, buggy, inadequate, and so on. This is the definition of bad code.

There is no dichotomy between the product and the code. To suggest that you can make better software by neglecting the code is absurd.


> The code makes the game what it is.

No arguing with that. Same as building material makes a house what it is. The question is, does the success depend on material used? You can build a great house amidst the desert.

But that's an analogy. More real-world example—imagine two startups:

- Startup 1: bad programmer, good QA. - Startup 2: good programmer, bad QA.

Where would you invest your money?

> If you write bad code, your product may be overbudget, buggy, inadequate, and so on.

Here I disagree. Overbudget? It depends on product success. Buggy? If you have good QA, it's not buggy. Inadequate? You can write the cleanest code, but your product won't work as users want it to.

When we say ‘great product’, do we mean that it has nice clean code, or it's something else? How many great products have bad code?

> There is no dichotomy between the product and the code.

As long as you are ‘just a’ developer, and there are other people focusing on product and its functional quality. In that case you receive specific tasks with deadlines, and yes, you should focus on writing good code.

Not so if you're a solo developer.

1) No one will focus on the product, except you. 2) You most likely would be heavily biased towards writing good code. (Because you're a developer, you're supposed to write good code, right?)

You need to force yourself to focus on the product, to avoid becoming the Startup 2 from above example. Intentionally writing bad code is one way to do that. In that case you at least can be that Startup 1—you'll be forced to pay more attention to functional quality (as opposed to structural), so you'll be good QA.

You can argue that one can focus on both. My opinion is that it's too risky. You need to have priorities set as clear as possible.

> To suggest that you can make better software by neglecting the code is absurd.

Yes, it sounds really controversial (especially to a programmer). I'm far from satisfied with that statement. What would be a better way to be a good QA while being a great programmer?


Multiplayer mode can be extremely hard and can take any fun out of being an indie game developer. I'm still not sure if it is the right abstraction to work in on day one.


> Then you made a mistake and hopefully learned something.

Perhaps you learned that prematurely abstracting things is a waste of time?


Regarding the construction of abstractions from day one:

http://en.wikipedia.org/wiki/Opportunity_cost


"""Well no, because the job doesn't end when it's "done"."""

By definition, it does.

"""In the earliest versions of the game, blocks were all basically homogenous cubes of some material, so they didn't need to be oriented. Later, blocks were added that did need to be rotated in various ways (...)"""

So you are suggesting that they should have set up a system to allow that from the beginning.

Have you sat and thought how adding things like that could delay the initial release?

Also, have you sat and thought that if the initial release was not successful at the marketplace, all that extra work would have been in vain?

[downvote? Thanks, parent]

Just build what you need at the time, and make it flexible enough so that it can be refactored to something else later.


>Just build what you need at the time, and make it flexible enough so that it can be refactored to something else later.

But this is simply framework-level abstractions.


Well, flexible enough so that it can refactored to something else later != framework-level abstractions.

It could just be as simple: just don't make an untangleable mess out of it.


Or, in other words, the perfect amount...

Philosophically, I agree. I view premature abstraction in the same light as premature optimization. I believe both abstraction and optimization are incarnations of your understanding of the problem. You want them in important places, not necessarily everywhere, and hence you want them late enough in the engineering process that you understand which places are important.

That said, within the comfort zone, there are high and low levels of abstraction. For example, in a project the size of Minecraft, I would expect CS major code to contain an ObjectFactoryFacadeCollection or two. Minecraft has nothing of the sort. It sticks almost exclusively to the Mob::HostileMob::Zombie inheritance we all grew up with. This is not bad or good[1], it's simply a reflection of the low-abstraction style of attacking problems that I associate with self-taught programmers.

On the other hand, its Magic Number to Constant ratio is downright scandalous . . . ;) Though it's possible that some of that is an artifact of the compilation/decompilation process.

[1] I'm lying, in this case it's a good thing.


Have you ever looked at a class named ObjectFactoryFacadeCollection and thought to yourself, "oh boy, this part will be fun to read?"

On one hand there's the complexity of the problem you're solving. On the other hand, there's incidental complexity. The ObjectFactoryFacadeCollection class squarely falls into the incidental complexity category. In other words, the moment you are writing a class of that sort, you have stopped working on solving the problem you set out to solve -- you're solving a problem that was invented by your tools, design, or limits of your understanding.

Rich Hickey gave an extremely good talk about trying to avoid this kind of incidental complexity: http://www.infoq.com/presentations/Simple-Made-Easy .


This isn't necessarily true. Sometimes you do in fact need these types of abstractions. This is why they've been made into patterns. The trick is to not use it before its necessary. The mere existence of it doesn't imply overengineered code.


Yes, of course you do sometimes need these types of abstractions, but you seem to have missed my point: they are a factor of incidental complexity. To restate, they are not at all inherent to the problem you are trying to solve. They are inherent to the tools with which you are solving the problem.

For instance, if your problem is calculating the trajectory of a projectile, a solution certainly exists that does not involve anything at all like an ObjectFactoryFacadeCollection. However, certain solutions involving unnecessarily complex abstractions could conceivably require one. This is incidental complexity. On the other hand, all solutions will require some information about the projectile's velocity, gravity, and so forth. This is complexity that is inherent to the problem itself.


Philosophically, I agree. I view premature abstraction in the same light as premature optimization.

It's usually easier to optimize later since optimisations are often just taking sections of code independantly and making them quicker. There is the whole 90/10 rule (or whatever it's called) that says it's better to highly optimise a few sections of bottleneck code rather than the whole thing.

Trying to retrofit an abstraction to a piece of code is almost always a horrible experience frought with mess and compromise.


Trying to retrofit an abstraction to a piece of code is almost always a horrible experience frought with mess and compromise.

Yes, and unless the problem is trivial or your experience in the domain is such that your foresight borders on the clairvoyant, this is guaranteed to happen. No matter how much (or little) design you do up front.

The key is to recognize the right time to stop and refactor, so as to keep the pain that comes with learning the problem space to a minimum.


Premature abstractions can have similar issues: unless you have more than 2 cases you don't necessarily know what your abstraction should look like. As the cases pile up you find yourself increasingly shoehorning implementations into abstractions that don't quite abstract correctly.


> Trying to retrofit an abstraction to a piece of code is almost always a horrible experience frought with mess and compromise.

It is amazing to me that our experiences are so different: I have found the exact opposite of this statement to be true. The only way that I've ever come up with a good abstraction is by starting with something concrete (preferably two or more instances) and factoring out the commonality. Retrofitting a piece of code to an abstraction that was designed in a vacuum tends to be an exercise in frustration, due to the abstraction being shortsighted and insufficiently suited to the problem space.


A well-abstracted program can be easier to optimize because modules are more loosely connected and their internal implementations can be replaced/optimized without breaking the rest of the code.


ObjectFactoryFacadeCollection eh?

I know of a lot of people who'd code like this saying that it makes their code more testable et c, but they couldn't show me any test code.


I disagree. Abstraction is fundamental to programming. The more abstractions, the better. I'm not talking about design patterns here, but abstractions that hold explanatory power in your problem space. They necessarily increase code comprehension, reduce potential bugs, etc.

Abstractions reduce potential bugs by reducing the 'interaction-space' of a particular entity in your code. Think about a program that has 100 variables all in one function. That is potentially 100! interactions between entities in your code. When you make a change, you have to reason about all 100! interactions to be sure you're not introducing a new bug.

Abstracts greatly reduce this space. If instead you have 10 objects who each contain 10 variables, within the object you have 10! interactions to reason about. In the main function that ties each object together you now have 10! interactions to reason about. This is many many orders of magnitude easier than the original problem.

The more (natural) abstractions, the better your code.


> The more abstractions, the better.

No. A hundred times no. If you have ever had to make sense of a complex program that was over-engineered with unnecessarily complex abstractions, you cannot possibly think that this is true.

> Think about a program that has 100 variables all in one function [...]

This isn't an example of code that is not abstract enough, it's an example of a basic failure to understand the principles of writing a program meant to be read by other humans. Sure, breaking that code up into understandable chunks is a form of abstraction, but it's not exactly the kind of abstraction that the grandparent was talking about. She was talking about "extremely elaborate object frameworks." I maintain that elaborate object frameworks are a bad thing, unless they are "barely enough to get the job done." Anything beyond that adds unnecessary complexity.

This beautiful and poignant quote sums things up much better than I ever could:

“Perfection is achieved not when there is nothing left to add, but when there is nothing left to take away” – Antoine de Saint-Exupery


He may have been referring to framework-level abstractions, but I was specifically referring to natural abstractions for the problem space. Blanket statements against abstractions in code is missing the point--abstractions are fundamental to programming. "Elaborate object frameworks" could mean a few different things. If you're referring to design pattern type structures, then I somewhat agree that the fewer the better. But in the general case it is not true that perfect is not being able to take anything more away. Not when humans are the ones writing and maintaining the code.


Abstractions reflect a persons understanding of the problem space. Iterative game development is exploratory. There is some trade off between redundant code and ease of local modification without worrying about global impact.


> > The abstraction is just barely enough to get the job done

> Or, in other words, the perfect amount...

... until your first hire


In my experience as a person who has been hired before, I can say that I would absolutely prefer well-factored, but rather concrete code over towering abstractions. Sure, if something is a general feature, it should be abstract. There's nothing more frustrating, though, than wading through layers of abstractions, only to find that hidden behind them is a singular concrete implementation.


You shouldn't optimize a personal project for your first hire. Do the simplest thing that works, and then abstract/iterate.


> Based on what I've read from decompiled Minecraft, I'd guess Notch is self-taught. The abstraction is just barely enough to get the job done

How can you tell from decompiled optimized code? Not saying you can't, just interested. Haven't read much decompiled.


The compiler can't really change things like the class hierarchy, and a lot of the more interesting optimizations are done at runtime.


Right. If you're coming from a C background, you'd be amazed how much information from the source is preserved in a Java .class file. To my knowledge, C++ turns into a binary blob, and a lot of magic can happen along the way. But Java turns into a slightly more machine-readable version of Java. A lot of the structure in the code is actually directly meaningful to the JVM. All the details of the object -- its variable names and types, its methods and their signatures, the class hierarchy, even sometimes the line number in the original source code -- is still there. The preservation is so complete that it has been said all Java programs are essentially open source (the sort of thing people would normally say about, say, Javascript).

Now, last time I played with Minecraft, it had been run through an obfuscator as well, so some of that doesn't apply. In particular, the variable and method names have been reduced to gibberish, and I don't know what sort of monkey business might have occured around inlining constants.

But macro structure doesn't change. In Java, given the way .class files work, it really can't change. And that's where a lot of the abstraction in a project lives. The class hierarchy is still there. Use of interfaces is still there. How you organize data, how you manufacture objects, which function calls which function, it's all still there. Even little things, like whether your functions return objects, enumerations, or magic numbers, is unchanged.

Even in decompiled code, even right at the start, you can blur your eyes, and at a glance you'll see either a lot of little functions or a few big ones, a lot of little classes or a few big ones, a lot of inheritance relationships through abstract classes or a few simple ones.


I'd kinda like this kind of evaluation on the code I write for work.


I once spent about nine months creating a largish program, and then moved on to another job. I spent about a day educating the fellow who would be carrying on the maintenance. I barely knew him.

About a year later, we met again. I still didn't know him, but he sure felt that he knew me. I remember he greeted me with, "I love your sense of humor", and "If you really want to know someone, you should work on their code for a year." No doubt he knew me as a programmer much better than I know myself.

Talk to someone who's maintained your code. :)


Code reviews are a great way to get feedback on your work. Like open-source, knowing your code will be seen by co-workers is a strong personal incentive to produce better work.

If your work does not do reviews now, you can still ask a co-worker to look over you code changes. Many people would be flattered that you would ask for their programming wisdom. ;)


Can you point to resources/books where I can read more about design as is taught in CS degrees? I'm curious on their problem solving strategies compared to mine.


>The abstraction is just barely enough to get the job done, and the algorithms are decidedly homebrew.

Wow.. actually your comment is EXTREMELY critical and very dishonest in the end claiming that its not a criticism.

I'm just curious, what software have you written, and how many millions of users does it have?


"Does anybody know if Notch is self taught? I thought he had a CS Degree from somewhere?"

The two are not mutually exclusive.

I was talking with a friend recently who's working on her Ph.D, observing that everyone who thought I was so smart in college is now better educated, officially, than me. She was like "You're basically self-taught, but with a piece of paper that says you were willing to stick around in college for 4 years. Even in college you were self-taught."

One of my teammates at work recently said I should become a professor. I was like "Don't I need a Ph.D for that?" All my other teammates were like "Naw, man, visiting lecturer!"

If you're actually self-taught, you can treat formal education options as a menu that you might or might not choose to sample, depending on your goals at the moment. You don't have to define yourself as one or the other.


This is interesting, when I studied CS at university I'd say there were 2 different types of people that got high marks.

There were people who were self taught , either before they started CS or once they learned some programming at university they identified other areas outside the course that interested them and they could apply their new programming skills.

These people typically got jobs in the software industry after graduation and become software developers.

The other group of people were people who were just generally high achievers and learned enough programming to pass the course with good marks but nothing much else. They got equally high marks because they were good at passing exams.

Most of them either retrained for a career in finance , went into academia or got management gigs at tech consultancy companies.

I can't think of anybody I know who is a working programmer who is not self taught to at least some degree.


> They got equally high marks because they were good at passing exams.

I don't understand how this makes sense, unless being "good at passing exams" means "cheating". Can anybody explain? I hear this said so often, and I usually chalk it up to the speaker rationalizing his own poor scores.

My CS exams were always hard, and the only way to "get good" at passing them was to learn the material.


Different types of learning. I got very high grades in all my courses during college, but a number of them were due to the fact that I am really good at cramming and figuring out what to study and what to ignore. That's what it means to be good at passing exams.

Most of my CS courses I actually spent the time to fully understand and internalize the material. That was the material I truly learned.


But when you write an exam, clues to your mental state are all throughout the material you write on the exam, and that mental state includes everything you know. Someone who's "good at passing exams" can extract clues to your mental state from the wording of the questions and figure out what kind of answer you want. On multiple-choice exams, they only need to come up with 2 bits of information about which answer to choose, and the proposed answers themselves give additional clues to the exam-writer's state of mind. 2 bits doesn't feel like mastery; it feels like an educated guess.

It depends on the subject area, of course, and the skill of the exam writer. But it's very common to be able to pass multiple-choice exams without knowing anything at all about the material.

Knowing the material, of course, helps you come up with the right answer --- but it also helps you a lot with "reading" the exam writer. And you don't necessarily need to know a whole lot about the material to get an advantage that way.

It's also often possible to get acceptable marks on exams by parroting rather than deep comprehension.

I think my test-taking skills were usually worth one to two letter grades' worth on exams when I was in school. I could usually get a D or C on exams where I should have gotten an F, and an A on exams where I should have gotten a B or a C. A little while back, I got 97% correct on the ai-class final exam without having learned more than half of the material. (In that case, though, I think the test also failed to cover most of the material.)

I think non-multiple-choice math exams are probably the hardest to "fake out" this way.

There are other people whose test-taking "skills" actually have a negative effect on their scores. First, they study the material in their bedroom or at the kitchen table, rather than the classroom, unnecessarily impairing their recall when the exam comes. Then, they show up to the exam exhausted and sleep-deprived from cramming all night, damaging their ability to think creatively or tolerate stress, and then they have an extreme stress response from the test-taking situation, further handicapping their ability to think. It's easy to imagine that someone like that could fail a test I'd get an A on, with the same level of knowledge.


"Someone who's "good at passing exams" can extract clues to your mental state from the wording of the questions and figure out what kind of answer you want."

I actually tried an experiment on this when I was in high school. I took the AP Comparative Government without ever having taken the course, or really having any sort of academic exposure to it (hey, it was free with the purchase of the AP US Government, and I was taking the day off from school anyway for the latter test). My only knowledge consisted of what I read in the newspapers, plus half an hour with a test prep booklet at breakfast that morning, plus whatever I could glean from the test questions themselves.

I scored a 3 on it. Not a great score, but passing. Pretty good, actually, considering that the test involved writing 4 essays on a subject I knew nothing about. So I figure perhaps 50% of the outcome of a test is knowing the material and the other 50% is test-taking skills.

Ironically, though, I think that the skill of extracting subtle clues to the mental state of the people around you, and the answers they expect, is far more valuable than any subject matter you learn in school. It's absolutely essential if you work in an organization, so you can understand who the decision makers are, what their priorities are, and what will really impress them without them having to tell you anything. It's absolutely essential if you manage people, so that you understand why they're working for you and what will motivate them to do their best work. And it's absolutely essential if you choose to strike out on your own and be an entrepreneur, because that's how you tell what customers want. They're generally indifferent to you and often have no clue what they actually want; there's no way you'll get them to tell you.


In my experience, some people are better are forcing themselves to be interested in things.


Interesting... if true, what this could mean is that virtually no-one who is taught programming at school ends up enjoying it. You have to be one of the ones who seeks it out before you are required to learn it, otherwise you're not the kind of person who would tolerate it as a job?


I don't think that's really true. I knew someone in college who had never touched a compiler before her sophomore-year intro CS course. She ended up graduating with high honors, went to work at MIT Lincoln Lab for a few years, and is now doing a Ph.D in CS.

I know several other people here at Google that went into college studying things completely different from CS, took a couple courses in it, and discovered they loved it.

I think the real determiner is what happens after college, after the academic support net is taken away. The people who go on to become really great programmers use it as a springboard to start seeking out information on their own. You can always tell whose these are once you hire them, because they'll ask you several questions about the system to orient themselves, get some code & documentation pointers from you, and then go about their merry business learning everything they can, including tons of stuff you didn't tell them (and often, didn't know yourself). The mediocre ones learn just enough to accomplish the task at hand and then ask you again as soon as they have to do something new.


I have similar issues. For most of my adult life I've had so many interests that it is just impossible to keep up with all of them to the level I want. Even within computing as a field I am overloaded. I read a lot of stuff on HN and elsewhere I find interesting and I agree that it makes me feel like I should understand it all. First time I've came to that realization.

My current solution to the problem of overwhelming interests is starting grad school in CS; by going through a program, I am forced to focus on certain things by outside forces, thus saving my sanity.


Your other option is to become a consultant, if you can stand it. This is meant both seriously and as a joke.


I already am one, kind of.. I work for these guys: www.mitre.org

My day job is broad enough that I get to sate most of my professional desires in one way or another. Now I have grad school to explore some topics in depth and expand my education. It's working out very well so far.


This is my solution as well; I hope to start in a few months. Really looking forward to it.


> Does anybody know if Notch is self taught? I thought he had a CS Degree from somewhere?

I have a CS degree, and I'm 99% self-taught.


That's an interesting dichotomy. I have a CS degree from a well-respected school, but as far as programming goes, I would say I'm at least 90% self-taught. I guess I assumed that all programmers are essentially self-taught, regardless of educational background.


There is much knowledge than you can learn. Try to focus your learning on what help you solve your clients problems. You'll become an expert only in a thing or two, and you'll be able to talk about it like the people here on HN.

Also the smart comment writer is not a representative of the average HN visitor/population. There are always people who are expert in their niches.


This is true, However at some point I want to move my career in a different direction and most of what I learn on the job is industry specific and the industry itself is not something that hugely interests me (although I have not decided for sure what I want to do). So I am trying to learn as much fundamental stuff as possible.

The issue is deciding how much one should know about something before you confidently put it on your resume.

For example I would say that I am an OK Java programmer but I avoid using enterprise frameworks which are commonly used in many companies so there are many parts of the language which I am not familiar with simply because I have never had cause to use them, for example I do almost all of my persistence using a database and ORM so I almost never have to use Java's concurrency locking features in the wild as I do all my locking in the DB.

The same with functional programming, there is no reason to learn it for my job but I get a feeling it will become more important as time moves on so I should know something about it.

There is allot of stuff on HN with people saying that everyone should have implemented a toy compiler at somepoint and if you haven't then your not a serious programmer , or perhaps it is a Toy OS etc etc.


HN is a good place to go if you begin to believe that you are truly awesome at coding and know it all. Brings you back to earth with a thud




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: