> The grief factor of learning to code is on a different scale to every other major. One missing semicolon will take your whole tower down, and you realise this in the first day of practical exercises.
I would say that this basically screams "badly taught programming". Missing semicolon does not take down work, it just needs to be put back in and ide should help you.
"Even if you are of the opinion that CS is math, and coding doesn't come into it, you will hit a coding wall early on."
I mean it seriously, programming is easier then math. I am telling that as someone who always liked math.
"And that's just what you consider code. If you put in the wrong compiler flags, it breaks. If you can't link because something needed rebuilding, it breaks. Want to avoid it? Learn how make works. Huge number of options, which you won't understand, because you're a novice."
This is not something novice should be dealing with.
"Oh and learn git, too. And Linux. Just so you can hand in the homework."
As I told, badly taught programming that takes in people who know nothing and then proceed as if they already learned stuff in the past.
> I would say that this basically screams "badly taught programming". Missing semicolon does not take down work, it just needs to be put back in and ide should help you.
The point is that the incidental complexity is there, and that it's something that has to be dealt with, to even begin to get to the fundamental part of the problem.
That you have to be using the right IDE (and learn how to use it, and learn to recognize it's attempts at helping you with the problem) does not refute the point, it's a perfect illustration of it.
CS isn't harder. CS has high incidental complexity, the crap you have to deal with just to be able to work on the problem within a real world environment.
And yet I had zero problem learning Logo when I was like 10 years old. Neither did _most_ of the other 10-year-olds in the class. They didn't all take to it as much as I did, but the majority of the class was not stopped by the frustration of syntax errors.
If you can teach programming to 10-year-olds without it being too frustrating... on the other hand, I suppose you could argue that younger people will have an _easier_ time, for the same reasons (human) language acquisition is easier for children. I dunno.
I'm not sure how it relates, but what I think was _really_ an immense aid to people of any age learning programming in the 80s was that computers and software were _simpler_, meaning that you could very quickly approach the level (at least at first glance) of 'real' software with what today seems like 'toy' software. The text adventure games I started writing as a child weren't actually _fun_ for anyone to play but me, but they _looked_ like a real text adventure game that other people played, they had the same form. Or the ascii-graphic blackjack game. There was gratification with a pretty low bar.
I have no idea how I'd become a programmer today. It's an entirely different path. People ask for me for advice, and in my head, I'm like, well, first be a child in the 80s with an Apple IIe, then start writing for the web when all you need to know is HTML and you write it in Perl cgi-bin... I have no useful advice.
Yeah, I tried to teach myself programming in high-school (mid 2000's) because I wanted to try and get into building games. I was overwhelmed by the complexity of even setting up the tools. I figured I wasn't cut-out for it. Then in college I decided to take an introductory programming class that involved writing simple C programs and I was hooked. The prof did a good job of minimizing the accidental complexities of programming.
That is the thing, you should not deal with that when you are just seeing for cycle for the first time. You should not be anywhere near compiler flags. The IDE is something you should start with. Text editor + compiler is the harder way. The first class should start with opening IDE and writing something really simple.
Like someone mentioned logo or the old basics - if kids start in that environment they pick it up quite easy. They pick up concepts (wtf variables, yes does not matter how they are named, conditions, for cycle, functions) and there is instant gratification and little complexity. Just playing those games with little robot and arrows that "programs" it or simple scripting language in some game already makes much difference.
Then you can move on compiler and java or C - because basics are out of the way and it is the time to deal with what you call incidental complexity. Basically, one concept at a time. It is a bit as like teaching quadratic equations to people who were never taught normal ones nor much arithmetic.
Logo is great and all, but we're not talking about beginning programming for kids, we're talking about CS degrees.
At some point between Logo and bachelor's degree, you run face-first into that complexity, which is the "wall" the commenter atop this thread was referring to.
> Logo is great and all, but we're not talking about beginning programming for kids, we're talking about CS degrees.
I'd be willing to bet you could create a CS course using nothing but Logo - and it would probably be better for students than what currently exists.
In his book "Mindstorms", Papert discusses the faulty idea that Logo is only for children, and that it isn't a language that can be used to develop complex software.
The fact is, it can. In fact, Logo is pretty close to Lisp in its functionality.
The most complex piece of Logo I've ever seen was a few decades ago in an issue of the Rainbow Magazine (for the TRS-80 Color Computer); it was essentially a game of Monopoly, with full graphics (and probably sound too); I'm not sure if it did any kind of file i/o - but I know Logo supports all of that and more (especially current versions of Logo).
Seriously - if you think Logo is only for kids, you're missing out on a very fascinating language.
> Logo is pretty close to Lisp in its functionality.
Is it? Quote from the Logo Manual, 1974:
All complete LOGO statements are imperatives, so that an operation cannot stand by itself. If you type SUM 17 26, Logo responds with the error message YOU DON'T SAY WHAT TO DO WITH 43. In contrast PRINT SUM 17 26 is a complete statement. The computer will print 43.
Yes, but many students are basically at the logo level at that point. They struggle with basics. Then there is huge sudden jump to compiler flags which you call the "wall". But that is exactly what I call bad teaching. There is no reason for that to be the wall, all you need is not to teach them everything at the same time in lesson one. This is combined with culture which tenda to attribute easily learnable skills to "natural talent" - so students conclude "I am not that talent" and go somewhere where they think they are talent.
I mean, you can teach concepts one at a time and still be demanding and require certain speed. I am not saying it should be easy. You however should separate those concepts from each other and teach them one by one. Otherwise you just relying on them to have it "picked up" randomly which is not an expectation other majors have (like physics or math or biology).
In early college I once spent 10 hours in the computer lab with a broken program over a missing semicolon. The compiler kept telling me the problem was something completely different and in an entirely different part of the code so I kept my focus there, but in reality it was just a missing semi-colon much higher up. I got so frustrated from that (and a couple other struggles, along with some extremely shitty and unhelpful professors), I figured I must be in the wrong degree, and ended up taking English classes for a semester before dropping out entirely and went to work in a warehouse for two years and not doing any programming whatsoever.
Eventually some people got me back into Flash development (I had dabbled before and had a couple sort of popular things on Newgrounds, and they were asking me for help). I then made a couple of games that were super popular and realized I probably wasn't in the wrong field after all, and eventually went back to college and finished my degree. It was easier the second time around, and I had much better teachers.
Now I've been in the industry for about 10 years and have worked on all sorts of different types and sizes of programs and learned and used dozens of languages and technologies.
That was back in the 90s when I had that semicolon problem, and I'm pretty sure compilers have improved since then, because I usually see compilers catch missing semi-colons pretty much exactly where the problem happens nowadays. Also there was no Stack Overflow back then, and most forums weren't terribly helpful either, and programming blogs were a lot less common and comprehensive.
I remember specifically that I had difficulty wrapping my head around the concept of linked lists and there seemed to be no where to find that information besides my professor who was a real jerk and said, during his office hours, the specific time he's supposed to be available to help students, that "if i didn't understand it during his lecture he couldn't help me" and two old books in the school library that had a few paragraphs about it each.
> That was back in the 90s when I had that semicolon problem, and I'm pretty sure compilers have improved since then, because I usually see compilers catch missing semi-colons pretty much exactly where the problem happens nowadays.
Yeah - that's been replaced by C++ compilers b0rking hard on errors in variable templates and such (and leaving you just as confused as to where the problem really is).
I took a short course (1 credit hour) in Java when it was new enough that there where no IDE's. IMO, limiting courses to established languages is kind of a waste.
I also did Java as my first serious programming language with no IDE...though this was in the early 00's in AP CS in HS. But it was good to get to know the basics through the most basic form of it. Javac/Java on the CLI, and a pure basic text editor.
On the bleeding edge your going to have to deal with things like semicolon placement by compilation. I am not saying writing code on a test by pencil and paper should be the normal approach, just using an IDE 100% of the time is hardly nessisarily.
I'd agree that starting with a plain old text editor is probably the best approach. Remove as much of the magic as possible when you're first getting started, and then add it in once they understand the basics.
I learned java in high school as my first language, and I struggled to get the IDE properly set up - it definitely slowed down my understanding.
Given the gender skew amongst high-school programmers, this is almost equivalent to saying "women should not be starting a CS degree". It also further entrenches the advantages of people who come from affluent backgrounds. Given the success of some short coding bootcamps, it isn't inconceivable that novices could be brought up to speed within the first semester.
We don't expect law students to have been amateur lawyers in high school; it's a bonus if they have been in the debate society or have taken a personal interest in the subject, but it's not necessary. I don't see why CS should be a special case, aside from a misplaced attitude of exclusionary elitism.
> It also further entrenches the advantages of people who come from affluent backgrounds.
If you can afford a video game console, you can afford a raspberry pi (or equivalent.)
> We don't expect law students to have been amateur lawyers in high school
No, but we expect math majors to have done some math before they get to college.
I don't disagree with you that most people can be brought up to speed in less than a year, but it's also not unreasonable to expect first year CS students to have had some programming experience. Computers don't cost thousands of dollars anymore.
Math majors are expected to know high school math. You can literally know exactly that and succeed. If you wish to get ahead of competition, you know exactly what to do and exactly what is measured. CS does not have such clarity of expectations. I knew basics, because of journals my father bought. Since I knew basics, the teacher shown me some competitions - from then on I just went with the flow. Had my father not bought those journals or had my teacher not shown me competitions, I would not study cs or had much harder time.
If you are a boy, video game console is something that is bought for you often times whether you really want it or not. Raspberry pi is likely not and you don't know it exists. And even if you know it exists, you need computer to upload code to it and then you could have just use that computer to code. I know that it is easy to code when you know where to look, but the biggest hurdle is that many kids dont know where to look. Instead, they are told there is something magical and hard about it.
I remember being told that it is hard or that I cant compete because I am (presumably) just learning and other kids "already know a lot". It was bs, but that is where it is for many kids.
Seriously, I knew straight A students who were more hardworking then me and had good grades in math (meaning no dummies memorizing stuff) were under impression I have some special brain because I can program. All the myths around cs tend reinforce such nonsense.
> Raspberry pi is likely not and you don't know it exists.
If a kid can't type "how do computers work" in google, they probably won't become a programmer. Are we really wringing our hands over that?
> All the myths around cs tend reinforce such nonsense.
There's another bit of nonsense around CS that a lot of people here believe - that there are all these secretly good programmers out there who just don't know they're programmers because no one ever told them how to start, and that they might not even know what programming is because it's so difficult to get started.
I find it nearly impossible to believe that there are all these smart people out there, surrounded by technology, with hacking and computers in every TV show, with the President telling them to "learn how to code", Bill Gates' name dropped by rappers, Google in everyones' pocket - and no one is typing "how does my smartphone/videogame/internet work" and reading the results?
Like, who is still walking around like, "I got good grades in school and knew people could become doctors, but I didn't know people could make computers go! Why didn't anyone tell me?!"
Are there other problems keeping people from breaking into the industry? Absolutely. Is "not knowing where to look" one of them? There's no way that it can be, not anymore.
You're also missing the social aspect, particularly for girls. Depending on your personality, friendship groups, culture and Family it's often not socially acceptable to spend your free time as a teenager hacking away at stuff. You forget that those of us who are happy to spend those years in our bedrooms hacking away are weird. If you're a geeky boy you can get away with it, some folk may even encourage it. If you're a girl not so much.
The idea that we block off CS as a subject for only those who've taught themselves some programming is absurd and elitist. While we may expect Math students to know Math before Uni, it so happens that 99% of high schools teach the level of Math we expect them to know. In a lot of countries it's compulsory. Same goes with most of the sciences, we expect the level that schools will teach. Unfortunately most schools don't teach programming, and those that do often don't do it well.
> If a kid can't type "how do computers work" in google, they probably won't become a programmer. Are we really wringing our hands over that?
If a kid cant type "how human body works" in google, they probably won't become a biologists. Somehow, being constantly surrounded by all those bodies and dogs and grass they did not learned biology. Except that they come to college and do learn. But realistically, "how do computers work" is such a broad question that it does not have to do with anything. It is literally irrelevant question to anything practical.
But that is literally this culture I am talking about - the conviction that if you already don't know stuff and was not interested in the past, you are not talented point stop don't even try. Math teachers nor chemistry teachers assume that - if you are curious now are happy with you learning now.
> that there are all these secretly good programmers out there who just don't know they're programmers because no one ever told them how to start,
They are not secretly good programmers. That takes more then just aptitude. But yeah, a student with good memory and basic math aptitude has genetics for programming. There is nothing special about us. The hardest part of beginner is to figure out what makes sense to learn and what not.
> and that they might not even know what programming is because it's so difficult to get started.
Well, it is difficult when you don't know what to do and get told you probably don't have aptitude first time you struggle with something. It is easy if you learned from simple concepts, either because you run into right game or because you run into good teacher or book.
Exactly like any other learnable skill - math, chemistry, law, physics and so on.
> "with hacking and computers in every TV show"
That has nothing to do with reality. I see sword fights in many movies too.
> "and no one is typing "how does my smartphone/videogame/internet work" and reading the results?"
What does that question have to do with building sofware? Like, real world software with real world homework that suddenly out of nowhere expects you to have skills that were not taught previously.
What does that question have to do with algorithms? Programming languages? Whether the students would be more attracted to solving problems side or building things side or theory side or software engineering side, your how does videogame works is largely irrelevant. Plus the answer will be high level overview of memory/cpu and such.
> Are there other problems keeping people from breaking into the industry? Absolutely. Is "not knowing where to look" one of them? There's no way that it can be, not anymore.
You did not suggested a single practical place for beginner to learn stuff. Only few very general question that does not necessary lead to programming - most of them would lead to electronics at best. And such good resources exist, but none of your suggestions lead to them.
The "breaking into the industry" is far away from "starting with programming". But then again, that is part of nonsense around this. I know people who found a job with very little knowledge in small company just because they looked confident and hiring manager was inexperienced. Few of them even became good programmers, although they created huge mess on their way there and had to rely at politics a lot. But somehow, a honest student with good results in pretty much everything else is assumed unable to learn the same, because he is less good at pretending.
> What does that question have to do with building sofware? Like, real world software with real world homework that suddenly out of nowhere expects you to have skills that were not taught previously.
You're way beyond the scope of the original discussion.
Someone suggested people would drop out of CS due to the grief factor of - for example - breakage due to missing semicolons.
Another person then suggests that maybe it's a good idea that CS majors have some experience programming before they enter the program. Note: some experience - no one said you need to have built a "real-world" team project before going into a CS program.
Then all hell breaks loose because, apparently, getting online and reading a couple tutorials and screwing around in a browser-based programming sandbox (there are tons), is something only wealthy people with college educated parents can do.
Another person then suggests that maybe it's a good idea that CS majors have some experience programming before they enter the program.
Why not offer two tracks? People who have never programmed before have to take a 6 month intro to computers course, that people who know how to program can skip. The people who skip the course can then either use the extra time to take other courses or graduate earlier.
Honestly I think the only prerequisite for a CS major is to have interest in it. I never had any programming classes in high school and I switch my major 3 times until landing on CS. I had never programmed before in my life, but it was interesting to me and I found out I had an affinity towards it. I'm just saying this statement that you need to have some background to it is not necessary.
A video game console has a known and proven entertainment value. The whole family can use it. It's a reason for people to come over to your house. It's a status symbol.
Beyond that, I know a lot of poor kids who don't have video game consoles, and there literally isn't $50 in the budget for something that may or may not produce any value (not to mention the monitor, the peripherals, and taking up vast amounts of time on the family computer -- if one exists -- to figure out how to use it).
I have extended family living in rural Utah who don't even have a computer -- they use their phones for the Internet. You can debate back and forth about the cost of a phone vs. the cost of a cheap Desktop, but people need phones to function in society, for better or worse. Try figuring out what to do with a bottom tier smartphone, a limited data plan, and a Rasberry Pi that just arrived in the mail.
I volunteer at the Boston Museum of Science, working in the Tech Studio/engineering department, often showing off the latest "engineering toys" -- Little Bits, Rasberry pi, Scratch on an iPad that controls a Lego robot... Rich kid parents ask "Oh, where can I get this?" It's not a big deal for them to drop a couple hundred bucks on flavor-of-the-week programming toys. Poor kid parents are often interested and enjoy playing around with it at the Museum, but never ask about getting one themselves. It doesn't even occur to them.
Beyond THAT, there are a lot of majors that don't cost a lot of money. You can use the argument "X doesn't cost much to learn -- why haven't you done X before on your own time?" to apply to anyone.
You can always point to someone who can't afford it. That doesn't make me wrong.
There are now computers that cost as little as taking the family out to dinner at McDonalds. Probably a hundred million more people (yes, however, not every single person, ok) have the means to learn programming compared to when I was a kid.
Sounds like a great opportunity for local libraries to become more relevant (and/or for a nonprofit org to supply them). I know many local libraries supply at least a few computers for internet browsing or whatnot. Why not a pi or two, or pocketCHIPs to loan out?
> If you can afford a video game console, you can afford a raspberry pi (or equivalent.)
Right. This type of democratization is EXACTLY what allowed me to climb the social ladder out of the rural midwest into a Top 15 university and then Silicon Valley. As a teenager, all I needed to teach myself coding and advanced math was free time and an Internet connection.
>Given the gender skew amongst high-school programmers, this is almost equivalent to saying "women should not be starting a CS degree"
What makes you bring up gender over race or class ? There are more middle-class Asian and White women in the field than lower-income African-American and Hispanic men.
The following sentence was about household income. I'm not entirely sure that race is a meaningful factor after correcting for socioeconomic status. Gender is demonstrably the biggest skew in CS majors.
> We don't expect law students to have been amateur lawyers in high school; it's a bonus if they have been in the debate society or have taken a personal interest in the subject, but it's not necessary. I don't see why CS should be a special case...
Agreed, and when it comes to CS, programming is the easiest part. People who fail to see this are probably the ones who only know programming and no CS.
If you've applied for a CS degree and not done any programming though why (outside of financial incentive) would a Uni want you, surely it shows you've no interest in the field - that has to correlate strongly with failure. From nothing to doing bubble-sorts with something like python shouldn't be more than a couple of weekends for someone of the caliber to study CS at Uni.
Say you apply to study marine biology but have never been to the seaside then sure, you might do fine, but really I'd be wondering why you weren't taking a gap year and doing some self-directed study at a beach (hey, that sounds fun!).
Maybe because in the USA, you don't "apply for a degree" at the undergraduate level? You apply to be at the university full stop, or at most an honors program.
So, you do modules or somesuch; presumably they have pre-requisites that work in largely the same way? Like for non high-school subjects "read chapters 1-4 of 'intro to $subject' and have a grade C or above on $exam"?
How do you manage expectations for courses/modules/whatever from both sides (teacher:student) without such things.
You simply declare a major/concentration and register for classes in the appropriate departments. If you get acceptable grades in those classes you are allowed to continue.
Isn't "declaring a major" almost synonymous with applying for a degree, presumably the declared major in this case would be "computer science", so requiring some programming knowledge for those declaring that major would be possible?
In theory can one easily go to a USA Uni based on relatively unrelated abilities - lets say history and cooking? - and leave with a degree in Electronic Engineering? If there are no subject-based pre-reqs why does one need to declare a major, or is that done later?
My UK degree was modular so I could have done something quite similar, but I was assessed for entry based on abilities pertinent to a nominated degree (not the one I ended up with as it happens). Modules still had [relatively loose, unassessed] pre-reqs such as reading particular books or understanding specific concepts.
By the time you have declared a major, you will have taken the prerequisite courses already. Some programming knowledge is required, but that can be acquired while at university, not before.
>Given the gender skew amongst high-school programmers, this is almost equivalent to saying "women should not be starting a CS degree".
Somewhat related to your point, letting novice programmers get CS degrees doesn't fix the problem because there is still a performance gap which results in a hiring gap. If there is problem with demographics having unequal experiences, while colleges can be tweaked to keep the inequality from growing, it is far better to fix it where it comes from.
It's far simpler to say than do, but our society needs to find why the gender skew in high school programmers happens and stop it.
Poppycock. I studied CS at UVA in the late-90s and had no previous programming experience. I was an experienced computer user, which wasn't a given in the 90s, but I hadn't done any programming. There wasn't anything in the curriculum that required previous experience.
Linux wasn't required for entry-level courses. At the time, they were taught using C++ and a Borland IDE on Windows.
Similarly, source control wasn't introduced immediately. I'm pretty sure CS101 assignments were turned on paper, in addition to email.
And the department offered many options for lab time, office hours, and tutoring. I worked as an undergraduate TA for 2.5 years, which basically meant holding lab hours 3-4 times/week for 2-3 hours at a time, in addition to grading assignments and exams.
I also started a CS program in '95, with practically no previous programming experience. (10 PRINT "HENRIK IS THE BEST", 20 GOTO 10 doesn't count)
One thing our university did though was that the introduction course was in Scheme, which evened the playing field immensely, because most of the kids who could program were self-taught in Pascal or C, and where pretty stumped when confronted with a functional programming language.
But my university's approach didn't help the abysmal graduation numbers either, there were so many classmates that dropped out during the first year. A bunch dropped out because the programme was an engineering programme, and they failed the math parts, but did good on the CS parts. Most of those switched to other programmes that were more pure CS and were successful.
But there were a lot of students who just lacked that elusive thing that makes a person a good programmer. There's been a lot of studies, and lot of previous discussions on HN of those studies, but the jury is still out, we have no idea how to effectively screen people for programming ability. The only thing we can do is toss them into an education and see if it sticks. The original article asks:
> Isn’t it reasonable to expect that people with an aptitude for math, physics, and engineering will also have an aptitude for CS?
And the answer is a resounding no.
Funnily enough, this whole thing ties into the problems of recruiting good programmers, another HN staple topic. We can't tell if someone will be a good programmer before an education, and we can't even quickly tell if someone is a good programmer after an education, or even after years of working in the industry! If there was a quick way of identifying good programmers, we wouldn't be in this mess.
Agreed. Apart from understanding logic, complex math is such a small part of development. I would say programming requires the overlap of knack for math and a pure creative interest like painting. That's much, much rarer.
During the dot boom, there were plenty of CS grads, but after the dot bomb and the market dried up, people lost interest. Also, the pay and incentives aren't nearly as good as they were during the dot boom.
Add off shoring, constantly pushing down wages, and the fact that most US businesses make programming miserable, that's probably most of the big picture.
A bunch dropped out because the programme was an engineering programme, and they failed the math parts
There are solutions to this problem as well.
JMU and GMU (high quality, but not top-tier, state schools) both offer pure CS degrees, but also a spectrum of multidisciplinary programs in "IT". One of my current summer interns is wrapping up a degree in info security, with a heavy dose of programming. The other intern is a straight-up CS major. Both appear to have the programming chops to join the workforce as typical application developers.
FWIW I started my CS degree in 2001. My first CS class assumed I knew how to navigate a *nix system (I forget which one). I was also expected to already know at least one programming or scripting language of my choice that could be used on the school's servers. That included but was not necessarily limited to: Perl, Java, C/C++. Oh, and I also needed to know Matlab.
I'd say that's a massive flaw in the curriculum. They've effectively written off the overwhelming majority of new college students. Outside of a high-school level CS course, where would one even encounter a *nix system (yes, I'm ignoring the dedicated geeks who spin up their own system for fun)?
Their goal was probably not to maximize the number of CS graduates. FWIW my university had a well respected 2nd tier CS program. By that I mean just short of MIT/Georgia Tech/etc.
My university, around the same time, offered a 2 week, 0 credit, intro to Unix course that everybody who hadn't worked with *nix systems before where expected to take. All courses then assumed that everybody knew everything taught in that course.
I was there the same time you were and while I thought I came in knowing a lot but I was dead wrong. I had to learn a ton of things from the ground-up like everyone else.
God forbid someone goes to college to learn something.
Of course, in my experience coming in to a CS degree as a techie didn't actually teach me how to make a living writing code. Most of the useful stuff I had to learn on my own time. The connections I made at school helped, but the value of the actual classes was low.
So perhaps a CS degree isn't intended to teach anything to anyone.
To the degree that colleges have their curricula setup that you need to have experience programming to enroll in the CS degree, that's on them. It's not true of essentially any other STEM major--or indeed for most degrees. (There are a few exceptions. Lack of fluency+ in English would make it had to make it through many top humanities degrees. And many music, etc. degrees would be tough/impossible without prior exposure.)
However, in general, the idea that you need to already be a programmer per-college to major in CS is somewhere between wrong-headed and toxic.
I think the founders of what now is CS would be lumped in with the very, very bright but still novice programmers. Most of the field has nothing to do with actual hardware anyway.
in my day/at my alma mater non-novice programmers had a reputation for dropping out because they were bored by the introductory material in the first year of the program.
I would say that this basically screams "badly taught programming". Missing semicolon does not take down work, it just needs to be put back in and ide should help you.
"Even if you are of the opinion that CS is math, and coding doesn't come into it, you will hit a coding wall early on."
I mean it seriously, programming is easier then math. I am telling that as someone who always liked math.
"And that's just what you consider code. If you put in the wrong compiler flags, it breaks. If you can't link because something needed rebuilding, it breaks. Want to avoid it? Learn how make works. Huge number of options, which you won't understand, because you're a novice."
This is not something novice should be dealing with.
"Oh and learn git, too. And Linux. Just so you can hand in the homework."
As I told, badly taught programming that takes in people who know nothing and then proceed as if they already learned stuff in the past.