Please don't attempt to operationalize this data by saying "I know PHP. If I learn Ruby, then I can get a $5,000 raise!" I talk to more engineers than is healthy, and many of them think that languages are the primary differentiator in salary (or some other metric of interest in their career). They're very mistaken. Far more salient are negotiating skill, location, what company you work for, where you're assigned in that company (profit center or cost center, favored project/division/etc or not, etc), and ability to use computer code as leverage to accomplish things that capitalism cares about.
Also, the dynamic range among e.g. Ruby programmers of my acquaintance goes from something like $2k a month to $2k a day, not primarily based on skill with Ruby. (If I were comfortable describing the individuals at the high end some people might say "That's cheating! They're getting compensated for non-Ruby things which happen to be expressed in Ruby programs." I call that winning rather than cheating but to each their own.)
ability to use computer code as leverage to accomplish things that capitalism cares about
For the people that ultimately pay the bills, this is the only thing that matters. They don't give a shit what technology you know or what a nice guy you are. They just want it done. To them, the ability to understand their domain, communicate effectively, and get things done is far more important that how you do it. I regularly hear from people who say, "I need x fixed/working/solved. We use y. If you don't know y, then figure it out."
Languages and technologies are only important to CTOs and recruiters. The point when you get beyond them and start talking to their bosses is when your career will really take off, not when you learn a "better language".
If that was the only thing that mattered, how do some consultants make 3x others, with exactly the same performance or worse?
While the people who pay the bills think that's the only thing that matters, your ability to confidently market yourself as providing perceived value plays a big part in your payment or salary negotiations, regardless of the actual value you provide.
Unfortunately I, like many others in our field, suck at this. Strangely, I've always found it easier to to market the work I do than to learn to market my own value. I can't even write a short bio without feeling like I'm stepping out of line.
In my talks with programmers, a lot of the disconnect is the perception that there's "legitimate work" on one hand and "marketing" on the other. While one may understand the importance of marketing at an intellectual level, the technical mind may still have an aversion to actually doing it.
My suggestion: replace the word "market" with "communicate" in your thinking. So:
"your ability to confidently communicate your value plays a big part in your payment or salary negotiations"
So it's not marketing, it's kind of like writing docs for end users. Still not coding, but a little further from "marketing". A small change, but it may give some mileage.
I feel like the term "marketing" is often used by not-so-great communicators to fill a gap of understanding of how value is communicated by those who are doing good business as a result.
I think the same goes for companies. E.g. a user friendly phone, tablet, computer or maybe just a very usable website, is not more expensive because of 'marketing', but because it does a better job communicating with the end user and thus creating a better perception of value to the customer.
The biggest personal shift for me came when I realised that marketing is not about me, my experience and the marketing copy on my website. It's about giving things of value to other people. Maximise the value you give to others and the number of people you give it to, and the "marketing" takes care of itself.
Languages and technologies are only important to CTOs and recruiters.
Even at the CTO level, languages and technologies more of a strategic business decision than a raw technical decision. Questions like:
* can we hire enough programmers of quality in this language?
* is there sufficient support available for these technologies?
* is the technology sufficiently mature?
Selection of a core technology can't be purely a "wow, cool" decision at that level, although it may factor in to what kind of programmers you are likely to get (e.g. see pg's old article "The Python Paradox" http://www.paulgraham.com/pypar.html)
This is key. If you think you'll want to or have to hire an 800 person staff then you're going to lose developer quality and will be served by picking languages which are already common on the market.
[...] people that ultimately pay the bills [...] don't give a shit what technology you know
Of course they do. Imagine trying to convince a CTO of a 20-strong company to switch to another, better suited for the business task at hand, programming language for a project with 1+ MLoC written over a course of couple of years.
I'm not sure I'd agree. By and large employers exist in language silos and will only hire devs with experience in their specific language, this means that there are distinct markets for different languages. That effect is amplified that entire sectors focus on specific languages.
So for example Java in London tends to be pretty well paid because all the investment banks hire for it heavily, this has also pushed up Java salaries in non-banking role.
The best PHP dev with great negotiating skill probably still makes less than an average Java dev at a bank.
There are two factors that determine how much money you can make in a job such as programming:
- How much money are you making for your employer
- What are the costs your employer needs to make to allow you to make that money for him
If there's a large and growing gap between the two, you can expect a raise, event if the company as a whole isn't doing too well. If there's a small or shrinking gap, you'll be looking for another job soon. If the gap is negative, you're probably the owner.
The only thing I'd add to this is that there exist organisations in the world that can afford to hire a team of developers (or, as it were, execute long-term contracts with a consultancy at an aggregated $10k+ day rate) indefinitely with little or no function that couldn't be achieved with Wordpress 1.0, including the security vulnerabilities.
In the worst case, I... I mean my made-up friend Bob... has worked on teams that seemed to have no other purpose than to execute the new idea that week, with little or no connection to any actual business value.
Bob has seen this happen too many times for it to be a random occurrence. According to Bob, such engagements/teams are typically funded by the taxpayer or upper management at companies that have more cash than God. Then I told Bob to shut up, because he's in breach of all the NDAs and sometimes people read things on Hacker News.
Very good point. It's not the language, it's how you market what you're doing with it.
What would be interesting in this survey is slicing it by industry and years of experience. Do Python programmers in Financial Services make more or less than C++ if they both have 3 years of experience? Which languages show tapering of salary after a certain amount of experience?
You were claiming the hiring market was on fire and that being fizzbuzz capable was about enough to get meetings with hiring directors and that your clients were begging you to find them an engineer.
During that time I was spending a great deal of time applying to jobs, going to tech meetups and interviewing and had very little success despite a track record of successfully taking on challenges in the business world and demonstrable junior-level programming skills. I asked you to your back up your claims and put me in touch with any of these clients dying for an engineer, as you might remember.
Now—about a year later—I've achieved the goals I had at that time. I'm working in a purely technical role building a large site that many, many people use. My compensation is at least in the top decile of both programmers in the US and residents of the city where I live. And one thing that was key to me getting a flood of offers from both large companies and start-ups a few months ago was developing domain-expertise in JavaScript and becoming proficient in several frameworks. Putting up projects on github and contributing to others played a role, too.
There is a high rate of unemployment in the US right now. As a result, many, many people are trying to break into the software world. For new graduates, there's a bit of an open door via internships but it's not at all easy in the general case. I've seen many talented people struggle for months to break in and then suddenly get a flood of offers. There's clearly a good degree of herd behavior involved.
Without the specific framework skills that I had, companies would have had a very long list of applicants to consider before getting to me. And both understanding the frameworks and doing well in the technical interviews required some fairly deep understanding of the language. Yes, the company you work for and the team you're working on have a great deal to do with your compensation. But both of those things are heavily influenced by how skilled you are in what technologies, especially for typical engineering roles. Knowing the right stack can open doors for networking around the front-door, too!
>They're getting compensated for non-Ruby things which happen to be expressed in Ruby programs." I call that winning rather than cheating but to each their own.
I wouldn't call that cheating but I would call it a poor anecdote to generalize from. Some people with a legal background translate contracts and earn a great deal of money for it, but that fact doesn't speak much to the market for translators as a whole or show there's no appreciable difference in career trajectories for translators of various languages.
I am very very impressed by your posts over a year ago, and by your recent achievements - as someone who hustled his way from redundant to twice my original salary in a similar time I sympathise with the effort.
I would say that perhaps both of you are missing one important component - specialisation. giving off the right signals for a particular niche enables you to credibly sell yourself in that niche (be it email-life cycle products or Angularjs coding)
I read your post from a year ago - you mentioned three interviews, Ruby, ObjC and JS. I would guess that today you would not dream of going to the first two - simply because your strengths lie in JS.
You have specialised - and as such become more valuable. But you could have specialised in many things other than language and still gotten good jobs. A specialist commands more money, due to scarcity and percieved increase in the probability of success.
However I still think patio11 is also right - of two equally qualified specialists, one that is better at marketing and promotion and negotiation is going to do drastically better.
But no matter how good Ramit Sethi is on negotiation, if he goes for a C++ job he is not going to beat out a specialist even if the specialist just says yes to the first offer.
(Little assumption there about Ramit's C++ skillz)
If your argument here is that knowing Javascript earns you more money than knowing no programming language in depth at all, I don't think anyone disagrees with you.
You could however have picked Ruby, PHP, Python, Haskell, Java, C(++), Objective-C (holy crap, Objective-C), Scala, Clojure etc to deep-dive on and would have been just as employable.
To convince me otherwise, you'd have to make the case that all of my professional connections are conspiring to deceive me or that I'm experiencing hallucinations on a near-daily basis and need professional help.
Yes and no. The particular job offers I got all relied on JavaScript, but of course there are a lot of great positions that require other languages. I do think that JavaScript was the optimal choice. There is such a small number of people who have experience with the MEAN stack compared to LAMP / Java / RoR that it makes it a quite a bit easier for an outsider to break in through unconventional means.
Objective C would have been very strong as well. Many companies of all sizes, including my current one use it. It would probably have taken a successful app on the app store or a stronger resume in order to get the kind of interest I did from top tier companies. Ruby would have lead to openings, but probably wouldn't have done much for me my interviews with Google. Java is used all over the place but from what I've seen, few junior devs without a degree break into the better work. Haskell would have been absolutely horrible--I'd have been competing with freakishly smart people for a small number of low-paying jobs!
The beauty of JavaScript is:
1) It's in the front end of every web stack, so nearly everyone needs it.
2) The MEAN stack (which includes Node and Angular) is a fast rising platform with few people competent in it. Even knowing "old" frameworks like Backbone can lead to a lot of opportunities.
3) In general, there is a bit less conservative of a culture among JavaScript programmers. Java/C/Python shops are more likely to care about pedigree and formal credentials.
veering offtopic, but how the hell do you remember a conversation you have with a guy two years ago on a website? Can you give me some tips on getting my mind to that steel-trap state?
Exercise! It does wonders and can protect you from a lot of the fallout from stress, obesity, aging and all kinds of other things.
http://bit.ly/12aKL6p
And salad. I don't have a link for it but I think salad is huge.
There are tons of other things to do but I don't generally do them. I eat about 4000kcal a day, often go out drinking with my friends, don't sleep enough, and I'm often a workaholic. But I exercise and I eat vegetables.
What are the distributions of the salaries among one language? Mean values alone are completely useless, give us some information about how you found the salaries out, what their standard deviations are etc.
I don't want to imply maliciousness here, so I'm going to call this an example of unknowingly bad statistics (which, sadly, seems to be the default in most places). Bonus points for the bar (!) chart that starts at 90,000 and looks like ActionScript is twice as profitable as the rest of the languages.
He found the salaries out using rapleaf... which didn't know the salaries but was making an educated guess. (You can try out their api if you register an account, FWIW it was way off for the people I've tried it on.)
With data like that, I'd imagine the only sort of data you could trust would be the mean, since I'm guessing its much harder to be wrong about relative income levels between large groups than it is to be right about actual income numbers.
The only sort of data you can trust is the data itself. Providing the mean of data is providing a sum value a person thinks is useful to convey the shape of the data, and the mean in particular is unsuitable at conveying shape on its own.
What the author tries to get at is a precise quantity, namely the average salary for each language. However, we cannot measure that quantity directly, so taking samples (from a good source) is as good as it gets.
The easiest assumption about that data is that it follows a normal distribution, which does have a mean, but also a standard deviation; both of these are required to talk about any meaning of the result. (If the normal distribution is not a good fit things may get more complicated.)
What I'm saying is that providing just means is not a ballpark way of doing statistics, it is wrong. The statistical ballpark is the normal distribution, and maybe a value ensuring that it is at least somewhat appropriate to use that one.
I was trying to say something along the lines of "once you've played with rapleaf, you won't care what the stdev is", because rapleaf's data isn't good.
The standard deviation here isn't going to be the standard deviation of programmer salaries -- it will be the standard deviation of rapleafs estimates. Which as I previously mentioned are horrible, and not particularly interesting.
I agree with the author that Haskell is estimated at a lower rate because it's academic. I think the reason we see higher salaries for older languages here is that rapleaf must be using age as an input (with makes sense in some fields).
> The only sort of data you can trust is the data itself.
Agree. Again, here there is no data. It's gibbs sampled noise from shitty models.
> Providing the mean of data is providing a sum value a person thinks is useful to convey the shape of the data, and the mean in particular is unsuitable at conveying shape on its own.
I don't think anyone here is confused about the question of the mean showing the shape of data.
> The easiest assumption about that data is that it follows a normal distribution
Salary most certainly does not follow a normal distribution. I don't understand how you came to write what you wrote without knowing that. It's a classic example of the exponential distribution.
(To be fair, salary estimates from rapleaf do seem to follow a normal distribution :p maybe you can guess why? [hint, it's a phrase I used earlier that starts with "gibbs sampled noise from" and ends with "shitty models"])
So forgive the statistical ignorance, but how would one display mean and std dev in useful graphical form? (Asking seriously as it would be good to know which setting to press on gnu plot)
Not quite the same, but box-and-whisker plots (https://en.wikipedia.org/wiki/Box_plot) are good for visualising distributions. They show the mean, minimum, maximum and the first and second quartiles.
Since they show the minimum and maximum we can judge the overall spread, the mean tells us...well...the mean and first and second quartile lets us judge how close data is clustered around the mean.
These histograms are lacking in that they don't actually have the mean line or std devs on the plot itself; my general point is that if the goal is to get across a distribution, a histogram is the best way to do it, perhaps aided by markers for mean, median, std dev, etc., as the case merits.
They're typically used for confidence intervals or standard error, but you can theoretically use them with any measure of variability as long as you're clear.
Without even looking at potential issues with the source data, it is also the case that most programmers of any calibre work across multiple languages. Also, languages which are 'older' and reaching obscurity (ie. fallen out of favour) are going to be known by older programmers who, if they are still in their career, probably command higher salaries only as a factor of age and experience. Finally, salary before tax is not very representative of income or social circumstance, and people in our industry often opt for time flexibility or remote work in preference to higher salaries on purpose (for example, after having children). All in all, not useful.
> Bonus points for the bar (!) chart that starts at 90,000 and looks like ActionScript is twice as profitable as the rest of the languages.
When I make charts like this, I always let the X-axis be zero. I find it confusing to look at a chart and not immediately know the percentage difference without using a calculator. With the X-axis at 0 you can immediately tell the proportional differences.
It's like looking at the intra-day variations of a stock's price, and thinking "Whoa! What happened here?!". Then you zoom out and it's a completely straight line.
The bar chart is straight from Google docs, which just auto-scaled however it saw fit. I tried to present the numbers first to be clear about the absolute range.
ActionScript has died out in many places, but still has heavy use, drumroll in the Federal Goveernment sector. I was working in a shop recently with a guy who ONLY knew ActionScript/FLEX...... pathetic, but his salary sure wasn't.
Those incomes look like fairly typical mid-level individual salaries (in the DC area, at least). I would expect the household numbers to be higher than listed to account for those that have working spouses.
Yeah, those are definitely all possible. I would have preferred to use personal income but it's not available via the API, and I figured that using household income was better than nothing.
Without knowing where the users are based, this data alone is pretty meaningless. Where I'm from in South West England pretty much all the programming jobs are Java (hence why I left), and you are lucky to get £40k. Spain and Portugal have even lower salaries. I'm guessing this is US based, but even then, I'm sure there is a lot of variation. Working in London as a Java developer in finance you could easily get over £100k.
The only downside is that the prices of imported goods (most notably electronics) and software are usually higher than in the US for example. Services are much cheaper of course.
I'll second this. I've yet to ever see any PHP job short of architect paying that amount here (most are 70k or less), same with Java and Perl. I'd like to know where this was collected from....I need to move there.
I feel the opposite. I can't get anyone that knows puppet for under $110k in the Boston area. All these salaries seem low for Boston, and probably the bay area as well.
Most commercial Haskell jobs are in finance -- where you don't get to contribute code to github. That leaves the PhD students and open source folks, shifting the data sideways.
I can kinda understand why XSLT is one of the most expensive (on an intuitive level - as quchen said, it's not clear how the data was obtained for this post). Certain parts of it is a complete brainfuck. I worked on a small opensource project of mine once, where I decided to use XSLT to translate XML to HTML. I din't know much XSLT, so I started reading docs and eventually managed to accomplish my goal. However after struggling with it for a while, I realized it sometimes makes completely no sense. Or, to be more fair, it takes a rather different logic to understand it than I usually apply to programming languages... because, well it's not a programming language.
I only wonder, where is XSLT usually used? Can anyone outline the most typical types of projects and environments?
I worked with XSLT for a few years a while ago, it's functional not procedural and that's the brainfuck you need to get your head around. It was also missing what seemed like basic functionality, like mutable variables.
We used to automatically convert SQL queries into XML using the AS column naming (like SELECT p.Id, p.Name, k.Id [kids\kid\@Id], k.Name [kids\kid\Name] FROM Person p JOIN Kids k ON k.FatherId = p.Id).
Then we could transform server-side to send down the HTML as well as do client-side updates in javascript by sending XML and the running the XSLT clientside to do partial updates (you can specify which part of the XSLT you want to run).
Worked pretty well as at the time, most browsers ran XSLT much faster than they ran Javascript.
I even got a pretty funky pivot table working in it that could handle 10,000s of rows when in javascript it would die after a 1,000 (we're talking IE6/7 era here).
There's actually a Daily WTF article where someone posted Sketchers.com's use of XSLT as a WTF and a lot of people chimed in in the comments that they did this and it was actually really good and not a WTF at all:
Note the featured comment was a response from the lead dev!
As for what situations in industry, our use was pretty uncommon as you can see from the WTF, but it's often used to transform XML in one format to another so they can get different systems actually talking to each other.
I just had an exam about XSLT, XML Schema, etc... this morning. My take-away is this. Pick any scripting language (python, ruby, php...) + templating engine and you'll avoid the World Of Pain that is XSLT.
XSLT is kind of 'declarative', but it has ifs, switches, ... They probably didn't start out with the plan to implement a programming language with xml tags, but they did. It's horrible.
Edit: XML Schema is horrible as well. This is how you define an element with string content, and one attribute (which is VERY basic):
XSLT is purely declarative, not just kind of. Those ifs and switches are not imperative constructs, they're evaluated functionally. xsl:if conditionally applies a template, which kind of looks imperative and control-flowy, but is really functional all the way through.
<xsl:if test="x">template</xsl:test>
is basically this in Lisp:
(if eval(x) (apply-template template) nil)
And <xsl:for-each> is equivalent to map in Lisp. It's a functional projection, not an imperative loop, even if the syntax looks like that of an imperative language.
Ahh yes. I tried writing a schema for my document (ocd here, I like things to be valid), but then I realized that the cost/benefit ratio of this process would be completely unreasonable.
If I had to do it today, I'd probably use DTD to validate the XML tag hierarchy and than use a programming language (with xpath) to validated the attribute and tag contents.
XML Schema tries to validate both the tag hierarchy and the contents, which requires data types and complicated definitions. So much overkill...
DTD and XPath are actually very nice and simple. This is the DTD equivalent of my XML Schema example above:
I use XSLT for https://emailprivacytester.com/ - The HTML part of the email that it sends is constructed from a small amount of dynamically generated XML, and a static XSLT file:
The cool thing is, I initially wrote the application in Perl. I then re-wrote it in NodeJS, but was able to re-use the same stylesheet as it's language independent.
The brainfuck is that it's purely declarative. It represents the transformation of one tree structure to another with no notion of sequential imperatives or associated concepts like variables and mutable state. This has some very distinct advantages: the execution can be very fast and parallelizable and predictable in space and time usage and secure in presenting nearly no attack surface for injections and overflows and such.
This sounds great in theory. But the problem is real world business requirements don't work that way. Real specs are never a straightforward declarative transformation. They always include bells and whistles and rhinestones that don't fit into XSLT. Loop over this set and suppress records where this field matches the previous record (state!), or include pagination, or call somebody's web service in the middle of it to display today's stock price, or grab the user's preferences for colors and time zones. (All real examples from a job I had with XSLT once upon a time.) XSLT doesn't do any of that and isn't supposed to. So you add another layer of data munging in whatever produces the XML before the XSLT operates, or hack it in Javascript afterwards, and either way now you have two problems.
Maybe XSLT skews high for income because it takes huge sums of money to get programmers to even touch the thing.
I had to struggle with XSLT to transform XML into HTML a while back. Since it not intended to be a real programming language, I had trouble making it do the equivalent of something basic like iterating through a collection (recursion, ugh!). I can easily see how it would be a highly-paid skill because the lack of debugging tools (again, it is not a real programming language) made me want to curse MSFT and everyone else who were supporting XSLT at the time (way back) as the Second Coming of content publishing. If you ever want a headache, try reading the XSLT spec.
Visual Studio (tested version 2010; Express probably won't do it, though) has a pretty good XSLT debugger, which made writing XSLT stylesheets a lot easier when I wrote some a few years ago.
>However after struggling with it for a while, I realized it sometimes makes completely no sense. Or, to be more fair, it takes a rather different logic to understand it than I usually apply to programming languages... because, well it's not a programming language.
IIRC, it actually is turing complete (or nearly powerful), and it works in the paradigm of functional programming. Have done a large-ish project with it circa 2002-2004.
>I only wonder, where is XSLT usually used? Can anyone outline the most typical types of projects and environments?
Well, it's used a lot in the enterprise, where there is also a lot of XML.
You use it a lot in structured documentation business (think hardware manufacturer, military, pharmaceutical, ...). Where you publish some xml documentation into serveral output formats like HTML, epub, pdf, whathever..
Because XSLT is used to put a visualization of a data together with an XML-formatted document - very convenient for B2B. You may see content in a browser, and then extract info from the same page without XSLT.
At my workplace all of the legacy projects use XSLT as the templating language for our Cocoon (Java Framework) web applications.
Complete and utter clusterfuck, let me tell you...
"Rapleaf aggregates consumer data from data providers, cleanses it and maps it to emails, and ultimately makes it accesible through our easy to use Append portal and API. We partner with dozens of large and small data companies to aggregate data that we ultimately anonymize and tie to email addresses. We source it from only legitimate data bureaus who adhere to the highest consumer privacy standards -- sources that give consumers appropriate notice and choice about sharing their information and have opted in to make their data accesible."
What? Are they claiming to know the salary of all the github users used in this study? Were the users filtered down to those who had income data available? Because if so, that's a really weird (bad) selection technique. If not, what is being done with users whose income data is not available? Also where the hell is rapleaf getting their data?? This whole thing confuses the heck out of me, someone please help.
Rapleaf's API provides income given an email address, when available. I ran the list of emails from the git commit log across that API, which only gives me data points for the users who have income data available.
Is there really any way to correct for the fact that not all the data is going to be available? It certainly doesn't exclude a sampling bias, which I did mention in the post, possibly not aggressively enough.
The actionscript programmers may be getting paid higher (Mostly because the Flex enterprise hangover is still lingering), but in my experience the actual number of actionscript jobs available is sharply falling. I have been doing Flash/Flex dev for about 8 years now, and while my evidence is certainly anecdotal,i have gone from having tons of contract offers swamping my inbox to once every few months.Make no mistake, Flash and Flex IS dying. Javascript meanwhile is maturing remarkably, and it will soon be in a position where enterprise will seriously start considering it as a viable platform.
As a former actionscript programmer, this makes me cringe a little. My assumption is that actionscript is a new COBOL - tons of businesses rely on it, but now that it's looking like a dead end, the only way to fill those positions is to crank up the salary to compensate for the perceived dead space on a resume.
IMHO experience in it translates well to Javascript, but I miss some of the features - especially the type system and the token API (which was much more consistently implemented than the various JS deferred systems).
Bare salaries don't tell you a whole lot. I suspect that most of the programmers in this dataset live in places like the bay area, NYC, and Boston. The cost of living in those areas are much higher than in much of the world.
> Why are you on HN then?
Why is this relevant? Is there a reason you'd expect higher-paid programmers to not be interested in relevant news?
I live in Uruguay, and program in VB6, VB.NET, and occasionally in Forte4GL, C# or PHP, but I'd make the same programming in Java or C# (it's an average local salary for a developer).
The way to make more money, as Patio11 said, is not to switch languages, but the "ability to use computer code as leverage to accomplish things that capitalism cares about"
Ouch, is that a full-time job? Is that 14K euros or dollars?
I barely manage on 24K for me and my girlfriend (made some bad decisions though).
I've heard rent and cost of living is cheaper in Poland than in Uruguay - we're still in a housing bubble here, and it's close to the most expensive country on earth for some stuff (example: cars).
Why is Uruguay's cost of living so high? Just curious! I always hear how far money goes in South America in general, so what is unique about Uruguay in that sense?
Cost of living in south america is only low if you have a salary in USD or another strong currency. For locals in most countries in south america the truth is that they have to face economies with high inflation rates and a salary that don't catch up fast enough
We still have a housing bubble, that accounts for the ridiculous rent - just across the river from Montevideo, in Buenos Aires, rent is 50% cheaper.
The economy is pretty strong at the moment, so the exchange rate for us is favorable, that means cheap imports, but relatively expensive local stuff (especially bought in USD).
However, there are stiff trade tariffs, so we can't import cheap foodstuffs, and local ones are pretty expensive - mostly labor went through the roof, and we aren't that much automated yet.
The left-wing government imposes taxes on everything - left-wing voters would say that most were left over from other governments, but they were added upon and expanded.
An employee is extremely expensive due to all the added social security stuff - close to 100% over the salary the worker receives, and very hard to fire. While the take-home salary is lower, a restaurant employee is probably more expensive in Uruguay than in the U.S. (and more expensive to replace)
Same for other stuff, taxi drivers, etc.. are more expensive.
Gasoline is more expensive than in Europe, basically twice as much as in the U.S., utilities are not subsidized as much as in other Latin American countries, and very inefficient, so they're a lot more expensive than in the U.S. or Europe.
Clothing is taxed absurdly, there's now a 200 dollar allowance for imports, which are basically 3 to 4 times cheaper.
We also have some ridiculous bureaucracies in place that impose a huge burden on buying and selling property (housing, cars), creating huge market inefficiencies.
The differences with, say, Argentina, are that they're subsidizing heavily - while cratering their economy, since they impose ridiculous tax burdens on exports (Soy, etc..) and have protectionist barriers which are hurting capital expenditures.
Brazil is almost as expensive as Uruguay. Other countries such as Chile or Peru are a lot cheaper, because they took much different approaches (with their pros and cons of course), and other countries have an "informal" economy where they don't care that much about nominal taxes and legal system.
This is why many companies are getting out of the tech hubs. Costs are lower for the company and money goes further for the devs. Take Research Triangle Park in Raleigh, NC. A ton of companies have dev centers there (Cisco, EMC, NetApp, HSBC, Fidelity, Deutsche Bank, Verizon, IBM, John Deer, ConstantContact, iContact, SAS...).
As a dev your money buys 50% more and yet devs are in such demand that unemployment is under 3% in the sector. Because of this you can easily beat all the salary benchmarks in this article and have a four bedroom house too.
Perhaps my mistake was working for academia (support staff, not academic) but I make a bare fraction of that. (and for comparable work to what I've done elsewhere in industry.)
I would say location weighs heavily on average income as a programmer. For example, the average dev salary in the midwest might be more like $50,000 and anything approaching $100,000 being very high end.
This. Salaries are based upon more than just skills and languages.
It seems like this article is biased towards California or New York salaries. I once jokingly told a SF based recruiter that $55k might not seem too much, but those aren't California dollars, which are worth less than Pittsburgh dollars. If I wanted to make six figures right now, I'd probably move.
Given the sample size here I don't think this means anything.
I've seen real world programming jobs in the U.S.A. everywhere from $25,000 up to $250,000. Everything you get is closely clustered around $100,000k and when you consider the sample size I;m nit sure any of these mean anything.
This is absolutely useless. Emacs Lisp? Really? XSLT? What? CSS? Woah. I love that the internet has made it easy for anyone to share their opinion, I hate when noise is presented is as useful information.
After just reading the headline the first thought I had was "the comments are going to seriously question the statistical basis used" I was not disappointed.
The article lists household income, which makes it just about useless. Household income will be higher if you are married. People who are married are more likely to chose a more stable job working for a large corporation. Such jobs require languages more like Java, and less like Haskell. Such people might also have less spare time to play around with new languages due to family engagements, or a dozen other differences, all masked by this chart.
±10% is almost nothing... OTOH, maybe I could do XSLT, if I just took some nausea pills...
PS: I actually think XSLT is a brilliant application of some cool ideas - putting them to work in (e.g.) grammar compatibility, it's just, like XSD, using XML itself as the syntax (e.g. i < 10 pls shoot me) and lack of helpful conventions (e.g. the empty stylesheet should be the identity transform, which you can then tweak).
This is probably something that many well-paid "XSLT developers" probably figured out early on. Develop your XSLT using ANYTHING except XSLT. ...and then they go on to create the "actual" programs to vomit the crapwads of excessive mark-up, and mystify those who made their own unfortunate beds and chose to consume this... this... XSLT.
Remember as patio11 says the most important and valuable skill is not making money for your employer but being able to enable them (and eventually other companies, as your own entrepreneurial soldier of fortune) to make more money.
Being a very versatile web developper, I never understood why (good) PHP programming gets so little credit in the profession, whereas to me it appears to be much more demanding to write a decent piece of software using PHP rather than with Ruby, Python, or even ASP/.NET. These other languages have so many easy tools, great debugging and all that, with the natural counterpart of offering less performance. PHP strikes me more as the C Language of web development : easy to learn, but harder to decently apply on a day-to-day basis. So I wonder if the companies out there just value other languages more because they tend to misinterpret the "simplicity" of a language for the quality of the developer using it. To me, making a good website using Rails is child's play compared to making one using PHP, framework or not...
Supply and demand and perception is pretty much what it boils down too. There's a lot of PHP apps, and a lot of terribly written PHP apps. Add that companies that hire for such positions have a plethora of options for cost-effective (read dirt cheap) and a lot of developers that will work for such wages and you have a language that while popular, is not well respected, and has those looking for more money and a more versatile skill set seeking greener pastures.
It is because of Adobe AIR.Which is a cross platform runtime. Contrary to popular public opinion, Flash and Flex is actually a stunningly beautiful platform to code in. It's a very very mature platform, with an excellent community, and amazing libraries. (As a matter of fact, the creator of Angular JS wrote a blog post citing some of Flex's features as his inspiration).
It's just that Adobe keeps screwing up their Runtimes. Flash and AIR both have performance issues which is why they are starting to get left behind now.
For enterprise softwares,its cheaper and more optimal to have a single codebase that can work across platforms and in return take a slight hit on the performance.
I agree flash/flex is a great platform to develop on. From the language, to the IDE, to the API. It's like a strongly typed javascript with XAML done right. You can create a great looking app that shares the same code base between the web, iPhone and Android.
I've done some ActionScript freelancing back in the days. The demand for ActionScript programmers is actually rising, because everyone moved away from the Flash platform, but someone must maintain the legacy code and do the jobs Flash is best suited for.
1. Web and mobile games.I see Adobe Air is getting popular as a cross platform game dev library + starling , feathers , etc ...
2. Flex dashboards and reports in banks( e.g.Morgan Stanley ).
Lolwut? Are these for two person households where each person is in the same language or only for people working at quant funds? About 40% of my social circle is made up of programmers and no one makes close to these amounts. 100% anecdotal, but, I think we are all being shafted ;0
Some languages heavily used in finance (and not much else) haven't been mentioned, and would likely fair pretty well. I'm thinking about K, and in Ocaml. Ocaml in particular is used at Jane Street which is known for paying extremely well.
You should control for geographical differences in pay and language distribution. California likely has more Ruby programmers-- the higher cost of living and higher salaries there may explain the entire difference between Ruby and PHP.
That's because Cobol programmers probably make an order of magnitude greater income than on this chart. The only people who still write Cobol from my understanding are people who continuously update banking mainframe software. Government mainframes, etc. Usually these days you are apprenticed by someone with years of experience because of how difficult it is to obtain old documentation etc.
yeah that's why I mentioned it. Back in college, right before the year 2k, we had a special Cobol course they added just to get people some 2k jobs. I recall one guy who's still doing Cobol most of his time, working on old software like traffic lights systems (!)
I thought Puppet was a collection of Ruby scripts, not a programming language in the same sense as the others listed.
To put it another way, can you "program" in Puppet without first having Ruby installed?
On a side note, I am always wary of the statistics Github gives for each repository where they estimate the percentage of code from different languages used (e.g., 90% Python, 10% C, etc.). In your opinion, how accurate do you think these are?
Puppet has it's own DSL. It's easy to learn, but it is difficult to use it well. Basically as a Puppet 'programmer' I can generally see the difference between an experienced user or not.
This all said, to extract a lot of power from Puppet you also have to script in Ruby, as to add additional functions not included in the Puppet core DSL requires you to write your own scripts.
Do you need Ruby installed to run Puppet programs?
The answer if I'm not mistaken is yes.
Personally I would be afraid to use Puppet as a substitute for my own configuration scripts. With the later, I'm required to know how things work at the shell level (and I use a small POSIX-like shell, not bash). But is this true with Puppet? I note it currently has some open security issues. But hey, if you are paid a competitive salary to write Puppet scripts, then there's little reason to learn more than how to do that.
I imagine Puppet probably saves you heaps of time.
Indeed. Puppet was created in Ruby and thus requires it to function.
Puppet is generally meant to be a replacement for configuration scripts. Essentially your Puppet scripts define a 'state' that you want the server to be in, thus if you want a user account called 'bob' you'd do something like this:
From this Puppet would do all the 'difficult' bits for you.
You mention that if you get paid to write Puppet scripts you wouldn't need to know how things work at the shell level. That is extremely far from the truth. You need a strong background and experience with shell commands, configuration layouts, shell/CLI methodology, etc. Without those you would quickly be lost both in terms of managing and dealing with Puppet and scripting in it.
Knowing Puppet is no replacement for Shell knowledge unless the extent of what you need to do with Puppet is understand you're using it to launch your Vagrant instances. Honestly there are things which to build out in Puppet would require extensive Ruby based functions and sometimes you really just need to fall back to using exec {} statements in Puppet (Meaning you're running shell commands from within Puppet). You can see some examples of this in the zookeeper module I built: https://github.com/justicel/puppet-zookeeper
Feel free to message me with Puppet questions and really, if you're just using Shell scripts I highly, highly recommend moving toward configuration-management (even if it's not Puppet), unless you only have one or two servers to manage.
Perhaps my time has been well spent mastering the shell at the expense of not learning Ruby and Puppet.
I have a need for keeping things small (too small for Ruby) so if I do need to use scripting languages other than sh for configuration, I'm planning to use Lua.
Here's a question for you since you mentioned Vagrant: I'm interested in hosting that allows me to run my preferred bootloader that reliably boots my DomU kernels instead of the PyGrub hack that AWS uses. I can deploy my systems as Xen kernels, as filesystem images that have a Dom0 that can run my DomU kernels, whatever. I can also run limited purpose kernels in userspace so I am not necessarily restricted to Xen as a means of "virtualization".
I want the AWS convenience of creating and launching an instance remotely rather than having to visit a datacenter. But I do not want anything to do with grub.
All my configuration is in the DomU kernel's embedded filesystem. When I change configs I simply edit the kernel source and recompile. My kernels recompile very quickly. I don't use kernels provided by third parties.
I'll probably have to build my own "cloud" to get things the way I want. Big itch; relentless scratching. Easy? No. But, so often that's how useful software comes into existance: because of someone personally finds the status quo inadequate.
It looks like the most data points are found in the middle of the field. The extremes at both ends are underrepresented (except Java). That is a common finding, because the probability of deviation from a “true norm” would be highest in the smallest sample.
The big question is: How significantly do these datapoints deviate from the null hypothesis of “all programming languages average the same pay).
as much as i'm loving that coldfusion (my primary language of choice) is included on this chart and ranked in the top 10, i would really like to see how the job market compares to the other languages on the chart.
on a side note: really? xslt is second on the list? i remember using it to create a code generator years ago. probably one of the poorer decisions i've made in my life :P
http://en.wikipedia.org/wiki/Simpson's_paradox is quite likely ;) There's many other variables apart from the programming language itself that affect the salary. They can be only indirectly correlated with the language.
I think in reality the highest paid languages are C, Objective-C and C++, with Java a close second (though it very much depends on what is being done).
And if you're looking to be at the top of your game, one single piece of advice: master C. Once you've mastered it, you can pick and choose whatever else you want.
I second this. Have a mate of mine that has five years in C and has been able to walk into any shop that he wants, regardless of language they use, and get a gig, and well paying to boot.
On the other end of the scale, I have almost six years in PHP and it's worth as much as a broken kinder egg for anything outside the LAMP stack world (and in some cases, within that world as well)
Actually the only really interesting thing about that data is how close the lowest and highest values are, certainly they must be within the margin of error (something I noticed he failed to talk about). So it seems the real conclusion to draw is that average income is largely uncorrelated with language used.
Averaging is a very lossy method to use. You could be looking at a histogram shaped like a satellite dish and the average value you come to is one few are close to. There are a lot of extremes in tech salaries from what I've seen hiring and contracting over the years.
Surely who you're working for and what you're working on has a much much bigger effect on income than what programming language you happen to use. People using python at a top hedge fund will surely earn more than people using python at a low end web shop.
This chart looks like a GROUP BY clause which was used in a not too meaningful way. As others mentioned above there are a lot of parameters here like the company you are working for, location, project nature and such which would be important to point out.
This chart looks like if a GROUP BY clause was used in a not too meaningful way. As others mentioned above there are a lot of parameters here like the company you are working for, location, project nature and such which would be important to point out.
It's based on household income, so is it reasonable to infer that Haskell programmers may be likely to be the sole income while ActionScript programmers have working spouses? That could make sense.
I think CSS covers here a general "front-end web development" among with HTML and there is simply a big demand for such stuff, especially will all the new features of CSS3 and HTML5 - everyone wants a modern 3.0 website now :)
Well this sucks, in India even after speaking Java, Ruby, CoffeeScript and Scala I get a measly $35,000 which after taxes goes down to $25,000. Yeah, I know low cost of living etc. etc.
Perhaps the differences is in the demographics, not the language. People using 'older' languages may be older, further along in their careers. That would explain everything.
If you're getting paid to program Haskell chances are you are in either academia or finance. I think it's safe to assume that a) Haskell programmers in finance earn more than most PHP programmers and b) that they're not allowed to upload their latest projects to github.
Without significance testing, are those results significant? It looks like random data to my eye but I've not got the source data for any proper analysis.
Also, the dynamic range among e.g. Ruby programmers of my acquaintance goes from something like $2k a month to $2k a day, not primarily based on skill with Ruby. (If I were comfortable describing the individuals at the high end some people might say "That's cheating! They're getting compensated for non-Ruby things which happen to be expressed in Ruby programs." I call that winning rather than cheating but to each their own.)