> It’s obvious. Every high-tech company has people called “Product Managers” (PMs) whose job it is to work with customers and management and engineers to define what products should do. No PM in history has ever said “This seems to be working pretty well, let’s leave it the way it is.” Because that’s not bold. That’s not visionary. That doesn’t get you promoted
This mentality isn't unique to product managers. Mobile developers and front-end developers increasingly don't want to work on technologies that are perceived as outdated, or even to work on someone else's code. Everyone wants to list new technologies and greenfield projects on their resumes.
This was one of the biggest hiring challenges at my most recent large project: iOS developers wanted to rewrite everything in Swift. Android developers wanted to rewrite everything in Kotlin. Our web app was already written in React, but the web developers constantly wanted to refactor it to use the latest state management trend. When Hooks were introduced in React, we spent far too long arguing with developers who wanted to refactor all of the working code to use hooks instead.
The bigger problem is that nobody wanted to work on someone else's codebase. Everyone wanted to work on greenfield projects and technologies that would look good on their resume going into the next job. Some of the most talented developers we hired were obviously only interested in using our company as a stepping stone to FAANG jobs (which didn't actually pay much more, but they were more prestigious on a resume so they wanted to switch).
Blaming product managers is an easy out and will likely match popular opinion among people who aren't product managers, but the problem extends to development teams as well.
I'm an Android developer who despises Kotlin and all the other trendy stuff. I often get laughed at for my views. However, my apps are ridiculously slim and fast, so there's that. Oh and I don't have a job.
It really feels to me as if developers these days care much more about the development process itself, and how their code looks, than about the end product of their work that the users see. Same for designers — they want their UIs look "clean" and enjoyed like works of art, disregarding obvious, glaring UX issues that arise when any person at all tries to use the thing.
I would wholeheartedly agree with that. Developers don't talk about enabling people or making lives better anymore, they don't talk about helping people work more efficiently or use their computer more effectively. Developers treat their users like cattle now. They addict them and herd them and farm their eyeballs for ad-revenue. They don't program to make the world a better place, they program because it's a fun way to make entirely too much money at their favorite hobby: reinventing and overcomplicating the wheel.
To be fair, the user profile also changed a lot over the years. What used to be a 'programmer for power user' space, now it's all about minimalism and pretty, dysfunctional interfaces. I can see why one would become jaded and separate themselves from users.
Also another problem that doesn't seem to be talked about a lot is that way too often, the stake holders and the developers have nothing in common with the end user.
As I see it, it's plain and simple: companies want to make all the money in the world at any cost. This "growth" is how they measure their success among each other. But caring for their users and treating them as adults with agency doesn't earn as much as manipulating them into doing something they never intended. And this manipulation drives the ongoing dumbing down of everything. Back in the day, if you were annoyed by something in a piece of software, you went looking for a setting to make it your way, and more often than not, you found and changed it. No more.
Apple in particular is a good example here. They used to position themselves as a company that makes great tools that empower people. But now they're putting their own agenda, and, conveniently, bottom line, before that. That's how we end up with endless redesigns of things that worked perfectly fine for everyone (e.g. safari).
I'm sick of this bullshit. I always push back against managers or leads who're trying to get in a feature that users won't like, or will find confusing. And I'm not the only one. Not every programmer works for the surveillance-industrial complex.
That's what I did at VK too. At some point though my opinion just stopped being respected because by then developers were demoted to mere translators from human to computer language.
Runtime efficiency is just one consideration to be balanced against others. Maybe you would have an easier time getting a job if you would accept the new stuff and be willing to compromise on runtime efficiency so your team could crank out more features and more quickly meet its business goals. Note: the issue isn't your individual productivity; being willing to go along with the pack has its own advantages.
I don't want to work at a place that has "business goals". I don't want to work on a proprietary product either. I just want to use my skills to make the world a better place, and moving money around ain't that. I don't want to participate in the rat race or help it in any shape or form.
Having business goals isn't necessarily a bad thing. It's possible to develop a good product that improves the lives of its users, while compromising on some things so the business can sell the product before it runs out of money. The money from those sales can then be used to fund more worthwhile work. It may be hard to find a position at a company that lives up to that ideal, but it's possible. I believe my cofounder and I are trying to do that with our bootstrapped company (but we're not currently hiring). You have to be willing to compromise on some ideals though; I struggle with that myself.
I am indeed desiring one, but I'm too picky. And it's not for the money, but rather mostly for socialization (I'm lonely af). Maybe I'd work at a startup, but there aren't many of those that would want an open-source app, and there are even fewer that do something legitimately useful instead of inventing a problem and then heroically solving it.
I worked at VKontakte (Russian social network) and was the only Android developer for several years. It was amazing when Pavel Durov was still the CEO. I was living the dream. It all went downhill form there when it was acquired by mail.ru group and I had to quit because it started feeling like an abusive relationship. My app being closed-source didn't help either.
Then was Telegram. Same Pavel, mostly same team, but the tasks were much more ambitious, and the product itself much more complex and internally intertwined (every single update touches the chat screen, which in the Android app is more than a megabyte of pure Java). I did the VoIP library from scratch. In hindsight, this certainly isn't a one-person endeavor, but I didn't know any better. So, yeah, eventually, but very unexpectedly, I was fired by that same person who I was a huge fan of for so many years. The reasoning was that "our calls are shit". Apparently I was supposed to write code that didn't contain bugs and just "test it myself". Oh all those foolish other companies having QAs, right?
I have no idea where to go now. I want my 2011 back. I want to be passionate about something again.
I would suggest the crypto industry, specifically DeFi. Reasoning:
* It's (mostly) all open-source.
* It's self-directed. Don't like a team? There are a dozen others waiting scrambling for developers that you can join with no bureaucracy nonsense. Want to work alone? The tools are mostly mature enough that you can release products as a one man team.
* You seem to have disinterest or even aversion towards money, from its role in tricking people into mindless jobs. DeFi is about the roots of what money is, voluntary transfer of value between people, which is something I think everybody can agree is fundamentally good.
To be honest, I don't really believe in crypto. All the blockchain stuff is mostly solutions looking for problems. And, after all these years, cryptocurrencies still haven't replaced regular money. I still can't buy a coffee with bitcoin. There were numerous efforts to create such a cryptocurrency, but they've all failed — because governments aren't letting go of the control they have over everything, and they have the necessary physical force to enforce these views. Currently, all I see about cryptocurrencies, is people mostly using them for trading or see them as an investment. Sometimes they buy drugs with them. And that's really it.
Though, yes, if I wanted easy money, I could go work at such a company. Or I could as well go to a FAANG-ish megacorp and earn craploads by working an hour a day.
There is a place where "a visionary founder with technical ability and clear view of the UI they want" no longer aligns with what we can provide. It can turn into a soul-destroying farce just like a "growing company admits as many middle-managers as there are designers/engineers" simply because The Visionary woke up one day and decided that "this UI we have is shit". Combine it with the Russian approach to "find the closest person responsible and punish them the hardest way possible" and you get the setup you have described.
Be very careful when putting your trust in visionaries.
To counter your point, it's not just that product managers are easy to blame; it's that it's literally their job to change stuff.
Developers want to change stuff and do greenfield because it's fun, but that's a whole different beast to a career track whose job it is to change things. This goes beyond just poor incentives ["Googler gets a raise because shiny new thing"], into "upper management have created a construct whose purpose it is to disappoint existing customers who like what's already there".
There may be an argument that you'll draw in more people with shiny stuff, than you'll lose when you break old stuff. I only have anecdotes and personal experience that that's untrue [digg? reddit? metro?], and no examples of cases showing that argument to be right.
Also, personally I get my jollies working on other people's old code. I work with several FORTRAN codebases that pre-date my birth, and I enjoy it.
> To counter your point, it's not just that product managers are easy to blame; it's that it's literally their job to change stuff.
Sure, but companies aren't being blindsided by project managers who sneak into their org charts and start driving change unbeknownst to company leadership. They're hired to do these things and their performance is based on how well they do them, just like anyone else.
This might be a good example of the struggle of middle management: You get blamed from all sides for just trying to do the job you're assigned.
Developers get 1-2 years at a job before they fall behind in compensation.
So my resume always needs to be fresh. I can’t be working with older technology because when it comes time to jump again, I won’t have the skills to do so.
Asking an iOS developer to work on Objective-C is asking them to sacrifice their future prospects.
Devs have a lot of the same incentives product managers do.
That may contribute, but we saw the same pattern in devs who had been there for 5-10 years.
I also see the same pattern in "Ask HN" posts and many comments here from people venting about how they've started so many side projects but can never finish them. Starting new projects is fun. Shipping things can be hard and requires some tolerance for boredom.
A lot of people get into the field because they like to program and play with new programming things. Regardless of jobs or compensation, there's a strong pull toward doing new and different things. There's also a strong aversion to working on someone else's project.
Shipping products is boring when the products themselves are boring. That’s why people want to work at small companies who have a lot more stakes in each releases and pay close attention to what they build, how it reaches market and make it evolves after.
To throw another log in this rant bone fire, a few decades ago software was seen as harder and more expensive, and while there was some frivolous and/or stupid apps, big enough development projects were usually bound to some purpose.
Our industry in its current form is more open to build meaningless things and trash after a while. I knew devs who were specialized in home page redesigns, and they viewed it as facelifts to show the client company is still alive. It has a business purpose, but in the grand scheme of things the only person that really enjoyed the project was the guy using it to learn vue.js.
Or a team porting an app from Objective-C to React native, not because they think it’s shiny but because hiring is hard/expensive and/or the business wants to have one size fits all apps for iOs and android.
> ”Asking an iOS developer to work on Objective-C is asking them to sacrifice their future prospects.”
The FAANGs have tons of Objective-C code. The largest and most popular iOS apps in the world are actively developed in Obj-C and C++.
It’s absurd to me that mobile developers actively undermine their ability to get those extremely well-paid jobs by limiting themselves to one language.
Asking an iOS developer to work on Objective-C is asking them to sacrifice their future prospects.
The impact a language has on your career prospects is U-shaped. Knowing new languages is good for your career. Stagnating on an 'old' language is bad (Swift is only 7 years old...). But eventually knowing the old language puts you back in demand again, to port and support old codebases.
>Asking an iOS developer to work on Objective-C is asking them to sacrifice their future prospects.
This just sounds insane to me being a non-programmer. Literally nothing in my job requires me to be up to date on technologies that are less than 10 years old to be considered for a job.
From where I sit as a dev, it's less "I want shiny new tech on my resume" and more "we need to refactor / simplify everything for ease of maintainability".
It's a hard problem. Sometimes it's unnecessary but sometimes the tech debt is real. It's not uncommon to have one blessed path for doing most things but hey, here's this edge case that doesn't fit our mental model and is a pain to test and bloats our bundle and we keep getting paged for it and the engineers who built it left the company and didn't leave any documentation or tests, so let's just kill that feature.
And now users are pissed and don't understand, because from their perspective, they think the cost of leaving the feature alone is zero. But the truth is the cost of keeping it is quite high and there just isn't enough usage to justify engineering effort to reduce ongoing maintenance costs.
It's much worse in ecosystems that are changing the core frameworks at a rapid pace (hi, React!). If you don't do the same thing Facebook does internally, and continually refactor your entire codebase onto the latest paradigms, you quickly end up unable to adopt new features and 3rd party libraries when you actually need to do so.
If you look instead at a mature ecosystem, like, say, C++ or Java, you might reasonably maintain a codebase for a decade without needing to adopt new language features.
Fait point. I would also add: blame the AB test culture that has creeped in everywhere. I am pretty confident that the current layout of the Spotify UI is not the result of a design team but the outcome of hundreds of back-to-back AB tests (some of them coming out positive, to the benefit of the career of the ideator)
Resume driven development strikes again! Everyone just wants to use the latest shiny thing then switch jobs and leave the pile of burning mess to the next sucker.
Some of the best software I've used are mostly developed by solo developers or small teams (Sublime, YNAB4) because they actually care about their product instead of the underlying tech.
> Some of the best software I've used are mostly developed by solo developers or small teams (Sublime, YNAB4) because they actually care about their product instead of the underlying tech.
But YNAB is also a great example of apps-getting-worse — you're explicitly praising the 2012 version over the 2015/evergreen version.
(In YNAB's case, they had business reasons to make the app worse (or thought they did), but I bet that's true of many of the examples in TFA)
I can't really blame them for their new webapp. They've built a great product that doesn't really need any further maintenance. It'd be pretty weird after they finish YNAB4 to just say, "Ok we're done, let's disband the company".
Blaming the engineers for wanting to follow trends is really passing the buck. Blame the industry (hiring managers, ctos, investors, etc) for not hiring people who don’t have the exact skill set that they’re looking for. To be competitive (aka get paid well) in today’s marketplace you need to have a skill set that is with whatever trends are most common.
I’m fortunate that the company I was hired at doesn’t care that I haven’t worked on Kotlin or React. But I was passed up by many companies because I didn’t have react experience - can’t help it if my last few jobs were all angular.
Many engineers are focusing on doing what will help them get the most compensation. And following industry trends is part of that.
Wanting to be competitive (aka getting paid well) is the root of the problem. The top jobs are ridiculous in every field. Because there is so much competition for them, competitiveness becomes orthogonal to doing your job well.
In most fields, there are also second-tier jobs. You get to enjoy a nice middle-class lifestyle with a good work-life balance in a stable long-term job. The incentives are there to do your job well, because you have to face the consequences of your choices. I know plenty of software engineers in jobs like that. When the company eventually goes bad, they have no trouble finding new similar jobs, because good experienced software engineers are scarce.
When is a product complete? One day it must be unless the builders are hopeless
Surely there must be a point where the product functions as intended and the peak number of users like it like that. Instead we carry on. Everyone has to keep proving their 'value' right? Always changing, always growing. Everything eventually gets iterated to death. Any piece of software or design, no matter how much you might like it will eventually be replaced. It's as true of software as it is of car shapes or curtain patterns.
Just imagine being the CEO of Evernote (mentioned elsewhere in the thread) announcing to the world that they are feature complete. It's a finished product and they are laying off everyone except customer support and marketing. They would be considered insane despite that decision delivering the best profits and customer experience.
> Android developers wanted to rewrite everything in Kotlin.
Probably because Google says Kotlin is the way. You can't use Compose, Google's reboot of the View system, from Java. Why did google create Compose? To eliminate a decade of cruft and problems for one, so congrats ... now your developers are stuck with it.
I've seen devs come and go. They start on a project, rewrite whatever they can get their hands on into the "way it should've been" and then fuck off after a year. The devs that stay have to figure out the maintenance story on their own because screw comments and documentation for this new bright and shiny way of doing things.
I don't know if I'm the only one that sees the pattern in our company but God forbid you say anything to bruise the new hire's ego.
This is a fair criticism, but except for the inevitable bugs/regressions introduced by the rewrite, the devs aren't directly affecting the UX (making it worse). More importantly, they're not deliberately sabotaging functionality -- the article's Economist app example -- in their quest for the current hotness.
This is why "it's the PMs" is a refrain -- because the changes they make are very tangible in the end product.
> Mobile developers and front-end developers increasingly don't want to work on technologies that are perceived as outdated, or even to work on someone else's code
Backend folks do this too btw. I’ve seen adoption of Docker, Kubernetes, microservice architectures, Rust, etc. driven more by a desire to use the latest trends than the best tool for the job.
Absolutely true. Though when I was looking for a job 5 years ago I had "stale" classic skills and no one wanted to hire me because employers were looking for new skills, cloud etc. Now I make sure my work is new tech rather what is best.
I think you're talking about a few different concerns and not all should be ignored. Resume driven development is something that should be avoided, but there are good reasons to update your tech stack even if the old one is working fine. To your point, I've been to tech workshops for new tech where everyone was laughing about their current jobs' quagmire from previous new tech. All I could think about is not wanting to use either in production for what they were doing. However, I was also at a job pulling teeth trying to convert from SVN to git. A lot of people argued changing was fashionable, but branching and merging were frustrating (therefore not heavily used), tools we wanted to integrate with less often supported SVN, and a lot of the automated processes in place would have been much more lightweight after transitioning.
While I don't think appealing to new hires should drive your tech stack, telling them we still used SVN was something you kind of said under your breath with a bit of shame. As for "nobody wanted to work on someone else's codebase" is true, but I think needs some give and take. If you're making someone responsible for something you need to give them some latitude for running it their own way. I'm going to grumble less about dealing with my own mistakes over someone else's, but rewriting everything every time someone quits isn't an option either.
> Mobile developers and front-end developers increasingly don't want to work on technologies that are perceived as outdated, or even to work on someone else's code. Everyone wants to list new technologies and greenfield projects on their resumes.
Resume driven development is a bad anti-pattern and leads to software nightmares. Avoid it.
> Everyone wanted to work on greenfield projects and technologies that would look good on their resume going into the next job.
Greenfield = easy. Fixing a 10 year old architectural mistake = hard.
> Blaming product managers is an easy out
Resume driven development and product managers who make changes for the sake of their own resume are two sides of the same coin. It's all putting good for yourself over good for your customer and good for your employer.
> It's all putting good for yourself over good for your customer and good for your employer.
Wouldn't this not happen if the employers just paid more? From what I understand, people do resume-driven development because they want to work at a place that will pay more at some point.
Fair point. There was a discussion about an article asking why companies don't pay their Developers more to stay recently [1].
If I was maintaining some decades-old legacy code and knew that my employer will most likely still be around and pay me a competitive salary for the rest of my life, I wouldn't mind just mastering COBOL.
But most companies go bust or lay off developers at some point, so keeping yourself employable by working with the latest tech stack is unfortunately quite a necessity.
That's what I've heard too, that most COBOL jobs don't pay that much and have been outsourced.
> If I was maintaining some decades-old legacy code and knew that my employer will most likely still be around and pay me a competitive salary for the rest of my life, I wouldn't mind just mastering COBOL.
That is how I feel too. In fact I'd prefer working on something that really lasts compared to the latest fad. But the differences in salary make this a difficult choices.
To play devil's advocate on the re-write stuff. That's because new tools are usually significantly better on one or more dimensions and usually not much worse in other dimensions.
That doesn't mean you should actually do it, but as someone who's had more than enough time to get over the new and shine syndrome there's a lot to be gained by migrating as often as needed (ie. the productivity gain from the new outweighs the migration/current support cost in a reasonable timeframe).
Programming tools, techniques and languages are still evolving quickly and even in the same ecosystem I don't use many of the tools and a frameworks I did 5 years ago.
For those who are tired of this culture, you might want to take a look at working on projects targeting poorer countries. There, backward compatibility with older technologies is what is prized, and newer technologies hinder adoption rates.
> The bigger problem is that nobody wanted to work on someone else's codebase. Everyone wanted to work on greenfield projects and technologies that would look good on their resume going into the next job. Some of the most talented developers we hired were obviously only interested in using our company as a stepping stone to FAANG jobs (which didn't actually pay much more, but they were more prestigious on a resume so they wanted to switch).
This isn’t actually the developers’ fault, either. I think we need to blame the tech industry as a whole for this.
Agree. I appreciate this suffers from a No True Scotsman fallacy, but the framing from the OP completely misses the point on what a Product Manager _should_ be doing. There’s a near infinite list of things customers want. From missing features, to bug fixes, minor UI improvements, to whole new products. If people are making change for change sake it’s usually more of a symptom of not doing the work to actually identify what would be impactful and valuable. It’s reaching for what feels easy versus doing the hard work.
The only way to combat this mentality is that developers and product managers need to feel more ownership of the long term success of their projects. When your compensation is roughly similar no matter how successful your work is, you're going to optimize for yourself over that of the best interest of the company.
What is it in it for the developers to work on Object-C and other older technologies?
You could or not, it's a tradeoff. Not sure what your criteria for "objective" is. In iOS-land you gotta keep up with the changes every year or your app will show its age. You could continue to do this in ObjC year after year but it's increasingly tough to hire ObjC devs, and Swift offers the possibility of a gradual transition rather than a full on rewrite. Not to mention the quality of life improvements.
I am just writing some hardware test app using C++ and wxWidgets.
But in general, yes, you are completely right. It's just virtual signalling between developers. You can't just write something, it has to be Instagram friendly. It needs to be popular and have hundreds of stars in GitHub.
Yes: some companies have employees who don't quit as soon as a new JavaScript framework is launched. Do what they do. Yes, large companies operate by different terms than startups, but I assume (big guess here) there are startups who aren't just angling for acquisition or unicorn-or-bust layoff pits. Note that they may not say they're going in those directions, but they will definitely manage you in those terms.
[In fact, I'd be curious how many <20ppl companies with 80hr/wk soldiers aren't hustling for acquistion or aren't running out of money]
For GP, it seems obvious that they're interviewing the wrong people. It could be their phone screening, it could be their ad, it could be their industry sector. For instance, don't advertise for a Front-end Developer, advertise for a [thing you're actually hiring for] Developer, and screen resumes on this basis regardless of what you put in the ad (which hopefully is accurate as to job responsibilities and necessary skills).
This mentality isn't unique to product managers. Mobile developers and front-end developers increasingly don't want to work on technologies that are perceived as outdated, or even to work on someone else's code. Everyone wants to list new technologies and greenfield projects on their resumes.
This was one of the biggest hiring challenges at my most recent large project: iOS developers wanted to rewrite everything in Swift. Android developers wanted to rewrite everything in Kotlin. Our web app was already written in React, but the web developers constantly wanted to refactor it to use the latest state management trend. When Hooks were introduced in React, we spent far too long arguing with developers who wanted to refactor all of the working code to use hooks instead.
The bigger problem is that nobody wanted to work on someone else's codebase. Everyone wanted to work on greenfield projects and technologies that would look good on their resume going into the next job. Some of the most talented developers we hired were obviously only interested in using our company as a stepping stone to FAANG jobs (which didn't actually pay much more, but they were more prestigious on a resume so they wanted to switch).
Blaming product managers is an easy out and will likely match popular opinion among people who aren't product managers, but the problem extends to development teams as well.