Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Intel dropped the ball, and it was the biggest bullet I ever dodged. Even 20 years ago, I felt there was something wrong about the internal culture there, so I turned down the post-internship job offer.

There's a bunch of teams there with three-letter acronyms whose origins have been totally forgotten. Like, nobody knows what LTQ or ASR stands for, or what purpose they have. When you're an intern, you tend to think that the higher-ups know what they're doing, but if you ask for an explanation, you will soon conclude that they don't know either.

People were not working hard enough. At the time Intel's dominance was supreme. They should have been picking up on niche ideas like GPUs and mobile chips, it would have been cheap and adjacent to what they had. Instead, all I heard at the meetings was laughing at the little guys who are now all bigger than Intel. Even my friend in the VC division couldn't get the bosses to see what was happening. People would spend their whole day just having coffee with random colleagues, and making a couple of slides. It's nice to relax sometimes, but when I was there it was way too much of that. There was just way too much fat in the business.

I still have friends there who stayed on. They tell me not to come, and are now wondering how to do the first job search of their professional lives. A couple have moved very recently.

It's very odd that the guy who was famous for saying what upper management should do (set culture) ended up building a culture that has completely failed.



> People would spend their whole day just having coffee with random colleagues, and making a couple of slides. It's nice to relax sometimes, but when I was there it was way too much of that.

I knew a lot of people who got jobs like this after college. I was so very jealous at the time. I was working in a company that was nice, but also wasn’t afraid to tell people when they weren’t meeting expectations. Some of my friends were at companies where they weren’t expected to “ramp up” for the first year. One person I know read “The Four Hour Work Week” and talked his company into letting him work remote, then started traveling the world. He would brag that his entire job was “telling the engineers what to do” and it took him an hour a day because he did it all through email in one sitting.

Years pass, economies evolved, and now it’s harder to get a job. Companies start looking for dead weight and discover people doing jobs that barely contribute, if at all.

A tech company near me looked at their VPN logs (required to interact with their internal services and do any dev work) and discovered a lot of engineers who were only connecting a couple times per month.

By then it’s hard to turn it around. It’s not easy to take people who have become so comfortable not working that the entire idea of urgency is a foreign concept. Ask for some task that should only take an hour or two and they’ll say they’ll have it by early next week. Any request turns into a series of meetings, which have to be scheduled with all participants, which means they can’t start discussing it until Bob is back from vacation next week, so they might have an idea of what’s required by end of month.

At some point you can’t turn it around without making big changes to the people involved. There’s too much accumulated inertia and habit. You have to reorg at minimum and bring in new management, while also making it clear to everyone that their performance is now actually being noticed. It’s hard.

With Intel, I’ve also heard from some ex-employees who left because pay was lagging. Companies with low expectations can feel like they’re getting away with low pay because many people will keep an easy job despite the low pay. It masks the problem, for a while.


> Years pass, economies evolved, and now it’s harder to get a job. Companies start looking for dead weight and discover people doing jobs that barely contribute, if at all.

It sounds like you blame their own lack of effort for losing their jobs. Like, if they would have worked harder, it wouldn't be them on the line.

But the reality is, they did not let the corporations take advantage of them. They turned table and had a good Work-Life-Balance and got paid for it. Yes, maybe it cost them their job. But at the same time, they had one for years, and for many people it would have meant that they had been ready for a change anyway.

Eventually, happiness is a personal measure and what fulfills you is your own desire and the way the people worked, that you talk about, may not be your preference. But it does not sound like they made a poor choice.

I worked my ass off for 20 years. I'm an expert in the field that I work in, but when I had been skipped for raises in three years I said fuck it and put my personal life in front of everything else. I wake up when I want, start my work when I want, work way less than I should. I still don't get no raise, but all my peers and my manager continue to tell me what a great job I do. Now I'm slacking hard, but why should I feel bad, when hard work is not valued? That my boss and peers are happy are a positive thing, but I would not concern myself much, if they were less.


>> But it does not sound like they made a poor choice.

I think the thing that's not obvious to young people is that choices that seem good at any given time may turn out to be poor choices further down the line. The guy who traveled the world while working one hour a day telling engineers what to do over email probably had a great young adulthood. It sounds like he paid for it later, though, by getting laid off and having difficulty finding another job.

This doesn't mean that those who worked their asses off didn't get screwed over, but on average they probably did better professionally - and by proxy, financially.


The seductive failure of doing that is what you choose to invest your saved time in, similar to financial debt.

It’s one thing if someone is iron willed enough to make productive use of their new free time.

It’s different if they use it to play video games and sleep.

Most people, if left to their own devices, will do the latter.

We can say what we want about a hard, challenging job, but it forces us to work and learn. Thus, at the end of it, we have the benefit of that working and learning.

The better question is not “How little work can I get away with doing?” but rather “What will I have at the end of this work?”


im not sure when i can play video games and sleep..


Why not find a job where hard work is valued, or start your own company?


One of my background voices is constantly saying "How will you describe your current project in a job interview?". Maybe I add a little spin, maybe I highlight certain aspects, but I always have an idea of how I'd introduce it and how I would answer detailed 'star' style questions about my contributions and results.

I think some people with 'cushy' jobs don't take on this same mentality, perhaps overestimating the security of their current job. “telling the engineers what to do” is not a good starting point and the answers to follow-up questions had better be pretty detailed and convincing.


Agreed. I had a uni roommate who played video games during his internship because he had nothing to do. Struggled to find a job and I think changed careers.

Also interviewed someone my year but we were both a year out of school, same major, roughly same job title, at our first post-undergrad jobs. I was thrown into the deep end and learning a lot. He was buying software licenses. I commend him on sticking it out for a bit but also realizing it was a bad fit.


I have a few people in my network like this too. They would tease me for working too many hours and having a high savings rate while they traveled around the world and spent their bonuses and vests. They struggled to find a job paying $140k/yr while I'm earning almost 4x that.


Can't take it with you. (money)


>>It's very odd that the guy who was famous for saying what upper management should do (set culture) ended up building a culture that has completely failed

Is it? Everywhere I worked upper management is taking big about the culture but their taking points are rarely applied to the company.

Like when Facebook says something like "we value your privacy"


Facebook does value your privacy, because it's a commodity they can sell for a high price.

Sort of like that Twilight Zone episode. The aliens come and convince us they are here to serve man. "Here, if you don't believe us look at our book called 'To Serve Man.'"

Finally one of the humans translates it and discovers it's a cookbook.

https://en.m.wikipedia.org/wiki/To_Serve_Man_(The_Twilight_Z...


FB invades your privacy so that they sell access to you. Your privacy is important to them, so that they can continue to be the intermediary.


Humm time to put my life story and my product and lifestyle preferences in an easy to read format behind a small paywall, there's gold in them there hills!


And they have internal modeling to exactly value each persons privacy in $


> Everywhere I worked upper management is taking[sic] big about the culture but their taking[sic] points are rarely applied to the company.

Or worse, where I am their talking points about the culture they want ONLY applies to the company and not themselves. (In-office requirements, how the office is laid out, etc.)


I spent ~2 years at Intel 20 years ago. My experience was kind of similar, although I didn't see the "People would spend their whole day just having coffee with random colleagues" aspect.

I left because I was working on a machine learning project that was a "solution in search of a problem;" and I spent too much time working alone. I was very early in my career and felt like I just wasn't learning enough from my peers.

Overall, I felt like Intel was a positive experience. I do think their biggest problem was that they had to many lifers and didn't have enough "healthy turnover." Almost everyone there started at the beginning of their career, and thus everyone who was mid-late career didn't understand what the rest of the industry was doing.


> At the time Intel's dominance was supreme

They are the poster child for "we have a monopoly so we don't have to innovate or even maintain competence". Mind you, how much worse must things be at AMD that they're not winning the x64 war? Eventually the "PC" market is going to get run over by ARM like everything else. Especially now there's a Windows on ARM with proper backwards compatibility.

(although something is very odd with drivers on Windows-ARM, if anyone knows the full story on how to get .inf based 'drivers' working it would be genuinely helpful)


This is a very Apple viewcentric point of view.

Windows on ARM is still largely ignored, everyone on the consumer level is more than happy with current Intel/AMD offerings.

Every single attempt to sell Windows ARM systems has been more or less a flop, including the recent CoPilot+ PCs.

Windows developer community also largely ignores Windows on ARM, unless there is an actual business value to support yet another ISA during development, CI/CD pipelines, and QA.

Only Apple gets to play the vertical integration game, our way or go away attitude, and they are the survivors of home computer vertical integration only because they got lucky when banks where already knocking on the door.


  > This is a very Apple viewcentric point of view.
Which also isn't great for Apple. I mean they're lagging Microsoft now. We've all felt this coming, right? The M series was great but it's hard to think of more innovation after Jobs. I mean... things got smaller/thinner? That's so exciting... now can we fix the very basic apps I use every day that have almost trivially fixable bugs?

In a way, Pantheon feels weirdly accurate. People not actually knowing what to do. Just riding on momentum and looking for the easiest problem to solve (thinner & extract more money from those making your product better) because the concern is next quarter, not next year, not the next 5 years. What's the point of having "fuck your money" if you never say "fuck you"?


Those of us that were around for when Apple was at the edge of bankruptcy can relate to a similar approach, where some products were great like the Newton, but the wind of the early days wasn't as strong.

They have plenty of money to burn, but unless they make their systems more affordable to the common man that doesn't live with tier 1 country salaries, they will eventually become the iPhone/iPad company.

There is no longer Apple hardware for servers, the way MacPro has been dealt with, it is clear that the professional desktop workstation is also not a market that they care about any longer, if the only PCI slots on studio are for audio cards.

So it doesn't matter how great the M chips are, if they don't have products on their portfolio that people care about buying, instead of having Windows/Linux/BSD systems for servers, and mostly Windows on consumer hardware (70% worldwide market share).


I did say "eventually", because I'm at the very start of "yet another ISA during development, CI/CD pipelines, and QA" work.


Which means that there is some business value on selling the software on Windows on ARM devices to budget such efforts.


Can you present data from sales numbers and developers showing this? Need to see the numbers of ARM vs x86 for sales and how many applications are currently one/both.


Walk into any random shopping mall computer shop and observe how many people care about buying Windows ARM, or are asking about them.

Those are my numbers.

If you prefer something more official,

https://www.pcworld.com/article/2816617/microsofts-copilot-g...


> Walk into any random shopping mall computer shop and observe how many people care about buying Windows ARM, or are asking about them.

Those still exist?


Yes, all over the place in Europe.

Media Market, Carrefour, Publico, Worten, Cool Blue, FNAC,...


> everyone on the consumer level is more than happy with current Intel/AMD offerings

They shouldn't be. Apple's chips changed the game so much that it was a no-brainer for me to choose them when I bought a new laptop - PCs just couldn't compete with that compute and battery life. Anyone with a decent enough budget is not even considering Windows.

I don't think any power user will be happy with Intel/AMD any more.


Gaming desktop rigs prove otherwise.

As for laptops maybe when there is something able to compete with Razor laptops for e-sports, using ARM.

Snapdragon chips ain't it.


> Anyone with a decent enough budget is not even considering Windows.

Uhm, no. This is so totally dependent on your use case. I use my home box MOSTLY for gaming; it's just better on Windows. I also want a box I can upgrade. I never need to carry it with me.

Apple isn't even in the consideration space for me for that.

For work I don't have a choice, but the M[1-4] machines are _good enough_ for that; the battery life is nice, but I'm not mobile that often. I don't use its screen, mouse, or keyboard, so don't care there. The OS is close enough to familiar unixen that I like it fine, but WSL2 on Windows or native linux would be MORE THAN FINE, and closer to my deployment environment so would be better at way less cost, but our IT dept. doesn't want to support it so I get what I get.


> I use my home box MOSTLY for gaming; it's just better on Windows.

Don't you mean on x86?

Windows on ARM is no more suitable for running legacy x86 games at full performance than any one else's OS on an ARM chip.


I think that was implied, proving the point that many don't even think about Windows ARM.


It sure seems like AMD is winning the x64 war?

https://www.alltechnerd.com/amd-captures-17-more-cpu-market-...


Maybe a semantic argument, but I'd say they are "on their way to winning", but once they have a higher market share than Intel, "they are winning".

> Despite Intel still holding the lead with 56.3% of systems validated through CPU-Z, AMD is closing in, now claiming 43.7% of the market.


I do hope intel comes back, while I prefer AMD currently (due to no avx512 on intel consumer cpus and me not being interested in having things scheduled on efficiency cores in my desktop pc), if there's no competition anymore who knows what AMD might become...


I'm a gamer, and for Intel to make a comeback for me, they need something that competes with AMD's X3D chips, which absolutely dominate all the gaming benchmarks.

The Core Ultra CPUs are an absolute joke for gaming, often being beaten even by the 14th gen CPUs. The Core Ultras had a major performance regression in L3 cache speed which destroyed gaming performance.

Games love large cache sizes. The Ryzen 9700X has the same number of cores, but a slower clock than a 9800X3D, yet the 9800X3D comes out on top purely because it has 96 MB of L3 cache compared to 32 MB. If Intel would have put out an i9-14900K with 96 MB of L3 cache, it'd probably come out on top.


> they're not winning the x64 war?

That's probably the strongest mis-statement I've heard this week. At least, it seems AMD have been the x86-64 leaders for several years now.

Why are you thinking AMD aren't winning?


By what metric? AMD is making headway fast but intel still controls 76% of the server market.[1]

https://www.tomshardware.com/pc-components/cpus/amd-records-...


By precisely what metric are AMD the leading x86-64 vendor? Intel still outsells them.


EPYC?

I don't think I even know anyone (including in businesses) who buy Intel for anything any more, and it's been that way for a few years.


That's why you want to look at data rather than people you know https://www.servethehome.com/amd-epyc-9005-turin-turns-trans...

I think Epyc will get there within a few years, it has great momentum and Intel's response has been pretty weak.


Stock price delta over the past five years is a good metric since the market is efficient and traders themselves use all available information.


Heh!


Poe's Law in action. I can't tell if this is serious or satire.

EDIT: Though I'm leaning towards satire.


Imagine being this naive in 2025. The market isn't rational.


> Especially now there's a Windows on ARM with proper backwards compatibility.

I wouldn't say that Microsoft's Prism x86 to ARM translation layer has been anywhere near as successful as Rosetta was at running the majority of legacy software on day one.

Prism is improving, but the Windows on ARM reviews are still bringing up x86 software that didn't work at all.


I think Intel's been losing for a while now. I would rather have RISC-V over ARM, though.


I was pretty pleased to see a news article on the x86/x86-64 emulator for RISC-V, felix86, recently, where they've had success running Steam and playing relatively complicated 3D games using a PCIe graphics card.

Dealing with MCUs for projects, RISC-V Espressif chips and boards are no-brainers now; I buy big bags of ESP32 boards from Seeed. I get some free ARM boards at work, which are neat - I always love playing with MCUs - but they're relatively power-hungry and expensive without a lot to show for it. I'm either using a ~$6 ESP32 board or a ~$1 ATTiny in a DIP package for home/fun projects. ESP32s are starting to show up in consumer electronics I find, too, along with the relatively pared-down ESP8266s which I'm not as fond of, though I can still flash them easily over USB-TTL at least, so whatever.

In the SBC space, ARM is competing with x86. RISC-V exists but only really for enthusiasts. RISC-V may start making inroads here soon. I picked up some Radxa Rock 2F boards (using ARM-based Rockchips) for ~$12 shipped a few months ago, they run Debian, and these have been fantastic for projects (though now ~impossible to source the cheap 1GB variant of). It's difficult to imagine it being worth getting involved in this nightmarishly competitive space, though obviously some still do. Most seem to try finding some obscure niche to justify a high markup.

In many workloads, it's more the GPU that matters. I need an MMU, a PCIe slot, and driver support. Most of us don't really need these outlandishly complex and CPU-centric $100+ ATX motherboards, or even CPU/RAM sockets/slots; just solder it on. -Like, how often do people even upgrade the CPU on a motherboard anymore? I'm more liable to throw the whole thing out because it doesn't have any 10PB/s 240GW USB9 quantum ports, so cut materials, decrease surface area, lower cost, and make it disposable.


Same. I started down the semiconductor path about 15 years ago (physics student, loved my EE classes, loved nanofab class, wanted more) and got warned away by an astonishing number of independent postdocs, interns, and seemingly successful industry contacts who all agreed on one point: the pay was such utter dogshit that one should consider it a passion career like art or music. Some of them saw consulting for the Chinese as their "big ticket light at the end of the tunnel" -- but that got shut down soon enough. I changed directions before getting first hand experience but the R&D job listings tended to support this view. "They pretend to pay us, we pretend to work" seemed to be in the advanced stages at that point.

At least in R&D, from the angle I saw it. Clearly, being stingy wasn't a universal problem: heavy buybacks, ludicrous M&A (in foresight and hindsight), and that $180k average salary in the article sounds completely divorced from the snapshot impression that I got. I don't know what gives, was R&D "salary optimized" to a degree that other parts of the business weren't? Did the numbers change at some point but the culture was already rotten and cynical? Or did I see noise and mistake it for signal? Dunno.

In another world I'd love to have been part of the fight to make 10nm work (or whatever needed doing) rather than working on something that doesn't fully use my skills or in my private opinion contribute as much to humanity, but my employer pays me and respects my time and doesn't steer their business into every iceberg in the ocean, and in the end those things are more important.


The problem with R&D as a career (in a "large organised company/institute" context at least, where you don't own and control the results) is the limitless supply of impoverished students and postdocs willing to work for exposure.


Right, but cheaping out on the foundation of your building has consequences -- which have come home to roost orders of magnitude in excess of the money saved.

In R&D management, this is an extremely well-known problem with an extremely well-known solution: use the oversupply to be selective rather than cheap. The fact that they chose to be cheap rather than selective is managerial incompetence of the highest order. They had one job, and they blew it. "Selective" doesn't even mean that the rating system has to be perfect or even good, it just has to equilibrate supply and demand without shredding morale. Even a lottery would suffice for this purpose.


Interesting, is this specifically on the fab side? I work at Arm and the engineers working on design and validation are paid quite well. Not quite as well as NVidia but most non-juniors are well above the $180k mark quoted above for Intel.


I firmly believe intel went from decline (which you can reverse) to terminal (which in a business sense, can be reversed, but it usually isn't) as soon as AMD was able to successfully erode some of their server business, and ARM is now doing the same.

The mountain of money for intel has always been with server chips, as its their high margin chipsets. While they make alot of money on consumer laptops and desktops, its no where near the amount of money they traditionally have made on their server oriented chipsets.

I don't think Intel is likely to come out of this state without something extremely radical happening, and every time they try to do something that could be radical it never has enough time to gestate to work, it always ends up abandoned.


Can't vouch for this site but it has a breakdown of cpu market share by device type and the graph for servers is shocking

https://www.cpubenchmark.net/market_share.html


Is this mainly because of the delays with the newest Xeon?


I remember early on in the clone wars hearing that Intel was strong arming motherboard manufacturers with delayed chipset availably if you also happened to make compatible kit for competitors. Now at the time I was funding my computer purchases with a paper route so it was always going to be AMD (cost/performance) but the scuttlebutt made buying AMD seem like the rebel/moral move.


You only need to look at total employees of AMD + TSMC, either AMD and TSMC are extremely efficiency or Intel is bloated.

Monopoly and Bureaucracy. That is basically what government is. It is kind of sad reading Intel was like that even in 2005.


> Like, nobody knows what LTQ or ASR stands for,

Arithmetic Shift Right? (I kid, of course, but seeing a team name that _might_ correspond to an assembly instruction, in an post about Intel amused me.)


I can attest to this. 16 or so years ago I walked into one of their offices to talk Intel GPU’s and shaders with folks. They knew enough to build hardware but their software stack was 10% of what a GeForce 4 MX could do.

I didn’t end up taking the job.

I never really knew what happened to that division.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: