Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Intel's 2010 7.6B$ purchase of mcafee was a sign that Intel doesn't know what its doing. In the CEO's words: The future of chips is security on the chip. I was like no, no its not! I wanted them to get into mobile and GPUs at the time. Nvdia's market cap was about 9B at the time. I know it would have been a larger pill to swallow, and likely had to bid a bit more than 9B, but I thought it was possible for Intel at the time.


> The future of chips is security on the chip. I was like no, no its not!

Putting aside whether the statement is considered true or not, buying McAfee under the guise of the kind of security meant when talking about silicon is... weird, to say the least.


McAfee makes their money from people being required to run it for certification. Imagine government/healthcare/banking/etc. customers being obliged to use only Intel chips because they'll fail their audits (which mandate on-chip antivirus) otherwise. I hate it, but I can see the business sense in trying.


Still, $7.6B is ludicrous money for a "try", especially when everyone in the room should have known how shaky the fundamentals were for such a pitch.


I’m not sure mcaffee is the go to for this requirement any longer. Maybe. Definitely across the 4 enterprise I’ve worked at, they all migrated away from mcaffee.


There's definitely a lot that can be critiqued about that period.

Famously they divested their ARM-based mobile processor division just before smartphones took off.

The new CEO, as the article mentions, seems to have a lot more of a clue. We just hope he hasn't arrived too late.


  a lot that can be critiqued about that period.
Like the time they appointed Will.I.Am?

https://youtu.be/gnZ9cYXczQU


>Famously they divested their ARM-based mobile processor division just before smartphones took off.

Wasn't that AMD (perhaps also AMD)? Qualcomm Adreno GPUs are ATi Radeon IP, hence the anagram.


Intel sold their XScale family of processors to Marvell in 2006.

I remember very well as back then I was working in University porting Linux to an Intel XScale development platform we had gotten recently.

After I completed the effort, Android was released as a public beta and I dared to port it too to that Development Board as a side project. I thought back then Intel was making a big mistake by missing that opportunity. But Intel were firm believers in the x86 architecture, specially on their Atom Cores.

Those little Intel PXA chips were actually very capable, I had back then my own Sharp Zaurus PDA running a full Linux system on an Intel ARM chip and I loved it. Great performance and great battery life.


Intel divested their StrongARM/XScale product line.


Yes, just before the iPhone came out and with Apple newly fully engaged as a major Intel CPU customer (for x86 Macs) for the first time ever.

Kind of like Decca Records turning down The Beatles.


It's really sort of been downhill since they decided to play the speed number game over all else with the Pentium IV. Even the core i7/i9 lines that were good for a long time have gone absolutely crazy lately with heat and power consumption.


That's overly reductionist. Conroe topped out at around 3 GHz, compared to its predecessor Presler achieving 3.6 GHz.

I think Netburst mostly came from a misguided place where Intel thought that clock frequency was in fact the holy grail (and would scale far beyond what actually ended up happening), and that all the IPC issues such as costly mispredicts could be solved by e.g. improving branch prediction.


It is exactly that short sited mhz over all else attitude im referring to as a fatal mistake.


Intel's market reality is (percieved) speed sells chips.

It's embarassing when they go to market and there's no way to say it's faster than the other guy. Currently, they need to pump 400W through the chip to get the clock high enough.

But perf at 200w or even 100w isn't that far below perf at 400w. If you limit power to something like 50w, the compute efficiency is good.

Contrast that to Apple, they don't have to compete in the same way, and they don't let their chips run hot. There's no way to get the extra 1% of perf if you need it.


Oh, I'm quite well aware. I traded a spaceheater of an i9/3090 tower for an M1 Studio.

The difference in performance for 95% of what I do is zero. I even run some (non-AAA) Windows games via crossover, and that's driving a 1440p 165hz display. All while it sits there consuming no more than about 35w (well, plus a bit for all my USB ssds, etc) and I've never seen the thermals much past 60c, even running nastive-accelerated LLMs or highly multithreaded chess engines and the like. Usually sits at about 40c at idle.

It's exactly what almost 40 year old me wants out of a computer. It's quiet, cool, and reliable - but at the same time I'm very picky about input devices so a-bring-your-own peripherals desktop machine with a ton of USB ports is non-negotiable.


I remember when they did random stuff like the whole IoT push (frankly, their offerings made no sense to humble me .. Microsoft had a better IoT than Intel). They did drone crap .. gave a kick ass keynote at CES I recall .. also .. made little sense. Finally, the whole FPGA thing .. makes little sense. So much value being destroyed :(


The Altera (FPGA) acquisition could have made sense, but they never really followed through and now it's being spun off again.


There were some technical issues with the follow-through that they didn't foresee. CPUs need to closely manage their power usage to be able to extract maximum computing power, and leaving a big chunk of static power on the table in case the FPGA needs it. The idea of putting an FPGA on a die was mostly killed by that.

Regarding other plans, QPI and UPI for cache coherent FPGAs were pretty infeasible to do at the sluggish pace that they need in the logic fabric. CXL doesn't need a close connection between the two chips (or the companies), and just uses the PCIe lanes.

FPGA programming has always been very hard, too, so the dream of them everywhere is just not happening.


That was not the point of the Altera acquisition. The point was the fill Intel's fabs, but the fab fiasco left Altera/Intel-FPGA without a product to sell (Stratix 10 -- 10nm -- got years of delay because of that). Meanwhile Xilinx was racing ahead on TSMC's ever shrinking process.


I remember when they bought a smart glasses company then refunded every buyer ever the full retail price. There hasn’t been an Intel acquisition that has worked out in some 20 years now it seems. Just utterly unserious people.


> There hasn’t been an Intel acquisition that has worked out in some 20 years now it seems.

Maybe Habana Labs?

I can't really tell if it's working out for Intel, but I do hear them mentioned now and then.


Isn't that true for virtually EVERY big tech merger? Like, which ones have actually worked?


Google built its core advertising ecosystem on acquisitions (Applied Semantics, DoubleClick, AdMob, etc) and extended it into the mobile space by buying Android.


Youtube was also an acquisition.

Haven't heard much about successful Google acquisitions lately though.


Facebook bought Instagram and WhatsApp and they were both home runs. Zuckerberg knows how to buy companies.


Facebook bought their eventual competitors by making an offer they couldn't refuse. Zuck knows Metcalfe's law.


Instagram and WhatsApp are still popular with consumers though. Meta didn’t kill them, if anything they’ve grown significantly.


That’s a different type of acquisition, right? Buying your competition. If nothing else you’ve wiped out a competitor.


Even that sometimes flops (HP/Compaq)


Mostly true, but there are exceptions:

Apple does really well on its rare acquisitions, but they aren't very public as they get successfully absorbed. PA Semi, Intrinsity, more I can't remember.

ATi and Xilinx have by all accounts worked out really well for AMD.


The iPod


Android and PA Semi have worked out pretty well...


There are the occasional good ones, like instagram.

But I guess thats the problem - I had to provide an example


Broadcom is a good example of successful mergers.


Nvidia Mellanox


i was a process engineer there in the early 2000's, they did crazy random shit then too! they had an 'internet tv' pc that was designed to play mp4's in 2001.


He was right but for the wrong reasons.

Had Intel figured out hyperthreading security and avoided all the various exploits that later showed up …


Then they would have worse-performing chips and the market wouldn't care about the security benefits. Cloud providers may grumble, but they aren't the most important market anyway.


Has there ever been an exploit in the wild for rowhammer/whatever the other vulnerabilities were?


Intel pivoting to GPUs was a smart move but they just lacked the tribal knowledge needed to successfully ship a competitive GPU offering. We got Arc instead.


Isn't Arc actually pretty okay?


They mostly work now and they are decent options at the low-end (what used to be the mid-range: $200) where there is shockingly little competition nowadays.

However, they underperform greatly compared to competitors' cards with similar die areas and memory bus widths. For example the Arc A770 is 406mm^2 on TSMC N6 and a 256-bit bus and performs similarly to the RX 6650XT which is 237mm^2 on TSMC N7 with a 128-bit bus. They're probably losing a lot of money on these cards.


It's getting better and drivers are improving all the time. I personally liked the Arc for the hardware AV1 encoding. Quicksync (I use qsvencc) is actually pretty decent for a hardware encoder. It won't ever beat software encoding, but the speed is hard to ignore. I don't have any experience using it for streaming, but it seems pretty popular there too. Nvidia has nvenc, and reviews say it's good as well but I've never used it.


This. If you follow GamersNexus, there are stories every month about just how much the Arc drivers have improved. If this rate continues and the next-gen hardware (Battlemage) actually ships, then Intel might be a serious contender for the midrange. I really hope Intel sticks with it this time as we all know it takes monumental effort to enter the discrete GPU market.


When it works, perhaps


Arc seems more like, where the GPU will 'be' in another 2-6 years. Where Arc's second or third iteration might be more competitive. Vulkan / future focused, fast enough that some translation layers for old <= DX11 / OpenGL are worth it.

If you're hoping for an nVidia competitor, the units in that market may be more per unit, but there's already an 1-ton gorilla there and AMD can't seem to compete either. Rather, Arc makes sense as an in-house GPU unit to pair with existing silicon (CPUs) and low / mid range dGPUs to compete where nVidia's left that market and where AMD has a lot of lunch to undercut.


One unfortunate note on Nvidia data center GPUs is to fully utilize features such as vgpu and multi-instance GPU, there is an ongoing licensing fee for the drivers.

I applaud Intel for providing fully capable drivers at no additional cost. Combined with better availability for purchase they are competing in the VDI space.

https://www.intel.com/content/www/us/en/products/docs/discre...


MBAs eating the world one acquisition at a time.


He was a process engineer


The CEO at the time of the McAfee acquisition was Paul Otellini -- an MBA: https://en.wikipedia.org/wiki/Paul_Otellini.


Intel has an amazing track record with acquisitions -- almost none of them work out. Even the tiny fraction of actually good companies they acquired, the Intel culture is one of really toxic politics and it's very hard for acquired people to succeed.

I wish Pat well and I think he might be the only who could save the company if it's not already too late.

Sourced: worked with many ex-Intel people.

POSTSCRIPT: I have seen from the inside (not at Intel) how a politically motivated acquisition failed so utter spectacularly due to that same internal power struggle. I think there are some deeply flawed incentives in corporate America.


Not gonna lie, I had a professor who retired from Intel as a director or something like that. Worst professor I had the entire time. We couldn't have class for a month because he 'hurt his back,' then half us saw him playing a round of golf two days later.


It's never too late to go back to school.


McAfee was Renee James idea, she was two in a box (Intel speak for sharing a management spot) with Brian Krzanich.


Intel should have bought Nvidia.

And acqu-hired Jensen as CEO.


I've heard the reason AMD bought ATI instead of Nvidia is Jensen wanted to be CEO of the combined company for it to go through. I actually AMD would be better off it they had taken that deal.

Prior to the ATI acquisition nvidia actually had been the motherboard chipset manufacturer of choice for AMD cpus for a number of years.


ISTR there was sort of a love-hate relationship with a lot of nVidia chipsets.

nVidia always had the trump card of saying "if you want SLI, you have to buy our chipset." But conversely, a lot of the options weren't great. VIA tended to alternate between decent and incompetent chipsets, SIS was mostly limited to the bottom of the market, and ATI's chipsets were very rare.


AMD is doing fantastic and it’s CEO is great. It would be a big let down if they had bought nvidia as we’d have a single well run company instead of two


Would that make them Ntel or Invidia?


Invidia is sort of how nvidia gets pronounced anyway, so I’d go with that one. Ntel sounds like they make telecommunications equipment in 1993.


It would have had the same fate as the NVIDIA ARM deal.


Unlikely with AMD owning ATI. The reason NVidia was blocked from buying ARM was because of the many, many third parties that were building chips off ARM IP. Nvidia would have become their direct competitor overnight with little indication they would treat third parties fairly. Regulators were rightly concerned it would kill off third party chips. Not to mention the collective lobbying might of all the vendors building ARM chips.

There were and are exactly zero third parties licensing nvidia IP to build competing GPU products.


one example would be the semicustom deal with mediatek

https://corp.mediatek.com/news-events/press-releases/mediate...

like it’s of course dependent on what “build competing products” means, but assuming you mean semicustom (like AMD sells to Sony or Samsung) then nvidia isn’t as intractibly opposed as you’re implying.

regulators can be dumb fanboys/lack vision too, and nvidia very obviously was not spending $40b just to turn around and burn down the ecosystem. Being kingmaker on a valuable IP is far more valuable than selling some more tegras for a couple years. People get silly when nvidia is involved and make silly assertions, and most of the stories have become overwrought and passed into mythology. Bumpgate is… something that happened to AMD on that generation of gpus too, for instance. People baked their 7850s to reflow the solder back then too - did AMD write you a check for their defective gpu?

https://m.youtube.com/watch?v=iLWkNPTyg2k


Maybe, however, the GPU market was not considered so incredibly valuable at the time (particularly by eg politicians in the US, Europe or China). Today it's a critical national security matter, and Nvidia is sitting on the most lucrative semiconductor business in history. Back then it was overwhelmingly a segment for gaming.




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: