Yeah John Gruber has really become a shill over the last few years. His article reads like an Apple keynote transcript, everything is magical and amazing, a marvel and mindbogglingly great. No wonder they throw him so many exclusives ;)
Yes M1 is great. But it's a closed system (no booting Linux or Windows at this time) and even under macOS you're having even less access to the system's internals than before. Also, Apple performs many tricks to get such long battery life, such as putting apps to sleep when you're not looking at them (app nap). I find this extremely irritating when working with my Mac but I can't turn it off completely even though I have a mini and thus battery life is not a factor at all. My work MacBook spends it entire life docked with the screen closed too.
M1 is a solid platform for sure, better for laptops than intel/amd's offerings. I agree. But I'm getting enough of Apple's (and Gruber's) hypey marketing. It's not magic, just good engineering. I stopped watching their keynotes for the same reason.
> John Gruber has really become a shill over the last few years.
I think it's important to read Gruber and understand he's an Apple person. Not a shill, as much as his world is all Apple. Sure, he checks out an Android phone every couple of years, but for the most part he doesn't care what's going on outside the Apple ecosystem. And, that's fine. As a reader of Gruber, I read for his takes about Apple.
That's fair enough, I have occasionally read his blog over the years (mainly linked from other sites such as MacRumors). That's also how I saw how many scoops he's been given by them (because if they were actual leaks I'm sure he would have been cut off by now, knowing Apple's secrecy). But I don't remember him being this lyrical about Apple as he is here.
However I used to be an Apple fan and Mac used to be my daily driver (even though I've always been an "every OS at the same time" guy). But my enthusiasm has reduced a lot lately due to the higher lock-in, and 'iPadification' of the Mac. Especially due to stronger ties with Apple services: As an "every OS" guy this doesn't work for me obviously. Basically it is getting decidedly harder to use Apple stuff well if you don't want to do things exactly "the Apple way". The Mac didn't use to be like that. It was more like a general unixy OS that wasn't so opinionated.
So, as my opinion has changed it is possible his admiration for Apple is irking me more than before :) Sorry if I offended by calling him a shill. But he adopts Apple's flowery language filled with superlatives as if their products are some kind of magic inconceivable to their competitors. That rubs me the wrong way as I do consider the world outside Apple which is not bad at all.
What lockdown is grating to you? I grew up on BSD multi-user systems and Apple things still look the same to me: crusty BSD with ancient coreutils in the terminal, nice rounded corners in the GUI.
What is the iPad-ification? Just the code signing stuff? I guess you can't write to /System anymore these days? And there's something about app notarization, but it doesn't really affect any of my uses.
Almost all GNU user land tools and libraries work on BSD and you can install them fairly easily with brew. Unless you are doing Linux kernel development, there is little you’ll miss out on.
I'm typing this on FreeBSD. If the most interesting part of a Mac to you is the BSD layer ... Well, I doubt that. People like Mac for the UI and Cocoa stuff, plus hardware and integration thereof.
The way you've substituted "BSD" for macOS really doesn't make sense. Yes, there is BSD code in MacOS, quite a bit of it in fact. But they're not the same thing.
You are right that people that buy a Mac don’t get it for the FreeBSD kernel. However, most people that complain about not being able to run Linux on a M1 can be just as productive with the alternatives on macOS.
For 99% of other things that run in a terminal, you can get it to work in FreeBSD/macOS. Also, for Python folks, there is anaconda distribution that is agnostic to the underlying OS. Ditto for node or ruby development. Also Gui Emacs and vi have decent ports to macOS.
That said, I guess if you are doing some custom server side development that really needs Linux, you will have a hard time with M1 - virtualization and emulation don’t mix and your code might not be portable enough.
Again as a frequent FreeBSD user, "FreeBSD/macOS" is not really a thing. There is historical shared code but they are different beasts. Sometimes there is substantial porting involved to get things from one to the other and they are not really the same entity. Additionally, the BSD pieces of macOS are frequently out of date and a hodgepodge of components.
Personally, I have very frequently had the experience that Unixy stuff that "just works" on *BSD or Linux does not work on Mac straight away, or breaks with a new OS release, etc. It takes effort for the homebrew folks or the macports folks or whatever to keep stuff working.
Mostly agree, and upvoted, however I think it’s important to acknowledge (in thread about M1 Macs) that full brew support will be a large undertaking for M1.
It seems to be going well so far. I would say it's around 70% of the way there[0].
What's left is a few major runtimes (OpenJDK, golang is waiting for 1.16 to be released). Many of the packages that are not properly supported either require golang [4], rust [2], mono [3] or java to support arm64.
It's impressive the amount of work Apple has done behind the scenes get M1 support in many open source packages, [1] for example.
Note: You can’t copy passwords stored in your Local Items or iCloud Keychain. To transfer these keychain items to another computer, set up iCloud Keychain on the other computer using your iCloud user name (normally your Apple ID) and password.
You can manually copy keychains other than Local Items or iCloud Keychains to another Mac using the steps below.
Ah, good find! It appears that the Big Sur documentation notes this right up top, whereas for the earlier OS versions it's relegated to a note at the bottom.
Either way, I'm not sure this counts as a warning! It rather reminds me of:
As you are probably aware, plans for the development of the outlying regions of the galaxy invoke the building of a hyperspace express route through your star system. And your planet is one of those scheduled for demolition.
(Shouts of terror emit around the globe.)
There's no point acting all surprised about it; the plans and demolition orders have been on display at your local planning department in Alpha Centauri for fifty of your earth years. If you can't be bothered to take an interest in local affairs, that's your own lookout.
Most people setting up their keychain without iCloud will never see this, and there's not even an explicit mention here of the issue with backups (though one could sort of infer that).
Agree. He is an Apple guy. He delights when Apple does cool stuff. Sometimes he takes Apple’s side on debatable subjects. But he has no issue calling Apple out when he thinks they are doing wrong. It’s unfair to call him a shill.
Live by the sword, die by the sword. Fully-committed shills are not shills.
Edit to clarify: I meant that a shill with skin in the game is less of a shill. Of course a shill can be "fully commited" to shilling, but if they are fully committed to the thing which they are promoting, then they are not a shill in my eyes.
> Yeah John Gruber has really become a shill over the last few years
I remember reading sentiments to that effect since 2004. I doubt he's getting paid directly, so he could only be a shill in a very loose sense of the word. He does get paid in review units and access, though.
I think he's maybe overly enthusiastic about the good stuff, and skews toward the positive when matters are questionable but he has called out Apple's corporate malfeasance and pointed out technical shortcomings in the past. He's biased but that's a one man blog for a specific audience.
Personally I do think that very access to inside information makes him at least partially an Apple-controlled outlet. He's released some serious scoops about upcoming products, and his verification of a rumour is basically taken as a confirmation in the Mac rumours scene.
It's of course possible he has some kind of access that Apple don't control (like an employee leaking information), but given their extreme focus on secrecy, I'm sure that this would have long been shut down if it wasn't fully authorised.
That kind of access does not come without ties, IMO. Though I agree I was coming on too strong when I said shill.
> he has called out Apple's corporate malfeasance and pointed out technical shortcomings in the past.
He does this all the time. In his last podcast or two he has complained about Apples’s keyboards, hot laptops, short charger cords, Siri and the Touch Bar. I’m sure there are others but I haven’t gone back.
FYI, they have to ship back the review units. Of course there is a financial aspect to the fact he gets the unit first and can publish the review and get the eyeballs, but it's just something to know about Apple, they don't give anything.
It doesn't matter. Sure, that might mean that an Apple system is not for you.
But the crux of the matter is: Apple did something truly revolutionary. How is the rest of the market going to respond? Specifically, how are Intel, AMD and Microsoft going to respond? Microsoft, at least, has Windows running on ARM, but it doesn't have Rosetta. Intel and AMD have nothing remotely comparable.
This is hyperbole. Apple has released a CPU that has very good perf/watt compared to it's current competition. That's it. This isn't revolutionary, it's just another incremental step in CPU improvements.
Can we please give Apple praise for competing well in the hardware market without fully drinking the kool aid and parroting their marketing.
Sounds revolutionary to me. The new M1 is faster and MUCH lower power than an i9 based system that would ramp up fans at even the most minor of workloads like say upgrading iterm2 from version n to version n+1.
Suddenly $700 mac mini's can do crazy things like edit 4k video and the m1 laptops can handle being plugged in to wall power much less often than the i9 systems.
Likely within 12 months there's likely to be systems with twice as many cores bringing unheard of levels of performance to reasonable price points.
Keep in mind Intel hasn't shipped something significantly faster than the previous generation in 5 years.
I would like the people downvoting you to explain why they think it's truly revolutionary. I just went back to reread the Tom's Hardware article on the chip, thinking I must have missed something, and I'm not seeing it.
Full credit to Apple for incrementally improving their A-series processors over the last decade until they're good enough for laptops. But I'm not seeing that this makes anything fundamentally different. Just compare the A14 with the M1, for example: https://www.cpu-monkey.com/en/compare_cpu-apple_m1-1804-vs-a...
>I would like the people downvoting you to explain why they think it's truly revolutionary.
For one, it's the first PC chip with these kind of thermals and power combo.
Second, it brings many new design decisions, from unified memory, to the SoC design, various coprocessors, etc, that have not been available in the form of laptop/desktop PCs. The last of this kind in the mass market would be something like the Amiga, which is still a very different design.
Third, it's hella fast and many different things, both given its power draw, but also irrespective of it.
Fourth, it makes ARM for PC computing mainstream suddenly, after decades where it was relegated to the mobile and hobbyist/specialist (rapsberry, etc) space.
If that's not a revolutionary change, I don't know what you expected. A CPU made from flour with alien technology that's 1000 times faster and uses quantum bits?
Like the iPhone and iPod, it's a revolutionary entrant from the consumer point of view in terms of user experience. And like the iPhone and iPod, from a technical point of view, it's merely a very competent iteration down a natural evolutionary pathway, in a preexisting product type.
And like the iPhone and iPod it has very real innovations (in engineering) that people always underplay because "No wireless, less space than a Nomad, lame!", or "we had smartphones before, I had some Ericsson/Nokia/MS gizmo, so what's evolutionary about the iPhone?".
You're right. At the end of the day, they're technically evolutionary, not revolutionary, which was the central conceit of this subthread. And there's nothing wrong with that. Technical evolution can lead to consumer revolutions.
I disagree. The iPhone was revolutionary in that it made smartphones and phone apps accessible to the mass market, unleashing a wave of change in how people live, work, and play.
M1 Macs will be used basically like the previous Macs, but somewhat faster and somewhat longer without plugging in. So I don't think this qualifies as a revolutionary change.
I think the M1 Macs are a precursor for a truly revolutionary (for consumers) Apple ARM computer. Whether that will be realized with the M1X/M2 next year or further on, it remains to be seen.
These all strike me as evolutionary changes. People will be doing about the same things as before. They will do them a bit faster, and not have to plug in as often.
I expect revolutionary changes to overturn an existing order. E.g., the Mac was revolutionary. Smartphones were revolutionary. This is a solid and impressive evolutionary change. Kudos to them, of course. But when actual revolutions happen, I would like to have a word handy to acknowledge that.
Smartphones can be just as easily trivialized into being something which lets us "do the same things as before, but faster and with less plugs".
I think it is wrong to say there is a clear difference between these kind of changes. We won't really know how "revolutionary" M1 was until we see the landscape of the market in the future, but I think the simple fact of dropping x86 alone is enough to enable some major new ways of looking at desktop computing.
I think that's incorrect. As a user of a pre-iPhone smartphone, I don't think Apple should get credit for inventing the smartphone, but I do think the iPhone was revolutionary. They didn't just popularize it; they created a true consumer-grade device via relentless user-focused polish. It's the same deal with the original Mac, which was also revolutionary.
But here, there's nothing radically different about M1 Macs that will open up vast new markets or notably change the daily lives of a purchaser.
I'm glad we both agree that revolution is not the right word. I think precision in language is something that is generally valued at HN, especially when the lack of precision is marketing hype.
Exactly how is it much better? It's not the first chip with a 15-25W TDP. When compared to other chips with such a TDP, it's incrementally faster in single core and it trades blows in multicore. It does this at a process node ahead.
For very light loads, it does have a slightly better power performance than existing low power PC processors, but not by that much. A Ryzen LPP core is somewhere between the big and the small cores of the M1 in power performance in such cases.
It's not the first PC chip to be an SOC, or to have unified memory, or to have many coprocessors. Literally all of those were done simultaneously, in chips of comparable performance, at the same power.
It's also been a decade that we know that instruction sets don't matter. All modern PC processors use a different instruction set and translate from x86. To switch them to ARM would be a matter of adding a few extensions to expose all the functionality, and change the instruction translation stage a bit.
>Exactly how is it much better? It's not the first chip with a 15-25W TDP. When compared to other chips with such a TDP, it's incrementally faster in single core and it trades blows in multicore. It does this at a process node ahead.
Being a process node ahead is already "much better", and IS a feat. Where are the process node ahead chips from the competitors?
On top of that, it's Apple's own SoC design, it has its own GPU design and co-processors, tons of work into making speed optimizations all around (from the buses to memory handling), and came with a new OS release, a port to the new architecture, a bridge layer for iOS apps, and a perfectly capable x86 translation layer.
Rosetta2 alone would be difficult to pull all for most of the industry (Microsoft failed at the same, and their ARM laptops are completely subpar to the M1 machines as well in many levels, despite costing the same or more).
>It's not the first PC chip to be an SOC, or to have unified memory, or to have many coprocessors.
Which mainstream PC chip in actual laptops/desktops people use has unified memory and doesn't use sockets/DIMMS?
>It's also been a decade that we know that instruction sets don't matter. All modern PC processors use a different instruction set and translate from x86.
That's a moot point, since they still carry all the x86 baggage in microcode, since tons of bad decisions leading to things like Spectra. ARM doesn't.
In terms of GPU cores it seems evident that Nvidia is clearly number one—disregarding any thermal limits—and AMD is clearly number two. It seems Apple is a solid contender for third place now. I don't think Intel or the smaller embedded players (Mali, PowerVR etc) have anything quite so good.
Dunno, do you want a GTX 3070 for $499 or a AMD 6800 for $580? The benchmarks come out at about the same per $ (AMD wins by 10-15%). Sure you could get the 3080 ... or the 6800XT. Pick your price point and AMD and Nvidia will have something close. Sure AMD has nothing to compete at the $1500 price point, but the reviews I've seen claim that the $1500 nvidia is just a big waste not close to justifying the price.
PS5, Xbox X/S, and Apple have been preferring AMD lately. Even Tesla seems excited by the RNDA2 cards.
>Being a process node ahead is already "much better", and IS a feat. Where are the process node ahead chips from the competitors?
Not if you still don't beat them cleanly. The end result is what matters. Also, being able to pay more than other people isn't a technical feat.
>On top of that, it's Apple's own SoC design, it has its own GPU design and co-processors, tons of work into making speed optimizations all around (from the buses to memory handling), and came with a new OS release, a port to the new architecture, a bridge layer for iOS apps, and a perfectly capable x86 translation layer.
Half of that is software, which I didn't mention, and the rest is par for the course. The buses and interconnects in a Zen chip are far superior, which you notice as better I/O.
>Which mainstream PC chip in actual laptops/desktops people use has unified memory and doesn't use sockets/DIMMS?
You didn't mention Sockets/DIMMS, but literally any AMD APU has unified memory, and use memory DIMMs. Seems like the best of both worlds to me. Just as good memory performance, and it's upgradeable.
>That's a moot point, since they still carry all the x86 baggage in microcode, since tons of bad decisions leading to things like Spectra. ARM doesn't.
AMD is not affected by Spectre. ARM also uses speculative execution.
> they still carry all the x86 baggage in microcode, since tons of bad decisions leading to things like Spectra. ARM doesn't
> AMD is not affected by Spectre
This is not true. AMD, ARM, Intel, and IBM are all affected by Spectre (Bounds Check Bypass) and that particular variant has not been fixed in hardware, and short of disabling speculative execution or radical changes to ISA or microarchitecture, is likely to never be fixed purely in hardware.
AMD's new CPUs, post FX, are not affected by the majority of Spectre vulnerability. The ones that do affect them are fixed in software with essentially no performance penalty. Less than even ARM CPUs.
* Exactly how is it much better? It's not the first chip with a 15-25W TDP. When compared to other chips with such a TDP, it's incrementally faster in single core and it trades blows in multicore. It does this at a process node ahead.*
Ok. Where can I buy a laptop with similar performance, battery life and cooling then?
Same arguments came out with the first iPhone. “It’s just a big screen! Other phones have a big screen!”
Similar performance and cooling? Buy a Ryzen 4800U laptop.
Similar battery life? A good part of it is down to the software, sadly.
For the trade-off of 8 instead of 12 hours of battery life, you'll have a repairable and open computer. Seems competitive. You also get upgradeable RAM, a GPU that is actually useful because it has drivers for a useful API, and much more I/O.
M1 is somewhat slower in multicore, and has a fifth of the I/O performance. They are comparable.
Importantly, you're comparing the M1 in a Mac Mini, to the 4800U in a laptop. Comparing the M1 in the Air, for example, the 4800U is quite a bit faster in multicore and the single core performance advantage shrinks.
As for 2 and 3, you're comparing to AMD, not to Intel. AMD CPUs have significantly better power efficiency than Intel.
If you want to compare battery life without taking into account software, compare power consumption. The Zen chip uses slightly more power than the M1 on a laptop.
No I'm not. I'm specifically considering two chips.
Those Macs have both much better software for power management, and much more battery. You can't compare chip power consumption by comparing total endurance of the system.
When you do compare the two, they're within 30% of one another.
Read the anandtech article again. 15-25w is the power consumption of the whole Mac mini measured at the wall plug. I.e it includes power supply, ssd, gpu comparable to RX560, external buses etc etc. If you want an x86 chip that beats its performance by a significant margin you have to pay more for just the chip than for the whole Mac mini.
I would like people arguing the opposite to define "revolutionary".
The combination of the performance increase with the power and thermal efficiency in the first release for a laptop is the sort of improvement we haven't seen in a very long time.
> I'm not seeing that this makes anything fundamentally different
I guess we'll see. $1000 dollars buys a lot more laptop utility than it did a month ago, and the abrupt availability of a laptop with higher performance with far better battery life than anything else on the market has certainly gotten a lot of people's attention.
I'm not buying one[1], but I'm really curious to see the M2.
[1] I'm really torn, the atrophy of the Unix side of the OS and the increasing lockdown has frustrated me a lot. But the option of just running MacOS as a hypervisor that happens to run Photoshop may be just fine for me... I'll decide when the current machine dies.
Sure. To me evolutionary generally means quantitative improvements in expected dimensions. Revolutionary requires overturning some existing order. E.g., the American Revolution replaced the existing government.
I'd call the original Mac revolutionary, in that it totally changed how people interacted with computers. Laptops were revolutionary. Smartphones were revolutionary.
A new generation of laptop that runs somewhat longer without plugging in is not revolutionary. People will be doing the same things.
all day on-battery real-world heavy usage without significant compromises. In fact, not just without significant compromises, but about as fast as a laptop can go.
For me, a lack of constantly thinking about the battery is a big deal. I had the same thing happen w/ my phone: mine got large enough a couple years ago that it was a full 2 day battery life. So I could assume it would work all day.
Same deal here. I can go (in a !2020 world) to a customer, go from meeting to meeting, etc and it will just work.
edit:
Also, it feels like we're transitioning from a world in which laptops are mostly / often used while being plugged in, to a world more like a phone. The default mode of operation is disconnected from a power source.
I agree that's an excellent step forward, but again, I'm not seeing it as a revolution. All-day batteries were always available in the Android world, so I don't think there was any revolution there. Similarly, there are plenty of laptops with all-day batteries. That list includes, but is not topped by, both M1 and previous Macs: https://www.laptopmag.com/articles/all-day-strong-longest-la...
Ah. I found my macs to have all day battery as long as I didn't actually really use it. So it could kind of happen, but only if I paid a lot of attention. I totally recognize this may not make a diff for everyone, but for me, it's moving to assuming the battery lasts all day w/o me managing it. :shrug:
> "Microsoft, at least, has Windows running on ARM, but it doesn't have Rosetta"
I thought that Microsoft's emulation efforts were hampered by Intel making threats, e.g. "Intel recently made an unprecedented public challenge to Microsoft and Qualcomm that basically told the latter two companies: if you ship an x86 instruction set architecture (ISA) emulator, we’re coming after you.", from https://www.forbes.com/sites/tiriasresearch/2017/06/16/intel...
If so, the blame for that really lies at Intel's doorstep as well.
Intel has patents in things like some of the newer vector instructions still - Apple AFAIK will not translate those, and will not report them as available.
Microsoft has a performance disadvantage in that the memory consistency model of ARM is much weaker than x86 sequential ordering or x86_64s total store ordering. Apple’s chips (at least back to the A12Z) support enabling TSO per thread on the performance cores, so the translated code does not need as much fencing.
Did Intel similarly threaten Apple? If not, why not? Even if they didn't know for sure exactly what Apple was up to with the M1, they surely could have speculated, as so many others did.
Rosetta 2 is running on an Apple chip which they’ve designed to have an X86 strong memory model mode. Microsoft has to run on off the shelf ARMv8 chips.
What is revolutionary about it? We still haven't seen Zen in the same node as the M1. Zen still scores higher in absolute terms (albeit at higher power).
It's evolutionary. The power efficiency is great, and the core design is great (wouldn't expect less from those who were Intel's and Qualcomms most talented staff), but calling it revolutionary is an overstatement.
Using an M1 Air, coming from a 16" Pro, here's the diff:
- Cost: $1400 vs $4000 (nearly 1/3)
- Speed: Air feels significantly faster, and more apps move to native, will be even moreso
- Battery: At least double, if not 3x
- Size: Less than 1/2 the weight
The revolutionary thing is doing all of those at once. Nearly tripling battery life while making a huge leap in performance, at 1/3 the cost and half the weight is simply incredible.
Comparing it to previous gen Apple devices doesn't really tell the whole story. You can get a machine from a different manufacturer with similar hardware for muchmuch less than $4000, and it wont have totally broken cooling[0].
Bear in mind that I'm not arguing that the m1 macbooks aren't a really compelling option. I'm just saying this is evolutionary, not revolutionary.
(Making other laptop/desktop manufacturers and OS devs wake up to non x86 architectures could be revolutionary for competition and pricing though, we'll see.)
For years we've been evolving. I've owned ~6 laptops over the last 10 years, all Intel Macs. Each year you get about ~20% ~performance increase for the same "costs" (weight, battery, price).
This year you get about 100% performance increase and the costs are all halved. Ok, so... we'll call it a "huge" evolution. All I know is I've been waiting since the air came out for this laptop to exist: fast enough to handle all my work, with decent battery life. It's been almost a decade of waiting, and not only did it finally come, but it somehow is much faster than even the biggest of last generation and super cheap.
If you made a graph showing (price * perf * battery-life * weight) you'd see a nice evolutionary curve over the last decade, and then this generation you'd see a huge jump - way more than double. You're ignoring the fact this is a 10W CPU doing more than 28W, fanless! And the diff in subjective performance isn't small, it feels bigger than any previous generation jump by quite a lot...
AMD laptops are also crushing this generation of Intel laptops. AnandTech's comparison shows that M1 vs AMD is mostly a power/performance trade-off at the moment.
Not that the M1 isn't exciting especially in the ultrabook segment.
Why are you using Apple’s discontinued 15” model as a benchmark here?
The 16” MacBook Pro most certainly does not have those same cooling issues.
But that’s the thing: Apple had to make their flagship performance laptop bigger including a larger battery than its original design in 2016 because Intel was never able to hit their roadmap goals.
That doesn’t sound like progress to me, it kind of sounds backwards.
In other words, Apple couldn’t give you a reason to buy their performance laptop four years later without making it physically larger. Because if they didn’t do that their customers would only have minimal incremental improvement to compel them to consider a newer system after four years.
In contrast, a four year old iPhone is over twice as slow from a raw performance perspective than the current model.
So yeah, I could buy a much cheaper PC laptop with proper thermals and cooling for much cheaper than a 16” MacBook Pro to get that fantastic performance. Maybe it’d even be a lot better than the M1 Macs.
But I can’t buy a Windows laptop, for any price, that has no fans at all and manages performance and battery life of the MacBook Air.
Apple’s throwing a CPU into their *cheapest laptop that is marketed for basic usage (students, home users) that benches faster than basically all but AMD’s most performant chip, a chip that would probably get you pretty terrible thermals and battery life in a 16” MacBook Pro.
A lot of evolutionary products were, in retrospect, revolutionary. Was the guts of an iPod or Walkman or iPhone or IBM PC or Honda Civic all that different from conventionally available parts? No, consumer electronics almost never have anything particularly exotic going on.
But it’s the whole package that can be revolutionary. The Civic was a dirt cheap car with excellent fuel economy and reliability. Is it all that different from other cars? No, not really, but the entire Japanese auto industry was a disruptive product in the 1980s.
The rumor mill is already telling us that the 16” MacBook Pro may ship with a 12 core (8 big, 4 little) “M1X” or “M2” chip. If the tech media landscape is head over heels for the M1, this new chip is going to make serious waves, because if you double the performance cores in the M1 it’s basically going to be untouchable by anything that can remotely be considered a laptop.
If Apple doesn’t touch the form factor of the 16” MacBook Pro it’s going to have similar results: it’s going to run quieter, cooler, and longer.
In short, it’s just my own opinion that calling this bit of incremental improvement evolutionary rather than revolutionary is something of an understatement.
> Yes M1 is great. But it's a closed system
...
> Apple did something truly revolutionary
While I don't deny they deserve a lot of credit for this, I think it's important to keep in mind that in many cases Apple does this by exploiting its lack of openness. ie: these things are not unlinked.
So where some people see Apple innovating and achieving breakthroughs, I also see them exploiting and leaching off an ecosystem in a toxic way. It's like a chemical plant that dumps its waste into the local river - they get a free ride (not complying with interoperable / openness) which is fine if they are a small part of the ecosystem. But if every company was like that we would be living in technology hell. There would be no internet, no ecosystem of reliable affordable server infrastructure, there wouldn't be a single phone you could write an app for if it wasn't in the corporate interest of a mammoth entity, etc etc.
Microsoft does have their own Rosetta, which has been able to run x86 apps for a long time and will be able to run x64 apps within the next few months in preview builds
Microsoft’s solution is emulation, not binary translation. This is a much slower approach than Rosetta. Hopefully this gives MS the kick in the pants it needs to step up their game though.
Their response was probably more along the line of previous "advertising campaigns" [1] (ie, payola) for the big PC manufacturers to avoid using AMD for their high margin items.
Depends on the price point. Largely Ryzen is winning at the moment but Intel has some compelling options at the low end. E.g. the 10100f looks pretty good if you're not wanting to spend much.
Of course AMD haven't released the 5000 series options on the low end yet, so you can only choose between it and a ryzen 3000 series CPU.
Tiger Lake laptop processors have been released and they are competetive with ryzen 4000 mobile processors. All these comparisons of the M1 to 10th-gen intel cpus are quite unfair when the 11th gen 10nm chips are on the market now and have perf characteristics much closer to the M1
Disclosure: I just chose a Ryzen 4500 budget machine, and the 1135G7 is comparable.
The M1 as a computer is too locked down for me, and it seems like for you. But the M1 as a demonstration of the power of modern ARM chips is absolutely unmatched. I have seen a lot of people use the lockdown argument to justify a disregard of ARM overall without realizing that no, the open version of this technology will be incredible.
> the M1 as a demonstration of the power of modern ARM chips is absolutely unmatched
I feel like it's more a demonstration of Apple's competency in vertical integration, cultivating talent, reducing the impact of politics on product/technical quality and ensuring productivity within and between technical teams. Aspects where Intel has abjectly failed and hasn't found any solutions for.
Intel has tried repeatedly to push into new areas, but they've failed every time. Meanwhile, Apple has somehow figured out how to push into new market segments successfully in a crazy consistent way and the M1 is a clear indication of how healthy and functional they are as a business. Like, when was their last real failure?
Apple is absolutely leading on this front, starting in the phone space. That being said, I don't think their lead over other distributors is especially large. Apple has taken ARM seriously, and built performance chips in their mobile devices that if not for thermal constraints would handily compete with x86 systems. That being said, the same is true of a Snapdragon. Apple ARM, and certainly M1, is better than the competition, but that's because no one else has looked for performance from ARM. The chip designers haven't really been pushed yet. Now that M1 has demonstrated what can happen when the technology is freed from the phone, we'll see people catch up.
I believe Apple has significant advantage to design CPU cores compared to Qualcomm. They acqhired PA Semi team and has been designing great performance cores continuously compared to ARM Cortex (Especially for Single threaded). Qualcomm no longer develop own CPU core since ARMv8 transition.
Oh I'm not against ARM at all. In fact I have many systems running it :) Raspberries but also a Pinebook. I'm also getting an M1 mini from work.
I just don't like the almost-religious overhyping of everything Apple. They're doing a great job. They're not perfect.
For example: Moorhead makes a good point IMO. Yes he's talking about software issues. But in the case of Apple, you can't separate those two. The OS doesn't work on other hardware and (in this case) the hardware doesn't work with other OSes. This works to their (huge) advantage in some cases, being able to optimise each component perfectly. But it can be a drawback too like it is here.
> For example: Moorhead makes a good point IMO. Yes he's talking about software issues. But in the case of Apple, you can't separate those two. The OS doesn't work on other hardware and (in this case) the hardware doesn't work with other OSes. This works to their (huge) advantage in some cases, being able to optimise each component perfectly. But it can be a drawback too like it is here.
Moorhead didn't make that point.
Moorhead made the point that a small amount of software that's mostly rarely used on the Mac platform isn't working perfectly. And even the mass market software he's pointed at, like Adobes is already in beta test with native versions about to be released.
> I just don't like the almost-religious overhyping of everything Apple. They're doing a great job. They're not perfect.
I'm in the same boat as you and the other OP up-thread. I'm amazed by what M1 has accomplished, but less so because of what it represents for itself (and for Apple) and more that it shows ARM is capable of.
As an architecture, ARM is impressing the Hell out of me as of late. Sure, it's not touching Intel and AMD in terms of raw performance, but once you start looking at the performance-per-watt, the story is absolutely different. x86's power budget is looking awfully bloated by comparison.
What happens if NVIDIA's acquisition of ARM goes through? Do they start producing their own CPUs?
In your quest to avoid almost-religious overhyping of everything Apple, I'd contend that you're presenting the inverse: almost-religious anti-hyping of everything Apple. We can criticise them for their seemingly blinkered focus on casual consumers and end-users at the expense of developers and technologists without finding ways to rationalise away anything good Apple does.
Attributing the positive aspects of M1 to the ARM micro-architecture is contemporary revisionism. If micro-architecture was the key differentiator, Microsoft Surface RT would have been a runaway success and Qualcomm would have taken over the notebook CPU market by now. Little of what makes the M1 so impressive has anything to do with the ARM instruction set. The impressive parts are things like the entirely in-house GPU and the mild hardware acceleration of Rosetta 2.
It's also worth noting that Apple has been deeply involved in ARM since the late 1980s. They were a co-founder of Advanced RISC Machines Ltd. along with Acorn and VLSI. Apple were deeply involved in development of the ARM6 architecture and the Apple Newton was one of the first major commercial applications. So even if you limit your amazement to the ARM architecture, a portion of that credit goes to Apple anyway.
> In your quest to avoid almost-religious overhyping of everything Apple, I'd contend that you're presenting the inverse
I must point out that I believe you misinterpreted my intent. I'm largely indifferent to Apple. I'm not entirely sure if you'd find that more offensive or less since I'm interpreting this response as one of annoyance (I'd appreciate a clarification to this end).
I don't use Apple products. And you're absolutely right, the M1 is an impressive feat.
Understood. Intent is often difficult to robustly convey without being neurotically verbose.
I promise my response was not borne of annoyance. I just got the impression that you wanted to focus credit on the one thing about the A1 SOC which could plausibly attributed to anyone other than Apple. That seemed a stretch, is all.
Unfortunately, I know what you mean. I have a tendency toward verbosity that makes it feel that every time I write a post I'm crafting a treatise that might make the great statesmen of old blush (with the exception that I'm far too stupid to make it as interesting or entertaining).
And yes, intent is difficult to convey through text even if you're afforded a great deal of time and effort.
> you wanted to focus credit on the one thing about the A1 SOC
No, not at all. The ML cores and other, err, accessory CPUs they've added are incredibly fascinating. Mostly I was thinking of the ARM portion of the hardware.
Truthfully, I confess that part of my excitement lending itself to ARM is because I genuinely want to see more vendors follow Apple's footsteps and either a) craft their own designs or b) build out ARM-based platforms that will lead toward greater competition in the desktop/laptop markets (or both?). Fortunately, Apple's tendency toward setting market trends gives me some hope that we'll see noteworthy contributions that embarrass x86 and upend the idiotic convention that I think M1 has finally broken. Namely that you can have a chip that works on an incredibly tiny power budget or you can have performance. Apple's proven we can have both, and I think that's a good step forward. It excites me that we may actually see viable competition this decade, and if we do, I'll absolutely credit Apple with upending conventional wisdom and proving that our fixation on x86 for "performance" was myopic and unnecessary.
Maybe that explains my reasoning a bit more. At the expense of being unnecessarily verbose (though you may appreciate that). :)
Also, I owe you an apology. I wrote my reply out of some frustration last night. Regrettably I allowed myself to get sucked into a "discussion" with someone I didn't know was an antivaxxer until I realized their refusal to listen to an explanation of microbiology wasn't born out of lack of interest but rather suspicion. When I saw your post, I genuinely believe I unfairly took some of that frustration out on you. I'm truly sorry about that.
The explanation is appreciated. We’re all human and passions drive us in weird, occasionally unpredictable ways. I’m undoubtedly as guilty of it as anyone.
> We’re all human and passions drive us in weird, occasionally unpredictable ways.
Very true. It's just our nature, and we're all fallible beings in some way or another. Lord knows I'm at the top of the fallibility list.
> I’m undoubtedly as guilty of it as anyone.
I am as well; more than most, in fact.
> Also, fuck those anti-vaxxers.
Agreed!
I didn't realize that was their opinion at the time, so I shared a bunch of medical literature in the hopes of supporting my argument. What I didn't realize was their opinion of any supporting evidence was dismissed as clearly "paid for" by, uh, $LARGE_FACELESS_ENTITY.
While I'm sure that's true in some cases, the mRNA COVID vaccines are fascinating because the principle has been around since 1989. I'm hoping it works, not specifically because of SARS-CoV-2, but because it shows some possibility as an anti-cancer vaccine. We just need to prove the delivery mechanism is functional.
I really want it to work so that we can manage or eliminate other diseases as well. I think that's why I was so frustrated/disappointed.
> I just don't like the almost-religious overhyping of everything Apple. They're doing a great job. They're not perfect.
That may be so, but again that doesn't really apply to the M1's performance as hardware. If the article was about Apple as an investment, or Apple as the designer of the next computer for me, I'd agree that taking into account their software would make sense. But the whole point of the article is that equating Apple's being Apple with the M1 being a badly designed chip is wrong.
M1 is really great but I don't like to see the presentation style like "n% faster than top selling laptop!" (What model, workload, situation, etc...?). Please just show benchmark numbers and models.
Completely agree with regards the closed platform.
I switched to Linux for my development machine for a few years now. And, using something else than i3(or any other tilling window managers) seems like a step down. I like instant switching between workspaces, splits, no animations.
And I'll be looking for a good laptop next year, when everything returns back to normal, and I might have to travel. And will put Linux on it.
And agree with regards to aggressive marketing, in my case it makes the brand a bit toxic for my taste.
I live a double life: Apple for laptops and Linux for desktops (and I'm mostly a Desktop user who manages a lot of Linux servers).
Both platforms have their pluses and minuses. I like to carry ideas between them for my development and environments. Found this extremely fruitful for me over the years.
We ordered a bunch of M1 airs for portability and power efficiency. Will see how it goes.
I personally love both Linux and Apple for laptops but, for the out of the box experience and reliability, Apple is more polished due to integration. My EliteBook has no problems with Linux so far but, when a flawlessly running driver degrades after an update (cough, intel e1000e, cough), it leaves a bad taste in the mouth. OTOH, I'm firstly a Linux guy who prefers and writes GPL software.
Yep. Mac's look like a comfortable unixy environment. Preinstalled zsh, a fast terminal emulator (item), gnu utils. I'm sure I could make it work as a dev machine.
I have to admit that m1 macbook air battery life is compelling. A long time ago I was thinking about a perfect laptop that would be very light and have a 20+ hours of battery life. This comes close. Even though I have never used a laptop more than 5 hours on battery. And that happens a couple of times a year.
Thank you. I’m really excited about it. Using a different CPU architecture is always exciting. :)
Using Mac is like driving an Audi S8. Comfortable, smooth, powerful. However, it's not your hand tuned Impreza which can read your mind while going above 200Km/h.
Even an Intel MacBook pro can go a long time on battery (I have a personal Mid2014, configured all-out). I've developed some system-abusing scientific code on it and, it really delivers.
The trick for me is, I can make a Mac dev machine without modifying it with homebrew, et. al. I install a Linux VM to VMWare, give it a static IP, install all required tools, servers and services on it. On the macOS side, I use Eclipse for IDE and use clang/llvm for compiling (since I also aim to code which behaves same on both gcc and llvm).
I develop on macOS and interface it with Linux VM via network if it's absolutely necessary. Linux VM also hosts some heavy tooling like LaTeX which cannot be installed/updated on macOS very cleanly (I know macTeX. It doesn't play nice with newer macOS, due to Apple's locks on the OS).
Then everything becomes transparent for me. Pull -> Develop -> Push in either direction. Eclipse already syncs itself via oomph and cloud. Code is portable, environment is same. It also ensures code compatibility, allows me to see compiler effects and run tests on many platforms.
As aforementioned, it also inspires me to write better software. My code carries Apple's sensible defaults and it-works mentality with Linux's flexibility options. This approach allowed me to create a series of utilities for a project in limbo. These small no-setup utilities saved a lot of labor and ultimately saved the project. So yes, apple is a walled garden and it's not optimal but, they do a lot of right things. We can selectively carry them to more open platforms to make them better. Similarly, open platforms' flexibility can be carried to some macOS applications so, the environment better accommodates power users out of the box.
Homebrew and other tools are nice but, I don't like to shoehorn stuff which doesn't fit natively into an ecosystem.
> I know macTeX. It doesn't play nice with newer macOS, due to Apple's locks on the OS
I’ve been using LaTeX from /opt/local with zero integration issue, even on Big Sur. What are the issues you’re alluding to?
Otherwise, I agree with your take in general, except that I do everything you do in a Linux VM directly in macOS. I develop and run my codes on a Linux workstation and an iMac, and going back and forth improves the codes a lot. Performance and usability issues are spotted much earlier.
I much prefer Macports’ approach than Homebrew’s. It is nicely self-contained in /opt/local, does not break when a system library is updated, and the file system layout is much saner. Just update your environment in zshrc and you get all the benefits whilst not changing anything for all the stuff that you don’t run in a terminal.
In the past, MacTeX team had a problem with a particular OS release (when SIP was released and enabled) and, I was in the middle of my Ph.D. At that time, had no time to wait for problems to resolve. I got VMWare, installed Linux, tuned my LaTeX environment and never looked back.
I'm sure that all the problems are resolved by now but, my workflow is mature now and everything is working flawlessly. Considering I'm going to need Linux anyway, did no efforts to move my LaTeX workflow back to macOS again.
Since the code I'm developing is going to be used in a lot places, I'd rather develop it in two distinct environments and run tests on each. Also, I like to experiment with different development tools on different environments. Experimenting and experiencing each environment broadens my horizon. Also it's more enjoyable IMHO.
Didn't play with Macports TBH. I don't think I'm going to use it but, will take a look to it.
Another thing is, I don't customize/change my terminals much. When you manage 1000+ servers with a team, customizing each terminal to your liking is not feasible so, I can work pretty fast with stock bash or anything. I'm old school and don't like flashy console setups anyway. :D
> In the past, MacTeX team had a problem with a particular OS release (when SIP was released and enabled) and, I was in the middle of my Ph.D. At that time, had no time to wait for problems to resolve. I got VMWare, installed Linux, tuned my LaTeX environment and never looked back.
I feel for you; the middle of writing up a PhD is about the worst time to have this type of technical issues. I remember being afraid of any update back then.
> Didn't play with Macports TBH. I don't think I'm going to use it but, will take a look to it.
To me, Macports is the closest to a sane package manager like on FreeBSD or most linuxes. There is practically no learning curve if you’ve already used one. But yeah, it’s not flashy or cool like Homebrew and it wants to use the system software as rarely as possible so it will reinstall zlib etc. I think the trade off is acceptable because then everything is more stable and predictable as the exact libraries used are known and tested, and won’t be broken by an update.
> Another thing is, I don't customize/change my terminals much. When you manage 1000+ servers with a team, customizing each terminal to your liking is not feasible so, I can work pretty fast with stock bash or anything. I'm old school and don't like flashy console setups anyway. :D
Yeah. In terms of looks, have simple settings to see at a glance if I am on my local computer, a workstation over the LAN, or a cluster somewhere else (I like clean terminals). I have one git repo with settings files and zsh modules though. So I spent quite some time fine tuning everything, and all the computers I use behave the same, whether they run macOS or any Linux distro (or even Cygwin, actually).
> when a flawlessly running driver degrades after an update (cough, intel e1000e, cough), it leaves a bad taste in the mouth.
Hah! I see I'm not alone. For me it was the iwlwifi drivers. In one kernel revision, they worked fine; in another, they'd randomly panic and bring down the interface. Downgrading the firmware packages didn't do anything. Kernel upgrades have worked as of late.
Though, it may be hardware-related in my case. I've had nothing but trouble with that particular card. I was going to replace it, but it seems that Lenovo really likes Loctite on their M.2 screws and it won't budge. Tempted to use a soldering iron to melt the thread locker...
You're now making me question whether this is a problem with the wifi card or the driver. I admit I'm still convinced it's probably the card (I have an earlier revision in a ThinkPad that works fine), but now I'm not so sure!
Curious: Are you talking about that "little" issue where the i915 module was randomly causing a panic? I ran into someone who had some really perplexing issues that a kernel update later resolved, and their dmesg was strongly hinting (to me) that it was linked to the i915 drivers. That, and a bug report I managed to dig up.
And I also agree. I'm looking at a Ryzen build, too. I have a couple of Intel NICs in my file server though and they're better quality IMO than the Realtek cards.
Actually it can be anything. My mother owns a HP Spectre X2 convertible with an Intel WiFi card. Only the driver bundled with the Windows works. Any newer version via Windows Update or Intel breaks the card after a standby-wake cycle. Both updated drivers support the card on paper but, either the driver has a problem or the way the hardware designed on that particular computer messes some stuff up. BIOS upgrade didn't fix anything too.
> Are you talking about that "little" issue where the i915 module was randomly causing a panic?
No, I started to see some lines in the display like a faulty GPU draws. Sometimes a line, sometimes small corruptions were visible but, it was always fixed itself after a minute or two. The laptop in question is extremely low power so, nothing gets hot.
Newer kernel updates fixed these issues too.
Intel's mid-range and higher-end NICs have proper offloading and processors so, they neither tax the system much or fail to reach their advertised speeds. They're real cards for real loads, so they're more reliable AFAICS. A Realtek card works reasonably well for most of the light loads but, load it up with sustained high load, it cannot saturate the interface as it should.
> Any newer version via Windows Update or Intel breaks the card after a standby-wake cycle.
hahaha that's insane.
I've seen some weird things come out of HP machines in the past, so somehow none of the above would surprise me.
> No, I started to see some lines in the display like a faulty GPU draws.
Ah, interesting. There was a drive bug in the i915 module that could potentially cause a panic on certain hardware. I never experienced it myself, but I ran into someone who had a similar issue. I don't think their problem was tied to the i915, but it was also resolved by a kernel update.
> A Realtek card works reasonably well for most of the light loads but
Yeah, and the Realtek chipsets are much more cheaply designed. Although, what puzzles me is that most of the Intel chipsets aren't that much more expensive (depends on vendor, of course, but you can easily find off-brand dual-port cards for ~$40USD with an Intel chip). It really makes it pointless to invest in Realtek cards that are both cheap and may or may not work at all.
I remember having an issue with the onboard one in my desktop where it would randomly disconnect after negotiating 1Gbps. If I forced it into 100Mbps it would be fine. But, as with the earlier discussion(s), that was also resolved with a kernel update. I used an Intel card for a few years because of that reason, so I have no idea when it was eventually fixed.
Moreover, out of the box driver is a stock intel wifi driver. It's not vendor-specific. Phew.
> I remember having an issue with the onboard one in my desktop where it would randomly disconnect after negotiating 1Gbps.
That's how the new driver broke one of the NICs. Other ones were just failing to detect carrier at all. The bug you're referring is [0], which I also added some feedback.
> The bug you're referring is [0], which I also added some feedback.
Possibly not, unless I'm misunderstanding. The bug was seen in a Realtek card in 2011-ish.
I'm suspicious that might've colored my opinion of Realtek early on and why I still buy Intel cards to this day. Well, depending on the motherboard, of course.
Oh, then Intel actually caught-up with Realtek and broke their cards the same way for some time then. :)
Realtek's 8139's first revisions was not able to perform well in most cases (~30mbit max speed, unrealiable). 8169 is much much better. Their new wireless cards also work very well if the driver is good, but they have so many sub-models with very different feature sets, so you need to get the exact chip for your needs.
I use a 8111 (PCIe, Gigabit) at office for desktop to laptop bridge and it works fairly well, no stability or speed problems for now.
> Oh, then Intel actually caught-up with Realtek and broke their cards the same way for some time then. :)
Hadn't thought about it that way, but that's hilarious. Realtek's such a trend-setter.
> 8169 is much much better.
Just checked, and apparently that was the card in the machine I had issues with. I think part of the problem is that it was new at the time. So, I can't really fault the drivers per se. I never had any issues with it after they were fixed.
I have a Macbook and used to have a Hackintosh for my desktop. But this ARM move made me move away from that solution on the desktop.
I tried using Windows with WSL2 first but there was just too many workarounds I had to do for my workflow.
I then looked into the most popular Linux distributions and chose Pop!_OS. Suffice to say, I am really satisfied with it. It's probably the Linux distribution that I feel is the closest to MacOS that I have used.
I am interested in the M1 Macs but my 16" MBP is still fine. I will look into the ARM Macs next year to see what they offer.
The question is how far he can get without Apple's documentation. On the CPU side, probably pretty far as they use the standard ARM ABI. On the GPU side? This is where things will get really hairy. Same with the machine learning cores.
I'm not planning to get a macbook. My main contender would be an XPS with a new ryzen cpu on 5nm, if they'll have one next year, or a thinkpad again with Ryzen 5nm. I would not accept anything less.
But thinking about it, it might be that the mobile AMD CPUs are a year behind the desktop ones. So next year the best ryzen could still be at 7nm Zen3. In that case apple will have a 2 year transistor lead. But, will see.
Only wishful thinking from my side. As mentioned in the post, I hope that the M1 performance and aggressive apple marketing will push premium pc laptop manufactures in adopting the best performance x86 CPUs. And I also hope that AMD will seize the opportunity to get a good chunk of mobile market.
Of course, next year laptop models are probably being finished right now, and it could be that intel still found a way to bribe premium manufactures from including AMD in their premium models.
In my opinion they'll lose marketshare if that happens. But, there are most likely smart people with a lot of more information than me making these decisions.
Yeah John Gruber has really become a shill
over the last few years.
[...]
Yes M1 is great. But it's a closed system.
You can like Gruber or not, and that is your choice. I agree with him more often than not, but I certainly don't always agree.
However, while the "closed system" thing is a valid criticism of Apple, I strongly disagree that its omission from Gruber's article is a valid criticism of his writing.
Gruber writes for a specific and informed audience.
He is not writing articles for people who've never heard of Apple, or people who may not be aware that Apple runs a fairly closed ecosystem. He assumes that you know all of that.
Whatever you think about his writing as it exists today, imagine if he had to preface every single blog post with background information like that? Should the first 1,000 words of every post include a brief history of Apple and a discussion of open vs. closed ecosystems, and where Apple falls on that spectrum?
He knows that stuff. He knows that we know that stuff.
What's the last article (or speech, or email, whatever) you wrote? Surely, when deciding what to write and what not to write, you considered your audience and their level of context and domain knowledge, as well as the constraints of your medium?
I've read pretty much all of Gruber's blog posts for years. I'm very much an Apple person myself. I wouldn't call him a shill, but I do have to agree that he's become lazy about some of his opinions. He often bashes Facebook (which certainly deserves criticism) without ever using it. Many of his Apple reviews are very long-winded love letters to Apple about how great their products are. Every once in a while he will call out blatant Apple mistakes, but they have to be pretty bad.
>John Gruber has really become a shill over the last few years
Only a grain of truth to that. He has always been known, as far as I am aware, merely as an Apple fan who blogs about it. There may be some "shilling", to the extent that he probably has Apple stock, but it is also done out of conviction.
Yes M1 is great. But it's a closed system (no booting Linux or Windows at this time) and even under macOS you're having even less access to the system's internals than before. Also, Apple performs many tricks to get such long battery life, such as putting apps to sleep when you're not looking at them (app nap). I find this extremely irritating when working with my Mac but I can't turn it off completely even though I have a mini and thus battery life is not a factor at all. My work MacBook spends it entire life docked with the screen closed too.
M1 is a solid platform for sure, better for laptops than intel/amd's offerings. I agree. But I'm getting enough of Apple's (and Gruber's) hypey marketing. It's not magic, just good engineering. I stopped watching their keynotes for the same reason.