Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Macs that can run it are quite a bit more expensive than a 3090. GPUs can also do finetuning and run other models with larger batch sizes which Macs would struggle with. Also, for the models that fit both, an nvidia card can run it much faster.


It’s really not that expensive to get an M4 pro with 64GB. But you really want a Max or ultra, and you aren’t going to be able to do much with images. You are limited on the GPU part.

3090s aren’t sold anymore, so you have to scrounge for them in the used market, 4090s are a couple K? The local LLM scene is still pricey and confusing.


>>not that expensive to get an M4 pro with 64GB

$3900 as the starting price does not sound 'not that expensive' for me.


M4 Pro with 64GB of ram starts at $3900? It doesn't take that much searching to know that is BS:

https://www.apple.com/shop/buy-mac/mac-mini/apple-m4-pro-chi...

An upgraded M4 Pro mac mini to 64GB is $1999. You'll probably want the extra GPUs, though, bringing you up to $2199. But you probably want the max (not offered for the mini) or ultra to do serious LLM work. An M4 Max in a 14 inch MBP with 64GB of ram will cost $3900, but that's a whole step up from the M4 pro. I'm waiting to see what they do with the Mac Studio refresh (I don't need a max in a portable, and I'm betting I can get an M4 Ultra with 64GB ram for $4000 or so).


> M4 Pro with 64GB of ram starts at $3900?

No, but the OP was talking about a M4 Macbook, not a M4 Pro mini. "I can easily run 70b on my macbook."

Like you said, a M4 Pro 14" Macbook does start at $3,900. Because they lock the 64GB ram behind the top spec M4 Max. There is no M4 Pro MBP with 64GB of ram, only M4 Max.

> 36GB available with M4 Max with 32‑core GPU. > 48GB available with M4 Pro or M4 Max with 40‑core GPU. > 64GB or 128GB available with M4 Max with 40‑core GPU.

https://www.apple.com/shop/buy-mac/macbook-pro/14-inch-space...


No, you can get an M4 Max MacBook Pro for $3900. You guys keep quoting M4 pro as the M4 max, they aren’t the same chip. M4 pro MBP starts at $1999. Again, M4 pro != M4 max.

The page you linked to said as much.


The question is of 64GB laptop-shaped machines.

That page won't let me configure a machine with 64GB until I pick the "Apple M4 Max chip with 16‑core CPU, 40‑core GPU, 16‑core Neural Engine" option.

And a machine with that option selected, and with 64GB, starts at $3,899.00.

Remember: This all started with "what do you mean? I can easily run 70b on my macbook. Fits easily."


This is tiring:

Mac mini M4 pro 64 GB not - $1999

MBP M4 Pro 48 GB - $2799 (64 GB not an option)

MBP M4 Max 64 GB - $3899 (only top line M4 Max supports 64 GB)

Obviously, an M4 Max is not an M4 Pro, and an M4 Pro is much much cheaper than a M4 Max. Original comment said M4 Pro started at $3899, which is obviously uninformed or malicious. Calling it an “M4 pro M4 max” is just lunacy.

The original comment really was just about the M4 Pro with 64 GB, and the Mac mini does actually deliver that configuration for $1999. Switching it up to a laptop only discussion came later and was just an instance of moving the goalpost.


You're also going to want some more SSD space, because working with these model files on a 512 GB root is going to quickly become a problem. If you get the 2TB option, that's another $600.


Thunderbolt 5 mostly takes care of the storage problem.


You are right. I've been looking at MacBook with M4.


You would need to get an M4 Max (which is probably the right choice for real LLM work). I don't think mobile GPUs do very well on the PC side, so if you want portability, Apple might be your best bet.


I'm sorry. It's cheaper than first class flight from SFO to London, say.


You can say that $2,000 for an M4 Pro is supposedly “not that expensive”, but for about $1,000 I can buy a gaming laptop with a 3090 and use the system RAM to help run larger models. The RAM may be slower, but the higher TFLOPS of the 3090 help greatly. It’s even better if we compare the Max, which as you said is more suited for LLM work, as one can then upgrade to a 4090 and still be in the clear.

The local LLM scene is definitely pricey and confusing, but making it more confusing by recommending the wrong hardware for the task doesn’t help much.


I see often MBP with 48-64gb and 1TB under than 3500 CHF. Including the M4 (thx to black Friday week).

Meanwhile 4090 are close to 2000CHF.

I have no doubt where the actual value is.


A 3090 often goes for $800-900 on the used market in the US. Two of these would be $1800, and you get a much more versatile machine. However, the downside is also obvious, since your two 3090s can draw up to 800 W, and there's no chance you can carry it around with you. Overall, it's not that obvious where the actual value is, as it depends a lot on what you want to use it for.


What do you mean by "versatile"? I enjoy using my macbook for literally everything I do. And it can run 70b llama with ease. Seems extremely versatile, since I will have macbook anyway.

What would I do with giant tower chugging microwave worth of energy? I would better just do AWS for training. That is way more price efficient.


Also much more price efficient to buy a 3090 gaming laptop. It can do everything a Macbook can, but better, and at a third of the cost. You don’t need to put all of the money you save by buying non-Apple products into GPUs, after all.


Alright but being Swiss you can probably afford to buy both and still have most of your monthly pay left over lmao.


Honestly, not that expensive. Not sure what the problem is. It's like one first class flight from SFO to London. You know people buy those?

And what would I do with 4090? Buy a tower, insert it and SSH into it from my mac to run some model? And it still won't be enough to finetune it. Much more price efficient would be to just rent some 100s in the cloud for training.


missed the point where I wrote something about price. If it is expensive for you - just get AWS and train your models there or 3090

I got macbook anyway and it also happily runs llama 70b.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: