Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> Power consumption is the defining characteristic of AI.

I'm not so convinced. I've been playing with running Ollama + llama3.1 8B on my 2023 M2 MacBook Air with 24GB of RAM, and I don't notice much difference in battery life with or without it. I'm not querying it continually with a shell script in a loop or anything like that, but neither am I shy about throwing all kinds of prompts at it. My laptop keeps chugging away on battery.

Training AI may be ferociously resource intensive, but I haven't seen that querying those models is especially bad. I'd think that a model that Apple had tailored specifically to run well on its hardware would be relatively "light".



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: