Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Because the ChatGPT API (and analogous competitors) is cheap enough that it's both faster and more cost effective to just use it instead instead of using your own model, with maybe some shenanigans to handle its shortcomings without increasing cost much if at all.. And that was before gpt-3.5-turbo-0613, which dropped the price more and is about 2-3x faster.

There are startups that do finetuning on your own data, but with zero hints on how to preprocess your data and absurd costs (both upfront training and GPUs for serving inference) that's it's extremely difficult to argue from a customer business perspective compared to just using an API.





Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: