Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Ironically their over sensitive nsfw image detector in their api caused me to stop using it and run it locally instead. I was using it to render animations of hundreds of frames but when every 20th to 30th image comes out blurry it ruins the whole animation and it would double the cost or more to rerender it with a different seed hoping to not trigger the over zealous blurring.

I don’t mind that they don’t want to let you generate nsfw images but their detector is hopelessly broken, it once censored a cube, yes a cube...



Unfortunately their financial and reputational incentives are firmly aligned with preventing false negatives at the cost of a lot of false positives.


Unfortunately I don't want to pay for hundreds if not thousands of images I have to throw away because it decided some random innocent element is offensive and blurs the entire image.

Here is the red cube it censored because my innocent eyes wouldn't be able to handle it; https://archerx.com/censoredcube.png

What they are achieving with the over zealous safety issues are driving developers to on demand GPU hosts that will let them host their own models, which also opens up a lot more freedom. I wanted to use the stability AI api as my main source for Stable Diffusion but they make it really really hard especially if you want use it as part of your business.


I agree that given the status quo, it's a no-brainer to host your own model rather than use their SaaS – and likely one of the main reasons SAI doesn't seem to be on a very stable (heh) footing financially. To put it mildly.


Everyone always talks about Platonic Solids but never Romantic Solids. /s




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: