Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

edit: I conflated ImageNet with the art exhibitors; it is the former who are culling the images as a result of public reaction, not the latter.

This is a really bizarre project. I had seen some really offensive race-based labels, but I thought revealing the ugliness of the system was part of the point of this project?

But besides that, the results just seemed completely scattershot; I half expect the artists/exhibitors to reveal that x% of the results were randomized. Last week, I tried it myself after seeing another Asian user display results that were entirely Asian slurs (e.g. gook, slant-eyed). I uploaded my own very Asian-looking photo and got "prophetess", along with very vague labels, such as "person" and "individual".

Maybe the exhibitors cleaned the data/results by the time I tried it, but I used it just a few hours after seeing the other Asian user's results, so I'm doubtful that her tweet/complaints were enough on their own to change up the dataset that same day.



> I thought revealing the ugliness of the system was part of the point of this project?

It 100% is. From the link to the artist's website:

"Things get strange: A photograph of a woman smiling in a bikini is labeled a “slattern, slut, slovenly woman, trollop.” A young man drinking beer is categorized as an “alcoholic, alky, dipsomaniac, boozer, lush, soaker, souse.” A child wearing sunglasses is classified as a “failure, loser, non-starter, unsuccessful person.” You’re looking at the “person” category in a dataset called ImageNet, one of the most widely used training sets for machine learning."


I was just about to edit and correct myself; it is ImageNet who has made the decision to delete the offensive images, after the reaction to the exhibitors' work. It's too bad the exhibitors didn't make their own mirror/cache of the dataset. Judging from some tweets I saw, I think this project really helped people to understand how much of current artificial intelligence is human-driven. It's not a sentient computer deeming you to be a "slant-eye", it's a bunch of random Internet users. (not that this makes you feel better about the world, but at least the hate's coming from an expected source)




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: