Hacker Newsnew | past | comments | ask | show | jobs | submit | MPSimmons's commentslogin

Ope! Sorry, I missed it!


I am, in general, pretty anti-Elon, so I don't want to be seen as taking _his_ side here, and I am definitely anti-CSAM, so let's shift slightly to derivative IP generation.

Where does the line fall between provider responsibility when providing a tool that can produce protected work, and personal responsibility for causing it to generate that work?

It feels somewhat more clearcut when you say to AI, "Draw me an image of Mickey Mouse", but why is that different than photocopying a picture of Mickey Mouse, and using Photoshop to draw a picture of Mickey Mouse? Photo copiers will block copying a dollar bill in many cases - should they also block photos of Mickey Mouse? Should they have received firmware updates whenever Steamboat Willy fell into public domain, such that they can now be allowed to photocopy that specific instance of Mickey Mouse, but none other?

This is a slippery slope, the idea that a person using the tool should hold the tool responsible for creating "bad" things, rather than the person themselves being held responsible.

Maybe CSAM is so heinous as to be a special case here. I wouldn't argue against it specifically. But I do worry that it shifts the burden of responsibility onto the AI or the model or the service or whatever, rather than the person.

Another thing to think about is whether it would be materially different if the person didn't use Grok, but instead used a model on their own machine. Would the model still be responsible, or would the person be responsible?


> Where does the line fall between provider responsibility when providing a tool that can produce protected work, and personal responsibility for causing it to generate that work?

There's one more line at issue here, and that's the posting of the infringing work. A neutral tool that can generate policy-violating material has an ambiguous status, and if the tool's output ends up on Twitter then it's definitely the user's problem.

But here, it seems like the Grok outputs are directly and publicly posted by X itself. The user may have intended that outcome, but the user might not have. From the article:

>> In a comment on the DogeDesigner thread, a computer programmer pointed out that X users may inadvertently generate inappropriate images—back in August, for example, Grok generated nudes of Taylor Swift without being asked. Those users can’t even delete problematic images from the Grok account to prevent them from spreading, the programmer noted.

Overall, I think it's fair to argue that ownership follows the user tag. Even if Grok's output is entirely "user-generated content," X publishing that content under its own banner must take ownership for policy and legal implications.


This is also legally problematic: many jurisdictions now have specific laws about the synthesis of CSAM or modifying peoples likenesses.

So exactly who is considered the originator is a pretty legally relevant question particularly if Grok is just off doing whatever and then posting it from your input.

"The persistent AI bot we made treated that as a user instruction and followed it" is a heck of a chain of causality in court, but you also fairly obviously don't want to allow people to laundry intent with AI (which is very much what X is trying to do here).


Maybe I'm being too simplistic/idealistic here - but if I had a company that controlled an LLM product, I wouldn't even think twice about banning CSAM outputs.

You can have all the free speech in the world, but not with the vulnerable and innocent children.

I don't know how we got to the point where we can build things with no guardrails and just expect the user to use it legally? I think there should be responsibility on builders/platform owners to definitely build guardrails in on things that are explicitly illegal and morally repugnant.


>I wouldn't even think twice about banning CSAM outputs.

Same, honestly. And you'll probably catch a whole lot of actual legitimate usage in that net, but it's worth it.

But you'll also miss some. You'll always miss some, even with the best guard rails. But 99% is better than 0%, I agree.

> ... and just expect the user to use it legally?

I don't think it's entirely the responsibility of the builder/supplier/service to ensure this, honestly. I don't think it can be. You can sell hammers, and you can't guarantee that the hammer won't be used to hurt people. You can put spray cans behind cages and require purchasers to be 18 years old, but you can't stop the adult from vandalism. The person has to be held responsible at a certain point.


I bet most hammers (non-regulated), spray cans (lightly regulated) and guns (heavily regulated) that are sold are used for their intended purposes. You also don't see these tools manufacturers promoting or excusing their unintended usage as well.

There's also a difference between a tool manufacturer (hardware or software) and a service provider: once the tool is on the user's hands, it's outside of the manufacturer's control.

In this case, a malicious user isn't downloading Grok's model and running it on their GPU. They're using a service provided by X, and I'm of the opinion that a service provider starts to be responsible once the malicious usage of their product gets relevant.


None of these excuses are sufficient for allowing a product which you created to be used to generate CSAM on a platform you control.

Pornography is regulated. CSAM is illegal. Hosting it on your platform and refusing to remove it is complicity and encouragement.


> I don't know how we got to the point where we can build things with no guardrails and just expect the user to use it legally?

Historically tools have been uncensored, yet also incredibly difficult and time-consuming to get good results with.

Why spend loads of effort producing fake celebrity porn using photoshop or blender or whatever when there's limitless free non-celebrity porn online? So photoshop and blender didn't need any built-in censorship.

But with GenAI, the quantitive difference in ease-of-use results in qualitative difference in outcome. Things that didn't get done when it needed 6 months of practice plus 1 hour per image are getting done now it needs zero practice and 20 seconds per image.


> Where does the line fall between provider responsibility when providing a tool that can produce protected work, and personal responsibility for causing it to generate that work?

If you operate the tool, you are responsible. Doubly so in a commercial setting. If there are issues like Copyright and CSAM, they are your responsibility to resolve.

If Elon wanted to share out an executable for Grok and the user ran it on their own machine, then he could reasonably sidestep blame (like how photoshop works). But he runs Grok on his own servers, therefore is morally culpable for everything it does.

Your servers are a direct extension of yourself. They are only capable of doing exactly what you tell them to do. You owe a duty of care to not tell them to do heinous shit.


It's simpler to regulate the source of it than the users. The scale that genAI can do stuff is much, much different than photocopying + Photoshop, scale and degree matter.

> scale and degree matter

I agree, but I don't know where that line is.

So, back in the 90s and 2000s, you could get The Gimp image editor, and you could use the equivalent of Word Art to take a word or phase and make it look cool, with effects like lava or glowing stone, or whatever. The Gimp used ImageMagick to do this, and it legit looked cool at the time.

If you weren't good at The Gimp, which required a lot of knowledge, you could generate a cool website logo by going to a web server that someone built, giving them a word or phrase, and then selecting the pre-built options that did the same thing - you were somewhat limited in customization, but on the backend, it was using ImageMagick just like The Gimp was.

If someone used The Gimp or ImageMagick to make copyrighted material, nobody would blame the authors of The Gimp, right? The software were very nonspecific tools created for broad purposes, that of making images. Just because some bozo used them to create a protected image of Mickey Mouse doesn't mean that the software authors should be held accountable.

But if someone made the equivalent of one of those websites, and the website said, "click here to generate a random picture of Mickey Mouse", then it feels like the person running the website should at least be held partially responsible, right? Here is a thing that was created for the specific purpose of breaking the law upon request. But what is the culpability of the person initiating the request?

Anyway, the scale of AI is staggering, and I agree with you, and I think that common decency dictates that the actions of the product should be limited when possible to fall within the ethics of the organization providing the service, but the responsibility for making this tool do heinous things should be borne by the person giving the order.


I think yes CSAM and other harmful outputs are a different and more heinous problem, I also think the responsibility is different between someone using a model locally and someone promoting grok on twitter.

Posting a tweet asking Grok to transform a picture of a real child into CSAM is no different, in my mind, than asking a human artist on twitter to do the same. So in the case of one person asking another person to perform this transformation, who is responsible?

I would argue that it’s split between the two, with slightly more falling on the artist. The artist has a duty to refuse the request and report the other person to the relevant authorities. If that artist accepted the request and then posted the resulting image, twitter then needs to step in and take action against both users.


Maybe companies shouldn't release tools to generate CSAM, and shouldn't promote those tools when they know they produce CSAM.

sorry you're not convincing me. X chose to release a tool for making CSAM. they didn't have to do that. They are complicit.


A pen is also a tool for making CSAM.

Truly, civilization was a mistake. Retvrn to monke.


A pen is not a hosted service for generating CSAM, and if you were hosting a service where you drew CSAM with a pen for money you'd be arrested

"You'd be arrested" is such a beautiful argument. Truly an unimpeachable moral ground.

The fact that they basically stopped the ability to ask 'soft' questions without a definite answer made it very frustrating. There's no definitive answer to a question about best practices, but you can't ask people to share their experiences or recommendations.

They actually added some new question categories a while ago [1]

"Troubleshooting / Debugging" is meant for the traditional questions, "Tooling recommendation", "Best practices", and "General advice / Other" are meant for the soft sort of questions.

I have no clue what the engagement is on these sort of categories, though. It feels like a fix for a problem that started years ago, and by this point, I don't really know if there's much hope in bringing back the community they've worked so hard to scare away. It's pretty telling just how much the people that are left hate this new feature.

[1] https://meta.stackoverflow.com/questions/435293/opinion-base...


Oh, that's good that they added them. I stopped being active in on the sites a long time ago, so I missed that.

>Lp(a) levels are almost purely genetically determined and so elevated Lp(a) is essentially due to a poor roll of the genetic dice... For simplicity, we will devote little further attention to either of these secondary risk factors”

As someone who rolled poorly on those genetic dice, I would like to complain. But also, disregarding a factor that impacts 20%[1] of the population seems disingenuous.

[1] - https://familyheart.org/family-sharing-tools/high-lpa-family...


Lepodisiran has reduced Lp(a) by 94% in clinical trails. After it gets approved, IMO the author should re-write this section.

As far as I can tell from the narrative, Venezuela was basically serving as a puppet state for China, and if that's true, I would probably give that as the primary reason, but who knows. Maybe it's because Venezuela did poorly in the FIFA World Cup qualifier and this was action dictated by his recent peace prize award.

On a slightly more serious note, the charges against Maduro were actually filed in 2020: https://www.justice.gov/archives/opa/pr/nicol-s-maduro-moros...


Good call on the China angle. Maduro is a first grade asshole. But this is just one bully taking out another on a pretext. I'm thinking more along the lines of a gang war over territory than a goal of lifting up Venezuela to the point that they will be freely able to deal with whoever (or nobody) when it comes to their natural resources.

Gonna have to write more speeding tickets to pay for these, I guess


Yeah, this is definitely an n-dimensional modeling space. Left/Right wing are only one axis, and as far as I can tell, not even the most definitive in terms of determining if I agree with someone's perspective. I think that would be the authoritarian / anti-authoritarian. Lots of left wing people are still authoritarians, but between those two, you can get a _decent_ feeling of where someone is with regard to the national politics.


Agreed, I immediately thought of vehicles, but this is cool too.


Ever since I got involved with Espressif's ESP32/ESP8266 chips, I haven't even thought about arduinos, except to download the UI, but you don't even need to do that with the right VSCode extensions to make your life better.

I do keep meaning to try this though - https://platformio.org/


Last time I used Arduino was probably the late 2000s, as a kid/teenager,student their prices always felt too high to me, so I moved to "compatibles" or "clones" for a while.

Once ESP8266 and ESP32 came along (with a detour thanks to Raspberry Pi coming along in the 2010s), there was really no need nor desire to use Arduino anymore and like you I forget about them.

Maybe they have a place in education, and maybe in industrial applications, but outside of that, I wouldn't even consider Arduino anymore.

PlatformIO by the way is excellent, and I've used it for all ESP8266/ESP32 development in VSCode for some time now, though increasingly I just use ESPHome, as my desire to program microcontrollers at a low level wanes and my desire to simply achieve the task quickly grows.


I also really appreciate how routine ESPHome has made building sensor networks in my home. Really just incredibly useful software.


As a hacker and tinkerer I hate ESPHome. Yes it's super cool to have a turnkey "I want these sensors, give me firmware" but all of the code is hidden away and you can't easily modify or add to it.

If I want to run custom logic, I have to bundle a custom component into the esphome thing. Not bad I guess but I still don't like the lack of control


You can definitely override existing and create custom C++ components in ESPHome without too much hassle. It allows lambdas in the YAML itself.


my custom logic runs as something in home assistant usually, haven’t yet had a need to add stuff in the esphome config yet (although I could imagine many situations where it’s necessary)


Yeah, for sure. My more... uhh.. creative toys are definitely not ESPhome, but it's super cool that I can solder on a few wires to a DHT-11 and throw it in a closet with the weakest USB power plug I own, and end up with a great sensor.


Arduino’s sweet spot was always in education and learning.

I think most people should graduate into PlatformIO or vendor tools if they’ve used Arduino to learn basics.

I think it’s great that the Arduino ecosystem became so large and capable, but it had the side effect of leading many beginners into thinking that Arduino was synonymous with microcontroller.

I even took a contract once updating a company’s internal tooling because their first contractor tried to do it all with Arduino. The same scenario happens with Raspberry Pi in the world of Linux embedded systems development.


Are there good ESP32-based starter kits with manual books which a kid can learn from? I was looking for an Arduino-like kit as a Christmas gift, and it seems that Arduino kits are unbeatable. The starter kit is available in 10 languages and comes with a project booklet. All ESP32-based products seem to be better suited for more advanced users and seem to have a steeper learning curve.


No, not like there are with Arduino. Arduino is definitely the best I know of for embedded programming, and it has its place there, but I stopped trying to do real work with it.


I use a combination of Adafruit Ampy to copy files, esptool to reflash, picocom for the REPL, and VSCode. Some of those might be redundant and it did take a bit to figure out the syntax. But once that’s determined it’s cake, and all saved in my CLI history. I just Ctrl+R and bring it back.


For ESP32? You just install the VS Code extension and it does all the toolchain setup and you can flash with a button.


I don’t think that was available when I was starting and now I am set in my ways lol


I use the command line tool arduino-cli (with plain Makefile) to compile and upload the code (obviously usable in any editor). It has also a --verbose mode to show exactly what is getting executed. But I heard a lot about platformio, so I am wondering what is its benefits (beside the integration in vscode; as an emacs user vscode is not working for me)


I think platformio's selling point is multiple target boards via ts config. That and you can use an actual editor instead of the arduino "IDE", although I'm not a fan of vscode anymore either.

I also think they have some testing features built in, though i never delved too deep.


> That and you can use an actual editor instead of the arduino "IDE"

Note that the Arduino IDE has a setting to edit files using any random program instead.


What do you think of PlatformIo on these chips compared to the official ESP IDF?

Personally, I am not convinced we should (continue) conflating the IDE with the build+flash tools; the former should be associated with the programming language and developer preferences, and the latter with the MCU being programmed.


It's not an either/or. You can use platformio with esp-IDF or arduino for ESP chips.

PIO + esp-IDF is the only way I write ESP firmware.


I switched to platformio in vscode (and command line) a few months ago after using arduino ide for over a decade.

Can’t recommend it enough. Faster startup. Repeatable builds. The abilty to save your image and then flash on many devices. Build time parameters. Also allows access to some functionality that is not possible using arduino build process due to how arduino compiles and processes sketches.


Yeah. There is no reason to even touch the Arduino IDE anymore now that PlatformIO is so good.


As a bit of personal advice from a former blogger who had a million+ visitors per year, it's not about anything except your readers and the community you build. AI might have facts on everything, but AI content will not build a community and enrich the people's lives who engage with the site.

When everything else is a computer, be a human.


Interested to know: Why did you stop?


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: