Camera Companies Fight AI-Generated Images With ‘Verify’ Watermark Tech::undefined
Great, DRM on my personal photos. Next they’re going to charge a subscription to view my own goddamn vacation pictures
Fuck this timeline. I want to get off Mr. Bones Wild Ride.
It’s not DRM. It’s like EXIF metadata. You can strip it anytime you want, and it will often get stripped in practice even if you don’t want (e.g. screenshots, naive conversions, metadata-stripping upload services, etc.). It’s entirely optional and does not require any new software to view the images, only to view and manipulate the metadata.
On its own, it doesn’t tell you much about the image except that specific people/organizations edited and approved it, with specific tools. If you don’t trust those people/orgs, then you shouldn’t trust the photo. I mean, even if it says it came straight from a Nikon camera…so what? People can lie with cameras, too.
I wrote a bit more about this in another thread at https://lemmy.sdf.org/comment/5812616 about a month ago, if you’re interested. I haven’t really played with it yet, but there’s open-source software out there you can try.
You could implement it like that but I’m not convinced that’s the way this will go. The only way this will have mass adoption, I’m afraid, is if the tech giants can fleece us one way or another.
I think the only risk is if this somehow becomes legally mandated. I just don’t see how that’s possible.
Adoption has a clear path because professional photographers, journalists, and publishers have motivation to prove the authenticity of their images.
For consumers…meh. I don’t enable GPS on my phone camera, and I wouldn’t enable this either. I don’t need to prove anything.
Profitability, not user interest, is the deciding factor. We’ll see how it plays out. I don’t think this one will last
I guess this is better than nothing, but what happens if you take a photo of a generated photo? There are setups where the result will be impossible to tell that it’s a photo of a photo, and then you can have the camera digitally sign the fake photo as real.
Consoles (Xbox, Nintendo, PlayStation) are all hacked eventually. All that will happen is someone will hack a camera to sign any image sent to it.
I think this tech (signed pictures) is just going to make the problem worse. Once a camera is hacked, it’s “signed” but fake… Same spot we are now but now we have fake verified pictures
And consoles are a walled garden, here you would have to build a resilient trust network for all camera manufacturers, any private key gets leaked and the system is compromised.
Sony will ban any PSN account if they found using a modded console.
And I’ll be sure to let them know that I use windower add-ons and DAT mods when playing FF11. Maybe they’ll ban my PS2/PlayOnline from any future updates?
Facebook will not ban users spreading fake signed pictures.
It’s literally worse than nothing
It’s not just a sig on the image, but on metadata as well. Harder to fake time + place if they implement it thoroughly. (I.e., they would have to make it only trust GPS and verify against an internal clock, I suppose, and not allow updating time and location manually.)
…including the date and time a photo was taken as well as its location and the photographer…
That raises the bar on faking but doent rule it out just give more credit to those who can fake it
Not including gps and time makes this worse, but including it makes it useless because you can’t ever verify a photo sent across social media, since the exit tags will be stripped.
Add a depth sensor?
Pissing in the wind.
Sooo how long until theres a plugin in a1111?
deleted by creator