The last point is the most notable to me. I don’t know how evidence will be able to hold up in court any more since video can be so easily manufactured. What will evidence have to look like to prove someone is guilty when virtually everything but hard physical evidence can be manufactured? This technology is advancing far faster than we’re preparing for it and very little of that is in the true public interest.
The Last point he made is terrifying. Just remembering how there was recently a court case where a man murdered someone, and the family of the victim made an AI video of the murdered person where they “spoke their peace” to the judge and to the murderer. It sounded like typical “In another life, we could have ended up friends.” type of thing.
This was not a convincing AI video either. Just used a portrait of the man and moved his mouth to the words. But used his voice.
The family said they made it and said that this is what the deceased would have said if he were still alive.
The judge got emotional over the AI video and ended up sentencing the murderer to more time than he was initially going to give.
This is what will kill off 99% of the internet. If you can’t trust what you can see, then what’s the point in seeing anything, other than of course, for pure movie type entertainment.
Fuck you for cropping out their channel name and/or not crediting them. @pearlmania500 please credit peoples content in the future, its what good humans do.
Laws need to be implemented quickly around AI generated content. They need to have watermarks that confirm when and where the video was generated. Even if the watermark gets cropped out the data built into the video needs to still be there to be unpicked if necessary. It can be built into the AI code and ensure it’s all verifiable. It also needs to be enforced that any AI content cannot include real life people who have not consented to the video beforehand. Seriously this stuff can be used in horribly nefarious ways and it’s only going to get worse.
Welp. Good thing I’m a ludite and already going back to watching DVDs and reading old books. Come back to VHS with us kids, it’s a bloody nightmare to tamper with that, and everyone who knew how is dead.
Too bad people have thrown most of that tech in the landfill because we really need it back right now.
Off to see how much a cabin in the woods costs and to call A&E. Oh right, the woods will probably burn down because a 20 year old wanted to play chicken with some fireworks for a Tiktok challenge.
Anyone got a barge I can live on? I’d say yacht I can steal but orcas are all over those. And by the time AI is done there’ll be no water to worry about. At least we’ll see the bottom of the ocean and what’s down there before we’re all executed by firing squad.
So fun. Thanks technology, you’re keeping us alive longer but a lot of us clearly shouldn’t be.
another terrifying thing about the bodycam footage especially is that cops will now be able to say “yeah but that’s fake” to absolutely real evidence of excessive force
My theory has always been that AI like this, will be used to simply make everyone question everything – which, you’d think initially, is a great way of thinking.. But sadly it will be nefariously used. So whenever evil-doers say or do deplorable things, people will think “fake news”. And whenever scientists, experts or governing people try to explain something factual or historic that goes against the agenda of these evil-doers, it will also be questioned.
Facts no longer matter, reality no longer matters. Kids growing up now will be drowning in apathy, with no critical thinking and that is the whole plan.
The last point — Forget BIG things. Think about how this can be abused for all the smaller claims or even social implications shit.
So maybe we come up with ways to help counter this with murder, but how easy is it to make up small crimes that don’t go into the weeds of “forensics”, like simply shoplifting.
Or what about a beef with a neighbor, I can make a video of you stealing my packages if I want and share it all over social media.
Imagine someone at work really doesn’t like you, so they make a video of you hitting on some kid, making you out to be a pedo and share it at work.
The “small, but life damaging” things that will never get a “scientific review” can so easily ruin anyone.
I get Reddit is doing their best to just pretend like AI doesn’t exist as it gets more and more advanced, but people don’t realize just how fast this is advancing. Do you remember that cursed AI video of Will Smith eating spaghetti? That was the absolutely best that AI generation was capable of, and that was just TWO YEARS AGO. This is how far it has advanced in just two years and what’s capable of being produced with simple one sentence prompts.
It makes me wonder if governments have had advanced AI for a while now and us peasants just started getting it. Like how many years ahead of us are they technologically?
Honestly it’s already to the point where I can get tricked if I don’t know better when it comes to the subject or person being impersonated, its so real looking, scams in the future with our parents will be spooky and myself
I don’t get how anyone could be braindead enough to want this technology to be publically available. How do you put in all this work advancing the tech and not stop to think that maybe this DOESN’T benefit society at all? We knew this was going to happen for years… the consequences are not new discoveries. We’re just going to keep expanding this AI bubble until it bursts and our society goes with it.
We will still be able to trust archival footage, just not on YouTube or similar platforms. We will need national libraries or museums to curate and make available verified footage from their archives.
That was the point all along. The “tech elites” don’t want you *just* not believing fake news, they want you not believing *anything anymore ever again*
As someone who grew up appreciating stop-motion special effects in movies, then was blown away by Jurassic Park, let me tell you: THIS SHIT SHOULD SCARE THE HELL OUT OF US.
That it no longer takes a team of the most talented and brilliant artists in the world to create an effect that doesn’t just imitate reality or make abject fantasy believable for the consented intent of being mere entertainment for a volunteer audience is fucking crazy. Deep fakes and AI now outright qualify as simulacra.
And not only are we stuck with judicial and law enforcement system that is in no way prepared for it, but Republicans banned any legislation from being passed to regulate AI for the next 10 *YEARS*.
コメント
Love how he said “Psychopaths” but the text read “Tech Elites” at 1:34
Honestly… by the end of the video I was expecting him to say that he was ai too
The last point is the most notable to me. I don’t know how evidence will be able to hold up in court any more since video can be so easily manufactured. What will evidence have to look like to prove someone is guilty when virtually everything but hard physical evidence can be manufactured? This technology is advancing far faster than we’re preparing for it and very little of that is in the true public interest.
The Last point he made is terrifying. Just remembering how there was recently a court case where a man murdered someone, and the family of the victim made an AI video of the murdered person where they “spoke their peace” to the judge and to the murderer. It sounded like typical “In another life, we could have ended up friends.” type of thing.
This was not a convincing AI video either. Just used a portrait of the man and moved his mouth to the words. But used his voice.
The family said they made it and said that this is what the deceased would have said if he were still alive.
The judge got emotional over the AI video and ended up sentencing the murderer to more time than he was initially going to give.
This is what will kill off 99% of the internet. If you can’t trust what you can see, then what’s the point in seeing anything, other than of course, for pure movie type entertainment.
Let’s just get off the internet and trust the grass underneath our feet.
Fuck you for cropping out their channel name and/or not crediting them. @pearlmania500 please credit peoples content in the future, its what good humans do.
Thanks I hate it
1984. Who controls the past controls the future. Who controls the present controls the past.
I’ve literally lost trust in everything I’m always questioning if something is ai.
Waiting for JP makeup line to drop
We’re really doomed
Plot twist. Jake’s video is real, this ginger dude is AI.
Laws need to be implemented quickly around AI generated content. They need to have watermarks that confirm when and where the video was generated. Even if the watermark gets cropped out the data built into the video needs to still be there to be unpicked if necessary. It can be built into the AI code and ensure it’s all verifiable. It also needs to be enforced that any AI content cannot include real life people who have not consented to the video beforehand. Seriously this stuff can be used in horribly nefarious ways and it’s only going to get worse.
This technology is only going to create suffering and if its not stopped or regulated heavily its going to be too late.
And honestly, I feel its already late enough.
Apart from spreading false information, the rich and powerful can also just getaway now by claiming proofs against them are also Ai.
*Internet reality is dead
Welp. Good thing I’m a ludite and already going back to watching DVDs and reading old books. Come back to VHS with us kids, it’s a bloody nightmare to tamper with that, and everyone who knew how is dead.
Too bad people have thrown most of that tech in the landfill because we really need it back right now.
Off to see how much a cabin in the woods costs and to call A&E. Oh right, the woods will probably burn down because a 20 year old wanted to play chicken with some fireworks for a Tiktok challenge.
Anyone got a barge I can live on? I’d say yacht I can steal but orcas are all over those. And by the time AI is done there’ll be no water to worry about. At least we’ll see the bottom of the ocean and what’s down there before we’re all executed by firing squad.
So fun. Thanks technology, you’re keeping us alive longer but a lot of us clearly shouldn’t be.
HoW dO I kNoW tHis gUY iS noT AI, huh? HUH?
/This is terrifying.
another terrifying thing about the bodycam footage especially is that cops will now be able to say “yeah but that’s fake” to absolutely real evidence of excessive force
At least my love of DS9 is real.
Just remember the tech shit heads who helped Trump get elected had him put a 10 year pause on any restrictions on AI usage
I’m sure none of them have bad intentions in doing this
The internet as I knew it died when AI came out. I’ve been downloading everything I can before it’s too late.
Time to get off the Internet
My theory has always been that AI like this, will be used to simply make everyone question everything – which, you’d think initially, is a great way of thinking.. But sadly it will be nefariously used. So whenever evil-doers say or do deplorable things, people will think “fake news”. And whenever scientists, experts or governing people try to explain something factual or historic that goes against the agenda of these evil-doers, it will also be questioned.
Facts no longer matter, reality no longer matters. Kids growing up now will be drowning in apathy, with no critical thinking and that is the whole plan.
The last point — Forget BIG things. Think about how this can be abused for all the smaller claims or even social implications shit.
So maybe we come up with ways to help counter this with murder, but how easy is it to make up small crimes that don’t go into the weeds of “forensics”, like simply shoplifting.
Or what about a beef with a neighbor, I can make a video of you stealing my packages if I want and share it all over social media.
Imagine someone at work really doesn’t like you, so they make a video of you hitting on some kid, making you out to be a pedo and share it at work.
The “small, but life damaging” things that will never get a “scientific review” can so easily ruin anyone.
Yeah, pretty much everyone who was paying attention realized this was coming about twenty years ago.
every day my decision to not procreate is validated anew
But Jake Paul is in fact gay
Well it’s not like I didn’t know
But that still depresses me
I get Reddit is doing their best to just pretend like AI doesn’t exist as it gets more and more advanced, but people don’t realize just how fast this is advancing. Do you remember that cursed AI video of Will Smith eating spaghetti? That was the absolutely best that AI generation was capable of, and that was just TWO YEARS AGO. This is how far it has advanced in just two years and what’s capable of being produced with simple one sentence prompts.
Also it’s making people say that real videos are just AI slop, and that cynicism is going to destroy society.
It makes me wonder if governments have had advanced AI for a while now and us peasants just started getting it. Like how many years ahead of us are they technologically?
Honestly it’s already to the point where I can get tricked if I don’t know better when it comes to the subject or person being impersonated, its so real looking, scams in the future with our parents will be spooky and myself
the last part is fucking sad cus that shit will happen
All Ai videos should come with a mandatory watermark, like make a law that enforces that.
Someone gave a reason for allowing the public to play with ai, when it gets too interesting / dangerous they can take the Internet away.
I can not upvote this enough
more importantly, what a wonderful DS9 picture ♥
Getting framed or blackmailed by using AI footages is definitely on the menu now
Bring back analog cameras, VHS, for super important stuff. High security areas and what not.
I bet its much more difficult to tamper with footage on magnetic tape than it is to make an AI video on your phone.
I don’t get how anyone could be braindead enough to want this technology to be publically available. How do you put in all this work advancing the tech and not stop to think that maybe this DOESN’T benefit society at all? We knew this was going to happen for years… the consequences are not new discoveries. We’re just going to keep expanding this AI bubble until it bursts and our society goes with it.
AI is a curse. I wish it hadn’t been invented.
This is actually pretty scary cause a few months ago you could easily tell when something was ai or not
We will still be able to trust archival footage, just not on YouTube or similar platforms. We will need national libraries or museums to curate and make available verified footage from their archives.
That was the point all along. The “tech elites” don’t want you *just* not believing fake news, they want you not believing *anything anymore ever again*
As someone who grew up appreciating stop-motion special effects in movies, then was blown away by Jurassic Park, let me tell you: THIS SHIT SHOULD SCARE THE HELL OUT OF US.
That it no longer takes a team of the most talented and brilliant artists in the world to create an effect that doesn’t just imitate reality or make abject fantasy believable for the consented intent of being mere entertainment for a volunteer audience is fucking crazy. Deep fakes and AI now outright qualify as simulacra.
And not only are we stuck with judicial and law enforcement system that is in no way prepared for it, but Republicans banned any legislation from being passed to regulate AI for the next 10 *YEARS*.
I’m scared shitless, y’all.
I kept waiting for him to say, “And I made this video with AI too!” 😬
maybe this video is ai? 🤔😂