• Hey Guest,

    An update on the OFCOM situation: As you know, censorship around the world has been ramping up at an alarming pace. OFCOM, the UKā€™s communications regulator, has singled out our community, demanding compliance with their Online Safety Act despite our minimal UK presence. This is a blatant overreach, and they have been sending letters pressuring us to comply with their censorship agenda.

    Our platform is already blocked by many UK ISPs, yet they continue their attempts to stifle free speech. Standing up to this kind of regulatory overreach requires lots of resources to maintain our infrastructure and fight back against these unjust demands. If you value our community and want to support us during this time, we would greatly appreciate any and all donations.

    Read more about the situation here: Click to View Post

    Donate via cryptocurrency:

    Bitcoin (BTC): 39deg9i6Zp1GdrwyKkqZU6rAbsEspvLBJt
    Ethereum (ETH): 0xd799aF8E2e5cEd14cdb344e6D6A9f18011B79BE9
    Monero (XMR): 49tuJbzxwVPUhhDjzz6H222Kh8baKe6rDEsXgE617DVSDD8UKNaXvKNU8dEVRTAFH9Av8gKkn4jDzVGF25snJgNfUfKKNC8
N

noname223

Archangel
Aug 18, 2020
5,701
Today I read a newspaper article about deep fakes. Many apps allow to create deep fake porn with their technology. Some people use that to create porn with women despite the fact they have not given consent. They make huge amount of money with that. But at the same time they ruin the lives of many women.
I think the risk that women are used for deep fakes is way higher. As long you are not a famous man the risk is not that high you get used. Seemingly 90% of the fakes are women for deep fake porn.

There is another risk with deep fakes. When they are used for political use in order to interfere in elections.
Due to the fact they technology gets better and better the risks get higher and higher. Seeminlgy even technology noobs can create with these apps realistic deep fakes. That is dangerous.

Do you think there is a need for a stricter regulation of that technology? Seemingly in many countries the punishments for creating deep fake porn is not very high because the usual laws don't cover this issue. Do you think this could be a huge issue in the future. Will be there a way to restrict it? Or will we all have to live with the fact that everyone could do deep fakes of all kinds with our faces or bodies? Should there be stricter laws?
 
J

Julgran

Enlightened
Dec 15, 2021
1,427
Today I read a newspaper article about deep fakes. Many apps allow to create deep fake porn with their technology. Some people use that to create porn with women despite the fact they have not given consent. They make huge amount of money with that. But at the same time they ruin the lives of many women.
I think the risk that women are used for deep fakes is way higher. As long you are not a famous man the risk is not that high you get used. Seemingly 90% of the fakes are women for deep fake porn.

There is another risk with deep fakes. When they are used for political use in order to interfere in elections.
Due to the fact they technology gets better and better the risks get higher and higher. Seeminlgy even technology noobs can create with these apps realistic deep fakes. That is dangerous.

Do you think there is a need for a stricter regulation of that technology? Seemingly in many countries the punishments for creating deep fake porn is not very high because the usual laws don't cover this issue. Do you think this could be a huge issue in the future. Will be there a way to restrict it? Or will we all have to live with the fact that everyone could do deep fakes of all kinds with our faces or bodies? Should there be stricter laws?

Those sentences make my mind go haywire:

"Some people use that to create porn with women despite the fact they have not given consent."

"There is another risk with deep fakes. When they are used for political use in order to interfere in elections."

Imagine Biden, Trump or Kamala... ha ha! :smiling:


On a serious note, though, deepfakes will probably become so convincing, and also expand into other territories, such as voice and writing, that they will be indistinguishable from their real counterparts. I'm not sure that laws can do much to counteract these political deepfakes, but there could be strict punishments for creating and using them for political purposes, which could act as a deterrent against abuse.


This Tom Cruise deepfake definitely fooled me:

 
Last edited:
Alcoholic Teletubby

Alcoholic Teletubby

Rip in piss
Jan 10, 2022
426
They are a paranoia brought to life. 'No damn privacy anywhere.
 
  • Like
Reactions: Celerity
S

seewell

Member
Oct 16, 2022
23
Do you think there is a need for a stricter regulation of that technology? Seemingly in many countries the punishments for creating deep fake porn is not very high because the usual laws don't cover this issue. Do you think this could be a huge issue in the future. Will be there a way to restrict it? Or will we all have to live with the fact that everyone could do deep fakes of all kinds with our faces or bodies? Should there be stricter laws?
No law can stop code, especially when it is easy to use. Anyone could have their picture taken at any moment, if they are regularly in public. How to combat someone maliciously using pictures to make deepfakes? I don't think there are many possibilities. However, one can make use of public-key cryptography to combat "fakes" from being accepted as real, such as with politicians/public figure. That technology has existed for a very long time and will probably begin to be more commonplace in every day life in the near future.
 
narval

narval

Enlightened
Jan 22, 2020
1,188
Potentially very dangerous, specially to law-related things. It's a strong possibility that one can make pre-made incrimination videos/pictures, throw a bait and... bingo! You have been incriminated.

The worst thing: as @seewell said, one cannot stop easily this without making the solution worse than the problem (ie, censuring and regulating in a very crazy levels the IT industry & the whole world wide web). And that would not fix the problem. the deepfakes will be harder to make. Not deleted. Anyone with the tools in his PC wouldn't be affected if he anticipated this and downloaded the media files for feed the IA.

I hope that the IT tools willcan detect (or set a warning), with metadata or the audiovisual file itself, this kind of modifications.

Plan b (worst case): legal reforms. Audiovisual media it isn't admitible as a proof.
 
  • Like
Reactions: WhatPowerIs
kiuya

kiuya

Tired
Nov 16, 2021
92
Like others have said, it's a big upcoming issue. It's scary to think that anyone can grab your photos from Instagram, train an AI long enough and badaboom there's a porn video of you. Honestly I hope that deepfakes are going to turn into just a fun gimmick, like the Tom Cruise deepfake, but people seem to like more wanking to girls that haven't consented.
 
  • Like
Reactions: Celerity and š–£“ nadia š–£“
makethepainstop

makethepainstop

Visionary
Sep 16, 2022
2,032
Deep fakes are dangerous as hell, with a good deep fake it's possible to show a nations leader saying and doing things that could start a nuclear conflict.
 
Angst Filled Fuck Up

Angst Filled Fuck Up

Visionary
Sep 9, 2018
2,985
Potentially very dangerous, but then people used to say this about Photoshop too. There is a deepfake of my face superimposed onto Chris Hemsworth's Thor body floating around, but I've just decided to go with it.
 
  • Wow
Reactions: Celerity
A

akirat9

ć‚Øć‚Æ惈ćƒŖć‚¢ćƒ³
Sep 23, 2022
386
Never showed me face. Never will = safe 100%
 
S

seewell

Member
Oct 16, 2022
23
Never showed me face. Never will = safe 100%
There is the possibility for manipulation of images that are taken of you. There has been some research done with facepaint/makeup and IR blasting glasses to mess with camera sensors. I imagine some of that may start taking place if more and more big-brother-style cameras are deployed in more cities. That is mostly for facial recognition, but could probably apply here.

Additionally, there has been some adversarial testing with current AI systems, particularly autonomous driving. Researches have found how to make signs with specific patterns/icons on them to trick an AI into thinking the sign is a person or tree or whatever. That could be deployed as a tactic against some of this, possibly.
 
GrumpyFrog

GrumpyFrog

Exhausted
Aug 23, 2020
1,913
I don't think this technology is already on the level of development and accessibility when we need to make the laws stricter. I think that as it will grow and become more common, we will come up with ways to confirm that the video is genuine, and people will also be less trusting, same way as when photoshop become commonplace and it didn't take that much time for people to wrap their minds around the idea of pictures being photoshopped. Society will catch up to this technology the same way it caught up to all the other stuff that came up in recent years. I don't think there is a real cause for alarm.
 
A

akirat9

ć‚Øć‚Æ惈ćƒŖć‚¢ćƒ³
Sep 23, 2022
386
There is the possibility for manipulation of images that are taken of you. There has been some research done with facepaint/makeup and IR blasting glasses to mess with camera sensors. I imagine some of that may start taking place if more and more big-brother-style cameras are deployed in more cities. That is mostly for facial recognition, but could probably apply here.

Additionally, there has been some adversarial testing with current AI systems, particularly autonomous driving. Researches have found how to make signs with specific patterns/icons on them to trick an AI into thinking the sign is a person or tree or whatever. That could be deployed as a tactic against some of this, possibly.
No one has taken a photo of me. That is a fact.
 
  • Aww..
Reactions: seewell
littlelungs

littlelungs

Wizard
Oct 21, 2018
646
If used with malicious intent then, yeah, they can be super fucking dangerous... just like pretty much anything else. The world is an absolute shit-show and there will always be people who will abuse, contort and manipulate certain technologies, systems, etc that are/have the potential to be beneficial in their own right. The first time I saw a deepfake, I'm not gonna lie, I was freaked the fuck out... but ultimately, nothing terrifies me more than humanity and what "we're" capable of, and the future of this world because of humanity, if that makes even a lick of sense.

I think that as it will grow and become more common, we will come up with ways to confirm that the video is genuine, and people will also be less trusting, same way as when photoshop become commonplace and it didn't take that much time for people to wrap their minds around the idea of pictures being photoshopped. Society will catch up to this technology the same way it caught up to all the other stuff that came up in recent years.

I hope you're right.
 
Last edited:
Disappointered

Disappointered

Enlightened
Sep 21, 2020
1,283
People have already been set up with fake facebook posts by a certain legitimized ethnic mafia.
 
  • Like
Reactions: BruhXDDDDD
BruhXDDDDD

BruhXDDDDD

Student
Feb 18, 2022
166
As of right now, I've only really seen people use them to make unconsensual celebrity porn. But considering how sexualized celebrities are to begin with, I can't bring myself to care very much. On that front I'm more concerned about revenge porn and whatnot. Mostly concerned about it being used to blackmail/frame people. Eventually, the technology will likely be good enough that most people won't take any video at face value because it could have been faked.
 
  • Like
Reactions: Disappointered

Similar threads

cracklingroses
Replies
11
Views
319
Suicide Discussion
Whale_bones
Whale_bones
quietism
Replies
9
Views
374
Offtopic
wanttobelieve65
wanttobelieve65
GhostInTheMachine
Venting The Wall
Replies
2
Views
181
Suicide Discussion
GhostInTheMachine
GhostInTheMachine