[ad_1]

The innovations that AI makes possible can seem downright miraculous. But as a woman on Reddit recently learned, the technology comes with a disturbing underbelly.

That dark side includes downright dangerous violations of privacy — a problem that is not only far more widespread than most of us realize, but one against which the law provides virtually no protection.

RELATED: A Mom Is Selling Personalized Videos Of Her 11-Year-Old Daughter For $14 — And Grown Men Are Buying Them

A woman discovered her husband has been using AI to generate ‘deepfake porn’ photos of her friends.

In her Reddit post, the woman detailed how she and her husband have been “going through a really rough patch lately.” In an effort to salvage their marriage, they’ve been digging into tough conversations, including those about “poor decisions we’ve made within the relationship.”

During one of those talks, her husband revealed he’d engaged in behavior that would not only constitute infidelity to many people but is also a disturbing breach of several women’s privacy — creating AI porn images from photos of her friends.

Her husband revealed that he has been using AI-generated nude photos for sexual pleasure.

“Last night, he told me he has gotten swimsuit pictures of my best friends as well as some girls [with whom I went to] high school, off of social media and used an AI website to make them naked,” she writes.

Her husband has been doing this for years and has been doing far more than just looking at them. “He said he has jacked off a few times to these images over the past few years,” she writes.

[ad_2]

Source link

Leave a Reply

Your email address will not be published. Required fields are marked *