US Lawmakers Say AI Deepfakes 'Have the Potential To Disrupt Every Facet of Our Society' (theverge.com) 198
Yesterday, several lawmakers sent a letter to the Director of National Intelligence, Dan Coats, asking him to assess the threat posed to national security by deepfakes -- a new type of AI-assisted video editing that creates realistic results with minimal effort. The Verge reports: The letter says "hyper-realistic digital forgeries" showing "convincing depictions of individuals doing or saying things they never did" could be used for blackmail and misinformation. "As deep fake technology becomes more advanced and more accessible, it could pose a threat to United States public discourse and national security," say the letter's signatories, House representatives Adam Schiff (D-CA), Stephanie Murphy (D-FL), and Carlos Curbelo (R-FL). The trio want the intelligence community to produce a report that includes descriptions of when "confirmed or suspected" deepfakes have been produced by foreign individuals (there are no current examples of this), and to suggest potential countermeasures. In a press statement, Curbelo said: "Deep fakes have the potential to disrupt every facet of our society and trigger dangerous international and domestic consequences [...] As with any threat, our Intelligence Community must be prepared to combat deep fakes, be vigilant against them, and stand ready to protect our nation and the American people."
They're lying (They're politicians) (Score:4, Interesting)
They're obviously lying. Politicians are gonna love the proliferation of deepfakes. That way, the next time they say something stupid in an interview, they can say the clip is a deepfake. They're just starting to blame them as an upcoming problem now so they can start using it as an excuse ASAP.
Re: (Score:2)
Same as it ever was (Score:5, Informative)
Photos use to be considered "strong evidence", then Photoshop etc. came along to make doctoring cheap and common, and people stopped trusting photos. The same will happen to audio and video once they see enough fudged examples.
Re: (Score:3)
Agreed. The problem is that with photos, video and voice all untrustable, what is left? How can someone come to an independent conclusion about anything?
Politicians will just deny saying things that were caught on video....oh wait....
It really is a serious issue.
We figured out stuff before video existed ... (Score:3)
Agreed. The problem is that with photos, video and voice all untrustable, what is left? How can someone come to an independent conclusion about anything?
The same way we did before the proliferation of movie/video cameras and audio recorders. These are only very recent inventions.
Re: (Score:2)
Re: (Score:3)
Eye witness testimony? Great. It is a good thing we never had a problem with false accusers and poor perception.
Right because ...
(1) We had nothing other than eye witness testimony, never that stuff called evidence.
(2) And video eliminated false accusers and perception because actions are never subject to interpretation and the camera/mic catches all, its not like what happened before the video started or after it ended, what is happening out of frame, what beyond mic detection fills in unknown gaps.
Videos gets faked all the time for political purposes and people are largely gullible and believe them. Few have t
Re: (Score:2)
It's far worse than that. How, exactly, do eye witnesses tend to share their experiences with people? Through videos. Which can be faked.
So now not only do we have the problem of video fakes of politicians and politicians claiming real video is fake, we have the problem of eye witnesses being faked on video, and video of eye witnesses being called fakes.
On the flip side, maybe print journalism will have a resurgence, as we're headed back to the reputation of an individual who talked to eye witnesses being t
Re:We figured out stuff before video existed ... (Score:4, Interesting)
We used to have sources that the majority of people believed in. Professors. Clergy, civic leaders. Not saying the trust was deserved, but at least people mostly agreed on facts. (even if they were wrong)
Re: (Score:2)
We used to have sources that the majority of people believed in. Professors. Clergy, civic leaders. Not saying the trust was deserved, but at least people mostly agreed on facts. (even if they were wrong)
The world is full of evidence beyond video. There is no need to resort to leaders, clergy and professors who are offering opinions rather than evidence based conclusions.
Re: (Score:2)
In a way we're back to the problem thats plagued philosophers since time immemorial. Can we trust what we percieve? Do we go back to Hume and trust nothing, or back off a little and say Well we aren't entirely sure, but we do have this fantastic cognitive ability to sort through it and hopefully come to the right conclusions, while accepting we kinda see what we wanna see.
No good answers here, other than this really isn't as new as we think. We've always had to contend with untruths.
The difference is, for m
Re: (Score:3)
Agreed. The problem is that with photos, video and voice all untrustable, what is left?
For the most part photos, videos and voice are only untrustable to the common person. To expert opinion identifying a doctored item is usually very straight forward.
Re: (Score:3)
Sure, but in a typical court case, the defense and the prosecution will both put forth 'experts', one assuring everyone a video is fake, the other assuring everyone that video is real, and the jury (who have absolutely no clue how to tell which expert is reliable) will have to go off various internal biases.
Investigators, prosecutors, judges - none of these people have the knowledge to distinguish real videos from fake videos, and they also don't have the knowledge to distinguish real experts from bullsh-tt
Re: (Score:2)
Well that's fundamentally the way ANY court process works about anything. The goal of the jury is to decide what information to trust. There's nothing deepfake about this, it's the way it's been done since the 1800s.
Re: (Score:2)
Re: (Score:2)
The jury must decide what to trust sure, but shouldn't be deciding whether a scientific process is valid, and allowing in junk science is a major problem.
They aren't deciding on whether a scientific process is valid, they are deciding on the explanation offered by experts, trust in the people. I agree bad science shouldn't be put in front of the jury, but that's an idealistic view of a court system which has that amazing airtight defence that will force out the truth: putting you hand on a book and pinky swearing.
But I think we're fundamentally in agreement anyway. This isn't about deep fakes doctoring videos. It's about a process that has flaws and has had
The expert should explain it, not proclaim it (Score:2)
> And the jury (who have absolutely no clue how to tell which expert is reliable)
The best experts will explain to the jury how they can look at the evidence and draw conclusions, rarher than just proclaiming a conclusion.
As one example, a blood spatter expert can demonstrate, either live or on video, that slinging a wet sponge at high speed leaves a pattern of very small droplets, while carrying it or slinging it slowly let's much larger drops form. The jury can then see for themselves whether the crime
Re: (Score:2)
For "best" I think you should substitute "most convincing". A lot of the traditionally accepted forensic tests have never been properly validated. Some which were recently tested failed attempts to validate. (IIRC one that failed was correlating scars on a bullet and the barrel it was fired through...and that one *ought* to have worked.) But this doesn't mean that the technique that has failed validation won't be used in some other case, or that cases where it was used as "convincing evidence" will now
Re: (Score:2)
Given the poor quality, and the lack of provenance or ability to verify a chain of custody, of most video and audio evidence, it's not certain at all that more sophisticated doctoring will be apparent. An analyst with good tools can often deduce a great deal about the recording tools and the details of the original recording. But it takes time and effort that is often not of interest to a prosecutor.
Re: (Score:2)
There's no need to verify chain of custody if doctoring becomes apparent through simple statistical techniques. I think you're well over estimating the time and effort required. Often a simple JPEG error level analysis can determine what was doctored in an image, to say nothing of the fact that most outright lies (meaning composite images where someone has been doctored out or doctored in) are easy to identify.
For the common case we are well and truly in the realm of good enough. Now if you're prosecuting e
Re: (Score:2)
Running those "simple statistical techniques" is technologically feasible. Is it feasible fiscally? How much does it cost to run this analysis? Is this going to become more difficult and complex to detect as the doctoring tools improve?
Prosecutors are _not_ happy to have to spend more time and effort validating evidence. Neither are defense attorneys. It costs time and effort.
Re: (Score:2)
Running those "simple statistical techniques" is technologically feasible. Is it feasible fiscally? How much does it cost to run this analysis?
Well for a basic first pass you can upload the picture to any ELA online website. Though for courts they may want to do something more trustable like download free code from Github, or maybe even use one of the several open source forensic tools which are available. The same technique used for images works for video too though I'm not sure as many open source programs are around for it at this point.
The point is, it's not as hard as you make out, and the level of effort / expense is proportional to the crim
Re: (Score:2)
Most of the cost isn't the code, it's the expert that runs the code, interprets the results for lay people, and testifies that the code actually performs a valid analysis.
Re: (Score:2)
Reading the book The Truth Machine right now, which touches on this issue. The solution mentioned in the book is digital certification and notarization of images and video. Security camera footage, for example, is all digitally signed (by whom, I don't know). Any modification of the footage breaks the signature and is thus inadmissible in a court, or easily detectable. Trusted services that can notarize images and video make a lot of money in this universe. The mechanics of how this all would happen is no
Re: (Score:2)
Back to analog for forensic evidence? Those are harder to fake.
If there is a video footage of something it may not stand as a proof on its own. Other bits of evidence from multiple sources and in different forms will have to align with what the video shows to form a consistent view.
Re: (Score:2)
I think the difference with photos is that only limited information can be gleaned from a photo. The information which might be gleaned from a video is far, far more detailed. In particular, if they could effectively fake a voice, then the potential implications are pretty tremendous.
Personally, I suspect we're a long way off from reasonably-convincing fakes. And even when we do have such fakes, it's going to be far more difficult to make a fake which withstands careful scrutiny.
Re: (Score:2)
What's funny is that most photos are more detailed and higher resolution than most videos. The video captures motion, but is usually lower (often much) lower pixel density, worse color quality, and often suffers from poor image stabilization.
Re: (Score:2)
What's funny is that most photos are more detailed and higher resolution than most videos. The video captures motion, but is usually lower (often much) lower pixel density, worse color quality, and often suffers from poor image stabilization.
In spite of that, you can often make out details on video that don't show up well in a photograph, like flat reflective surfaces which can be difficult to identify in stills. Ideally, you'd have both of them. Also, blur.
Re: (Score:2)
Is that really true, though? Yes, Photoshop has made photo doctoring easy, but the digital age makes it easy enough to detect and debunk. AFAIK, it is extremely difficult to doctor a photo well enough that simple forensic analysis can not determine that it was manipulated, and even w
Re: (Score:2)
Agreed. If nothing else, we'll train an AI to spot the fakes. I mean, apparently AI can do anything these days, right?
Re: (Score:3)
If nothing else, we'll train an AI to spot the fakes
Yes, that is exactly how it works. Train one DNN to generate fakes, and train another to detect the fakes. Set them up as adversarial networks [wikipedia.org].
The generating network, and the detecting network continue to improve as each fights to defeat the other. Eventually the generator creates an image or video so good that it passes as "real". That is a "deep fake".
Re: (Score:2)
This still doesn't make sense. You cannot reliably determine the best 'faked images', and there's a reason for that - the only way we can know a well-faked image is fake is if there are things that *semantically* don't make sense in an image - e.g. say it's supposedly a photo of a ball, but the lighting doesn't make sense for a ball as there's a shading anomaly on the surface of the ball - the human can say, OK, that's supposed to be round but it's not, that must be fake - but if the AI 'fake detector' does
Re: (Score:2)
"Agreed. If nothing else, we'll train an AI to spot the fakes. I mean, apparently AI can do anything these days, right?"
So you'll train the AI to spot fakes based on what, real-world training data for which you can't reliably determine they're fakes? Or you'll create your own fakes as training data? This doesn't make much sense; since it's impossible for a human to distinguish a very well-done fake from a legitimate image, you're basically just going to train your AI to at best determine badly-done fakes -
Re: (Score:2)
I think you missed the sarcasm implied in my post. But given recent AI hype, I could see how that might be taken seriously. That being said...
So you'll train the AI to spot fakes based on what, real-world training data for which you can't reliably determine they're fakes?
Yes. It's not like you can't find a bunch of training material which you know are fakes to train your algorithm on. I can't tell most visual effects in movies are fake, but I know they are. Same principle.
Also, note that it's *humans* that can't reliably tell if they're fakes. The whole point is that computers have much deeper powers of analysis than humans do -
Re: (Score:2)
"Yes, Photoshop has made photo doctoring easy, but the digital age makes it easy enough to detect and debunk"
Considering how many people I've seen boldly declaring something 'obviously fake' on the basis of dumb crap like JPEG artifacts or H.264/H.265 encoding artifacts, I'm very wary of anyone who thinks it thinks it's "easy" to detect and debunk fakes - actually, I've been doctoring photos in Photoshop for over 10 years, and the bottom line is that while badly-done fakes are easy to spot, most well-done f
Re: Same as it ever was (Score:2)
Re: (Score:2)
True, but then the analysis falls to believing experts arguing over whether or not there was tampering
Re: (Score:2)
That's not how embedded encrypted hashes work.
It's not a problem to tag videos or any kind of a digital recording with invisible but readable watermarks based on the content of the video.
Hell... Kids today love blockchains.
Put a blockchain in it and tag it all the way from the lens to the eyeball. OK... CCD/CMOS to LCD.
The problem is backwards compatibility and the cost of upgrades.
There's still video out there that is captured with outdated analog devices, particularly of the security kind.
And that's not e
Re: (Score:2)
You understand that. I mostly understand that. The problem is explaining it to someone who doesn't have any technical education. Maybe if society gains experience with the reliability of the system, but that will take a long time and a few high profile exceptions will kill that belief for the whole proces.
Re: (Score:2)
Pretty damn simple.
Re: (Score:2)
Of course,. that just shifts the trust to the signature. Anyone can create a fake, sign it with a random key and claim it to be a signed video. The problem becomes one of trusting both the validity of the key and making sure the key is not available to someone with an interest in producing a fake video.
So if the police present a signed dashcam video showing that the citizen shot first, we still have to ask if the police had access to the signing key. If yes, then the signature means nothing.
Just to further
Re: (Score:2)
So if the police present a signed dashcam video showing that the citizen shot first, we still have to ask if the police had access to the signing key. If yes, then the signature means nothing.
That is not how asymmetric encryption works.
Re:Docoring photos is as old as photos themselves (Score:5, Insightful)
In fact, the entire legal system uses nothing but scientifically invalid "proof"... Our legal system doesn't even remotely adhere to that.
The threshold for legal conviction in the US (and a few other countries) is "beyond a reasonable doubt" and not "absolute proof" for good reason. If you demand 100% proof, you would almost never convict anyone, and that wouldn't serve justice either. The legal system has to carefully balance the ability to obtain a conviction when warranted versus protection against false accusations. Real life tends to be a bit messier than a peer-reviewed scientific paper, and rarely deals in absolutes.
Re: (Score:2)
That's the official standard all right. But it doesn't have much to do with how verdicts are often related. Do you really find it difficult to believe that the prosecutor intentionally fudged some evidence and withheld other evidence? It happens all the time, and has frequently been proven, admittedly often years after the conviction.
Re: (Score:2)
We have already got a legal system that has for (at least) decades trusted the honesty of the prosecutor handling the evidence, so this is a very small increment in reasonable distrust.
Possibly I should have said centuries, but I'm not sure how long the current prosecutor/defense attorney system has been in place, and analogs have been in place back to the Classical Greeks. (I.e., the Romans had an analog, but I'm not sure how the Greeks managed things, or whether they all did it the same way.)
In other words (Score:2)
OH NOES (Score:4, Insightful)
This has nothing to do with surveillance (Score:2)
Heinlein's "Fair Witness" (Score:5, Interesting)
Robert A. Heinlein imagined this problem, and in his novel Stranger in a Strange Land [wikipedia.org] he described a new profession: "Fair Witness"
A Fair Witness is a person who is trained to observe and remember without jumping to any conclusions. A Fair Witness should be able to describe in court what he/she saw, and only that. As an example, a Fair Witness would say something like "I observed a house, and the side I saw appeared white" rather than "I observed a white house." It's possible that other sides of the house, not seen by the Fair Witness, could be a different color; and it's possible the house was repainted after the Fair Witness saw it... the Fair Witness keeps such things in mind.
Surprisingly, Wikipedia doesn't seem to think that the idea of "Fair Witness" is notable. I Google searched for a reference, and I found a reference that claims to be quoting Wikipedia, but I can't find it on Wikipedia now.
http://dlkphotography.com/fair-witness/stranger-in-a-strange-land [dlkphotography.com]
I found the "Fair Witness" idea to be one of the most interesting things in the book, and I have long wondered if we would one day see that profession in real life.
That's hammered into private investigators (Score:3)
That's something that was said over and over again in my private investigator and security officer training.
You DON'T tell the client "yes I caught your spouse cheating on you with his ex". You write notes as its happening if possible saying "I observed a white sedan park near 124 Oak Street". It's possible that the car isn't the subject's car, it only looks similar. It's possible that the suspected companion doesn't currently live at that address. It's possible that the subject went there to meet with his
Re: (Score:2, Insightful)
In my "Intro to Psychology" class we learned about experiments that were performed that involved staging an event in front of a room full of people, including people who were trained as observers. Then, quizzing them on the details.
Everyone performed badly, including the trained observers. Especially prominent were racial biases, skewing their memory of who did what.
The notion of "fair witness" is too high a bar for humans to hit. We cannot avoid jumping to conclusions, infusing our biases, etc. And our
Re: (Score:2)
Reminds me of this witness memory demo on youtube [youtu.be]. IMHO, the problem of synthesized eyewitness videos will, to paraphrase JBS Haldane or Arthur Eddington, eventually lead to trouble that will not be just worse than we imagine, but worse than we can imagine.
Re: (Score:2)
staging an event in front of a room full of people, including people who were trained as observers. Then, quizzing them on the details.
Everyone performed badly, including the trained observers.
I once participated in an exercise along those lines. Our group was shown a video in which a man with a revolver robbed a bank or something and fired the revolver. The group was asked how many shots the man fired.
Some of the group thought he fired 7 times. From his revolver.[1]
I actually got the count perfect, but
Re: (Score:2)
Search for the experiment about the man in the gorilla suit, or about the stabbing murder with a banana. Average recollection isn't more reliable. People tend to see what they expect to see.
Don't trust any one source (Score:3)
Simple solution: don't trust any one source. Even if they're supposedly impeccable. Look for corroboration from multiple independent sources (and make sure they're really independent and not all getting their information from the same source). For instance if you have a video of someone checking into a hotel with a compromising companion, look for corroboration from the hotel's records, hotel staff who should have interacted with them, and the person's credit-card records. This is what we used to do before people got lazy and started believing everything they were told without question.
Re: (Score:2)
You're solving the wrong problem. People in power want to throw poor people in jail with video evidence. They also want to be able to afford expensive lawyers to plant doubt that a video shows the objective truth.
When politics gets involved with tech, it is all about self preservation and persecution.
Re: (Score:2)
Simple solution: don't trust any one source. Even if they're supposedly impeccable. Look for corroboration from multiple independent sources (and make sure they're really independent and not all getting their information from the same source). For instance if you have a video of someone checking into a hotel with a compromising companion, look for corroboration from the hotel's records, hotel staff who should have interacted with them, and the person's credit-card records. This is what we used to do before people got lazy and started believing everything they were told without question.
Indeed.
Deuteronomy 17:6 [biblegateway.com]
On the evidence of two witnesses or of three witnesses the one who is to die shall be put to death; a person shall not be put to death on the evidence of one witness.
US Lawmakers (Score:5, Insightful)
are simply afraid of the competition.
The US ( and everyone else ) has been altering both modern and historical facts to suit their own agendas since the very beginning. :|
I'm curious why the sudden concern
Pot. . . meet kettle.
Lawmakers don't have many facets (Score:2)
Lawmakers don't have many facets, maybe as few as two: graft and porn. I can see how they might project that onto society in general.
What you can do and can't do (Score:4, Interesting)
What could be possible:
* Cryptographic signatures on raw data leaving a camera, or Cryptographic signatures on the default recording app as the videos/photos are taken
Probably won't help that much though, but might help to identify unedited footage.
* Give images/videos timestamps signed by a third party immediately as they are taken
This can prove that a piece of information existed no earlier than that time.
None of these can thwart recording a video screen playing back pre-edited video.
Re: (Score:2)
Trusted third parties are expensive and usually unworkable. Cameras signing things will not provide any security at all, as they are easily hacked.
We will just have to learn to deal with this. There really is no other way.
Re:What you can do and can't do (Score:4, Insightful)
Trusted third parties are expensive and usually unworkable. Cameras signing things will not provide any security at all, as they are easily hacked.
So there will need to be some kind of hacking detection built into cameras. A write-only store which keeps track of changes to the internal flash, for example. In order to prove the validity of a video you'll have to produce both the video and the camera. This should be useful at least in keeping the police from tampering with body cam footage.
Re: (Score:2)
That has all been tied before. The only hardware in existence that offers a reasonable protection and could do the functionality you describe is a full HSM (Hardware Security Module) at something like $50'000 and upwards. All other approaches have been hacked successfully, and I have a nagging suspicion that HSMs have not only because they are to expensive for hackers to get their hands on as you would probably have to burn 3-4 for a successful hack. They are also not sold to the general public.
Re: (Score:2)
Trusted third parties are expensive and usually unworkable. Cameras signing things will not provide any security at all, as they are easily hacked.
I don't think you understand what he's talking about. We already have certificate authorities working just fine, some of them for free (e.g. Let's Encrypt). His notion isn't much different than what we already have with SSL/TLS, and hacking the camera wouldn't actually get you much of anything.
Hacking the camera wouldn't allow you to change the file, since doing so would invalidate the cryptographic signature on the file. Hacking the camera wouldn't destroy copies of the file that had already been exported.
Re: (Score:2)
Ah, yes, all the nativity of those that have no clue what they are talking about and think security is easy. Have a look into the relevant literature and then come back. Here is a starting point: The public x.509 certificate system is badly broken, and you can even get EV certificates for a fake or wrong identity with minimal effort.
Re: (Score:2)
You have yet to actually suggest what a harmful attack would even accomplish, let alone the means to accomplish it. While I touched on some of the means by which we could do this stuff, the overarching point I was making was that the concern you raised doesn’t allow for a bad actor to actually do anything harmful enough to warrant your claims that it’s unworkable and gets us nothing in terms of security.
Discussing flaws in a hypothetical device is pointless when the flaws don’t even expos
Re: (Score:2)
Bla, bla, bla. Seriously?
Re: (Score:2)
Compress your videos with Mpeg or similar.
If someone overlay something over your video, even if it's a "deepfake" you still can, as it's done with pictures check for differences on the compression artifacts.
Pictures don't lie. (Score:3)
And then came along tape, and audio editing, and auto-tune and computerized voice editing. And Hatsune Miku [wikipedia.org], who not only doesn't exist, her VOICE doesn't even exist: She was created by taking vocal samples [which] all contain a single Japanese phonic that, when strung together, creates full lyrics and phrases. Video [youtu.be]. The people with glowing green sticks, though, are real.
And now with movies have placed people's heads on other's bodies, never mind body-doubles. The trick is that's it's becoming better, cheaper and more widespread to create. (And I *SWEAR* that people are more gullible now-a-days than they used to be. Or maybe it's because things just move so much faster.) So we're back to a century or so ago: just because there's an audio/video of it, doesn't mean it's HAS to be true.
No worries though, since you're innocent until proven guilty, which has worked so well with MeToo and everything else in the last few years. We'll all just have to have a 2-way shoulder mounted camera that does a real-time blockchain video feed to verify where we are all of the time and that it's really is US in the video.
Now if blockchain would only run at Visa-level transaction speeds instead of a slow 8mm Movie Motion Picture Camera. Oh, and that's 7 BitCoin transactions system-wide and not just per camera. The limit for Litecoin is 56 TPS and the limit for Bitcoin is 7. Visa: 24,000 TPS (Link [howmuch.net]. And far be it your mounted camera loses WiFi/Cell connection or runs out of power.
Re: (Score:2)
Way back when, when we first had cameras, that was the saying. Because it was really hard to make a convincing "wrong" photo.
This is not true. In Germany, the cinema is very popular. I've seen all of Ned Nederlander's films... including when he was known as "Little Neddy Knickers." It was this man who inspired me to learn the art of the quick draw. I looked up to this man. I studied his every move. It was my dream to be as fast as Ned Nederlander. I practiced every day for hours and hours. He was a god to me. Then I found out about movie tricks. Trick photography. I was crushed.
They are just pushing for more power (Score:5, Interesting)
And what gives them power? Laws criminalizing things. There is no way in this universe they can stop this. But they can profit from it and they are certainly trying to.
when politicians panic... (Score:5, Insightful)
There is only one solution (Score:2)
Ones who don't try to weasel out of situations with denials, false claims (I was not told about .... ), redefinition of meaning or refusing to answer questions.
And if they can be seen not to
30 years late (Score:2)
I have a habit of writing documents describing technology that I would implement myself given the time, budget and drive. Examples of this would be RAID systems capable of online live healing and automated scaling with almost zero risk for loss due to transactional
easy! (Score:2)
Just make it illegal! That always fixes every problem!
Seeing Isn't Believing (Score:2)
Anything electronic can be faked, and so we need to start encouraging people to UNDERSTAND that "Seeing Isn't Believing". Just because you saw it on a screen doesn't mean it's true.
It's all phony.
No they don't. (Score:5, Insightful)
People will currently believe absolutely anything provided you get the narrative right and appeal to their emotions. There's no need to even doctor videos anymore. You just have to tell them.
Re:No they don't. (Score:5, Insightful)
Our mainstream media didn't flush anything. As said people will believe absolutely anything. One of those things they are told to believe is that the mainstream media has no credibility, and they are told this by the highest authority of one of the world's most powerful nations.
Re: (Score:2)
Having a partisan agenda does not affect the credibility of the media. There's a big difference between selective reporting and actively making shit up.
And again, the media didn't flush anything. There has been political bias in the mainstream media since the days of kids selling newspapers for a nickel on the streetside.
But it is quite fascinating that you lump "mainstream media" and then point to supporting only one political party. I take it you have some bias of your own, where supporters of the "other"
Re: (Score:2)
So you pick one news outlet and posted a lot of examples of them not actually lying about anything?
I'm not sure you're trying to prove your point, but you've successfully succeed in proving mine.
Re: (Score:2)
Still gunning for one media outlet. *golfclap*
Re: (Score:2)
Which election are you thinking about, and which media. There isn't one in which all the media were on the same side, and there isn't one in which the most corrupt politician didn't get a lot of support. This is certainly true all the way back to Jackson, and possibly further.
The media used to be less powerful, because the politicians used to directly talk to people. This gave us high-toned political campaigns featuring competing slogans like:
"Ma, ma, where's my pa?
Gone to the white house ha ha h
Solution (Score:3)
Re: (Score:2)
They should be glad (Score:2)
When the next video comes out, showing them saying Nazi/racist/whatever stuff, they can just claim:
Deepfake news!
Nothing will change (Score:2)
They can already get away with blatant lying and fool 51% of voters. We are seeing just how extreme 1 side can go with it already.
Profound affects on everybody's lives are already happening. It's so fast that the majority are aware of the shift; but as it becomes the new normal it'll be hardly any different over time than the gradual slide we've had since WW2; when weaponized psychology became widespread.
It'll probably take another horrific shameful war before "Lying Press" becomes a taboo phrase in this c
Re: (Score:2)
Re: (Score:2)
Trump was NOT a middle finger. Aside from the immature childish behavior of an FU vote (which is fitting given Trump's a man-child;) a middle finger is a huge understatement. Voting for Trump in protest is like a teenage girl running away out of spite with the 1st man who hits on her and ending up gang raped and thrown in some ditch. Yeah, that sure showed them...
Or the cattle/sheep escaping from their farmer for using and looking down on them; escaping to the butcher who reminds them of a pig so he must
Re: (Score:2)
Truth isn't truth? (Score:2)
When anything and everything, such as photos, videos, audio, facts, etc., become as muddied and confused as a psychotic break, then does "Truth" still matter? When "Truth" becomes unknowable, then it becomes useless. You can only react to what you perceive. Like for a person having a psychotic break, that often means reacting in a chaotic or harmful way.
I can't imagine what this will do to society. Perhaps society will react to things in way
Re: (Score:2)
The Running Man (Score:2)
Re: (Score:2)
Re: (Score:2)
Re:TRUMP (Score:5, Insightful)
This Congress? Nothing except verbally distance themselves from him (if they even do that) while lining their pockets with his policies. The next Congress though -- that's a whole different matter.
Re: (Score:2)
Re: (Score:2)
What's really weird about people is that with all the things Trump has done, so many people are fixated on whether he's into golden showers. That's really, really, trivial. Contributing to a huge death toll in Puerto Rico is nearly ignored.
The only significant thing about the pee tape is that if it's true it's another thing Putin has as a hold over Trump, but it's my guess that his real hold is that Trump's a physical (at least) coward, and Putin threatened to have something physical done to him if he did
Re: (Score:2)
What magic method is the AI supposed to use to determine that a particular video is fake? If it's "the eyes aren't properly blinking" the faker can then learn to blink the eyes. Etc.
Re: (Score:2)
Re: (Score:2)
"Deep fakes" can reputedly be created a lot more cheaply and consistently than can a hand edited video stream.
Calling the process that does it AI is technically correct, but currently it's far from a general AI and doesn't understand the objective context of the pixels it is editing. That will come eventually, however. So worrying about future developments is proper.