Security Cameras + AI = Dawn of Non-Stop Robot Surveillance (aclu.org) 103
AmiMoJo shared this post from one of the ACLU's senior technology policy analysts about what happens when security cameras get AI upgrades:
[I]magine that all that video were being watched -- that millions of security guards were monitoring them all 24/7. Imagine this army is made up of guards who don't need to be paid, who never get bored, who never sleep, who never miss a detail, and who have total recall for everything they've seen. Such an army of watchers could scrutinize every person they see for signs of "suspicious" behavior. With unlimited time and attention, they could also record details about all of the people they see -- their clothing, their expressions and emotions, their body language, the people they are with and how they relate to them, and their every activity and motion...
The guards won't be human, of course -- they'll be AI agents.
Today we're publishing a report on a $3.2 billion industry building a technology known as "video analytics," which is starting to augment surveillance cameras around the world and has the potential to turn them into just that kind of nightmarish army of unblinking watchers.... Many or most of these technologies will be somewhere between unreliable and utterly bogus. Based on experience, however, that often won't stop them from being deployed -- and from hurting innocent people...
We are still in the early days of a revolution in computer vision, and we don't know how AI will progress, but we need to keep in mind that progress in artificial intelligence may end up being extremely rapid. We could, in the not-so-distant future, end up living under armies of computerized watchers with intelligence at or near human levels. These AI watchers, if unchecked, are likely to proliferate in American life until they number in the billions, representing an extension of corporate and bureaucratic power into the tendrils of our lives, watching over each of us and constantly shaping our behavior... Policymakers must contend with this technology's enormous power. They should prohibit its use for mass surveillance, narrow its deployments, and create rules to minimize abuse.
They argue that the threat is just starting to emerge. "It is as if a great surveillance machine has been growing up around us, but largely dumb and inert -- and is now, in a meaningful sense, 'waking up.'"
The guards won't be human, of course -- they'll be AI agents.
Today we're publishing a report on a $3.2 billion industry building a technology known as "video analytics," which is starting to augment surveillance cameras around the world and has the potential to turn them into just that kind of nightmarish army of unblinking watchers.... Many or most of these technologies will be somewhere between unreliable and utterly bogus. Based on experience, however, that often won't stop them from being deployed -- and from hurting innocent people...
We are still in the early days of a revolution in computer vision, and we don't know how AI will progress, but we need to keep in mind that progress in artificial intelligence may end up being extremely rapid. We could, in the not-so-distant future, end up living under armies of computerized watchers with intelligence at or near human levels. These AI watchers, if unchecked, are likely to proliferate in American life until they number in the billions, representing an extension of corporate and bureaucratic power into the tendrils of our lives, watching over each of us and constantly shaping our behavior... Policymakers must contend with this technology's enormous power. They should prohibit its use for mass surveillance, narrow its deployments, and create rules to minimize abuse.
They argue that the threat is just starting to emerge. "It is as if a great surveillance machine has been growing up around us, but largely dumb and inert -- and is now, in a meaningful sense, 'waking up.'"
And now that it's waking up ... (Score:3)
It's observing all these people and saying, "Wait a microsecond ... I could do *their* jobs, no problem."
Re: (Score:1)
It probably could, but then what would be the point of living for us hoomans? Don't even need machines deciding we're no longer needed, it'll be ourselves that decide there's no point in hanging around and taking that to its obvious conclusion.
A little thought tells me we can go several ways with this. We could add ever more cameras and "AI" and whatnot. It could always be there, producing red flags all over, if we're lucky there'll be too many for an affordable state apparatus to deal with.
Or, you know, yo
Re:And now that it's waking up ... (Score:4)
You are falsely attributing human purposes and motives to that net. Probably in a humorous vein, but with a serious undertone.
You're wrong. The ones saying "this could to *their* jobs" (often incorrectly) are management, and occasionally bean-counters (not real accountants). This is not a general AI, and isn't actually a bit step towards general AI. This is a classification system...unless, of course, the AI itself decides what the categories are. Even then, probably. (Regression and factor analysis are great for recognizing clumps of data without assigning meanings to them.)
Welcome to 1984 (Score:3)
Re: (Score:2)
Frankly, this is worse than George Orwell's worst nightmare.
Re: (Score:1)
But it is the opposite! (Score:5, Insightful)
You're always on stage with an audience.
That's the great thing about AI + cameras. There is no audience, there is no-one real watching. Just a system that can recognize conditions.
We need to get people away from watching cameras, to the impersonal recognition of conditions that need alarming on.
Do you not find it vastly more creepy that London has teams of real humans constantly monitoring cameras all around the city, than if they had AI rigged up to inform operators of possible problem situations they should look at? No more peering at women walking down the street through the camera. No more watching people snog in what they thought was a private corner with no-one around.
Applying AI to the job of monitoring cameras that already exist is moving away from the world of 1984, not towards it, because you can't have judgmental overseers send police out to harass you just because your skin is the wrong color or the clothes you wear annoy them.
Re: (Score:3)
That might be true if the AI was itself used with benevolent intent by the people who own and control it. Consider that if one spoke against a totalitarian government, they could have the AI dig through your entire life to paint you and your relatives in a very negative light before they took corrective actions for your behavior.
And that is just one scenario. There ar
Re: (Score:2)
Okay, stop describing modern day China, just stop it....one needs re-education camps to make all that work...uh-oh...I guess the Uighurs are lab rats for Xi Jinping's Workers' Paradise.
Re: (Score:2)
Why do you need re-education camps? It' been working well in America with no education camps. Just lock up a good chunk of the undesirables, give them a bad social credit rating to make sure they can't improve themselves, rinse and repeat.
As a bonus it gives talking points for the next election, gotta lock up those undesirables.
Re: (Score:2)
Re: (Score:3, Interesting)
Two problems.
1. Cheap AI will encourage more and more cameras to be installed because now there is almost no cost to monitoring them. In a few years the basic PVR units that record camera footage will include some image recognition at no additional cost.
2. With AI and cheap storage nothing will ever be forgotten, and it will all be easy to find. The scope for abuse is breathtaking - everything you ever did in public will become searchable, and available to use against you for years, maybe indefinitely.
Re: (Score:2)
You're ignoring the possibility of planting "deep fakes" in the data stream. They wouldn't even need to be very high quality.
Re: (Score:2)
That's true. I'm sure they will start to add crypto signatures to video, but they will be faked too.
Re: (Score:2)
Person of Interest (Score:5, Informative)
Go watch the TV show Person of Interest. This is literally their main plot point, and showcasing how dangerous it is. They bring up all sorts of ethical boundaries with this technology.... OH, AND LOTS OF GUNS AND EXPLOSIONS, so it has that going for it, too!
Re: Person of Interest (Score:2)
Maybe this is a good thing (Score:2)
The scary bit is really more when or if all of this monitoring is shared more widely.
But the thing is, it does not have to be. Now is the time to support local cameras + AI - but be vigilant around who gets access to that video.
One scenario I thought of the other day - cameras + AI to detect when someone heavily armed entered school grounds or a building, and provide alarms and updates in real time to all staff. How amazing would that be if you had ten minutes to barricade a room instead of seconds, or to
Re: (Score:2)
Umm, no. It doesn't. It just sets up the structures of our Federal government. You're probably thinking of the Declaration of Independence.
You are totally correct (Score:2)
You're probably thinking of the Declaration of Independence.
Totally correct, thanks for the pointing that out. Got caught up in the heat of the writing.
Also I said "Guarantee" but the Declaration of Independence merely claims they are rights, and does not guarantee anything - more high level goals. :-)
Re: (Score:2)
That's precisely what all those GOP senators with a hand down their pants
And yet in the era of MeToo, is is many Demcoraticic celebrities and politicians being brought up for past misdeeds with woman dan men.. is Kevin Spacey a Republican? Nope. Tim Allen is still working.
Re: (Score:1)
Given a choice between using a technology to improve the lives of the whole populace, and using it instead to crush their necks under an iron heel, NO government has EVER chosen the former unless held at swordpoint - and even then only until the people's grip on said blade loosened a little.
All of the good things AI and permanent surveillance could potentially be worth will only ever happen accidentally or as an *unwanted* side-effect by those who implement it running contrary to their expectations - until
Re: (Score:2)
You are too absolutist. There actually have been governments that worked towards increasing the welfare of the citizenry in ways that did not increase their power. Not many, admittedly, and that's not the way to bet.
That's because you're a pedophile... (Score:2, Troll)
There was a more limited form of this I saw in a recent video, I think on Twitter - they showed a bunch of school kids entering a school in China using facial recognition to enter gates. I was supposed to be alarmed by this, but again why?
...who gets off on watching schoolkids.
You probably have a scoring system for preferential rape too.
Knowing you, it's probably based heavily on race, religion and personal political leanings of the children and/or their parents.
Not to mention your hate of other people's privacy and how much you love the idea of training children to accept the surveillance state as the normal and default position.
https://techcrunch.com/2019/05... [techcrunch.com]
The database also contained a subject's approximate age as well as an "attractive" score, according to the database fields.
But the capabilities of the system have a darker side, particularly given the complicated politics of China.
The system also uses its facial recognition systems to detect ethnicities and labels them - such as "æ±æ--" for Han Chinese, the main ethnic group of China - and also "çæ--" - or Uyghur Muslims, an ethnic minority under persecution by Beijing.
Where ethnicities can help police identify suspects in an area even if they don't have a name to match, the data can be used for abuse.
The Chinese government has detained more than a million Uyghurs in internment camps in the past year, according to a United Nations human rights committee.
It's part of a massive crackdown by Beijing on the ethnic minority group.
Just this week, details emerged of an app used by police to track Uyghur Muslims.
There is no data that can be gathered that can't be abused.
Automating the proce
Very odd video habits you have (Score:1)
I find it very interesting that the only videos you watch are ones you can get off to, since you cannot understand the concept of wider use...
I find that much more disturbing than AI + cameras.
I personally watch videos for information instead of arousal. Sad that you miss out on so much of life around the world! You should try it sometime, zip up those pants and open up YouTube! You will find the broader world more interesting than broads alone.
There is no data that
Re: (Score:3)
As I was saying before some pathetic pedo snowflake tried to downmod me out of fear that mentioning pedophiles on Slashdot will expose them to their moms...
There was a more limited form of this I saw in a recent video, I think on Twitter - they showed a bunch of school kids entering a school in China using facial recognition to enter gates. I was supposed to be alarmed by this, but again why?
...who gets off on watching schoolkids.
You probably have a scoring system for preferential rape too.
Knowing you, it's probably based heavily on race, religion and personal political leanings of the children and/or their parents.
Not to mention your hate of other people's privacy and how much you love the idea of training children to accept the surveillanc
Re:Maybe this is a good thing (Score:5, Insightful)
People like you are the reason why we cannot have a lasting free society. You are incapable of rational risk-analysis and comparison. You take a fantasy of something rarely happening and make it the most important aspect but ignore the problem that things will not happen as in your fantasy and that millions or billions of other negative effects per one enactment of your fantasy will be part of the package.
You describe yourself (Score:2)
You are incapable of rational risk-analysis and comparison. You take a fantasy of something rarely happening
The exact same thing applies to your arguments, out of millions and billions of people how often is surveillance abused?
I want to eliminate those few times by taking humans out of the equation and have set goals for what monitoring video is for.
Meanwhile YOU want to block AI, but can do nothing to stop the tide of cameras which will have more and more people abusign the privilege of being able to see
Re: (Score:3)
Have a look at human history. And then come again. The problem is not that the people doing the surveillance abuse it. That is a small side-issue.
Re: (Score:2)
It would be better to engineer a society and a school systems so that you don't get school shootings on a monthly bases. School shootings is mostly a US phenomenon - for multiple reasons. The key part is to avoid the reasons for why the assailants want to shoot people in the first place. If you manage to do that, it would probably result in a happier, more productive school in the end for everyone overall.
It won't work anyway. Shooters are often people who belong to the school in the first place, and who do
Re: (Score:2)
Guns and ammo made readily available courtesy the NRA, turning American society into a society of gun-nuts one child at a time.
Re: (Score:2)
Guns and ammo made readily available courtesy the NRA, turning American society into a society of gun-nuts one child at a time.
Guns and ammo have always been readily available in the US, and predate the existence of the NRA (who I'm no fan of). The problem of kids acting out this way, is a much more recent activity though.
Re: (Score:3)
One scenario I thought of the other day - cameras + AI to detect when someone heavily armed entered school grounds or a building, and provide alarms and updates in real time to all staff.
This is an excellent example of the fallacy Schneier references here [ted.com] (and in many other places too). You're designing security based on feelings, not reality; since relatively rare events, like school shootings, create a strong emotional response, their risk is highly overestimated. This makes you accept permanent and ubiquitous surveillance as a trade off for stopping armed people entering schools.
And this is exactly how this bad trade-off is sold to you. Here's a quote from Schneier's take on the ACLU rep [schneier.com]
Free the cameras! (Score:2)
The solution to avoiding Big Brother is to free the cameras - ALL of them - from their corporate and governmental overlords. If the cameras are accessible to anyone and everyone and not controlled and abused by and for the benefit of a tiny minority, then Big Brother never happens. Instead, we get Universal Ubiquitous Situational Awareness: much like the intimate population of a small village, we get citizens who behave just a little bit better and kinder, knowing that everyone could be watching their beh
Re: (Score:2)
If the cameras are accessible to anyone and everyone and not controlled and abused by and for the benefit of a tiny minority
Read Earth [amazon.com] by David Brin for a take of a world along these lines.
Re: (Score:3)
How would you stop them from being abused by corporate interests trying to use the camera to track you for commercial purposes?
I have seen many cases of publicly available databases of citizens being used and abused by corporations for various advertising and profiling purposes already.
Re: (Score:2)
No, we get Universal Ubiquitous Situational Awareness for that KGB thug running Russia, the pompous ass running China, and a provide an Allah-send to every Muslim fanatic who ever wanted to stage something really, really big in the U.S.
So... (Score:2)
Security Cameras + AI = Dawn of Non-Stop Robot Surveillance
Is it time to start wearing face masks on a daily basis?
You can, but you don't have to! (Score:2)
Is it time to start wearing face masks on a daily basis?
People already do in winter so knock yourself out.
Myself I enjoy the sun on my face and the wind in my hair, I walk the world without fear to weigh me down!
Re: (Score:2)
Is it time to start wearing face masks on a daily basis?
The cops will just shoot anybody who does (Might have been a terrorist!) to curb that pretty fast.
Re: (Score:3)
It is already illegal here in Sweden to wear one in public, only because wearers can't be identified on camera footage.
I wouldn't be surprised if other countries had similar laws.
Re:So... (Score:4, Informative)
The Chinese currently demonstrating in Hong Kong do it when they can be recognized in a public demonstration. The also watch their own use of cell-phones and credit cards so as not to reveal their beliefs. And this is from the Chinese screwups running China. Imagine what efficient American companies and government could implement.
Energy requirements? (Score:2)
Imagine the energy requirements if there was a security camera at each street corner, each with a PC with a top-of-the-line graphics card doing the facial recognition.
There have been reports of cryptocurrency "mining" using significant amounts of energy. But crypto-"mining" is still done by relatively few enthusiasts, not introduced by a government at a massive scale.
I'm afraid that widespread "AI" would surpass "mining"'s energy needs by a wide margin.
And increased energy use for little gain is not what th
Not much (Score:2)
Imagine the energy requirements if there was a security camera at each street corner, each with a PC with a top-of-the-line graphics card doing the facial recognition.
You are way behind the times, an iPhone from even a few years ago can easily handle the recognition involved.
Also we are not just talking about facial recognition which may be harder, but a variety of other tasks (like : has a car accident occurred in view of this camera).
criminals have issues with surveillance (Score:1)
this way my neighbor would be convicted long time ago.
Contend it or wield it? (Score:2)
Policymakers must contend with this technology's enormous power.
Why would they want to contend with it? They want to wield it!
They should prohibit its use for mass surveillance, narrow its deployments, and create rules to minimize abuse.
And that is just a few inverted suggestions for its useful applications that make them excited.
I see (Score:2)
My Groucho Marx Nose with 'stache and glasses will be a hit then.
I think (Score:2)
It's fine if (Score:2)
beautiful (Score:1)
I'm conflicted (Score:2)
Big Brother is watching (Score:2)
and now he's an AI!
Seriously, get a thicker tinfoil hat.
I'd much rather have an AI watching my actions than some sleazy underpaid person that might abuse his privilege. AIs are still far from Skynet so they're as dumb as can be and will only do what they're trained for.
Now if you're habitually engaging in borderline behavior then perhaps you should rethink that. It might be a real cop that sees you and acts on what he thinks he's seeing, which might involve a gun or similar.
Re: (Score:2)
Now if you're habitually engaging in borderline behavior then perhaps you should rethink that.
What? Like protesting the government?
Freedom and ignorance (Score:2)
Is that all freedom is? Are we only free when surrounded by strangers? Or where we cannot be seen?
Is that a functional model for the concept of freedom?
Imagine? (Score:2)
Imagine all of the false positives.
Re: (Score:2)
Well, you're right that the AI's won't be in control. But they will be controlled by those who lust for control, and will hand them an enticing tool with which to achieve it.