AI is only as biased as the data you train it with.
Computers cannot be racist if you don't program/train them to be racist.
The problem is some people think that facts are racist and will cry bloody murder if cops do something like focus patrols on poor black neighborhoods with high crime rates.
It amazes me that some law abiding citizens will complain about police trying to make their neighborhood safer
by Anonymous Coward writes:
on Wednesday April 07, 2021 @06:58AM (#61246128)
"Facts". If higher policing leads to a higher crime rate (because more people are... checked) the numbers become a fact. And if (conveniently) the neighborhood is predominantly black you have data that indicates that black people are indeed significantly more criminal than other demographics. The other way around we could drop the crime rate to 0 if we stop policing aforementioned neighborhoods. So these numbers don't mean anything really. It's a well known issue if you are really familiar with the topic and not just trying to sound smart. Law abiding citizens don't complain about police trying to make their neighborhood safer. That's BS. They complain about the usage of excessive force by badly trained cops and the inconsistent treatment of different demographics.
What does any of this bullshit even mean? The training in question is training of AI, not cops, and there wasn't even any AI to train. Any who are these "witch hunters", and how does your racist injection of one example change the argument?
Since the comment you responded to clearly needs to be explained to you, it addresses a basic truism that data that may be used to train AI can itself be inherently tainted with bias, resulting in a biased AI model. The OP is providing an example of that, and you're an
If higher policing leads to a higher crime rate (because more people are... checked) the numbers become a fact.
This is only applies crimes where police have to catch someone in the act. Crimes that leave evidence like murder, property damage, arson, kidnapping, etc. can be reliably tracked. Dropping the police presence to zero doesn't make these crimes disappear, you're still gonna have dead bodies rolling into the morgue and fire marshals finding arson.
A man is not complete until he is married -- then he is finished.
That's not how AI works (Score:0, Troll)
Computers cannot be racist if you don't program/train them to be racist.
The problem is some people think that facts are racist and will cry bloody murder if cops do something like focus patrols on poor black neighborhoods with high crime rates.
It amazes me that some law abiding citizens will complain about police trying to make their neighborhood safer
Re:That's not how AI works (Score:4, Informative)
Re: (Score:3)
What does any of this bullshit even mean? The training in question is training of AI, not cops, and there wasn't even any AI to train. Any who are these "witch hunters", and how does your racist injection of one example change the argument?
Since the comment you responded to clearly needs to be explained to you, it addresses a basic truism that data that may be used to train AI can itself be inherently tainted with bias, resulting in a biased AI model. The OP is providing an example of that, and you're an
Re: (Score:2)
If higher policing leads to a higher crime rate (because more people are... checked) the numbers become a fact.
This is only applies crimes where police have to catch someone in the act. Crimes that leave evidence like murder, property damage, arson, kidnapping, etc. can be reliably tracked. Dropping the police presence to zero doesn't make these crimes disappear, you're still gonna have dead bodies rolling into the morgue and fire marshals finding arson.