Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Crime

Predictive Policing Software Terrible At Predicting Crimes (wired.com) 62

An anonymous reader quotes a report from Wired: Crime predictions generated for the police department in Plainfield, New Jersey, rarely lined up with reported crimes, an analysis by The Markup has found, adding new context to the debate over the efficacy of crime prediction software. Geolitica, known as PredPol until a 2021 rebrand, produces software that ingests data from crime incident reports and produces daily predictions on where and when crimes are most likely to occur. We examined 23,631 predictions generated by Geolitica between February 25 and December 18, 2018, for the Plainfield Police Department (PD). Each prediction we analyzed from the company's algorithm indicated that one type of crime was likely to occur in a location not patrolled by Plainfield PD. In the end, the success rate was less than half a percent. Fewer than 100 of the predictions lined up with a crime in the predicted category, that was also later reported to police. Diving deeper, we looked at predictions specifically for robberies or aggravated assaults that were likely to occur in Plainfield and found a similarly low success rate: 0.6 percent. The pattern was even worse when we looked at burglary predictions, which had a success rate of 0.1 percent.
This discussion has been archived. No new comments can be posted.

Predictive Policing Software Terrible At Predicting Crimes

Comments Filter:
  • They just need to create new data to be able to jail their targets!
    • Re: (Score:2, Insightful)

      by sfcat ( 872532 )
      Uh, everyone...perhaps look at the source of this "information". So a few things about predictions, when scientists make predictions they have some kind of confidence interval. The majority of the time when computers make a prediction, the same is true. It depends on the algorithms you use, but usually there are statistical distributions being computed. If you set those confidence intervals to a very small number, then you drop your prediction rate to zero. When you set those confidence intervals very
      • by znrt ( 2424692 )

        the outlet might look wonky, but the alleged numbers from the police itself are pretty eloquent:

        the success rate was less than half a percent. ... specifically for robberies or aggravated assaults that were likely to occur in Plainfield and found a similarly low success rate: 0.6 percent. The pattern was even worse when we looked at burglary predictions, which had a success rate of 0.1 percent.

        less than 1% success rate has hardly anything to do with "confidence intervals" being too high, or "people not understanding ai". a drunk monkey would be 50% more accurate than that. try again.

        • by piojo ( 995934 )

          a drunk monkey would be 50% more accurate than that.

          If I asked you to pick some houses in Tokyo that were likely to be robbed tonight, I bet your success rate would be not small but zero. Guess we'd better choose the drunken monkey instead.

          (Or perhaps you are just being obtuse about the statistics)

        • the outlet might look wonky, but the alleged numbers from the police itself are pretty eloquent:

          the success rate was less than half a percent. ... specifically for robberies or aggravated assaults that were likely to occur in Plainfield and found a similarly low success rate: 0.6 percent. The pattern was even worse when we looked at burglary predictions, which had a success rate of 0.1 percent.

          less than 1% success rate has hardly anything to do with "confidence intervals" being too high, or "people not understanding ai". a drunk monkey would be 50% more accurate than that. try again.

          What is the definition of success? A significant parameter in that definition is the granularity of the prediction over space and time. Since both space and time are continuous variables, the expected correct prediction rate for an exact space and time is always zero, so both space and time are necessarily granularized. The granularity of space and time predictions is extremely important. Is the space prediction granularity one meter, one house, one block, or one precinct? Is the time prediction granul

      • by XXongo ( 3986865 )

        Uh, everyone...perhaps look at the source of this "information". So a few things about predictions, when scientists make predictions they have some kind of confidence interval.

        But these weren't predictions by scientists. They were predictions by a company out to make money from police departments.

  • by WereCatf ( 1263464 ) on Tuesday October 03, 2023 @11:54PM (#63898545)
    Who could've predicted this?!
    • by backslashdot ( 95548 ) on Wednesday October 04, 2023 @12:26AM (#63898575)

      Well, not the AI.

    • by vlad30 ( 44644 )
      The difficulty would be to predict exact events.

      However you could predict something is being planned if you had information for example that a "safe cracker" is meeting with a "driver" and a guy considered "hired muscle and this information would come from meta data of phone calls not necessarily from the content of the call which may be encrypted. Or through some good police work

      • The difficulty would be to predict exact events.

        There is no reason to expect software to predict individual events when dealing with humans rather than billiard balls.

        In addition to all the other problems with predicting human behavior, there is the Heisenberg effect: observing changes the behavior. If the software predicts a burglary at 123 Main St., and the cops stake it out, then the burglar is gonna see the cops sitting in front of the house in their van and go rob a different house instead. So that's gonna be marked down as a failed prediction.

        • As it should, because the prediction was completely worthless - a house still got robbed, so nothing was gained.

      • by vadim_t ( 324782 )

        The real world isn't GTA, though. "Safe cracker" isn't something a person can be tagged with in a police database. If they know somebody is a safe cracker, why aren't they in prison?

        No, it's more likely to be more heuristic about it. Like a strong guy, a welder, and somebody with a fast car regularly meet and seem to have too much money to spend. Which of course now is far less reliable and results in hassling a lot of people who just happen to have a skill or interest that's deemed suspicious.

        • If they know somebody is a safe cracker, why aren't they in prison?

          The cops tend to know quite well who's a professional criminal. Actually proving they took a part in a particular hit, though, takes a lot of work, and limited tuits the police has are de/reprioritized for political reasons to a huge degree -- which doesn't align to what the society, and your safety in particular, would need.

          • If they know somebody is a safe cracker, why aren't they in prison?

            The cops tend to know quite well who's a professional criminal.

            Nope.

            First, most criminals are amateurs. Second, no, cops don't know who are professional criminals. You're watch too many Hollywood movies where the police know the gangs but can't put them in jail until an unconventional cop who ignores civil rights and gets information by beating informers up until they squeal solves the case and kills the bad guy in a shoot-out.

            • by dstwins ( 167742 )
              Actually according to most police... the vast majority of crimes are committed by repeat offenders (meaning they were formerly convicted (or at least arrested for a crime). So yes, they do know who the "seniors" are in the game of cops and robbers.

              But "knowing" and KNOWING (ie: one is an unofficial... the other is substantial arrestable proof) are two different things.

              Also something that is different these days as well.. just like how companies have gone through a period of consolidation (bigger companies
      • I dont think thats what these pieces of software are actually trying to do. The concept SEEMS to be "a% economic circumstances, b% social circummstances, c% physical circumstances means d% chance of crimes in this area" , ie predicting the best places to put cops to most likely catch crimes. Essentially an automated criminologist.

        The problem is , well other than the fact they seem to fail at even that (Turns out the difference between an AI criminologist and a human one is the human one lives in the neighbo

  • There's statistics papers showing crimes have an exponential geographical spike, a handful of streets in cities see the most crimes by far. Piss easy to just point there and say "crime". But you're selling to the police, who needs to be effective?
    • by Arethan ( 223197 )

      But you're not actually selling to the police. You're selling to their funding managers, which are mayors/etc, ie not actually police.
      It's really easy to snow people with fancy datasets and charts that don't know wtf they're looking at or what they should be looking for.

      • I believe that generally the funding managers don't micromanage how funds are spent - so you *are* selling to the police.

        • by Arethan ( 223197 )

          Sometimes this is true, but in my (brief) experience within public service systems it more often isn't.
          Many times major funding comes with strings attached - ie, must be used for a specific purpose - for example, this is how you end up with a school library upgrade of dozens of new computers when the school (and even the district) has no idea how they'll ever use those systems for any purpose outside of casual minesweeper during free periods. Within the police spending context, this easily turns into 'must

    • Re:Which is weird (Score:5, Insightful)

      by Sique ( 173459 ) on Wednesday October 04, 2023 @02:05AM (#63898661) Homepage
      That's quite different from predicting a single event. Yes, you can predict the expected amount of crime, broken down to crime types within a large enough time frame and a large enough area. But as soon as you are not summing up, but looking for a single event, you are lost.

      Zipf's law for instance will tell you the distribution of the letters in a given text extremely well, even for comparatively short passages of text, but it does not allow you to predict the 100th character in a text even with some certainity (only that with about 12.6 percent probability, it is an e, but that's as good as it gets). Crimes are even in the worst areas rare random events, and in the most cases, a chain of events does not lead to a crime.

    • There's statistics papers showing crimes have an exponential geographical spike,

      I can't even figure out what the phrase "exponential geographical spike" means.

      The function Crime[location] is Crime = e^(location) ?

  • It's just there to establish cause for over policing the neighborhoods of undesirables. Then you can arrest them and take away voting rights.
    • by haruchai ( 17472 )

      It's just there to establish cause for over policing the neighborhoods of undesirables. Then you can arrest them and take away voting rights.

      And make them into slaves, which is permitted by the 13th Amendment

    • precrime = long cryo jail time with no trail

    • Re: (Score:2, Insightful)

      It's just there to establish cause for over policing the neighborhoods of undesirables. Then you can arrest them and take away voting rights.

      Er, you mean like arresting the leading opposition candidate?

      • by HiThere ( 15173 )

        Yes, that has to be included. And after all, sometimes the guy arrested actually is guilty...but that still affects the voting.

        The question in such cases is always "should it be allowed?", and at minimum this should require that the law actually be followed. Sometimes that's not sufficient, but that should be a minimum.

  • by Retired Chemist ( 5039029 ) on Wednesday October 04, 2023 @01:06AM (#63898605)
    I think the idea is to predict when and where crimes will occur in areas that are not heavily patrolled. Since a lot of crime particularly in lower crime areas is random, this is difficult to do. The odds are that any experienced police officer could do a better job that the software.
    • But... experienced police officers are EXPENSIVE!

      • by HBI ( 10338492 )

        You couldn't pay me enough to be a Plainfield, NJ cop. I mean it's better than NYPD but only by a little. Having lived there (ok, North Plainfield...) it's not a great place to live.

        Anyone old and experienced on the force probably retired three years ago, if they were smart.

    • Re:The idea (Score:5, Interesting)

      by Baron_Yam ( 643147 ) on Wednesday October 04, 2023 @06:55AM (#63898937)

      I've worked on this stuff. Mostly it was used correctly - not for prediction of future events, but for analysis of past ones to help identify prime suspects.

      There should be no question that a decent analyst with a properly populated data set is going to be better at pulling out non-obvious correlations than a human. The database has no trouble taking criminal records, repeat offender parole enforcement, officer contact reports, and recent occurrences and telling you that Jane's probably hooking up with John, who put her in touch with Joe, and now Jane and Joe seem to be working together even though no officer has directly observed this yet.

      Since most criminals are habitual ones, 'known to the police', this is a very good place to start when looking for suspects.

      In terms of predictive abilities... simple heat maps contrasting recent criminal activity with police visibility (the cars and even individual cops are all constantly GPS-tracked and logged now, I've built the systems that do it for multiple agencies) can show you trends. Then you blend in time of day, day of week, day of month, and zoning info and it's not terribly surprising that the system can tell you which areas are likely to see an increase in crime. It won't tell you that 12 Main Street is going to be burgled tonight at 3:12 AM, but it'll tell you that it's prime crime time, cops haven't been visibly patrolling recently, and there have been occurrences in the area that are probably going to repeat soon because the apparent opportunity is looking pretty good.

      It's still a good percentage 'voodoo' that maybe a good officer should be able to handle without a crime analyst and a database to point it out to them - but when you're talking a 24/7 service in a decent-sized city there is a lot of opportunity for these things to be missed as cops move around and rotate through shifts.

      • or are police going after criminals they know because it's an easy arrest? Recidivism rates are up there, but they're not approaching 100%, which is what I'd expect if "most criminals are known to police".

        What you're doing is more like "John committed a crime, so Jane is probably a criminal because Jane knows John".

        That's just poverty. It's got a name. "Broken Windows Policing". And it's history is not great.

        So does the software work? Yes and no. Is Jane a criminal? Well, she's got a baggie of p
    • It has been shown to not work, in theory and now in practice

      What is worse is that if you send police to an area they don't normally patrol, and they don't know much about, except that crime has been predicted there, it is both likely they might put off criminals - or not ... and because they are there they WILL find crimes, which in the past has been taken to show that the prediction was "correct" ...

  • Predictive Policing Software Terrible At Predicting Crimes

    That's what happens when you cheap out and use Postcogs instead of Precogs [fandom.com] ...

  • by quenda ( 644621 ) on Wednesday October 04, 2023 @01:52AM (#63898635)

    Sounds like a case of not putting any useful data into the system.
    And probably no real AI to analyse it anyway. Geolitica sounds like a bunch of buzzwords slapped together to win a lucrative police contract.

    Maybe if a real AI (glorified predictive pattern recognition system) has access to the cell-phone network tracking, and social media traffic, if might be able to recognise patterns, and do better than a cop with a map of last years crime locations, and a calendar. I'm sure the NSA has such tech.

          But this story is set in the US, where there is so little trust of authorities, that something like this could be difficult to set up, to say the least. I don't think people would be happy with even anonymised data used that way, so it is flying blind.

  • They should simply fire the fools that spend money on prediction software like this. None one can predict future like how they were hoping this would work. if someone would come up algorithm that can actually predict future like this, Army wold grab it so fast you would only hear rumours about it existence. There's old saying: Fool and hes money are soon departed.
    • "They should simply fire the fools that spend money on prediction software like this."

      My friend, we're talking about law enforcement. Just like when a cop gets caught beating the crap out of an innocent citizen, taxpayers get stuck with the bill, while the cop gets a few weeks of paid vacation, or a new job at the next town down the road. There is zero accountability when law enforcement wastes money on yet another tool that doesn't work at all, or costs a hundred times what it's worth.

    • by PPH ( 736903 )

      Army wold grab it so fast you would only hear rumours about it existence

      Douglas Lenat [wikipedia.org] recently passed away. Known for, among other things, developing a type of AI system called Eurisko [wikipedia.org]The details of his subsequent work may have been classified. But if you are working in the biz, other examples of DoD interest in stuff that actually works aren't that scarce.

  • by martin-boundary ( 547041 ) on Wednesday October 04, 2023 @03:52AM (#63898753)
    For any binary predictor (eg is there going to be a crime at this address?) if the accuracy of predictions is less than 50%, always use the opposite prediction, it has accuracy >50%.
    • by Sique ( 173459 )
      I'll predict that no crime will happen at this place in the next 24 hours. I'll be right about 99% even in the worst neigborhoods, as for the single household, having more than three crimes reported per year is quite seldom.

      But what do you do with this prediction?

  • Anyone even partially competent at writing prediction software gets lured away to work on the stock market, where even "only some usefulness prediction" is still a high enough bar to attract interest and can be valued because with non-linear returns it can afford to be wildly wrong on minor things repeatedly leading to sustainable losses but correct on one hugely valuable thing and turn a profit on the overall.

    This context, no. Prediction software fucks up even once and you can expect a lawsuit against the

    • by PPH ( 736903 )

      Prediction software fucks up even once and you can expect a lawsuit against the police department

      Prediction software is used to allocate resources to the locations more likely to experience crime. How are you going to sue for a mis-allocation? And why wouldn't you see lawsuits because some fat-assed shift sergeant sent more officers to the neighborhoods where the POC live?

  • Predictive policing has been a buzzword for a while and widely scoffed at by criminologists, so this isn't any surprise to anyone in the field. There are a whole slew of companies marketing unproven or poorly-implement technology to police departments and they sometimes fall for it. It's all AI models now, of course.
  • Now we just need 3 children, which can look into the future and give us the name of the culprit, so we can find him and imprison him before he commits a crime. nice. and dont forget retina scans everywhere and selfdriving cars.
  • by laughingskeptic ( 1004414 ) on Wednesday October 04, 2023 @12:15PM (#63899641)
    You are clearly always going to look like an idiot trying to predict crime at a location on a given day. The more granular you make your location the worse your predictions get.
    Let's take Houston for example:
    10062 sq miles

    435 murders per year
    Probability per sq mi: 4% PER YEAR
    Probability per sq mi: 0.01% PER DAY

    26,223 Violent Crimes per year
    Probability per sq mi: 260% PER YEAR
    Probability per sq mi: 0.7% PER DAY
  • They correlated with crimes that are reported to police. I used to live around there. Most of the population of Plainfield is not going to report property crime to the police, so the "study" doesn't really tell us much. Of course, the software is probably bogus, too, but that's not proven by this "study."

  • ...Minority Report?

  • Predictions might be worthless now, but if the system is trained more and more, it will improve over time. A tool like this is great as it can probably also give reasons why like due to the layout if the area, which in turn can lead to changes which makes the area more safe.
  • AI encounters human intelligence. Criminals avoid been caught where AI expects them to be.

Somebody ought to cross ball point pens with coat hangers so that the pens will multiply instead of disappear.

Working...