Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Privacy

London Underground Is Testing Real-Time AI Surveillance Tools To Spot Crime (wired.com) 31

Thousands of people using the London Underground had their movements, behavior, and body language watched by AI surveillance software designed to see if they were committing crimes or were in unsafe situations, new documents obtained by WIRED reveal. From the report: The machine-learning software was combined with live CCTV footage to try to detect aggressive behavior and guns or knives being brandished, as well as looking for people falling onto Tube tracks or dodging fares. From October 2022 until the end of September 2023, Transport for London (TfL), which operates the city's Tube and bus network, tested 11 algorithms to monitor people passing through Willesden Green Tube station, in the northwest of the city. The proof of concept trial is the first time the transport body has combined AI and live video footage to generate alerts that are sent to frontline staff. More than 44,000 alerts were issued during the test, with 19,000 being delivered to station staff in real time.

Documents sent to WIRED in response to a Freedom of Information Act request detail how TfL used a wide range of computer vision algorithms to track people's behavior while they were at the station. It is the first time the full details of the trial have been reported, and it follows TfL saying, in December, that it will expand its use of AI to detect fare dodging to more stations across the British capital. In the trial at Willesden Green -- a station that had 25,000 visitors per day before the Covid-19 pandemic -- the AI system was set up to detect potential safety incidents to allow staff to help people in need, but it also targeted criminal and antisocial behavior. Three documents provided to WIRED detail how AI models were used to detect wheelchairs, prams, vaping, people accessing unauthorized areas, or putting themselves in danger by getting close to the edge of the train platforms.

This discussion has been archived. No new comments can be posted.

London Underground Is Testing Real-Time AI Surveillance Tools To Spot Crime

Comments Filter:
  • Pretty sure (Score:2, Informative)

    by Anonymous Coward

    This would be illegal in the EU. Oh well another Brexit win!

  • "Testing" means: we are willfully violating human rights, but if we call it a test, we won't be held accountable.
    • Re: (Score:3, Insightful)

      Which human rights do you think they violating? The tube is a public space and they already have cameras everywhere.
      • Re:Newspeak (Score:4, Informative)

        by Errol backfiring ( 1280012 ) on Friday February 09, 2024 @12:31PM (#64227606) Journal
        That would be article 12. Being in a public space does not mean people have the right to stalk you, not even "with a computer".
        • Re:Newspeak (Score:4, Insightful)

          by Tyr07 ( 8900565 ) on Friday February 09, 2024 @01:17PM (#64227760)

          If that was interpreted that way so that a computer or AI couldn't do it, then all forms of marketing would be illegal and prosecuted for all the data collect to 'stalk' everything you do.

          It's clearly not classified as stalking if it's not a person and simply observes data in a public space. If I sit at a subway everyday, regardless if you go there or not, and I see you there, it's so far from stalking that people like you would who try to confuse the two are the ones we need to watch out for.

        • by hey! ( 33014 )

          The distinction between "observation" and "stalking" is that stalking involves following. You could absolutely achieve the legitimate stated goals of the system without stalking anyone.

          But you could *also* use such a system for purposes that violate peoples' rights without stalking them. For example, what if you include protesting as "antisocial behavior"?

          You can even pursue the legitimate goals of the system in a way that violates peoples' rights. For example you could -- quite accidentally -- train the

        • Being in a public space does not mean people have the right to stalk you, not even "with a computer".

          They already have the cameras installed for use by human operators. If anything, connecting this to an AI system will result in less "stalking" because while a human operator might use the camera system to virtually stalk someone they regard as attractive, an AI system will be much less likely to do so. Plus, while an AI system can have biases on whom it flags it is generally a lot easier to identify and retrain to correct biases in an AI system than it is to do the same for a human operator.

      • Yeah. There are more cameras in UK than people. And now we want to use predictive AI to "look for crime." Well, Orwell and Huxley *were* British. What's our excuse in the US? We seem to like the surveillance state too. And last time I rode TFL trains and buses I was less than impressed with their service, too.
    • Comment removed based on user account deletion
  • I'm conflicted... (Score:5, Insightful)

    by WCLPeter ( 202497 ) on Friday February 09, 2024 @12:25PM (#64227586) Homepage

    On one hand, having friends who'd worked in security, I know how difficult it can be to stare at a bank of monitors watching for specific behaviours for hours on end. If I were working subway security I'd really appreciate an automated system that tracks a person's location and tells me if they're on the tracks when the train isn't safely parked at the station.

    On the other hand the image quality of your bog standard CCTV camera is pretty shit and I'm going to worry about a bunch of false positives, something looks like a guy wielding a manchette in the blocky pixelated mess so you send a squad to stop them where you find it's just an old guy with a walking stick doing tricks for his grandson, so now you're back to a person staring at the bank of monitors but with the newly added annoyance of endless pings from the system saying "Potential crime in progress!" only to now have you missing the "real" crime while you're dealing with that.

    • Police arresting a guy holding a manchette is a pretty funny picture. "Sir, put down the turkey leg!"

    • Re:I'm conflicted... (Score:4, Interesting)

      by cusco ( 717999 ) <brian.bixby@gmail . c om> on Friday February 09, 2024 @12:54PM (#64227680)

      A few years ago there was an article on SlashDot saying that the British police had credited their access to over two million cameras with preventing "dozens" of crimes. Not thousands, not hundreds, not even scores, just a couple dozen.

      You're right, the wall of video screens shown in the movies don't really exist in actual security installations (my career for 17 years), for the simple reason that they're useless. IIRC one person can monitor up to 8-12 screens for 10-20 minutes before their brain is mush and they need to take a break of a quarter hour or more. One test had a guy in a sasquatch suit dance in front of one of the cameras on a 'video wall' and no one caught it (imitating a more famous psychological test). Most casinos and other high security installations have 3-6 screens per operator and provide frequent breaks.

      And yeah, most security cameras are low resolution and low frame rate to save on storage and network bandwidth.

    • To be clear this isn't really anything "new" coming into a field that is already pretty established with doing things like this. Professional CCTV software systems for public spaces have been integrating and using features like this for years, the systems can track specific people as they move around, dwell times, fall detection, dropped items, loitering timers, traffic and perimeter detection and alerts.

      Before the buzzword of "AI" was everywhere they would have just called this "intelligent surveillance"

    • >> so you send a squad to stop them where you find it's just an old guy

      Anything that gets flagged by the AI will be reviewed by a human before action is taken.

      >> the newly added annoyance of endless pings from the system saying "Potential crime in progress!"

      I work with image recognition and you would be surprised at how well it can work even with low-rez video.

  • Up until the first random innocent is killed to "prevent crime". You know it will happen when police runs to a location assuming something wrong's going on.

    And it may be worse than swatting.

    • by cusco ( 717999 )

      This is being done in England rather than the US, so that's probably not a worry for a while. Do that today in Minneapolis and yeah, first fatality would be tomorrow.

      • Sure, it would happen quicker in the US. But England is no stranger to police murdering people.

      • Except of course for https://www.theguardian.com/uk... [theguardian.com] .
        • by cusco ( 717999 )

          That's from 2005.

          https://mappingpoliceviolence.... [mappingpoliceviolence.us]

          Police have killed 89 people so far in 2024. Police killed at least 1,246 people in 2023.

          • As with everything, it depends on the context. If it was a wrong killing, that's one thing. If a perp is shooting at you then they'll probably get offed. And I was referring to the UK, not the US. The entire thread was about the UK, specifically England, specifically the London Underground. What happens in the US, shockingly, is not relevant to what happens in the UK or anywhere else.
  • The learning mechanism will likely pick up on certain ethnic groups who statistically have a higher crime rate, scoring them higher for the same actions.

    (I'm not claiming some ethnic groups are inherently violent. It's more that those who feel or are disenfranchised are more likely to commit crimes.)

  • by TwistedGreen ( 80055 ) on Friday February 09, 2024 @12:36PM (#64227624)

    Wow, England just keeps getting closer and closer to INGSOC.

    Let's put up a picture of Putin for our two minutes hate. If you don't like it make sure you stay out of view of the telescreens!

  • ... a defunct punk rock band have to do with security?

  • Human figure detection has already filtered down to the lowest end of consumer security cameras. I can buy a $50 Wyze camera that does an excellent job of it. You can wear a hoodie and a face mask, but you can't hide your body.

    The next step is right on the horizon - networked neighborhood cameras that are constantly monitored by AIs, watching for suspicious behavior, especially late at night.

    Within a decade it will be standard for "trusted" neighborhood security systems to send real-time alerts to the loc

Waste not, get your budget cut next year.

Working...