Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Privacy

Amazon Transcribe Can Now Automatically Redact Personally Identifiable Data (venturebeat.com) 10

Amazon is adding a new privacy-focused feature to its business transcription service, one that automatically redacts personally identifiable information (PII), such as names, social security numbers, and credit card credentials. From a report: Amazon Transcribe is part of Amazon's AWS cloud unit and was launched in general availability in 2018. An automatic speech recognition (ASR) service, Transcribe enables enterprise customers to convert speech into text, which can help make audio content searchable from a database, for example. Contact centers can also use the tool to mine call data for insights and sentiment analysis. However, privacy issues have cast a spotlight on how technology companies store and manage consumers' data.
This discussion has been archived. No new comments can be posted.

Amazon Transcribe Can Now Automatically Redact Personally Identifiable Data

Comments Filter:
  • I wish they'd support OCRing handwritten letters and such as well.
    • I wish they'd support OCRing handwritten letters and such as well.

      There is probably a phone app for that already. Depending on how bad or old the handwriting is.

      I'm guessing printed characters are not too bad, but cursive may be more difficult...

    • Good luck with my scrawl (esp. since I'm writing less and less and typing more).

      My handwriting is almost an encryption mechanism in itself :-)

  • Great /s (Score:5, Interesting)

    by Meneth ( 872868 ) on Friday February 28, 2020 @05:02AM (#59776874)
    The same function can be used to extract personally identifiable information. If you had trouble getting use out of a few million minutes of illicit voice recordings, despair no more!
  • or not redact for certain viewers and redact for others. Do we really know what the rules and possibilities are?

    Just my 2 cents ;)
  • If I create an app that records what my users say, then run it through Amazon and store it forever, is Amazon legally liable if I end up with PII in my records (even though my EULA said I am not capturing any PII)?

    • What does the contract say? You are the one who agreed to the damn thing, you should know what it says. If it you don't, then hopefully it says whatever is most likely to put you in prison or have you executed since that is what you deserve for asking here, rather than simply reading the contract!

      From my perspective as the party who's data (voice recording) you stole and sent to amazon for processing, YOU should be liable for what appears in YOUR records, Amazon should be liable for what appears in their

    • by cusco ( 717999 )

      Not unless they guaranteed 100% removal of all PII, and I can't believe they'd be that dumb. It's a tool, like a wrench. Skil isn't liable if you use that wrench to bash someone's head in rather than to tighten down a bolt.

  • by OolimPhon ( 1120895 ) on Friday February 28, 2020 @07:23AM (#59777048)

    One problem with this kind of automatic filter will be false positives.

    We already have situations where I'm looking through search results to find an answer to a coding problem and parts of an included log are redacted - parts, say, from an ssh session where a string has an embedded '@' which the system thinks is an email address where it is really an ssh invocation, or similar.

    How do we know that this new system will have enough training to be able to detect when you actually need the info in the file?

    • One problem with this kind of automatic filter will be false positives. --- I'd say a bigger problem is false negatives, i.e., the algorithm thinks it is not PII but it really is.

"May your future be limited only by your dreams." -- Christa McAuliffe

Working...