This Chatbot Catches Pedophiles

from the technology-on-the-prowl dept

We've written about kids teaching FBI agents how to chat online like teens to help catch unlike predators, and about how chat-bots are fooling lots of people - especially when they're expecting something specifically. It sounds like someone has put the two ideas together, and has created a chatbot that tries to catch pedophiles. It goes into chat rooms and starts talking like an ordinary teen, but looks for classic symbols of an adult predator looking to find a child. If the chatbot suspects something is up, it sends an email to the bots creator with the relevant transcript. He then reads the transcript to see if the situation looks suspicious, and will then contact local police with all the info. He calls the various bots ChatNannies, and claims no one has figured it out and that it's helped with police investigations (though, there's no proof of either of these claims). As for staying relevant, the chatbot apparently tries to learn from the conversations it's involved with, as well as surfing the web for other pop culture information. The article also includes a "sample chat" that seems fairly sophisticated for a bot - which actually makes me wonder how true it really is - though, I have kept up with the state of the art in chatbots lately. Are they really that sophisticated?
Hide this

Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.

Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.

While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.

–The Techdirt Team


Reader Comments

Subscribe: RSS

View by: Time | Thread


  • identicon
    Matt, 18 Mar 2004 @ 5:54am

    sophisticated

    How sophisticated does a teen chatbot need to be, all it needs is a vocabulary of 500 words, the latest movie+music chart (including actors/singers) and prefix everything with "so like.." and throw in a "whatever" occasionally :-)

    link to this | view in chronology ]

  • identicon
    Oliver Wendell Jones, 18 Mar 2004 @ 6:54am

    And then...?

    And then the police swoop in and arrest people for carrying on "illegal" conversations with a chat-bot?

    link to this | view in chronology ]

    • identicon
      Mikester, 18 Mar 2004 @ 9:01am

      Re: And then...?

      It doesn't matter if you're chatting with a bot. If it can be proven that you 'intended' to prey on a child, that makes you guilty.

      link to this | view in chronology ]

      • identicon
        Anonymous Coward, 18 Mar 2004 @ 12:43pm

        Re: And then...?


        whats " proof " Mike ?

        It's not even a human !

        link to this | view in chronology ]

        • identicon
          Mikester, 18 Mar 2004 @ 3:41pm

          Re: And then...?

          The proof would be in the transcription of the chat between the bot and the pedophile.
          You don't really think your words vanish off into cyberspace when they roll off the chat window do you? ;-)

          link to this | view in chronology ]

  • identicon
    Ed Halley, 18 Mar 2004 @ 8:02am

    No Subject Given

    I don't know the chat bots discussed here in particular, but I have considered a double-blind approach to bots appearing "human" while sitting in a number of chat channels.

    In this technique, the bot joins as person Alice on network A.NET, and as person Betty on a different network, B.NET. It picks a real unrelated participant on A.NET (we'll call Annie) and parrots most of Annie's speech on B.NET, so Betty says whatever Annie says. It likewise picks a random participant Bernice from B.NET and parrots most of Bernice's speech as Alice on A.NET.

    Once this blind is constructed, either Alice or Betty can add additional comments to goad conversations from their respective chat groups. If the bots detect directed questions, like "alice, a/s/l?" it can respond directly according to the bot's goals, or it can just funnel the question to the alter-ego on the other network to fetch a genuinely produced reply from the real Bernice.

    link to this | view in chronology ]

  • identicon
    DanTekGeek, 18 Mar 2004 @ 1:36pm

    No Subject Given

    what i want to know is if it could pass a turing test

    link to this | view in chronology ]

  • identicon
    maxx, 26 Mar 2004 @ 8:02am

    No Subject Given

    like that ex-Interpol guy David Race Bannon,(Race against Evil), its time to start killing child molesters. We need a mandantary death sentence for repeat offenders.(legal)

    link to this | view in chronology ]


Follow Techdirt
Essential Reading
Techdirt Deals
Report this ad  |  Hide Techdirt ads
Techdirt Insider Discord

The latest chatter on the Techdirt Insider Discord channel...

Loading...
Recent Stories

This site, like most other sites on the web, uses cookies. For more information, see our privacy policy. Got it
Close

Email This

This feature is only available to registered users. Register or sign in to use it.