Army comes clean about its recruitment AI, accidentally discloses info about pedophile- and terrorist-catching chatbots that roam the net

Dave from the Electronic Frontier Foundation writes, "Not too long ago, Boing Boing covered EFF's (at the time) unsuccessful attempt to retreive records about Sgt. Star (the Army's recruiter-bot) using the Freedom of Information Act. We've now received the files and compiled our research: It turns out Sgt. Star isn't the only government chatbot — the FBI and CIA had them first.

The information about the terrorist/child-abuser bots only came to light because the spy agencies failed to fully redact their responses (the type was legible through the black strikeouts).

Sgt. Star has a seemingly exhaustive supply of answers to questions about military service, from opportunities for dentists and veterinarians to whether soldier are allowed to use umbrellas (only women and under certain conditions). He also has answers that simply exist to deepen his personality, such as his music and film preferences, and information about his Rottweiler, "Chomp." He will also deliver rather in-depth, scientific answers to throwaway questions, including "why is the sky blue?" and "why is grass green?"

For all his character quirks, a user would never mistake Sgt. Star for human—that's just not how he was designed. That can’t necessarily be said for other government bots. Military, law enforcement and intelligence agencies have employed virtual people capable of interacting with and surveilling the public on a massive scale, and every answer raises many, many more questions.



Answers and Questions About Military, Law Enforcement and Intelligence Agency Chatbots

(Thanks, Dave!)