Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124
Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124
middle chaos AND riot In the Social Security Administration (SSA) caused by the so -called Elon Musk Government Efficiency Department (DOGE)Employees have now been asked to integrate the use of a generating chatbot into their daily work.
But before they can use any of them, everyone should watch a four-minute training video featuring an animated woman, with four fingers sharply drawn in a style that would not look out of place on the websites created in the early part of this century.
In addition to the 1.0 ERE Internet graphics, the video also fails to its main goal of informing the SSA staff about one of the most important aspects of using chatbot: Do not use any personally identifiable information (PII) when using the assistant.
“Our surveillance research in our training video,” SSA wrote on a Chatbot fact sheet that was shared in an email for employees last week. The fact sheet, which Wired has examined, adds that employees using chatbot should “refrain from loading PII in chatbot”.
Chatbot work, called the Agency Support Agency, began about a year ago, long before Musk or Doge arrived at the agency, an SSA employee with knowledge of Wired app knowledge tells. The app has been in limited tests since February, before distributing to all SSA employees last week.
In an email announcing his availability to all staff this week, and reviewed by Wired, the agency wrote that the chatbot was “created to help employees with daily tasks and increase productivity”.
Numerous SSA employees, including front office staff, show wired that they completely ignored the chatbot email because they were too busy with current work, compensating for lowered the account in the SSA offices. Others said they had briefly tested chatbot but were not immediately impressed.
“Honestly, no one has really talked about it,” says a Wired source. “I’m not sure that most of my associates even watched the training video. I played with a little chatbot and some of the answers I received from him were too unclear and/or incorrect.”
Another source said their associates were making fun of the training video.
“You can listen to my associates by making fun of graphics. No one knows is (using it). It is so clumsy and bad,” the source says, adding that they were also given incorrect chatbot information.