Sign up to save tools and stay up to date with the latest in AI
bg
bg
1

NYC’s government chatbot is lying about city laws and regulations

Mar 29, 2024 - arstechnica.com
The "MyCity" chatbot, run by the New York City government, has been found to provide incorrect information about local laws and municipal policies. The bot, launched as a pilot program in October, was designed to provide business owners with information from over 2,000 NYC Business webpages and articles. However, a report from The Markup and The City found the bot giving wrong information about city policies, including incorrect statements about Section 8 housing subsidies and worker pay regulations.

The inaccuracies are attributed to the token-based predictive models that power these chatbots, which use statistical associations across millions of tokens to guess the most likely next word in a sequence, without understanding the underlying information. Despite warnings that the bot may produce incorrect or biased content, it is being marketed as a tool to help business owners navigate government. The report highlights the risks of governments and corporations deploying chatbots before their accuracy and reliability have been fully vetted.

Key takeaways:

  • The MyCity chatbot, run by the New York City government, has been providing incorrect information on important local laws and municipal policies, according to a report from The Markup and local nonprofit news site The City.
  • Examples of incorrect information include misinformation about Section 8 housing subsidies, worker pay and work hour regulations, and industry-specific information like funeral home pricing.
  • The chatbot, which is powered by Microsoft Azure, uses a complex process of statistical associations across millions of tokens to guess at the most likely next word in any given sequence, without any real understanding of the underlying information being conveyed.
  • The report highlights the danger of governments and corporations rolling out chatbots to the public before their accuracy and reliability have been fully vetted, with examples of other chatbots providing misleading or inaccurate information.
View Full Article

Comments (0)

Be the first to comment!