The inaccuracies are attributed to the token-based predictive models that power these chatbots, which use statistical associations across millions of tokens to guess the most likely next word in a sequence, without understanding the underlying information. Despite warnings that the bot may produce incorrect or biased content, it is being marketed as a tool to help business owners navigate government. The report highlights the risks of governments and corporations deploying chatbots before their accuracy and reliability have been fully vetted.
Key takeaways:
- The MyCity chatbot, run by the New York City government, has been providing incorrect information on important local laws and municipal policies, according to a report from The Markup and local nonprofit news site The City.
- Examples of incorrect information include misinformation about Section 8 housing subsidies, worker pay and work hour regulations, and industry-specific information like funeral home pricing.
- The chatbot, which is powered by Microsoft Azure, uses a complex process of statistical associations across millions of tokens to guess at the most likely next word in any given sequence, without any real understanding of the underlying information being conveyed.
- The report highlights the danger of governments and corporations rolling out chatbots to the public before their accuracy and reliability have been fully vetted, with examples of other chatbots providing misleading or inaccurate information.