In other news, a New York attorney has landed in trouble for citing a case fabricated by ChatGPT in a lawsuit, leading to the dismissal of her client's malpractice case. This follows similar incidents where lawyers have relied on AI-generated information, highlighting the risks of using such tools in legal applications. Lastly, a House of Lords report warns that the UK's participation in the "AI gold rush" could be threatened if it focuses too much on setting safety guardrails on Large Language Models, potentially stifling domestic innovation.
Key takeaways:
- More than half of UK undergraduates use AI tools like ChatGPT to complete assignments, according to a study by the Higher Education Policy Institute.
- The Italian Data Protection Authority alleges that OpenAI's ChatGPT is violating the EU's GDPR laws, potentially facing fines of up to €20 million or 4 percent of the company's annual income.
- Lawyers have been getting into trouble for citing fake cases made up by AI, with one New York attorney's client's lawsuit being dismissed due to this.
- A report from the UK's House of Lords warns that the country could miss out on the 'AI gold rush' if it focuses too much on safety and regulation, potentially stifling domestic innovation.