The experts also highlighted the need for new architectures for large language models (LLMs), the models underpinning today's AI chatbots. Current LLMs are not very good at understanding basic cognitive logic, like cause and effect, and struggle with tasks such as image and video processing. Despite these limitations, LLMs have significant commercial value and are being widely deployed.
Key takeaways:
- Despite the hype around AI, experts at the World Economic Forum in Davos highlighted that AI still has a long way to go to reach real intelligence, with limitations in its current form.
- One of the challenges is data-related, with AI models like OpenAI's GPT-4 mostly trained on publicly available internet data, lacking the ability to handle more complex data from 'embodied AI' or from experimentation.
- The other challenge is architectural, with current AI models needing new architectures to reach the next level of intelligence. Current models are not very good at understanding basic cognitive logic, like cause and effect.
- Despite these limitations, AI models like LLMs have significant commercial value, solving real problems, generating content, improving productivity, and being widely deployed.