The article also provides a technical overview of how the chatbot was built, including the use of Python, Twilio's Sandbox for WhatsApp, and Flask API. It explains how to run the API locally with ngrok and deploy it on Heroku using Docker. The authors note that while the current implementation is simple, it has been useful for exploring the technical feasibility of combining WhatsApp with generative AI, particularly for early-years education. Future iterations could potentially use WhatsApp messaging history to enable more natural interactions and allow users to ask follow-up questions.
Key takeaways:
- Nesta’s Discovery Hub has created a WhatsApp chatbot prototype that uses a large language model (LLM) to generate personalised activity ideas for toddlers and answer questions.
- The chatbot was built using Twilio, a communications service platform, Flask Python package, and OpenAI. It can handle two types of user messages and provides responses based on pre-made tailored prompts.
- The prototype can be tested locally using ngrok and deployed on Heroku using Docker, ensuring it runs even when the local machine is turned off.
- While the current implementation is simple, it highlights the potential of combining WhatsApp with generative AI, especially in the context of early-years education. Future iterations could use WhatsApp messaging history for more natural interactions and follow-up questions.