Is it possible to speed up the loading time of the chatbot in the Storyline template?
Hello!
We use the Storyline Chatbot Template to create a chatbot solution for a customer. The aim is to run through several tasks and questions to create a brand identity. Basically, the solution works well, but the loading time (‘processing’) is often far too long. Is there any way to optimise this? We are of course also open to use a more expensive subscription model if this is necessary.
Thank you very much!
Reply
Posted on 11/19/2024
Please type your reply.
@support
Mathias, how long is the processing time when you use it? Also, where in the world are you located and where will the rest of the users be located?
Reply
Posted on 11/19/2024
@mathias
Thank you for your answer. There are waiting times of up to 15 seconds if it is a longer answer. We are based in Germany and our customers also come from German-speaking countries.
Reply
Posted on 11/20/2024
@support
Is the chatbot generating a lot of text? If you've ever used ChatGPT online, you might have noticed that the more text it needs to generate, the longer it takes to complete the task. So one workaround that I would recommend is to split the workload into separate tasks. Do the users need all of the text generated right away? If not, consider creating one chatbot that generates only the text that the users need in the beginning, then feed this input to the second chatbot that will generate the rest of the text. That could help you make it faster.