A need for AI chat functinality in Triggre - allowing custom code

I understand the choice of triggre to not allow custom code in the frontend to make platform simpler and more robust.
However, with the revolution of AI and it’s chat-based nature I would suggest to re-think this decision. At least for allowing to install a chat based 3rd party chat solutions. I cannot imagine a modern and easily accessible SaaS without a natural language interface now or in the future.
Accessing one’s business data on Triggre and then having to use a third party solution outside of it just to ‘chat to the data’ seems like a very serious drawback.

1 Like

Hi Tomas,

Thank you for expressing your feedback. Currently it’s already possible to connect to such AI tooling to your Triggre application via our WebAPI. We even have a template for that: AI-powered content creation no-code application template | AI Agent

I’ll ask our R&D team to add the chatbot-tool to the backlog.

1 Like

I fully understand that I can call OpenAI endpoints from Triggre, receive replies and use the replies to fill database/tables. I would use functionality shown in the video for filling/enriching the tables etc. But this is not the point of this post.
What I am saying is that Triggre lacks
a) the ability to have a CHAT interface inside Triggre (or in laymen terms - a chat bubble in the bottom corner of the screen). Users should be able to have an AI driven chat INSIDE Triggre, since natural language CHAT interface is becoming a de facto satdard for interacting with data and LLMs. Simply allowing LLMs to fill the data in the tables won’t cut it.
b) It would be esay to install such a AI Chat bubble if Triggre allowed inserting custom code snippets to bring our 3rd party ‘chat bubble’ solutions into Triggre. But unfortunately Triggre does not allow custom code to be inserted. That is why I am saying it might be time to rethink this policy.
c) Unless you have time and resources to develop your own chat solution inside Triggre that would support different LLM APIs, streaming etc.

1 Like

Hi @tomas,

Thanks for the clear feedback. Right now the most important reasons not to allow Javascript are security and backwards compatibility.

I agree fully that chatbots have improved dramatically over the past year, and are now an asset (before they seemed to me like more meant to annoy customers :wink:).

Could you perhaps teel us which ones you would use right now? This may help us in navigating this feature.

Supporting your option C is definitely worth exploring; as it would keep the security and compatibility issues in check.

I would not like to be limited to what specific LLM APIs I could connect. If you intend to build your own chat solution, then please build one that can connect to ANY chat API. Actually nearly everything needed for that is already inside Triggre. What else is needed - it is a GUI element for chat bubble (set per page with ability to have different chat bubbles with different endpoints to call depending on user/page) and minimally expanding API functionality to support streaming. Of course, in case where chat access to data stored INSIDE of Triggre is required (and is not available somewhere outside Triggre), it gets more complicated. But for starters I would be completely satisfied to just have a chat buble that could connect to various API endpoints and stream messages.

Hi @tomas,

Thanks for clarifying! We will consider this feature; though it won’t be implemented very soon likely. I don’t want to create false hopes here.

Our process for introducing features such as this one, which are very different from what we currently have in Triggre in terms of the concepts used, are first thoroughly thought through to see if and how we can do them with existing concepts or minimally different ones.

This is done to keep the platform as simple as we can. At the same time, we are working hard on a few much needed improvements to other large parts of the platform, mainly concerning the frontend of the applications you can build.

It seems logical to implement something like a chat interface after we have those frontend capabilities.