Using LM Studio instead of LocalAI

Hi folks.
I’m currently playing around wit ideas at the moment. I currently use LM Studio insead of Local.AI, I’ve found it to be more flexible and responsive. Is there a way to modify the integration with LocalAI on activepieces to allow for it to use LM Studio instead?

I have currently just taken a chance on connecting to server IP and port, the LM app is saying

[ERROR] Unexpected endpoint or method. (GET /models). Returning 200 anyway

Hey so after some light digging, I found out that I can use the OpenAI plugin and change the values on the configuration. Sucessfully running a local LLM with a model of my choosing.

This topic was automatically closed 15 days after the last reply. New replies are no longer allowed.