Gemini as MCP Agent?

Is it possible to use Gemini instead of OpenAI for the MCP pieces? I have Gemini configured in Platform Admin → Setup → AI.

I’ve tried both Google Drive and Postgres MCPs but both fail with the following error message:

Please check if you have connected your OpenAI provider to Activepieces.

And in the Docker logs, there’s this snippet:

{"name":"AI_RetryError","reason":"maxRetriesExceeded","errors":[{"name":"AI_APICallError","cause":{"errno":-111,"code":"ECONNREFUSED","syscall":"connect","address":"127.0.0.1","port":8080},"url":"http://localhost:8080/api/v1/ai-providers/proxy/openai/v1/chat/completions"

We only use Gemini currently as our LLM – is it possible to use that for MCP or is it required to use OpenAI?

Currently running version 0.67.6 (0.68.X releases seem to have some regression preventing using templates).

Hi Kyle,

Currently, we only support OpenAI for the MCP Agent. We will consider making both the provider and the model configurable.
We are aware of the templates regression and working on a fix.

Thanks for the response, this answers my question.

This topic was automatically closed 24 hours after the last reply. New replies are no longer allowed.