Sending data to Whisper with the OpenAi Piece

I run ActivePieces in a docker and it has not a public URL.

I saw that the Drive piece wrongly assumes that you actually have a public URL and it assumes it’s on port 80. So it gives me a wrong URL starting with my Public IP (I guess it went to the internet to do some ping and get the IP back).

This is completely broken as I don’t plan to give this ActivePieces a public URL. It will also break for any user running it behind a reverse-proxy.

I discovered that ReadFile instead of reading the file internally tries to read from some URL, so I had to substitute my IP for “localhost:xxxx” adding also the local port, so ActivePieces can read from itself.

This is a poor architecture design at the core. I don’t understand why the pieces seem to “try to seem more intelligent that what they are requested to do”. It breaks the S in the SOLID principle. The Drive thing is not responsible for getting an external IP from where I connect to the internet. It is responsible solely to interact with GDrive and nothing else. Any “extra fancy thing” they try to add, adds a potential break.

But that’s not the topic of question. I tackle that “file loading thing” in this other question here How do I read a file from Google Drive to be locally used? - The question here is "once I have the file in the ReadFile box, how do I forward it to the OpenAi piece.

I’d be more than happy to discuss architecture design in a videoconf with whoever is responsible of it. But my previous experience in other open source tools like n8n, Mautic and others where I found architecture failures was “this is how it is, if you don’t like it, don’t use it”. See if ActivePieces is different.

I wonder if ActivePieces follows the 12-factor https://12factor.net/ or the newer 15-factor Beyond the Twelve-Factor App [Book] - Intro video here: https://www.youtube.com/watch?v=hV-mTITits0