-
Notifications
You must be signed in to change notification settings - Fork 333
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Insecure usage recommendations #173
Comments
Would the dev be open to someone (or me) compiling instructions for how to setup with Tailscale? With Tailscale, you should be able to access your home instance with any of your devices connected to your tailnet. |
If privacy is the reason for running ollama locally, I wouldn't use a privately owned VPN company. But with regards to security, it might be good enough for most. I just want to access ollama on 127.0.0.1:11434. Why go to the trouble of using VPNs and online services when it's right there on the same machine? Am I missing something? |
If I'm traveling, having a dedicated home server is going to be much more powerful than running locally. For the VPN, with Tailscale, you're the VPN provider. Tailscale just initiates the connection. |
I see. I admit I only took a cursory look at Tailscale. I do see the usefulness of a setup like that for mobile use, I'm just naturally suspicious of venture capital funded companies. |
For what's it's worth, you could also run the open-source fork Headscale. |
Thanks for the tip, that's more down my alley :) |
I have the same question. Opening up my Ollama server to the world seems like a bad idea. I am on my Mac, and what I want to do is:
Is this possible? How? |
I had a look at the code, and it does actually support running a local Ollama server! No You just have to set the server URI to |
Let's assume your running Ollama on a local Ubuntu server and not running in a container. You can use a reverse proxy to access your server on your local network using caddy. You will need to edit the caddy file: (/etc/caddy/Caddyfile)
However, the tailscale approach is still easier since you can access everywhere and you don't have to keep switching addresses in enchanted. Just replace with your servers Tailscale IP. Caddy will also provision TLS certificates for a secure connection.
|
I was trying for a similar approach. I have domain based hosting on my server. And I wanted to see if there was a way to secure this say using Oauth. I have Authelia setup with a lot of my services and it works really well. Wondering if something like this could be done and that way my local instance could be shared between friends and family securely. |
My browser on iOS can connect to http://:11434 - why can't Enchanted? I agree with others here that exposing a public url is risky. |
Exposing ollama on a public port is a bad idea.
https://thehackernews.com/2024/11/critical-flaws-in-ollama-ai-framework.html
Why can't Enchanted access ollama on 127.0.0.1?
The text was updated successfully, but these errors were encountered: