Your own private search engine, connected to your local LLMs
This guide was written and tested on Ubuntu 24.04 LTS running inside a Proxmox LXC container. It should work on other Debian-based distros with minor adjustments. The MCP client side was tested with LM Studio on macOS.
Swap in your own IP addresses, subnets, and paths where noted. Do not blindly copy-paste network configs unless they match your environment. Always snapshot or back up before making system-level changes.
What Are We Building Here?
This guide gets you three things:
- SearXNG running in a Docker container on your server. It is a privacy-respecting metasearch engine that pulls results from Google, Brave, DuckDuckGo, and about 95 other engines without tracking you.
- A locked-down firewall using UFW so only your local network can reach the search engine and SSH.
- MCP (Model Context Protocol) integration with LM Studio so your locally-hosted LLMs can search the web through SearXNG. No cloud APIs. No data leaving your network. Your models get web access, and you keep your privacy.
The Architecture
+---------------------+ MCP (local process) +---------------------+ HTTP JSON API +---------------------+
| | | | (over your LAN) | |
| LM Studio (LLM) | <------------------------> | MCP Server | <------------------------> | SearXNG |
| (your Mac/PC) | | (your Mac/PC) | | (your server) |
| | | Node.js app | | Docker container |
+---------------------+ +---------------------+ +---------------------+
The LLM runs on your machine. The MCP server is a lightweight Node.js bridge that runs on the same machine as LM Studio. SearXNG lives on your server. When the LLM wants to search the web, it asks the MCP server, which queries SearXNG's JSON API, which fetches results from dozens of search engines and sends them back. Everything stays on your LAN except the anonymized search queries SearXNG sends out.
Prerequisites
Before you start, you will need:
- A Linux server or LXC container running Ubuntu 24.04 (or similar Debian-based distro)
- A user account with
sudoaccess - Network connectivity (your server needs internet access for Docker pulls and SearXNG to query search engines)
- For the MCP part: A Mac or PC running LM Studio with Node.js installed
If you are running this inside an LXC container, you must enable the nesting feature on the container. Without it, Docker will not work inside LXC. In the Proxmox web UI: select your container > Options > Features > set nesting=1. The container can be unprivileged. Just make sure nesting is on.
Part 1: Install Docker
Reference: Docker official install docs for Ubuntu
Install prerequisites:
sudo apt-get update
sudo apt-get install -y ca-certificates curl gnupg
Add Docker's official GPG key and repository:
sudo install -m 0755 -d /etc/apt/keyrings
sudo curl -fsSL https://download.docker.com/linux/ubuntu/gpg \
-o /etc/apt/keyrings/docker.asc
sudo chmod a+r /etc/apt/keyrings/docker.asc
echo "deb [arch=$(dpkg --print-architecture) signed-by=/etc/apt/keyrings/docker.asc] \
https://download.docker.com/linux/ubuntu \
$(. /etc/os-release && echo "$VERSION_CODENAME") stable" | \
sudo tee /etc/apt/sources.list.d/docker.list > /dev/null
Install Docker Engine:
sudo apt-get update
sudo apt-get install -y docker-ce docker-ce-cli containerd.io \
docker-buildx-plugin docker-compose-plugin
Add your user to the docker group:
Per the SearXNG container docs, you need your user in the docker group:
sudo usermod -aG docker $USER
The group change takes effect on your next login. Until then, prefix docker commands with sudo.
Verify Docker is running:
sudo docker --version
sudo docker info --format '{{.ServerVersion}}'
You should see the Docker version printed. If not, check sudo systemctl status docker.
Part 2: Install SearXNG
Reference: SearXNG Container Installation Docs
Pull the SearXNG image:
sudo docker pull docker.io/searxng/searxng:latest
If you hit DockerHub rate limits, use the GHCR mirror instead: sudo docker pull ghcr.io/searxng/searxng:latest
Create directories for config and persistent data:
mkdir -p ~/searxng/config ~/searxng/data
Run the SearXNG container:
Replace YOUR_SERVER_IP with your server's actual IP address. Binding to the specific IP (instead of 0.0.0.0) means Docker only listens on that interface, which gives you tighter control.
sudo docker run --name searxng -d \
-p YOUR_SERVER_IP:8888:8080 \
-v "$HOME/searxng/config/:/etc/searxng/" \
-v "$HOME/searxng/data/:/var/cache/searxng/" \
docker.io/searxng/searxng:latest
For example, if your server IP is 192.168.1.100:
sudo docker run --name searxng -d \
-p 192.168.1.100:8888:8080 \
-v "$HOME/searxng/config/:/etc/searxng/" \
-v "$HOME/searxng/data/:/var/cache/searxng/" \
docker.io/searxng/searxng:latest
Make it survive reboots:
By default, the container will not start back up after a reboot. Fix that:
sudo docker update --restart unless-stopped searxng
Verify it is running:
sudo docker container list
curl -s -o /dev/null -w "%{http_code}" http://YOUR_SERVER_IP:8888/
You should see the container listed and an HTTP 200 response. Open http://YOUR_SERVER_IP:8888 in a browser to see the SearXNG search page.
You will see a couple of errors in the logs about ahmia and torch engines failing to load. Those are Tor-based search engines and they need a Tor proxy to work. Totally fine to ignore unless you plan to search .onion sites. Check logs anytime with: sudo docker logs searxng
Part 3: Lock It Down with UFW
Now let us make sure only your local network can reach this thing.
Set up the defaults:
sudo ufw default deny incoming
sudo ufw default allow outgoing
This blocks everything coming in and allows everything going out (SearXNG needs outbound access to query search engines).
Allow SSH from your LAN only:
Replace the subnet below with your actual LAN subnet.
sudo ufw allow from 192.168.1.0/24 to any port 22 proto tcp comment 'SSH from LAN'
Allow SearXNG from your LAN only:
sudo ufw allow from 192.168.1.0/24 to any port 8888 proto tcp comment 'SearXNG from LAN'
Enable the firewall:
sudo ufw --force enable
Verify your rules:
sudo ufw status verbose
You should see something like:
Status: active
Default: deny (incoming), allow (outgoing), deny (routed)
To Action From
-- ------ ----
22/tcp ALLOW IN 192.168.1.0/24 # SSH from LAN
8888/tcp ALLOW IN 192.168.1.0/24 # SearXNG from LAN
Docker manipulates iptables directly and can bypass UFW rules for published ports. This is a well-known issue. By binding the container to a specific IP address (as we did in Part 2 instead of 0.0.0.0), you add an extra layer of control. The container only listens on your LAN interface, and UFW handles the rest. If you are running a public-facing instance, you would want to dig deeper into the Docker/UFW conflict. For a private LAN setup, what we have done here is solid.
Part 4: Enable the JSON API
If you plan to connect SearXNG to LM Studio (or any other tool that talks to it programmatically), you need to enable the JSON output format. By default, only HTML is enabled.
Stop the container first:
sudo docker container stop searxng
Fix the file permissions so you can edit it:
The container's internal searxng user owns the config file. Change ownership temporarily:
sudo chown $USER:$USER ~/searxng/config/settings.yml
Edit the settings file:
Open ~/searxng/config/settings.yml in your preferred editor and find the formats section (around line 85):
formats:
- html
Change it to:
formats:
- html
- json
Save the file.
The container will re-own this file on startup (via the $FORCE_OWNERSHIP setting), but it will not overwrite the contents as long as the file already exists. If you ever delete settings.yml and restart, it will regenerate from the template and you will lose your changes. That is why we back it up later.
Start the container back up:
sudo docker container start searxng
Verify the JSON API works:
curl -s 'http://YOUR_SERVER_IP:8888/search?q=test&format=json' | head -c 200
You should see JSON output with search results. If you get a 403 Forbidden, double-check that json is in the formats list and that the YAML indentation is correct (two spaces, not tabs).
Part 5: Connect to LM Studio via MCP
This is where it gets fun. We are going to give your local LLMs the ability to search the web.
What is MCP?
MCP (Model Context Protocol) is a standard that lets LLMs call external tools. Think of it like giving your AI a set of abilities beyond just chatting. In our case, we are giving it the ability to search the web and read web pages.
How it works in this setup:
- LM Studio runs an LLM locally on your Mac/PC
- LM Studio launches a small MCP server process (a Node.js app) in the background
- When the LLM wants to search, it calls a tool through MCP
- The MCP server sends the search query to your SearXNG instance over your LAN
- SearXNG fetches results from 98 search engines and returns them as JSON
- The MCP server passes the results back to the LLM
- The LLM reads the results and responds to you
Nothing leaves your network except the anonymized search queries SearXNG sends to the search engines. Your prompts, your conversations, your data - all local.
Prerequisites on your Mac/PC:
You need Node.js installed. On macOS with Homebrew:
brew install node
Verify it is working:
node --version
npx --version
Both should return version numbers.
Configure LM Studio:
In LM Studio, open the MCP configuration. You can find this in the developer/MCP settings area (look for the MCP icon or check under Settings). You will see a JSON config file.
Add the following configuration:
{
"mcpServers": {
"searxng": {
"command": "npx",
"args": ["-y", "mcp-searxng"],
"env": {
"SEARXNG_URL": "http://YOUR_SERVER_IP:8888"
}
}
}
}
Replace YOUR_SERVER_IP with your SearXNG server's IP address. For example: http://192.168.1.100:8888
If you already have other MCP servers configured, just add the "searxng": { ... } block inside the existing "mcpServers" object.
The npm package we are using is mcp-searxng. The npx -y command will download and run it automatically the first time.
What tools does the LLM get?
Once connected, your LLM will have access to two tools:
searxng_web_search- Search the web with support for pagination, time filtering, language selection, and safe searchweb_url_read- Fetch a URL and convert the page content to markdown so the LLM can actually read web pages
Test it out:
Load up a model in LM Studio, make sure the searxng MCP server shows as connected (you should see it in the integrations panel), and ask something like:
"Search the web for the latest news about Linux kernel updates"
The LLM should call searxng_web_search, get results back, and summarize them for you. You might see it call web_url_read if it wants to dig into a specific page for more detail.
Part 6: Back Up Your Configs
Before you call it done, grab copies of the important files:
mkdir -p ~/backups
sudo cp ~/searxng/config/settings.yml ~/backups/settings.yml.bak
sudo cp /etc/ufw/user.rules ~/backups/ufw-user.rules.bak
sudo cp /etc/ufw/user6.rules ~/backups/ufw-user6.rules.bak
sudo chown $USER:$USER ~/backups/*
If you are on Proxmox, this is also a great time to take a container backup from the Proxmox UI.
Useful Management Commands
SearXNG Container:
# View logs
sudo docker logs searxng
# Restart (after config changes)
sudo docker container restart searxng
# Stop
sudo docker container stop searxng
# Start
sudo docker container start searxng
# Remove and recreate (if needed)
sudo docker container stop searxng
sudo docker container rm searxng
# Then re-run the docker run command from Part 2
# Shell into the container for troubleshooting
sudo docker container exec -it --user root searxng /bin/sh -l
# Update to latest image
sudo docker pull docker.io/searxng/searxng:latest
sudo docker container stop searxng
sudo docker container rm searxng
# Then re-run the docker run command from Part 2
UFW Firewall:
# Check status
sudo ufw status verbose
# Check status with rule numbers (for deleting)
sudo ufw status numbered
# Delete a specific rule
sudo ufw delete [rule number]
# Disable firewall temporarily
sudo ufw disable
# Re-enable firewall
sudo ufw enable
What You Have Got Now
| Component | Details |
|---|---|
| SearXNG | Self-hosted, 98 search engines, private, no tracking |
| Docker | Auto-starts on boot, container set to unless-stopped |
| UFW Firewall | Deny all incoming, SSH + SearXNG allowed from LAN only |
| JSON API | Enabled for programmatic access |
| MCP Integration | LM Studio connected via mcp-searxng, two tools available |
| Privacy | LLM runs locally, search is anonymized, no data leaves your network |
Your local LLMs can now search the web through your own infrastructure. No API keys needed. No cloud services involved. No one training on your queries. Just your hardware, your network, your data.
Troubleshooting
If something broke, the first thing to check is sudo docker logs searxng for SearXNG issues, and the MCP server status in LM Studio for connection problems. Make sure the JSON API is enabled and the firewall is not blocking traffic between your machines.