11434

TCP

Ollama

Ollama LLM Server

⚠️ Security: DependsStatus: Active🤖 Category: AI/ML

📋 Basic Information

Port Number
11434
Protocol
TCP
Service
Ollama
Popularity
🔥 High
Description
Ollama local LLM server for running open-source language models
Tags
aillmlocalopensourceapi

🔒 Security Information

💡

Security Notes

Localhost only by default, configure firewall if external access needed

🛠️ Installation

🐧Linux

#!/bin/bash
# Ollama Installation Script for Ubuntu/Debian

# 1. Download and install Ollama
curl -fsSL https://ollama.com/install.sh | sh

# 2. Start Ollama service
sudo systemctl start ollama
sudo systemctl enable ollama

# 3. Verify installation
ollama --version

# 4. Pull a model (example: llama2)
ollama pull llama2

# 5. Test the server
curl http://localhost:11434/api/version

echo "✅ Ollama installed successfully on port 11434!"

🍎Macos

#!/bin/bash
# Ollama Installation Script for macOS

# 1. Install via Homebrew
brew install ollama

# 2. Start Ollama
ollama serve &

# 3. Pull a model
ollama pull llama2

echo "✅ Ollama installed successfully on port 11434!"

🔗 Related Ports