Installing Ollama
Installing Ollama GUI
This page walks you through downloading and installing Ollama with its native GUI interface.
Step 1: Download Ollama
Visit the official Ollama download page:
π ollama.com/download
Select your operating system and download the installer.
Step 2: Install on Your Operating System
The installation process varies slightly by platform:
macOS Installation
-
Download the .dmg file from ollama.com/download
-
Open the downloaded .dmg file
Double-click
Ollama-[version].dmgin your Downloads folder -
Drag Ollama to Applications
A window will appear showing the Ollama icon and your Applications folder. Drag the Ollama icon to the Applications folder.
-
Launch Ollama
- Open your Applications folder
- Double-click the Ollama icon
- If you see a security warning, click “Open” to confirm
-
Grant Permissions (if prompted)
macOS may ask for permissions to run Ollama. Click “OK” or “Allow” to grant necessary permissions.
Windows Installation
-
Download the .exe installer from ollama.com/download
-
Run the installer
Double-click
OllamaSetup.exein your Downloads folder -
User Account Control (UAC) prompt
Windows will ask if you want to allow the app to make changes. Click “Yes”
-
Follow the installation wizard
- Click “Next” to begin
- Choose installation location (default is recommended)
- Click “Install”
- Wait for installation to complete
-
Launch Ollama
- The installer will offer to launch Ollama automatically
- Or find Ollama in your Start Menu
C:\Users\[YourName]\AppData\Local\Programs\Ollama by default. The models will be stored in C:\Users\[YourName]\.ollama\models.Linux Installation
-
Install Ollama CLI
Run the official installation script:
curl -fsSL https://ollama.com/install.sh | sh -
Verify installation
ollama --version -
Start Ollama service
ollama serveThis starts Ollama as a background service on
http://localhost:11434 -
(Optional) Install a GUI Alternative
For a GUI experience on Linux, consider:
-
Open WebUI: Web-based interface
docker run -d -p 3000:8080 --add-host=host.docker.internal:host-gateway \ -v open-webui:/app/backend/data --name open-webui --restart always \ ghcr.io/open-webui/open-webui:main -
ChatBox: Desktop application for Linux GUI
-
Step 3: Verify Installation
After installation, verify that Ollama is running:
Using the GUI (macOS/Windows)
-
Look for the Ollama icon in your system tray (Windows) or menu bar (macOS)
-
Click the icon to open the Ollama window
-
Check the interface
You should see the Ollama chat interface with a welcome message
Using the CLI (All Platforms)
Open Terminal (macOS/Linux) or Command Prompt (Windows) and run:
ollama --versionExpected output:
ollama version is 0.10.0 (or later)Step 4: Understanding the Ollama Interface
Once installed, Ollama provides two main interfaces:
GUI Interface (Recommended for Beginners)
The native Ollama GUI includes:
- Chat Window: Main area for conversations
- New Chat Button: Start fresh conversations
- Model Selector: Dropdown at the bottom to choose and download models
- Message Input: “Send a message” box at the bottom
- Attachment Support: Add files to your conversations (+ button)
CLI Interface (For Advanced Users)
The command-line interface allows you to:
ollama list # List installed models
ollama pull llama3.2 # Download a model
ollama run llama3.2 # Start chatting with a model
ollama rm llama3.2 # Remove a model
ollama serve # Start Ollama serverWhat’s Next?
Now that Ollama is installed, let’s download a model and start chatting:
Having Issues?
If installation didn’t work, check the Troubleshooting guide.