Setup & Installation
Setting Up Your Local LLM Environment
This section guides you through installing and configuring Ollama with its native GUI interface. By the end, you’ll have a fully functional local LLM environment ready to use.
What is Ollama?
Ollama is an open-source tool that makes it easy to run large language models locally on your computer. As of 2025, Ollama includes:
- Native GUI Application: User-friendly desktop interface
- Model Library: One-click access to dozens of open-source models
- CLI Tools: Command-line interface for advanced users
- REST API: For integration with other applications
- Cross-Platform: Works on macOS, Windows, and Linux
Why Ollama?
Ollama has become the de facto standard for local LLMs because it:
✓ Simplifies Installation: One download, everything included ✓ Manages Models: Easy downloading, updating, and switching between models ✓ Optimizes Performance: Automatically configures for your hardware ✓ Provides Flexibility: GUI for beginners, CLI for experts ✓ Stays Updated: Regular updates with latest models and features
What You’ll Install
- Ollama Application: The main program with GUI and CLI
- Llama 3.2 Model: Our primary model for the workshop (2-3 GB download)
- (Optional) Additional Models: For experimentation
System Requirements Recap
Before proceeding, ensure your system meets these requirements:
| Component | Minimum | Recommended |
|---|---|---|
| RAM | 8 GB | 16 GB |
| Storage | 20 GB free | 50 GB free |
| OS | macOS 12+, Windows 11, Linux | Latest versions |
| Internet | Broadband connection | For initial downloads |
Setup Process Overview
The setup process has three main steps:
Estimated Time
- Installation: 5-10 minutes
- Model Download: 5-15 minutes (depending on internet speed)
- Testing: 5 minutes
- Total: 15-30 minutes
Ready to Start?
Let’s begin with Installation!