System Requirements
- MacOS Sonoma (v14) or newer
- Apple M series (CPU and GPU support) or x86 (CPU only)
Filesystem Requirements
The preferred method of installation is to mount theollama.dmg and drag-and-drop the Ollama application to the system-wide Applications folder.  Upon startup, the Ollama app will verify the ollama CLI is present in your PATH, and if not detected, will prompt for permission to create a link in /usr/local/bin
Once you’ve installed Ollama, you’ll need additional space for storing the Large Language models, which can be tens to hundreds of GB in size.  If your home directory doesn’t have enough space, you can change where the binaries are installed, and where the models are stored.
Changing Install Location
To install the Ollama application somewhere other thanApplications, place the Ollama application in the desired location, and ensure the CLI Ollama.app/Contents/Resources/ollama or a sym-link to the CLI can be found in your path.  Upon first start decline the “Move to Applications?” request.
Troubleshooting
Ollama on MacOS stores files in a few different locations.- ~/.ollamacontains models and configuration
- ~/.ollama/logscontains logs- app.log contains most recent logs from the GUI application
- server.log contains the most recent server logs
 
- <install location>/Ollama.app/Contents/Resources/ollamathe CLI binary

