| 단축키 | 기능 | 설명 |
|---|---|---|
| ollama pull <model> | Download model | Download a model. e.g. ollama pull llama3.2 |
| ollama pull <model>:<tag> | Pull specific version | Pull a specific model version. e.g. llama3.2:1b |
| ollama list | List local models | Show all downloaded models. |
| ollama rm <model> | Remove model | Delete a downloaded model to free disk space. |
| ollama show <model> | Model info | Show model parameters, template, and system prompt. |
| ollama cp <src> <dst> | Copy model | Duplicate a model under a new name. |
| ollama push <model> | Publish model | Push a custom model to the Ollama registry. |
| 단축키 | 기능 | 설명 |
|---|---|---|
| ollama run <model> | Run interactive | Start an interactive chat with the model. |
| ollama run <model> "prompt" | Single prompt | Run a one-shot prompt and exit. |
| cat file.txt | ollama run <model> | Pipe input | Pipe file content as the model input. |
| /bye | Exit chat | Exit the interactive chat session. |
| /clear | Clear history | Clear the current conversation history. |
| /set system <text> | Set system prompt | Assign a system prompt during chat. |
| /show info | Model details | Display current model information mid-session. |
| /show modelfile | Show Modelfile | Print the Modelfile for the current model. |
| 단축키 | 기능 | 설명 |
|---|---|---|
| ollama serve | Start server | Start the Ollama API server on port 11434. |
| OLLAMA_HOST=0.0.0.0 ollama serve | Expose server | Allow external connections to the Ollama server. |
| curl localhost:11434/api/generate | Generate API | Call the REST API to generate a response. |
| curl localhost:11434/api/chat | Chat API | Call the REST API for a chat conversation. |
| curl localhost:11434/api/tags | List models API | List all available models via REST API. |
| OLLAMA_MODELS=<path> | Custom model dir | Set a custom path for storing model files. |
| OLLAMA_NUM_PARALLEL=4 | Parallel requests | Set number of concurrent inference requests. |
| OLLAMA_MAX_LOADED_MODELS=2 | Max loaded models | Set maximum models kept in GPU memory. |
| 단축키 | 기능 | 설명 |
|---|---|---|
| FROM <model> | Base model | Specify the base model to build from. |
| SYSTEM "<text>" | System prompt | Set a custom system prompt for the model. |
| PARAMETER temperature 0.7 | Temperature | Set response creativity (0.0–1.0). |
| PARAMETER num_ctx 4096 | Context length | Set the context window size in tokens. |
| TEMPLATE "{{ .Prompt }}" | Prompt template | Define a custom prompt formatting template. |
| ollama create <name> -f Modelfile | Build model | Create a custom model from a Modelfile. |