All Shortcuts

nvidia-smi (GPU) Shortcuts & Commands — Complete Cheat Sheet (19)

Complete nvidia-smi commands cheat sheet for GPU monitoring, management, and diagnostics on Linux. Essential reference for ML engineers, data scientists, and GPU server administrators.

Search all nvidia-smi (GPU) shortcuts interactively

Interactive Shortcut Finder

All nvidia-smi (GPU) Shortcuts

GPU Monitoring

ShortcutActionDescription
nvidia-smiGPU status summaryShow GPU utilization, temperature, memory, and processes.
nvidia-smi -l 1Monitor every 1sRefresh GPU status every 1 second.
watch -n 1 nvidia-smiReal-time monitorReal-time GPU monitoring with watch command.
nvidia-smi -qDetailed infoShow all detailed GPU information.
nvidia-smi -LList GPUsList all GPUs with UUIDs.
nvidia-smi pmonProcess monitorMonitor per-process GPU usage.
nvidia-smi dmonDevice monitorMonitor device metrics every second.
nvidia-smi topo -mTopologyShow GPU NVLink/PCIe topology.
nvidia-smi nvlink -sNVLink statusCheck NVLink connection status and bandwidth.

GPU Management

ShortcutActionDescription
nvidia-smi -pm 1Persistence Mode ONKeep driver loaded to reduce GPU init latency.
nvidia-smi -i 0 -pl 250Set power limitLimit GPU 0 max power to 250W.
nvidia-smi -i 0 -ac 1215,1410Set clocksSet memory/graphics clock speeds.
nvidia-smi -rgcReset clocksReset GPU clocks to default.
nvidia-smi -r -i 0Reset GPUReset GPU 0 (resolves ECC errors etc).
nvidia-smi -e 1Enable ECCEnable ECC memory correction.
nvidia-smi -q -d POWERPower detailsShow detailed GPU power information.

Query & CSV

ShortcutActionDescription
nvidia-smi --query-gpu=name,memory.total,memory.used --format=csvCSV queryOutput GPU info in CSV format.
nvidia-smi --query-gpu=utilization.gpu,temperature.gpu --format=csv -l 55s CSV loggingLog utilization and temp every 5 seconds.
nvidia-smi --query-compute-apps=pid,used_memory --format=csvPer-process memoryShow memory usage per GPU process.

Pro Tips

Frequently Asked Questions

How do I check GPU usage on Linux?

Run 'nvidia-smi' to see a summary of all GPUs including utilization, temperature, memory usage, and running processes.

How do I monitor GPU in real time?

Use 'watch -n 1 nvidia-smi' or 'nvidia-smi -l 1' to refresh GPU status every second.

How do I check GPU memory usage per process?

Run 'nvidia-smi --query-compute-apps=pid,used_memory --format=csv' to see memory usage per process.

How do I enable persistence mode?

Run 'nvidia-smi -pm 1' as root to keep the NVIDIA driver loaded, reducing GPU initialization latency.

How do I check GPU topology and NVLink?

Use 'nvidia-smi topo -m' to see GPU interconnect topology including NVLink and PCIe connections.

Browse shortcuts for 230 platforms

Explore All