Prerequisites
Flexstack AI Host requires a setup that meets the following criteria to ensure stable and efficient operation.
Python: Version 3.10. Ensure that Python is properly installed and configured on your system. Use the command
python --version
to check your current Python version. If you don't have Python yet, this guide is helpful to install it.CUDA: Version 12.1 or compatible.
Checking and Installing CUDA
Use the command
nvcc --version
in the terminal to check your current CUDA version. If you do not have CUDA installed, or if it's not version 12.1, please visit NVIDIA's official website to download and install CUDA 12.1.
G++ compiler is required.
Checking and Installing G++
To check the current version of G++ installed, open a terminal and type
g++ --version
. If G++ is not installed on your system, you will need to install it via your operating system's package manager.
GPU: If your system includes a GPU, a minimum of 9GB VRAM is required for image generation tasks. For LoRA Training tasks, a minimum of 16GB VRAM is required.
Checking VRAM
You can use a tool like
nvidia-smi
on systems with NVIDIA GPUs to check VRAM capacity.
Last updated
Was this helpful?