在Hugging Face找到腾讯开源的混元大模型源码
机器配置: 16G 、A100
购买好服务器,依赖关系和安装
1.拉取代码库:git clone https://github.com/tencent/HunyuanDiT
[root@iZrj97v1slob86b6lvbjd3Z opt]# git clone https://github.com/tencent/HunyuanDiT
Cloning into 'HunyuanDiT'...
remote: Enumerating objects: 238, done.
remote: Counting objects: 100% (164/164), done.
remote: Compressing objects: 100% (121/121), done.
remote: Total 238 (delta 80), reused 90 (delta 40), pack-reused 74
Receiving objects: 100% (238/238), 158.99 MiB | 11.63 MiB/s, done.
Resolving deltas: 100% (88/88), done.
2.进入文件:
[root@iZrj97v1slob86b6lvbjd3Z HunyuanDiT]# ls
app dialoggen example_prompts.txt LICENSE.txt README.md sample_t2i.py
asset environment.yml hydit Notice requirements.txt utils
[root@iZrj97v1slob86b6lvbjd3Z HunyuanDiT]#
其中environment.yml用于设置 Conda 环境的文件:
3.安装 Conda:
mkdir -p ~/miniconda3
wget https://repo.anaconda.com/miniconda/Miniconda3-latest-Linux-x86_64.sh -O ~/miniconda3/miniconda.sh
bash ~/miniconda3/miniconda.sh -b -u -p ~/miniconda3
#ERROR: cannot verify repo.anaconda.com's certificate, issued by ‘/C=US/O=Let's Encrypt/CN=E1’:
Issued certificate has expired.
#跳过证书验证
`wget --no-check-certificate https://repo.anaconda.com/miniconda/Miniconda3-latest-Linux-x86_64.sh -O ~/miniconda3/miniconda.sh`
安装
`cd miniconda3`
`bash miniconda.sh` 安装脚本将会提示你一些信息和许可协议。按照提示一步一步进行安装。通常情况下,你需要按下回车键来查看许可协议,并接受许可协议
安装程序会继续安装Miniconda,并设置环境变量。等待安装过程完成。
安装完成后,你会看到一条消息提示安装成功。
最后,关闭终端窗口并重新打开一个新的终端,以使环境变量生效。
移除脚本
rm -rf ~/miniconda3/miniconda.sh
4.. 创建Conda环境:
回到 HunyaunDiT 文件夹下,执行
conda env create -f environment.yml
这个命令使用environment.yml文件中定义的环境配置来创建一个新的Conda环境。environment.yml文件通常包含所需的软件包及其版本,确保所有依赖项正确安装,以避免版本冲突。
Preparing transaction: done
Verifying transaction: done
Executing transaction: done
#
# To activate this environment, use
#
# $ conda activate HunyuanDiT
#
# To deactivate an active environment, use
#
# $ conda deactivate
5.. 激活环境:conda activate HunyuanDiT
(base) [fuyou@iZrj97v1slob86b6lvbjd3Z HunyuanDiT]$ conda activate HunyuanDiT
(HunyuanDiT) [fuyou@iZrj97v1slob86b6lvbjd3Z HunyuanDiT]$
创建环境后,使用这个命令激活名为HunyuanDiT的Conda环境。激活后,你的命令行会话将切换到这个特定环境,这意味着所有软件包的安装和执行都将在这个隔离的环境中进行。
6.. 安装pip依赖:
python -m pip install -r requirements.txt
在激活了HunyuanDiT环境之后,这个命令使用requirements.txt文件作为参考来安装所需的Python包。requirements.txt文件列出了项目运行所需的所有Python库及其版本。
(可选)安装Flash Attention v2: 本机没有 CUDA 跳过
python -m pip install git+https://github.com/Dao-AILab/flash-attention.git@v2.1.2.post3
这是一个可选步骤,用于安装Flash Attention v2库,这是一个可以加速模型推理的库。注意,这个库需要CUDA 11.6或更高版本,因此如果你的系统不满足这个要求,这个步骤可能不适合你。
下载预训练模型
要下载模型,请首先安装huggingface-clihttps://huggingface.co/docs/huggingface_hub/guides/cli
pip install -U "huggingface_hub[cli]"
或 python -m pip install "huggingface_hub[cli]"
然后使用以下命令下载模型:
# Create a directory named 'ckpts' where the model will be saved, fulfilling the prerequisites for running the demo.
mkdir ckpts
# Use the huggingface-cli tool to download the model.
# The download time may vary from 10 minutes to 1 hour depending on network conditions.
huggingface-cli download Tencent-Hunyuan/HunyuanDiT --local-dir ./ckpts
注意:如果No such file or directory: 'ckpts/.huggingface/.gitignore.lock'在下载过程中出现类似错误,可以忽略该错误并通过执行重试该命令huggingface-cli download Tencent-Hunyuan/HunyuanDiT --local-dir ./ckpts
所有模型都会自动下载。有关该模型的更多信息,请访问此处的Hugging Face 存储库。
使用 Gradio
在运行以下命令之前,请确保您已激活 conda 环境。
By default, we start a Chinese UI.
python app/hydit_app.py
Using Flash Attention for acceleration.
python app/hydit_app.py --infer-mode fa
You can disable the enhancement model if the GPU memory is insufficient.
The enhancement will be unavailable until you restart the app without the --no-enhance
flag.
python app/hydit_app.py --no-enhance
Start with English UI
python app/hydit_app.py --lang en
使用命令行
We provide 3 modes to quick start:
Prompt Enhancement + Text-to-Image. Torch mode
python sample_t2i.py --prompt "渔舟唱晚"
Only Text-to-Image. Torch mode
python sample_t2i.py --prompt "渔舟唱晚" --no-enhance
Only Text-to-Image. Flash Attention mode
python sample_t2i.py --infer-mode fa --prompt "渔舟唱晚"
Generate an image with other image sizes.
python sample_t2i.py --prompt "渔舟唱晚" --image-size 1280 768