π Model Description
license: apache-2.0 base_model:
- xiabs/DreamOmni2
This GGUF file is a direct conversion of xiabs/DreamOmni2-7.6B.
As a quantized model, all original licensing terms and usage restrictions continue to apply.
Usage
I created some custom nodes to run on ComfyUI. You can download it here: DreamOmni2-GGUF Place the model files in ComfyUI/models/unet and loras on ComfyUI/models/loras and refer to the GitHub README for detailed installation instructions.Model Details
Model Description
- Converted by: rafacost
Model Sources [optional]
- Repository: xiabs/DreamOmni2
- Paper: DreamOmni2: Multimodal Instruction-based Editing and Generation.
- Github: DreamOmni2 Project
π GGUF File List
| π Filename | π¦ Size | β‘ Download |
|---|---|---|
|
DreamOmni2-Vlm-Model-7.6B-F16.gguf
LFS
FP16
|
14.19 GB | Download |
|
DreamOmni2-Vlm-Model-7.6B-Q2_K.gguf
LFS
Q2
|
2.81 GB | Download |
|
DreamOmni2-Vlm-Model-7.6B-Q3_K_M.gguf
LFS
Q3
|
3.55 GB | Download |
|
DreamOmni2-Vlm-Model-7.6B-Q3_K_S.gguf
LFS
Q3
|
3.25 GB | Download |
|
DreamOmni2-Vlm-Model-7.6B-Q4_0.gguf
Recommended
LFS
Q4
|
4.13 GB | Download |
|
DreamOmni2-Vlm-Model-7.6B-Q4_K_M.gguf
LFS
Q4
|
4.36 GB | Download |
|
DreamOmni2-Vlm-Model-7.6B-Q4_K_S.gguf
LFS
Q4
|
4.15 GB | Download |
|
DreamOmni2-Vlm-Model-7.6B-Q5_0.gguf
LFS
Q5
|
4.95 GB | Download |
|
DreamOmni2-Vlm-Model-7.6B-Q5_K_M.gguf
LFS
Q5
|
5.07 GB | Download |
|
DreamOmni2-Vlm-Model-7.6B-Q5_K_S.gguf
LFS
Q5
|
4.95 GB | Download |
|
DreamOmni2-Vlm-Model-7.6B-Q6_K.gguf
LFS
Q6
|
5.82 GB | Download |
|
DreamOmni2-Vlm-Model-7.6B-Q8_0.gguf
LFS
Q8
|
7.54 GB | Download |
|
mmproj-DreamOmni2-7.6B-GGUF-f16.gguf
LFS
FP16
|
1.26 GB | Download |