πŸ“‹ Model Description


license: apache-2.0 library_name: transformers tags:
  • language
  • granite-4.0
  • gguf
base_model:
  • ibm-granite/granite-4.0-micro

Granite 4.0 Micro (GGUF)

[!NOTE]

This repository contains models that have been converted to the GGUF format with various quantizations from an IBM Granite base model.

>

Please reference the base model's full model card here:

https://huggingface.co/ibm-granite/granite-4.0-micro

πŸ“‚ GGUF File List

πŸ“ Filename πŸ“¦ Size ⚑ Download
granite-4.0-micro-Q2_K.gguf
LFS Q2
1.28 GB Download
granite-4.0-micro-Q3_K_L.gguf
LFS Q3
1.74 GB Download
granite-4.0-micro-Q3_K_M.gguf
LFS Q3
1.61 GB Download
granite-4.0-micro-Q3_K_S.gguf
LFS Q3
1.46 GB Download
granite-4.0-micro-Q4_0.gguf
Recommended LFS Q4
1.85 GB Download
granite-4.0-micro-Q4_1.gguf
LFS Q4
2.03 GB Download
granite-4.0-micro-Q4_K_M.gguf
LFS Q4
1.96 GB Download
granite-4.0-micro-Q4_K_S.gguf
LFS Q4
1.86 GB Download
granite-4.0-micro-Q5_0.gguf
LFS Q5
2.21 GB Download
granite-4.0-micro-Q5_1.gguf
LFS Q5
2.4 GB Download
granite-4.0-micro-Q5_K_M.gguf
LFS Q5
2.27 GB Download
granite-4.0-micro-Q5_K_S.gguf
LFS Q5
2.21 GB Download
granite-4.0-micro-Q6_K.gguf
LFS Q6
2.6 GB Download
granite-4.0-micro-Q8_0.gguf
LFS Q8
3.37 GB Download
granite-4.0-micro-f16.gguf
LFS FP16
6.34 GB Download