πŸ“‹ Model Description


license: apache-2.0 library_name: transformers tags:
  • language
  • granite-4.0
  • gguf
base_model:
  • ibm-granite/granite-4.0-h-micro

Granite 4.0 H-Micro (GGUF)

[!NOTE]

This repository contains models that have been converted to the GGUF format with various quantizations from an IBM Granite base model.

>

Please reference the base model's full model card here:

https://huggingface.co/ibm-granite/granite-4.0-h-micro

πŸ“‚ GGUF File List

πŸ“ Filename πŸ“¦ Size ⚑ Download
granite-4.0-h-micro-Q2_K.gguf
LFS Q2
1.14 GB Download
granite-4.0-h-micro-Q3_K_L.gguf
LFS Q3
1.53 GB Download
granite-4.0-h-micro-Q3_K_M.gguf
LFS Q3
1.45 GB Download
granite-4.0-h-micro-Q3_K_S.gguf
LFS Q3
1.36 GB Download
granite-4.0-h-micro-Q4_0.gguf
Recommended LFS Q4
1.73 GB Download
granite-4.0-h-micro-Q4_1.gguf
LFS Q4
1.9 GB Download
granite-4.0-h-micro-Q4_K_M.gguf
LFS Q4
1.81 GB Download
granite-4.0-h-micro-Q4_K_S.gguf
LFS Q4
1.74 GB Download
granite-4.0-h-micro-Q5_0.gguf
LFS Q5
2.08 GB Download
granite-4.0-h-micro-Q5_1.gguf
LFS Q5
2.25 GB Download
granite-4.0-h-micro-Q5_K_M.gguf
LFS Q5
2.12 GB Download
granite-4.0-h-micro-Q5_K_S.gguf
LFS Q5
2.08 GB Download
granite-4.0-h-micro-Q6_K.gguf
LFS Q6
2.44 GB Download
granite-4.0-h-micro-Q8_0.gguf
LFS Q8
3.16 GB Download
granite-4.0-h-micro-f16.gguf
LFS FP16
5.95 GB Download