π Model Description
base_model:
- google/gemma-3-270m-it
gemma-3-270m-it GGUF
Recommended way to run this model:
llama-server -hf ggml-org/gemma-3-270m-it-GGUF -c 0 -fa
Then, access http://localhost:8080
π GGUF File List
| π Filename | π¦ Size | β‘ Download |
|---|---|---|
|
gemma-3-270m-it-Q8_0.gguf
Recommended
LFS
Q8
|
278.04 MB | Download |