18 lines
366 B
Plaintext
18 lines
366 B
Plaintext
Why these models?
|
|
|
|
Coding:
|
|
|
|
codegemma:2b → lightweight, good for completions.
|
|
|
|
codellama:7b → solid for structured code (like Docker Compose).
|
|
|
|
mistral:7b → generalist, also good with logic in code.
|
|
|
|
Writing (tech docs & emails):
|
|
|
|
llama3.2:3b → smaller generalist.
|
|
|
|
gemma:7b → more natural writing.
|
|
|
|
neural-chat:7b → conversational, good for email tone.
|