Local LLM model page

Solar Pro (22B)

22B single-GPU advanced LLM. Compact but powerful. 50K downloads.

Parameters
22B
Minimum RAM
16 GB
Model size
13 GB
Quantization
Q4_K_M

Can Solar Pro (22B) run locally?

Solar Pro (22B) is best suited for mainstream Macs and PCs with 16 GB RAM. LocalClaw recommends Q4_K_M as the default quantization, with at least 16 GB RAM.

Search term for LM Studio or compatible runtimes: solar-pro-22b

Hugging Face repository: upstage/solar-pro-preview-instruct-GGUF

chatgeneralpower

Strengths

  • 22B single-GPU advanced LLM. Compact but powerful. 50K downloads.

Limitations

  • Performance depends heavily on quantization, RAM bandwidth and runtime support.

Best use cases

  • chat
  • general
  • power

Benchmarks

Speed: 5/10

Quality: 8/10

Coding: 7/10

Reasoning: 7/10

Technical details

Developer: solar

License: See model repository

Context window: Unknown tokens

Architecture: See model card

Released: 2024-09