← Back
Ollama releases v0.17.1-rc2 with Qwen 3.5-27B model support
· releasemodel · github.com ↗

What's New

Ollama v0.17.1-rc2 introduces support for the Qwen 3.5-27B model, expanding the platform's catalog of available models for local deployment.

Key Changes

  • New Model Support: Qwen 3.5-27B model is now available for use within Ollama
  • Release Candidate Status: This is a release candidate (rc2), meaning it's in testing before the stable v0.17.1 release

What Developers Need to Know

Users can now pull and run the Qwen 3.5-27B model directly through Ollama's command-line interface. This 27-billion parameter model joins Ollama's growing collection of quantized and optimized models, allowing developers to run this variant locally without relying on cloud inference services.

Next Steps

Test this release candidate and report any issues on the Ollama GitHub repository to help stabilize the upcoming v0.17.1 release.