TL;DR

Blaxel now integrates with HuggingFace, allowing you to connect to and deploy AI models (public, private, or gated) directly through the platform. Set up is easy with a HuggingFace access token, and you get full control over model deployment settings and endpoint management.

Our HuggingFace integration enables you to connect to serverless endpoints from HuggingFace—whether they're public, gated, or private—directly through your agents on Blaxel. But that's not all! This integration is bidirectional, meaning you can create new deployments on HuggingFace right from the Blaxel console.

Key Features That Make This Integration Shine

hf-integration.png

🔧 Getting Started is Simple

Setting up the integration is straightforward: just register a HuggingFace access token in your Blaxel workspace settings. The scope of this token determines what resources Blaxel can access on HuggingFace, including which public and private models to connect to.

When creating a new deployment, you have complete control over: