Huggingface.js 文件

Hugging Face JS 庫

Hugging Face's logo
加入 Hugging Face 社群

並獲得增強的文件體驗

開始使用


huggingface javascript library logo

// Programmatically interact with the Hub

await createRepo({
  repo: { type: "model", name: "my-user/nlp-model" },
  accessToken: HF_TOKEN
});

await uploadFile({
  repo: "my-user/nlp-model",
  accessToken: HF_TOKEN,
  // Can work with native File in browsers
  file: {
    path: "pytorch_model.bin",
    content: new Blob(...)
  }
});

// Use all supported Inference Providers!

await inference.chatCompletion({
  model: "meta-llama/Llama-3.1-8B-Instruct",
  provider: "sambanova", // or together, fal-ai, replicate, cohere …
  messages: [
    {
      role: "user",
      content: "Hello, nice to meet you!",
    },
  ],
  max_tokens: 512,
  temperature: 0.5,
});

await inference.textToImage({
  model: "black-forest-labs/FLUX.1-dev",
  provider: "replicate",
  inputs: "a picture of a green bird",
});

// and much more…

Hugging Face JS 庫

這是一個用於與 Hugging Face API 互動的 JS 庫集合,包含 TS 型別。

  • @huggingface/inference: 使用所有支援的(無伺服器)Inference Provider 或切換到 Inference Endpoint(專用)來呼叫超過 100,000 個機器學習模型
  • @huggingface/hub: 與 huggingface.co 互動,以建立或刪除程式碼倉庫以及提交/下載檔案
  • @huggingface/mcp-client: 一個模型上下文協議(MCP)客戶端,以及一個微型 Agent 庫,構建於 InferenceClient 之上。
  • @huggingface/gguf: 一個 GGUF 解析器,可用於處理遠端託管的檔案。
  • @huggingface/dduf: 用於 DDUF (DDUF Diffusers 統一格式) 的類似包
  • @huggingface/tasks: Hub 主要原語(如流水線任務、模型庫等)的定義檔案和事實來源。
  • @huggingface/jinja: Jinja 模板引擎的極簡 JS 實現,用於機器學習聊天模板。
  • @huggingface/space-header: 在 Hugging Face 之外使用 Space mini_header
  • @huggingface/ollama-utils: 用於維護 Ollama 與 Hugging Face Hub 上模型的相容性的各種實用工具。
  • @huggingface/tiny-agents: 一個微型的、模型無關的庫,用於構建可以使用工具的 AI Agent。

我們使用現代特性以避免 polyfills 和依賴,因此這些庫僅在現代瀏覽器 / Node.js >= 18 / Bun / Deno 上工作。

這些庫還非常新,請透過提交 issue 來幫助我們!

安裝

透過 NPM

要透過 NPM 安裝,您可以根據需要下載庫

npm install @huggingface/inference
npm install @huggingface/hub
npm install @huggingface/mcp-client

然後在你的程式碼中匯入庫

import { InferenceClient } from "@huggingface/inference";
import { createRepo, commit, deleteRepo, listFiles } from "@huggingface/hub";
import { McpClient } from "@huggingface/mcp-client";
import type { RepoId } from "@huggingface/hub";

透過 CDN 或靜態託管

您可以使用 vanilla JS 執行我們的包,無需任何打包工具,透過使用 CDN 或靜態託管。使用 ES 模組,即 <script type="module">,您可以在程式碼中匯入庫

<script type="module">
    import { InferenceClient } from 'https://cdn.jsdelivr.net/npm/@huggingface/inference@4.6.1/+esm';
    import { createRepo, commit, deleteRepo, listFiles } from "https://cdn.jsdelivr.net/npm/@huggingface/hub@2.4.1/+esm";
</script>

Deno

// esm.sh
import { InferenceClient } from "https://esm.sh/@huggingface/inference"

import { createRepo, commit, deleteRepo, listFiles } from "https://esm.sh/@huggingface/hub"
// or npm:
import { InferenceClient } from "npm:@huggingface/inference"

import { createRepo, commit, deleteRepo, listFiles } from "npm:@huggingface/hub"

用法示例

在您的賬戶設定中獲取您的 HF 訪問令牌。

@huggingface/inference 示例

import { InferenceClient } from "@huggingface/inference";

const HF_TOKEN = "hf_...";

const client = new InferenceClient(HF_TOKEN);

// Chat completion API
const out = await client.chatCompletion({
  model: "meta-llama/Llama-3.1-8B-Instruct",
  messages: [{ role: "user", content: "Hello, nice to meet you!" }],
  max_tokens: 512
});
console.log(out.choices[0].message);

// Streaming chat completion API
for await (const chunk of client.chatCompletionStream({
  model: "meta-llama/Llama-3.1-8B-Instruct",
  messages: [{ role: "user", content: "Hello, nice to meet you!" }],
  max_tokens: 512
})) {
  console.log(chunk.choices[0].delta.content);
}

/// Using a third-party provider:
await client.chatCompletion({
  model: "meta-llama/Llama-3.1-8B-Instruct",
  messages: [{ role: "user", content: "Hello, nice to meet you!" }],
  max_tokens: 512,
  provider: "sambanova", // or together, fal-ai, replicate, cohere …
})

await client.textToImage({
  model: "black-forest-labs/FLUX.1-dev",
  inputs: "a picture of a green bird",
  provider: "fal-ai",
})



// You can also omit "model" to use the recommended model for the task
await client.translation({
  inputs: "My name is Wolfgang and I live in Amsterdam",
  parameters: {
    src_lang: "en",
    tgt_lang: "fr",
  },
});

// pass multimodal files or URLs as inputs
await client.imageToText({
  model: 'nlpconnect/vit-gpt2-image-captioning',
  data: await (await fetch('https://picsum.photos/300/300')).blob(),
})

// Using your own dedicated inference endpoint: https://huggingface.co/docs/inference-endpoints/
const gpt2Client = client.endpoint('https://xyz.eu-west-1.aws.endpoints.huggingface.cloud/gpt2');
const { generated_text } = await gpt2Client.textGeneration({ inputs: 'The answer to the universe is' });

// Chat Completion
const llamaEndpoint = client.endpoint(
  "https://router.huggingface.co/hf-inference/models/meta-llama/Llama-3.1-8B-Instruct"
);
const out = await llamaEndpoint.chatCompletion({
  model: "meta-llama/Llama-3.1-8B-Instruct",
  messages: [{ role: "user", content: "Hello, nice to meet you!" }],
  max_tokens: 512,
});
console.log(out.choices[0].message);

@huggingface/hub 示例

import { createRepo, uploadFile, deleteFiles } from "@huggingface/hub";

const HF_TOKEN = "hf_...";

await createRepo({
  repo: "my-user/nlp-model", // or { type: "model", name: "my-user/nlp-test" },
  accessToken: HF_TOKEN
});

await uploadFile({
  repo: "my-user/nlp-model",
  accessToken: HF_TOKEN,
  // Can work with native File in browsers
  file: {
    path: "pytorch_model.bin",
    content: new Blob(...)
  }
});

await deleteFiles({
  repo: { type: "space", name: "my-user/my-space" }, // or "spaces/my-user/my-space"
  accessToken: HF_TOKEN,
  paths: ["README.md", ".gitattributes"]
});

@huggingface/mcp-client 示例

import { Agent } from '@huggingface/mcp-client';

const HF_TOKEN = "hf_...";

const agent = new Agent({
  provider: "auto",
  model: "Qwen/Qwen2.5-72B-Instruct",
  apiKey: HF_TOKEN,
  servers: [
    {
      // Playwright MCP
      command: "npx",
      args: ["@playwright/mcp@latest"],
    },
  ],
});

await agent.loadTools();

for await (const chunk of agent.run("What are the top 5 trending models on Hugging Face?")) {
    if ("choices" in chunk) {
        const delta = chunk.choices[0]?.delta;
        if (delta.content) {
            console.log(delta.content);
        }
    }
}

當然還有更多功能,請檢視每個庫的 README!

格式化與測試

sudo corepack enable
pnpm install

pnpm -r format:check
pnpm -r lint:check
pnpm -r test

構建

pnpm -r build

這將在 packages/*/dist 中生成 ESM 和 CJS javascript 檔案,例如 packages/inference/dist/index.mjs

< > 在 GitHub 上更新

© . This site is unofficial and not affiliated with Hugging Face, Inc.