Skip to content

Bad request 400: While using the LLM request through OpenAI #8

Open
@ambilykk

Description

While using the LLM using OpenAI, we encountered the Bad request error.

Code Used

const capiClient = new OpenAI({
    baseURL: "https://api.githubcopilot.com/",
    apiKey: tokenForUser,
    headers: {
      "Copilot-Integration-Id": "copilot-chat" 
    },
  });
  console.log("capiclient request");
  const response = await capiClient.chat.completions.create({
    stream: false,
    model: "gpt-4o",
    messages: [{
      role: "user",
      content: "What is GitHub Copilot"}]
  });

Error Message
Image

Work-around

Once we replace the module function with fetch and hardcoded the Copilot-Integration-Id, it start working.

const copilotResponse = await fetch(
    "https://api.githubcopilot.com/chat/completions",
    {
      method: "POST",
      headers: {
        "Content-Type": "application/json",
        "Authorization": `Bearer ${tokenForUser}`,
        "Copilot-Integration-Id": "vscode-chat",
      },
      body: JSON.stringify({
        messages: [{
          role: "user",
          content: "What is GitHub Copilot"}],
        max_tokens: 50,
        temperature: 0.5
      }),
    }
  );

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions