Skip to content
Open

Qwen #6538

Show file tree
Hide file tree
Changes from 5 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions Chebichat.md
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
app/store/config.ts
22 changes: 22 additions & 0 deletions app/SyncOnFirstLoad.tsx
Original file line number Diff line number Diff line change
@@ -0,0 +1,22 @@
"use client";
import { useEffect } from "react";
import { useSyncStore } from "./store/sync";

export default function SyncOnFirstLoad() {
const syncStore = useSyncStore();

useEffect(() => {
// if (syncStore.lastSyncTime === 0) {
// // If this is the first time syncing, call sync()
// alert("[SyncOnFirstLoad] Dong bo hoa du lieu lan dau tien");

console.log("[SyncOnFirstLoad] Dong bo hoa du lieu lan dau tien");

console.log("Thoi gian dong bo lan cuoi: ", syncStore.lastSyncTime);

syncStore.sync();
// }
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue

Restore the conditional sync logic to match component intent.

The component name SyncOnFirstLoad suggests it should only sync on the first load, but the current implementation syncs unconditionally on every mount. The commented code shows the correct conditional logic that should be active.

Apply this diff to restore the intended first-load-only behavior:

   useEffect(() => {
-    // if (syncStore.lastSyncTime === 0) {
-    //   // If this is the first time syncing, call sync()
-    // alert("[SyncOnFirstLoad] Dong bo hoa du lieu lan dau tien");
-
-    console.log("[SyncOnFirstLoad] Dong bo hoa du lieu lan dau tien");
-
-    console.log("Thoi gian dong bo lan cuoi: ", syncStore.lastSyncTime);
-
-    syncStore.sync();
-    // }
+    if (syncStore.lastSyncTime === 0) {
+      console.log("[SyncOnFirstLoad] First-time data synchronization");
+      console.log("Last sync time: ", syncStore.lastSyncTime);
+      syncStore.sync();
+    }
   }, [syncStore]);
πŸ“ Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
// if (syncStore.lastSyncTime === 0) {
// // If this is the first time syncing, call sync()
// alert("[SyncOnFirstLoad] Dong bo hoa du lieu lan dau tien");
console.log("[SyncOnFirstLoad] Dong bo hoa du lieu lan dau tien");
console.log("Thoi gian dong bo lan cuoi: ", syncStore.lastSyncTime);
syncStore.sync();
// }
useEffect(() => {
- // if (syncStore.lastSyncTime === 0) {
- // // If this is the first time syncing, call sync()
- // alert("[SyncOnFirstLoad] Dong bo hoa du lieu lan dau tien");
-
- console.log("[SyncOnFirstLoad] Dong bo hoa du lieu lan dau tien");
-
- console.log("Thoi gian dong bo lan cuoi: ", syncStore.lastSyncTime);
-
- syncStore.sync();
- // }
+ if (syncStore.lastSyncTime === 0) {
+ console.log("[SyncOnFirstLoad] First-time data synchronization");
+ console.log("Last sync time: ", syncStore.lastSyncTime);
+ syncStore.sync();
+ }
}, [syncStore]);
πŸ€– Prompt for AI Agents
In app/SyncOnFirstLoad.tsx around lines 9 to 18, the code currently calls
syncStore.sync() unconditionally on every mount, which contradicts the
component's intent to sync only on the first load. Restore the original
conditional check by uncommenting the if statement that verifies if
syncStore.lastSyncTime is zero, and move the console logs and syncStore.sync()
call inside this if block to ensure syncing happens only on the first load.

}, []);

return null;
}
4 changes: 4 additions & 0 deletions app/api/[provider]/[...path]/route.ts
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,11 @@ async function handle(
req: NextRequest,
{ params }: { params: { provider: string; path: string[] } },
) {
// Handle OPTIONS request for CORS preflight
// params.provider = MODEL_PROVIDER;

const apiPath = `/api/${params.provider}`;

console.log(`[${params.provider} Route] params `, params);
switch (apiPath) {
case ApiPath.Azure:
Expand Down
104 changes: 76 additions & 28 deletions app/api/alibaba.ts
Original file line number Diff line number Diff line change
@@ -1,22 +1,16 @@
import { getServerSideConfig } from "@/app/config/server";
import {
ALIBABA_BASE_URL,
ApiPath,
ModelProvider,
ServiceProvider,
} from "@/app/constant";
import { ALIBABA_BASE_URL, ApiPath, ModelProvider } from "@/app/constant";
import { prettyObject } from "@/app/utils/format";
import { NextRequest, NextResponse } from "next/server";
import { auth } from "@/app/api/auth";
import { isModelNotavailableInServer } from "@/app/utils/model";

const serverConfig = getServerSideConfig();

export async function handle(
req: NextRequest,
{ params }: { params: { path: string[] } },
) {
console.log("[Alibaba Route] params ", params);
// console.log("[Alibaba Route] params ", params);
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

πŸ› οΈ Refactor suggestion

Remove commented console.log statements instead of leaving them commented out.

Commented-out debug logs should be removed entirely to keep the codebase clean, rather than left as commented code.

-  // console.log("[Alibaba Route] params ", params);
+
-    // console.error("[Alibaba] ", e);
+
-  // console.log("[Alibaba] fetchUrl", fetchUrl);
+
-  // console.log("[Proxy] Alibaba options: ", fetchOptions);
+
-        // console.log("[Alibaba] custom models", current_model);
+
-        // console.log("[Alibaba] request body json", jsonBody);
+
-      // console.log("[Alibaba] request body", fetchOptions.body);
+
-      // console.error(`[Alibaba] filter`, e);
+

Also applies to: 30-30, 65-65, 81-81, 103-103, 131-131, 138-138, 159-159

πŸ€– Prompt for AI Agents
In app/api/alibaba.ts at lines 13, 30, 65, 81, 103, 131, 138, and 159, remove
all commented-out console.log statements entirely instead of leaving them
commented out. This will clean up the codebase by eliminating unnecessary
commented debug logs.


if (req.method === "OPTIONS") {
return NextResponse.json({ body: "OK" }, { status: 200 });
Expand All @@ -42,7 +36,9 @@ async function request(req: NextRequest) {
const controller = new AbortController();

// alibaba use base url or just remove the path
let path = `${req.nextUrl.pathname}`.replaceAll(ApiPath.Alibaba, "");
let path = `${req.nextUrl.pathname}`
.replaceAll(ApiPath.Alibaba, "")
.replace("/api", "");

let baseUrl = serverConfig.alibabaUrl || ALIBABA_BASE_URL;

Expand All @@ -65,6 +61,9 @@ async function request(req: NextRequest) {
);

const fetchUrl = `${baseUrl}${path}`;

console.log("[Alibaba] fetchUrl", fetchUrl);

const fetchOptions: RequestInit = {
headers: {
"Content-Type": "application/json",
Expand All @@ -83,28 +82,77 @@ async function request(req: NextRequest) {
if (serverConfig.customModels && req.body) {
try {
const clonedBody = await req.text();
fetchOptions.body = clonedBody;
let jsonBody: any = {};

try {
jsonBody = JSON.parse(clonedBody);

// Move input.messages to messages at the root level if present
if (jsonBody.input && Array.isArray(jsonBody.input.messages)) {
jsonBody.messages = jsonBody.input.messages;

// Remove input.messages to avoid duplication
delete jsonBody.input;

jsonBody.stream = true;
}
Comment on lines +87 to +100
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue

Fix performance issue with delete operator.

The delete operator can impact performance when used on objects.

Replace the delete operation with object destructuring:

-        // Move input.messages to messages at the root level if present
-        if (jsonBody.input && Array.isArray(jsonBody.input.messages)) {
-          jsonBody.messages = jsonBody.input.messages;
-
-          // Remove input.messages to avoid duplication
-          delete jsonBody.input;
-
-          jsonBody.stream = true;
-        }
+        // Move input.messages to messages at the root level if present
+        if (jsonBody.input && Array.isArray(jsonBody.input.messages)) {
+          const { input, ...bodyWithoutInput } = jsonBody;
+          jsonBody = {
+            ...bodyWithoutInput,
+            messages: input.messages,
+            stream: true
+          };
+        }
πŸ“ Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
let jsonBody: any = {};
try {
jsonBody = JSON.parse(clonedBody);
// Move input.messages to messages at the root level if present
if (jsonBody.input && Array.isArray(jsonBody.input.messages)) {
jsonBody.messages = jsonBody.input.messages;
// Remove input.messages to avoid duplication
delete jsonBody.input;
jsonBody.stream = true;
}
let jsonBody: any = {};
try {
jsonBody = JSON.parse(clonedBody);
// Move input.messages to messages at the root level if present
if (jsonBody.input && Array.isArray(jsonBody.input.messages)) {
const { input, ...bodyWithoutInput } = jsonBody;
jsonBody = {
...bodyWithoutInput,
messages: input.messages,
stream: true
};
}
🧰 Tools
πŸͺ› Biome (1.9.4)

[error] 95-95: Avoid the delete operator which can impact performance.

Unsafe fix: Use an undefined assignment instead.

(lint/performance/noDelete)

πŸ€– Prompt for AI Agents
In app/api/alibaba.ts between lines 85 and 98, the code uses the delete operator
to remove the input property from jsonBody, which can cause performance issues.
To fix this, replace the delete operation by using object destructuring to
create a new jsonBody object without the input property, ensuring input.messages
is moved to the root messages property and stream is set to true without
mutating the original object with delete.


const current_model = jsonBody?.model;
console.log("[Alibaba] custom models", current_model);

//kiem tra xem model co phai la qwen-vl hay khong (vision model)
if (current_model && current_model.startsWith("qwen-vl")) {
console.log("[Alibaba] current model is qwen-vl");
console.log("xu ly hinh anh trong message");

// Reformat image objects in messages
if (Array.isArray(jsonBody.messages)) {
jsonBody.messages = jsonBody.messages.map((msg: any) => {
if (Array.isArray(msg.content)) {
msg.content = msg.content.map((item: any) => {
if (item && typeof item === "object" && "image" in item) {
return {
type: "image_url",
image_url: {
url: item.image,
},
};
}
return item;
});
}
return msg;
});
}
}
Comment on lines +105 to +129
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

πŸ› οΈ Refactor suggestion

Refactor vision model processing and use English comments.

The vision model processing logic has several maintainability issues:

  1. Comments should be in English for better team collaboration
  2. The complex nested transformation logic should be extracted to a helper function

Consider refactoring like this:

-        //kiem tra xem model co phai la qwen-vl hay khong (vision model)
-        if (current_model && current_model.startsWith("qwen-vl")) {
-          console.log("[Alibaba] current model is qwen-vl");
-          console.log("xu ly hinh anh trong message");
-
-          // Reformat image objects in messages
-          if (Array.isArray(jsonBody.messages)) {
-            jsonBody.messages = jsonBody.messages.map((msg: any) => {
-              if (Array.isArray(msg.content)) {
-                msg.content = msg.content.map((item: any) => {
-                  if (item && typeof item === "object" && "image" in item) {
-                    return {
-                      type: "image_url",
-                      image_url: {
-                        url: item.image,
-                      },
-                    };
-                  }
-                  return item;
-                });
-              }
-              return msg;
-            });
-          }
-        }
+        // Check if model is a qwen-vl vision model
+        if (current_model?.startsWith("qwen-vl")) {
+          console.log("[Alibaba] Processing vision model:", current_model);
+          jsonBody.messages = transformVisionMessages(jsonBody.messages);
+        }

Add this helper function:

function transformVisionMessages(messages: any[]): any[] {
  if (!Array.isArray(messages)) return messages;
  
  return messages.map((msg) => {
    if (!Array.isArray(msg.content)) return msg;
    
    return {
      ...msg,
      content: msg.content.map((item) => {
        if (item && typeof item === "object" && "image" in item) {
          return {
            type: "image_url",
            image_url: { url: item.image },
          };
        }
        return item;
      }),
    };
  });
}
🧰 Tools
πŸͺ› Biome (1.9.4)

[error] 106-106: Change to an optional chain.

Unsafe fix: Change to an optional chain.

(lint/complexity/useOptionalChain)

πŸ€– Prompt for AI Agents
In app/api/alibaba.ts between lines 105 and 129, refactor the vision model
processing by replacing the Vietnamese comments with English ones for clarity.
Extract the nested message transformation logic into a separate helper function
named transformVisionMessages that takes messages as input and returns the
transformed messages. Then, replace the inline transformation code with a call
to this helper function to improve maintainability and readability.


// console.log("[Alibaba] request body json", jsonBody);

fetchOptions.body = JSON.stringify(jsonBody);
} catch (e) {
fetchOptions.body = clonedBody; // fallback if not JSON
}

const jsonBody = JSON.parse(clonedBody) as { model?: string };
// console.log("[Alibaba] request body", fetchOptions.body);

// not undefined and is false
if (
isModelNotavailableInServer(
serverConfig.customModels,
jsonBody?.model as string,
ServiceProvider.Alibaba as string,
)
) {
return NextResponse.json(
{
error: true,
message: `you are not allowed to use ${jsonBody?.model} model`,
},
{
status: 403,
},
);
}
// if (
// isModelNotavailableInServer(
// serverConfig.customModels,
// jsonBody?.model as string,
// ServiceProvider.Alibaba as string,
// )
// ) {
// return NextResponse.json(
// {
// error: true,
// message: `you are not allowed to use ${jsonBody?.model} model`,
// },
// {
// status: 403,
// },
// );
// }
Comment on lines +141 to +157
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

πŸ› οΈ Refactor suggestion

Remove commented model availability check code.

This large block of commented code should either be removed entirely if no longer needed, or uncommented and fixed if the functionality is still required.

If this functionality is no longer needed, remove the commented code:

-      // not undefined and is false
-      // if (
-      //   isModelNotavailableInServer(
-      //     serverConfig.customModels,
-      //     jsonBody?.model as string,
-      //     ServiceProvider.Alibaba as string,
-      //   )
-      // ) {
-      //   return NextResponse.json(
-      //     {
-      //       error: true,
-      //       message: `you are not allowed to use ${jsonBody?.model} model`,
-      //     },
-      //     {
-      //       status: 403,
-      //     },
-      //   );
-      // }

If this functionality should be retained, please uncomment and ensure the imports are available.

Committable suggestion skipped: line range outside the PR's diff.

πŸ€– Prompt for AI Agents
In app/api/alibaba.ts between lines 93 and 109, there is a large block of
commented-out code checking model availability. Determine if this model
availability check is still required; if not, remove the entire commented block
to clean up the code. If it is needed, uncomment the code and verify that all
necessary imports, such as isModelNotavailableInServer and ServiceProvider, are
correctly included and the logic works as intended.

} catch (e) {
console.error(`[Alibaba] filter`, e);
}
Expand Down
55 changes: 35 additions & 20 deletions app/api/common.ts
Original file line number Diff line number Diff line change
Expand Up @@ -6,14 +6,20 @@ import { getModelProvider, isModelNotavailableInServer } from "../utils/model";

const serverConfig = getServerSideConfig();

// HΓ m proxy request tα»« client tα»›i OpenAI hoαΊ·c Azure OpenAI, xα»­ lΓ½ xΓ‘c thα»±c, cαΊ₯u hΓ¬nh endpoint, kiểm tra model, vΓ  trαΊ£ về response phΓΉ hợp.
export async function requestOpenai(req: NextRequest) {
// TαΊ‘o controller để cΓ³ thể hα»§y request khi timeout
const controller = new AbortController();

// Kiểm tra xem request cΓ³ phαΊ£i tα»›i Azure OpenAI khΓ΄ng
const isAzure = req.nextUrl.pathname.includes("azure/deployments");

// BiαΊΏn lΖ°u giΓ‘ trα»‹ xΓ‘c thα»±c vΓ  tΓͺn header xΓ‘c thα»±c
var authValue,
authHeaderName = "";

if (isAzure) {
// NαΊΏu lΓ  Azure, lαΊ₯y api-key tα»« header Authorization
authValue =
req.headers
.get("Authorization")
Expand All @@ -23,33 +29,43 @@ export async function requestOpenai(req: NextRequest) {

authHeaderName = "api-key";
} else {
// NαΊΏu lΓ  OpenAI thường, giα»― nguyΓͺn header Authorization
authValue = req.headers.get("Authorization") ?? "";
authHeaderName = "Authorization";
}

// Xα»­ lΓ½ lαΊ‘i đường dαΊ«n endpoint cho phΓΉ hợp vα»›i OpenAI/Azure
let path = `${req.nextUrl.pathname}`.replaceAll("/api/openai/", "");

console.log("[Proxy] mac dinh ", path);

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

πŸ› οΈ Refactor suggestion

Remove or conditionally enable debug logs.

Debug console.log statements should not be in production code. Consider using a debug flag or removing them entirely.

-  console.log("[Proxy] mac dinh ", path);
+  if (process.env.DEBUG) {
+    console.log("[Proxy] default ", path);
+  }

-  console.log("fetchUrl", fetchUrl);
+  if (process.env.DEBUG) {
+    console.log("fetchUrl", fetchUrl);
+  }

Also applies to: 108-108

πŸ€– Prompt for AI Agents
In app/api/common.ts at lines 40-41 and line 108, the console.log debug
statements should be removed or wrapped in a conditional check using a debug
flag. Implement a mechanism such as an environment variable or a configuration
setting to enable these logs only in development or debugging mode, and ensure
they are disabled in production builds.

// LαΊ₯y baseUrl tα»« config, Ζ°u tiΓͺn Azure nαΊΏu lΓ  request Azure
let baseUrl =
(isAzure ? serverConfig.azureUrl : serverConfig.baseUrl) || OPENAI_BASE_URL;

// Đảm bαΊ£o baseUrl cΓ³ tiền tα»‘ https
if (!baseUrl.startsWith("http")) {
baseUrl = `https://${baseUrl}`;
}

// LoαΊ‘i bỏ dαΊ₯u "/" ở cuα»‘i baseUrl nαΊΏu cΓ³
if (baseUrl.endsWith("/")) {
baseUrl = baseUrl.slice(0, -1);
}

// In ra log để debug đường dαΊ«n vΓ  baseUrl
console.log("[Proxy] ", path);
console.log("[Base Url]", baseUrl);

// ThiαΊΏt lαΊ­p timeout cho request (10 phΓΊt), nαΊΏu quΓ‘ sαΊ½ abort
const timeoutId = setTimeout(
() => {
controller.abort();
},
10 * 60 * 1000,
);

// NαΊΏu lΓ  Azure, xα»­ lΓ½ lαΊ‘i path vΓ  api-version cho Δ‘ΓΊng chuαΊ©n Azure
if (isAzure) {
const azureApiVersion =
req?.nextUrl?.searchParams?.get("api-version") ||
Expand All @@ -60,9 +76,7 @@ export async function requestOpenai(req: NextRequest) {
"",
)}?api-version=${azureApiVersion}`;

// Forward compatibility:
// if display_name(deployment_name) not set, and '{deploy-id}' in AZURE_URL
// then using default '{deploy-id}'
// NαΊΏu cΓ³ customModels vΓ  azureUrl, kiểm tra vΓ  thay thαΊΏ deployment id nαΊΏu cαΊ§n
if (serverConfig.customModels && serverConfig.azureUrl) {
const modelName = path.split("/")[1];
let realDeployName = "";
Expand All @@ -88,8 +102,12 @@ export async function requestOpenai(req: NextRequest) {
}
}

// TαΊ‘o url cuα»‘i cΓΉng để fetch, cΓ³ thể qua Cloudflare Gateway nαΊΏu cαΊ₯u hΓ¬nh
const fetchUrl = cloudflareAIGatewayUrl(`${baseUrl}/${path}`);

console.log("fetchUrl", fetchUrl);

// ThiαΊΏt lαΊ­p cΓ‘c option cho fetch, bao gα»“m headers, method, body, v.v.
const fetchOptions: RequestInit = {
headers: {
"Content-Type": "application/json",
Expand All @@ -101,30 +119,30 @@ export async function requestOpenai(req: NextRequest) {
},
method: req.method,
body: req.body,
// to fix #2485: https://stackoverflow.com/questions/55920957/cloudflare-worker-typeerror-one-time-use-body
// Fix lα»—i body chỉ dΓΉng được 1 lαΊ§n trΓͺn Cloudflare Worker
redirect: "manual",
// @ts-ignore
duplex: "half",
signal: controller.signal,
};

// #1815 try to refuse gpt4 request
// Kiểm tra model cΓ³ bα»‹ cαΊ₯m sα»­ dα»₯ng khΓ΄ng (vΓ­ dα»₯: cαΊ₯m GPT-4)
if (serverConfig.customModels && req.body) {
try {
const clonedBody = await req.text();
fetchOptions.body = clonedBody;

const jsonBody = JSON.parse(clonedBody) as { model?: string };

// not undefined and is false
// NαΊΏu model khΓ΄ng được phΓ©p sα»­ dα»₯ng, trαΊ£ về lα»—i 403
if (
isModelNotavailableInServer(
serverConfig.customModels,
jsonBody?.model as string,
[
ServiceProvider.OpenAI,
ServiceProvider.Azure,
jsonBody?.model as string, // support provider-unspecified model
jsonBody?.model as string, // hα»— trợ model khΓ΄ng rΓ΅ provider
],
)
) {
Expand All @@ -144,43 +162,40 @@ export async function requestOpenai(req: NextRequest) {
}

try {
// Gα»­i request tα»›i OpenAI/Azure vΓ  nhαΊ­n response
const res = await fetch(fetchUrl, fetchOptions);

// Extract the OpenAI-Organization header from the response
// LαΊ₯y header OpenAI-Organization tα»« response (nαΊΏu cΓ³)
const openaiOrganizationHeader = res.headers.get("OpenAI-Organization");

// Check if serverConfig.openaiOrgId is defined and not an empty string
// NαΊΏu Δ‘Γ£ cαΊ₯u hΓ¬nh openaiOrgId, log giΓ‘ trα»‹ header nΓ y
if (serverConfig.openaiOrgId && serverConfig.openaiOrgId.trim() !== "") {
// If openaiOrganizationHeader is present, log it; otherwise, log that the header is not present
console.log("[Org ID]", openaiOrganizationHeader);
} else {
console.log("[Org ID] is not set up.");
}

// to prevent browser prompt for credentials
// Xử lý lẑi headers trả về cho client
const newHeaders = new Headers(res.headers);
newHeaders.delete("www-authenticate");
// to disable nginx buffering
newHeaders.set("X-Accel-Buffering", "no");
newHeaders.delete("www-authenticate"); // XΓ³a header yΓͺu cαΊ§u xΓ‘c thα»±c
newHeaders.set("X-Accel-Buffering", "no"); // TαΊ―t buffer cα»§a nginx

// Conditionally delete the OpenAI-Organization header from the response if [Org ID] is undefined or empty (not setup in ENV)
// Also, this is to prevent the header from being sent to the client
// NαΊΏu chΖ°a cαΊ₯u hΓ¬nh Org ID, xΓ³a header nΓ y khỏi response
if (!serverConfig.openaiOrgId || serverConfig.openaiOrgId.trim() === "") {
newHeaders.delete("OpenAI-Organization");
}

// The latest version of the OpenAI API forced the content-encoding to be "br" in json response
// So if the streaming is disabled, we need to remove the content-encoding header
// Because Vercel uses gzip to compress the response, if we don't remove the content-encoding header
// The browser will try to decode the response with brotli and fail
// XΓ³a header content-encoding để trΓ‘nh lα»—i giαΊ£i nΓ©n trΓͺn trΓ¬nh duyệt
newHeaders.delete("content-encoding");

// TrαΊ£ về response cho client vα»›i cΓ‘c header Δ‘Γ£ xα»­ lΓ½
return new Response(res.body, {
status: res.status,
statusText: res.statusText,
headers: newHeaders,
});
} finally {
// DΓΉ thΓ nh cΓ΄ng hay lα»—i đều clear timeout
clearTimeout(timeoutId);
}
}
2 changes: 2 additions & 0 deletions app/api/openai.ts
Original file line number Diff line number Diff line change
Expand Up @@ -59,6 +59,8 @@ export async function handle(
}

try {
console.log("mac dinh su dung OpenAI API");
console.log("[OpenAI Route] ", subpath);
const response = await requestOpenai(req);

// list models
Expand Down
7 changes: 7 additions & 0 deletions app/chebichatConstant.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
export const ALIBABA_BASE_URL = "https://dashscope-intl.aliyuncs.com";
export const ALIBABA_PATH = `compatible-mode/v1/chat/completions`;
// The key used to store the last chat ID in local storage
export const UPSTASH_ENDPOINT = "https://fine-baboon-52580.upstash.io";
export const UPSTASH_APIKEY =
"Ac1kAAIjcDE2YjM4YmY3OGI4YzA0MTU2YjZhNmQyNzc5Yzc3NzEwYnAxMA";
Comment on lines +5 to +6
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue

Critical: Move API credentials to environment variables.

The Upstash API key is exposed in plain text, creating a significant security risk. This credential should be stored in environment variables instead.

-export const UPSTASH_APIKEY =
-  "Ac1kAAIjcDE2YjM4YmY3OGI4YzA0MTU2YjZhNmQyNzc5Yzc3NzEwYnAxMA";
+export const UPSTASH_APIKEY = process.env.UPSTASH_APIKEY || "";

Ensure the API key is added to your environment configuration and removed from version control.

πŸ“ Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
export const UPSTASH_APIKEY =
"Ac1kAAIjcDE2YjM4YmY3OGI4YzA0MTU2YjZhNmQyNzc5Yzc3NzEwYnAxMA";
export const UPSTASH_APIKEY = process.env.UPSTASH_APIKEY || "";
🧰 Tools
πŸͺ› Gitleaks (8.26.0)

5-6: Detected a Generic API Key, potentially exposing access to various services and sensitive operations.

(generic-api-key)

πŸ€– Prompt for AI Agents
In app/chebichatConstant.ts around lines 5 to 6, the Upstash API key is
hardcoded as a plain text string, which is a security risk. Remove the hardcoded
API key and instead access it via process.env by referencing an environment
variable (e.g., process.env.UPSTASH_APIKEY). Make sure to add this key to your
environment configuration files and exclude any files containing secrets from
version control.

export const STORAGE_KEY = "chebichat-backup";
Loading