-
Notifications
You must be signed in to change notification settings - Fork 29
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Feature: Add support for ES modules #406
Comments
Thanks @michael-hhai! |
Yes. Correct. |
Yes, well known. That's why we have this: It's indeed more of an inherent problem with Node.js, but hopefully will be fixed. |
For what it's worth, I do not have a minimal reproduction immediately handy but I don't think the above linked workaround works for ES modules either. |
@michael-hhai I'm 90% sure it works (wanted to say 99% but maybe I need to be more modest) |
I have the following reproduction of the issue with ES modules that I've tried to make as minimal as I can:
import { OpenAI } from 'openai';
import * as traceloop from "@traceloop/node-server-sdk";
class Tracer {
public init(): void {
traceloop.initialize({
baseUrl: "http://example-url-does-not-exist.com/opentelemetry",
apiKey: "FAKE-API-KEY",
disableBatch: true,
instrumentModules: {
openAI: OpenAI,
},
});
}
public trace(fn: () => void): void {
traceloop.withAssociationProperties(
{
thing: "thing",
},
fn,
);
}
}
export const tracer = new Tracer(); b. Create {
"name": "tracer-module",
"version": "1.0.0",
"main": "dist/index.js",
"type": "module",
"exports": {
".": "./dist/index.js"
},
"scripts": {
"build": "tsc"
},
"dependencies": {
"@anthropic-ai/sdk": "^0.26.1",
"@aws-sdk/client-bedrock-runtime": "^3.632.0",
"@azure/openai": "^2.0.0-beta.1",
"@google-cloud/aiplatform": "^3.26.0",
"@google-cloud/vertexai": "^1.4.1",
"@pinecone-database/pinecone": "^3.0.0",
"@qdrant/js-client-rest": "^1.11.0",
"@traceloop/node-server-sdk": "^0.10.0",
"chromadb": "^1.8.1",
"cohere-ai": "^7.12.0",
"langchain": "^0.2.16",
"llamaindex": "^0.5.17",
"openai": "^4.56.0"
}
} c. Create {
"compilerOptions": {
"target": "ES6",
"sourceMap": true,
"module": "ESNext",
"strict": true,
"esModuleInterop": true,
"moduleResolution": "node",
"skipLibCheck": true,
"forceConsistentCasingInFileNames": true,
"outDir": "./dist",
"typeRoots": ["./node_modules/@types", "./types"]
},
"include": ["./**/*.ts", "./custom.d.ts"],
"exclude": ["node_modules"]
}
import { BatchInterceptor } from '@mswjs/interceptors'
import { ClientRequestInterceptor } from '@mswjs/interceptors/ClientRequest'
import { XMLHttpRequestInterceptor } from '@mswjs/interceptors/XMLHttpRequest'
const interceptor = new BatchInterceptor({
name: 'my-interceptor',
interceptors: [
new ClientRequestInterceptor(),
new XMLHttpRequestInterceptor(),
],
})
interceptor.apply()
interceptor.on('request', ({ request, requestId, controller }) => {
console.log(request.method, request.url)
})
import { tracer } from 'tracer-module';
tracer.init();
import OpenAI from 'openai';
// src/index.ts
const helloWorld = (): string => {
return "Hello, World!";
};
const main = async () => {
const resolvedTracer = await tracer; // Await the tracer if it is a promise
await resolvedTracer.trace(async () => {
// Example call to OpenAI
const openai = new OpenAI({
apiKey: process.env.OPENAI_API_KEY,
});
try {
const response = await openai.chat.completions.create({
model: "gpt-4o",
messages: [{ role: 'user', content: "Say Hello, World!" }],
max_tokens: 5,
});
console.log(response.choices[0]?.message?.content);
} catch (error) {
console.error("Error calling OpenAI API:", error);
}
// Original helloWorld function call
console.log(helloWorld());
});
};
main().catch((error) => console.error(error)); b. Create "compilerOptions": {
"target": "es2020",
"sourceMap": true,
"module": "commonjs",
"strict": true,
"esModuleInterop": true,
"moduleResolution": "node",
"skipLibCheck": true,
"forceConsistentCasingInFileNames": true,
"outDir": "./dist",
"typeRoots": ["./node_modules/@types", "./types"]
},
"include": ["./**/*.ts", "./custom.d.ts"],
"exclude": ["node_modules"]
} c. Create
Note that there is no call to {
"devDependencies": {
"tsx": "^4.16.5"
},
"dependencies": {
"@mswjs/interceptors": "^0.34.0",
"@opentelemetry/instrumentation": "^0.52.1",
"@traceloop/node-server-sdk": "^0.10.0",
"honeyhive": "^0.6.4",
"node-request-interceptor": "^0.6.3",
"openai": "^4.54.0",
"tracer-module": "file:../tracer-module"
}
}
Note that this does make a call to An interesting thing here is that this behavior is dependent on the |
Thanks! I think it's related to openai/openai-node#903 A possible workaround can be -
And then when running node for example you’d import that file with |
I've gone down that rabbit hole and I can't really get anything like that to work. Do you know what exactly needs to happen in order for |
It is indeed a PIA to get this to work, but it's definitely possible (I've done it). A couple of notes:
|
Would you mind spelling out what exactly all of that entails in terms of the minimal reproduction posted above? I'm not doing any sort of bundling there, so at least is not an issue. |
@ericallam Any help with the above? I still have no luck converting the minimal reproduction above into something that works (i.e. calls out to |
It seems like OpenLLMetry auto-instrumentation doesn't work with ES modules. For what it's worth, this is chiefly an upstream problem with opentelemetry-js (see also open-telemetry/opentelemetry-js#4845). Just making a record of it here as well.
The text was updated successfully, but these errors were encountered: