Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Executing mockttp remotes from a Cypress test suite broken since [email protected] #188

Open
cefn opened this issue Mar 7, 2025 · 4 comments

Comments

@cefn
Copy link

cefn commented Mar 7, 2025

Since [email protected] the package cannot successfully load and execute getRemote() in any Cypress test out of the box. The error seems to arise from the (incorrect?) export configuration of brotli-wasm which was a new dependency since 2.0.0.

This is a real shame as the mockttp design is exactly what's needed to work around the limitation that Cypress tests all execute within a browser context, so if you want to control mocks, mockttp remoting is a perfect solution.

Is it still possible to run mockttp in a browser with a typical bundling process (I assume that cypress has a mainstream bundling process that we can consider canonical) ?

Is there any obvious workaround to prevent brotli-wasm bundling issues being fatal for all Cypress tests?

Repro

I started to investigate by forking the the original cypress+mockttp working reference from @mvasin from 2020 linked at #35 (comment)

You can see my fork at https://github.com/cefn/cypress-mockttp/tree/283ce5e2facf0c166907c4abb0ca7cd9df7bf072

In my fork I upgraded all dependencies except mockttp to get it to function in modern Chrome and Cypress, then I upgraded to latest [email protected] which remained functional.

However, upgrading to [email protected] or mockttp@latest were both failures. I know there are API changes needed also to align with latest mockttp, but this failure is profound and immediate to do with the fact the newer mockttp includes elements which can't be bundled for the browser at all, so we never get that far. Aligning with the new API makes no difference.

Instructions

You can prove the repro runs correctly in 1.x.x by running...

yarn
yarn cy

...then clicking on the Cypress E2E Testing button that appears, then click on Start E2E Testing in Chrome in the next screen choose the example dummy.spec.ts in the eventually-loaded Chrome instance. It correctly shows expected this is a mocked response to equal this is a mocked response.

Image

After proving that it's functional with [email protected] you can recreate the error within the project by running...

yarn add [email protected]; DEBUG=cypress:* yarn cy

...or...

yarn add mockttp@latest; DEBUG=cypress:* yarn cy

The failure presents in the UI like this...

Image

Looking at debug logs the issue shows more detail about the module load failure which fingers loading brotli-wasm as the cause.

[
  {
    moduleIdentifier:
      "/Users/myuser/workspace/github/cypress-mockttp/node_modules/brotli-wasm/pkg.bundler/brotli_wasm_bg.wasm",
    moduleName: "./node_modules/brotli-wasm/pkg.bundler/brotli_wasm_bg.wasm",
    loc: "1:0",
    message:
      "Module parse failed: Unexpected character '\x00' (1:0)\n" +
      "The module seem to be a WebAssembly module, but module is not flagged as WebAssembly module for webpack.\n" +
      "BREAKING CHANGE: Since webpack 5 WebAssembly is not enabled by default and flagged as experimental feature.\n" +
      "You need to enable one of the WebAssembly experiments via 'experiments.asyncWebAssembly: true' (based on async modules) or 'experiments.syncWebAssembly: true' (like webpack 4, deprecated).\n" +
      `For files that transpile to WebAssembly, make sure to set the module type in the 'module.rules' section of the config (e. g. 'type: "webassembly/async"').\n` +
      "(Source code omitted for this binary file)",
    moduleId: "./node_modules/brotli-wasm/pkg.bundler/brotli_wasm_bg.wasm",
    moduleTrace: [[Object], [Object], [Object], [Object], [Object], [Object]],
    details: undefined,
    stack:
      "ModuleParseError: Module parse failed: Unexpected character '\x00' (1:0)\n" +
      "The module seem to be a WebAssembly module, but module is not flagged as WebAssembly module for webpack.\n" +
      "BREAKING CHANGE: Since webpack 5 WebAssembly is not enabled by default and flagged as experimental feature.\n" +
      "You need to enable one of the WebAssembly experiments via 'experiments.asyncWebAssembly: true' (based on async modules) or 'experiments.syncWebAssembly: true' (like webpack 4, deprecated).\n" +
      `For files that transpile to WebAssembly, make sure to set the module type in the 'module.rules' section of the config (e. g. 'type: "webassembly/async"').\n` +
      "(Source code omitted for this binary file)\n" +
      "    at handleParseError (/Users/myuser/Library/Caches/Cypress/14.1.0/Cypress.app/Contents/Resources/app/node_modules/webpack/lib/NormalModule.js:982:19)\n" +
      "    at /Users/myuser/Library/Caches/Cypress/14.1.0/Cypress.app/Contents/Resources/app/node_modules/webpack/lib/NormalModule.js:1101:5\n" +
      "    at processResult (/Users/myuser/Library/Caches/Cypress/14.1.0/Cypress.app/Contents/Resources/app/node_modules/webpack/lib/NormalModule.js:806:11)\n" +
      "    at /Users/myuser/Library/Caches/Cypress/14.1.0/Cypress.app/Contents/Resources/app/node_modules/webpack/lib/NormalModule.js:866:5\n" +
      "    at /Users/myuser/Library/Caches/Cypress/14.1.0/Cypress.app/Contents/Resources/app/node_modules/loader-runner/lib/LoaderRunner.js:407:3\n" +
      "    at iterateNormalLoaders (/Users/myuser/Library/Caches/Cypress/14.1.0/Cypress.app/Contents/Resources/app/node_modules/loader-runner/lib/LoaderRunner.js:233:10)\n" +
      "    at /Users/myuser/Library/Caches/Cypress/14.1.0/Cypress.app/Contents/Resources/app/node_modules/loader-runner/lib/LoaderRunner.js:224:4\n" +
      "    at /Users/myuser/Library/Caches/Cypress/14.1.0/Cypress.app/Contents/Resources/app/node_modules/webpack/lib/NormalModule.js:840:15\n" +
      "    at Array.eval (eval at create (/Users/myuser/Library/Caches/Cypress/14.1.0/Cypress.app/Contents/Resources/app/node_modules/tapable/lib/HookCodeFactory.js:33:10), <anonymous>:12:1)\n" +
      "    at runCallbacks (/Users/myuser/Library/Caches/Cypress/14.1.0/Cypress.app/Contents/Resources/app/node_modules/enhanced-resolve/lib/CachedInputFileSystem.js:45:15)\n" +
      "    at /Users/myuser/Library/Caches/Cypress/14.1.0/Cypress.app/Contents/Resources/app/node_modules/enhanced-resolve/lib/CachedInputFileSystem.js:279:5\n" +
      "    at /Users/myuser/Library/Caches/Cypress/14.1.0/Cypress.app/Contents/Resources/app/node_modules/graceful-fs/graceful-fs.js:123:16\n" +
      "    at /Users/myuser/Library/Caches/Cypress/14.1.0/Cypress.app/Contents/Resources/app/packages/server/node_modules/graceful-fs/graceful-fs.js:123:16\n" +
      "    at /Users/myuser/Library/Caches/Cypress/14.1.0/Cypress.app/Contents/Resources/app/packages/server/node_modules/graceful-fs/graceful-fs.js:123:16\n" +
      "    at FSReqCallback.readFileAfterClose [as oncomplete] (node:internal/fs/read/context:68:3)",
  },
];
@cefn
Copy link
Author

cefn commented Mar 8, 2025

I was able to hack my way to a running test case, aiming at the minimum intervention, but it's definitely a hack. You can see the changes required at... https://github.com/cefn/cypress-mockttp/pull/3/files

Since mockttp by default references modules not suitable for the browser, I succeeded at intercepting those modules at build time, accepting the cost that calls to those modules may fail at runtime, and hoping that only using the getRemote() API will sidestep the problem.

Ideally mockttp would have a browser-oriented build which includes only getRemote and its requirements. Suggestion, create a separate package or a separate export intended for remote, which maybe people could consume like...

import { getRemote } from "mockttp/remote"

Workaround

This section provides more detail about the workaround at https://github.com/cefn/cypress-mockttp/pull/3/files

I needed to explicitly configure webpack by adding @cypress/webpack-preprocessor then overrride the bundle configuration to eliminate the resolution of multiple node-oriented dependencies as below, and to patch a polyfill for streams.

This enables the single test case to run, but I don't know how errored mockttp will be in general as I use any other elements of the getRemote() API surface, if it hasn't actually been designed to run in the browser. I thought it WAS intended to target the browser, which is why I started to integrate mockttp into my project. Was this a mistake? Maybe it no longer targets the browser?

Given all the intercepted resolutions, I take this to mean that the browser-targeted build (or the build that cypress believes is the browser build) is incorrectly composed right now, and contains much of the infrastructure required for the admin server and local server, when by definition these flows are impossible within a browser.

      const webpackOptions = {
        resolve: {
          extensions: [".ts", ".js", ".wasm"],
          fallback: {
            "stream": require.resolve("stream-browserify"),
            "zlib": false, 
            "querystring": false, 
            "path": false, 
            "fs": false 
          }
        },
        module: {
          rules: [
            {
              test: /\.wasm$/,
              type: "asset/resource", // Ensures WebAssembly files are handled correctly
            },
          ],
        },
        experiments: {
          asyncWebAssembly: true, // Enable WebAssembly
          syncWebAssembly: true,  // (Optional) Enable synchronous WASM
        },
        plugins: [
          new webpack.ProvidePlugin({
            Buffer: ["buffer", "Buffer"],
            process: "process/browser",
          }),
        ]
      } as const;

@pimterry
Copy link
Member

Ideally mockttp would have a browser-oriented build which includes only getRemote and its requirements.

Mockttp already has built-in configuration for a separate browser entrypoint at main.browser.ts, see the configuration in package.json:

mockttp/package.json

Lines 5 to 51 in b19499f

"exports": {
".": {
"types": "./dist/main.d.ts",
"node": "./dist/main.js",
"browser": "./dist/main.browser.js",
"default": "./dist/main.browser.js"
},
"./pluggable-admin": {
"types": "./dist/pluggable-admin-api/pluggable-admin.d.ts",
"node": "./dist/pluggable-admin-api/pluggable-admin.js",
"browser": "./dist/pluggable-admin-api/pluggable-admin.browser.js",
"default": "./dist/pluggable-admin-api/pluggable-admin.browser.js"
},
"./mockttp-pluggable-admin": {
"types": "./dist/pluggable-admin-api/mockttp-pluggable-admin.d.ts",
"node": "./dist/pluggable-admin-api/mockttp-pluggable-admin.js",
"browser": "./dist/pluggable-admin-api/mockttp-pluggable-admin.browser.js",
"default": "./dist/pluggable-admin-api/mockttp-pluggable-admin.browser.js"
},
"./dist/main.js": {
"node": "./dist/main.js",
"browser": "./dist/main.browser.js",
"default": "./dist/main.browser.js"
},
"./dist/*": {
"default": "./dist/*.js"
}
},
"main": "dist/main.js",
"types": "dist/main.d.ts",
"browser": {
"dist/main.js": "./dist/main.browser.js",
"dist/pluggable-admin-api/pluggable-admin.js": "./dist/pluggable-admin-api/pluggable-admin.browser.js",
"dist/pluggable-admin-api/mockttp-pluggable-admin.js": "./dist/pluggable-admin-api/mockttp-pluggable-admin.browser.js",
"dns": false,
"os": false,
"fs": false,
"net": false,
"tls": false,
"http": false,
"https": false,
"http2": false,
"http2-wrapper": false,
"cross-fetch": false,
"cacheable-lookup": false,
"@httptoolkit/proxy-agent": false
},

This entrypoint excludes the vast majority of Node-specific code. There are still a few dependencies on node built-in modules still used like stream & buffer, but there's standard polyfills available for those (quite a few bundlers will handle that automatically - although not Webpack v5 specifically, which you're using here).

Regarding brotli-wasm, the main challenge is that it is actually required in the browser for quite a few modern Mockttp use cases, so it can't be excluded. That module allows you to inspect & mock Brotli-compressed traffic, and brotli is a popular compression algorithm nowadays that's widely used. You need to use brotli-wasm in the browser if you try to read any brotli-encoded body. Wasm is supported by all modern browsers and so it's quite possible to use this in the browser (Mockttp itself has a complete suite of browsers tests that run in Chrome) and brotli-wasm has a specific browser-compatible entrypoint to support this, generated with a standard wasm-pack setup.

The issue is just when a bundler gets involved, and webpack v5 specifically which needs special configuration for wasm in general.

Your solution (configuring webpack explicitly to support wasm) is roughly the right approach here if you want to include brotli-wasm, although there's a few bits that are surprising:

  • Do you know where the dependency on path and fs come from? There shouldn't be code in the bundle using these, as far as I'm aware, so that shouldn't be necessary.
  • Querystring I'm not sure about, that's probably a genuinely useful dependency. We could plausibly move to URLSearchParams for some cases to avoid this, but it depends how it's used.
  • The zlib dependency is also genuinely used: this provides (de)compression for gzip & deflate compressed content. That said, it looks like we could now replace this with the browser-standard (De)CompressionStream APIs, but we haven't yet. PRs to http-encoding do so welcome (this conveniently wouldn't need a separate entrypoint because modern Node now has the same API too!).
  • In future I'd like to remove Buffer & stream, but the Node modules for this are widely used internally, very useful & effective, and very easy to polyfill. Open to improvements there but it's quite a bit of work and hasn't been a high priority. We could migrate to explicitly depending on polyfills for these, which would simplify bundler setup but complicate the situation slightly on the Node side.
  • The process global dependency is rarely used directly, but often used by some polyfills in turn (including stream). If we dropped those then that would potentially disappear too.

For the wasm configuration specifically, as a workaround you could just add brotli-wasm: false to your resolve list instead of enabling wasm. That will skip that module entirely. The module is always loaded on-demand to support this, so everything will work fine without that normally, but will crash hard if you actually try to read any Brotli-compressed content.

@cefn
Copy link
Author

cefn commented Mar 11, 2025

This is incredibly detailed and useful thanks, Tim. I'll look if the fs and path dependencies were forced by the build or not.

In summary for our use case the scenario of mockttp needing special manipulation at the bundler level is the thing that then takes the extra work and creates the complexity, so knowing that e.g. the polyfills are explicit dependencies and that (De)CompressionStream and URLSearchParams fulfil those requirements and therefore no bundling is needed is the crucial threshold. Before mockttp was in the mix we had no webpack config at all. If there's any bundle configuration needed, we've already stepped into the lava, and 6 packages versus 1 package is not a big difference (excepting the runtime exceptions we might potentially see).

I think I may have misunderstood the architecture in that I expected calls via getRemote to have no logic of their own, but just mirror a node-side API which did all the actual implementation. From your commentary the boundary isn't as clear as this. Perhaps it could be?

Our choice will be probably to retire Cypress in favour of Playwright (which will effectively run in Node) so I'll see how that policy change goes down with the team before putting more effort in this direction.

@pimterry
Copy link
Member

I think I may have misunderstood the architecture in that I expected calls via getRemote to have no logic of their own, but just mirror a node-side API which did all the actual implementation. From your commentary the boundary isn't as clear as this. Perhaps it could be?

Mostly things do happen on the server side, but that's not always possible. The main challenge is when the client directly handles request or response data itself - this is where all the compression algorithms come in.

Any time the server needs to give the client request or response data, it sends the data as received in its original raw still-encoded form, and the client decodes it dynamically when you try to read it if required. There's a few reasons for this:

  • It's more efficient on the wire (we just send compressed data)
  • It saves a bunch of work if you never actually decode the data
  • It avoids duplication in memory/on the wire in the case you want to read the raw and decoded data
  • It lets the client handle issues with non-decodeable data independently
  • It's identical between Node-only and browser code, which is convenient: both receive an event with encoded data, and then decode it themselves.

The relevant stuff happens here:

export const buildBodyReader = (body: Buffer, headers: Headers): CompletedBody => {
const completedBody = {
buffer: body,
async getDecodedBuffer() {
return runAsyncOrUndefined(async () =>
asBuffer(
await decodeBodyBuffer(this.buffer, headers)
)
);
},
async getText() {
return runAsyncOrUndefined(async () =>
(await this.getDecodedBuffer())!.toString()
);
},
async getJson() {
return runAsyncOrUndefined(async () =>
JSON.parse((await completedBody.getText())!)
)
},
async getUrlEncodedFormData() {
return runAsyncOrUndefined(async () => {
const contentType = headers["content-type"];
if (contentType?.includes("multipart/form-data")) return; // Actively ignore multipart data - won't work as expected
const text = await completedBody.getText();
return text ? querystring.parse(text) : undefined;
});
},
async getMultipartFormData() {
return runAsyncOrUndefined(async () => {
const contentType = headers["content-type"];
if (!contentType?.includes("multipart/form-data")) return;
const boundary = contentType.match(/;\s*boundary=(\S+)/);
// https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/Content-Type#boundary
// `boundary` is required for multipart entities.
if (!boundary) return;
const multipartBodyBuffer = asBuffer(await decodeBodyBuffer(this.buffer, headers));
return multipart.parse(multipartBodyBuffer, boundary[1]);
});
},
async getFormData(): Promise<querystring.ParsedUrlQuery | undefined> {
return runAsyncOrUndefined(async () => {
// Return multi-part data if present, or fallback to default URL-encoded
// parsing for all other cases. Data is returned in the same format regardless.
const multiPartBody = await completedBody.getMultipartFormData();
if (multiPartBody) {
const formData: querystring.ParsedUrlQuery = {};
multiPartBody.forEach((part) => {
const name = part.name;
if (name === undefined) {
// https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/Content-Disposition#as_a_header_for_a_multipart_body,
// The header must include `name` property to identify the field name.
// So we ignore parts without a name, treating it as invalid multipart form data.
} else {
// We do not use `filename` or `type` here, because return value of `getFormData` must be string or string array.
const prevValue = formData[name];
if (prevValue === undefined) {
formData[name] = part.data.toString();
} else if (Array.isArray(prevValue)) {
prevValue.push(part.data.toString());
} else {
formData[name] = [prevValue, part.data.toString()];
}
}
});
return formData;
} else {
return completedBody.getUrlEncodedFormData();
}
});
}
};
return completedBody;
};

That code we can store/transfer just a buffer of the raw received request body, and when you call request.body.getUrlEncodedFormData(), it'll automatically decompress the body, parse it as URL-encoded data, and give you the result.

There's a few places where bodies like this are accessible on the client side:

  • Listening to events like server.on('response', (req) => ...) where you receive all the request data explicitly as it happens.
  • Reading data manually, with APIs like rule.getSeenRequests().
  • Rules that use callbacks like thenCallback(cb) or thenPassthrough({ beforeResponse: cb }) where the callback code is defined and needs to run on the client, and needs access to the request or response data directly.

In all these cases, the client needs to access the request/response body. If that's compressed, it needs to be able to decompress it, which currently uses zlib, brotli-wasm, etc.

I guess there's a few alternatives I can see:

  • We could transfer only the decompressed content - this means the client wouldn't have the encoded content. That's usually OK, although this can be a particular problem if the content can't be decoded, and you do often want to know the original size of the encoded content. Definitely a breaking change.
  • We could transfer nothing, and expose an on-demand server API to request the encoded or decoded content manually maybe? This means an extra round trip but quite efficient if you don't need that (and a localhost round trip is not a big deal). It does mean the server would need to store the body though, which it doesn't necessarily do right now (recording is optional - the above APIs normally store the data only during the callback/until the event has fired and then discard it)
  • We could expose multiple APIs somehow, so you get decoded content by default but you can opt into receiving the encoded content (and then somehow make it easy to opt into having a decoder available, similarly). Sounds like another breaking API change somewhere.
  • We could expose a generic "decode this" API from the server, where the client still always receives encoded data, but can just send it to the server to get the decoded result back, instead of decoding anything itself. That's less efficient for cases where you could decode it, but not a big deal and could avoid any API changes.

Any of these would be large tricky changes I think, but it's possible. The last is definitely easiest (we already use an async API for decoding here - we just create a server API for this, and then call that instead of decoding the content for ourselves).

So, just to recap:

  • path and fs shouldn't be required at all, I'm definitely interested to know where those are coming from
  • stream, buffer, querystring and zlib are required but could probably be removed with some pretty mechanical work to replace them with web standards
  • I think process is only required because of other node polyfills, so would disappear if they were gone
  • Avoiding brotli-wasm might be possible with some of the above encoding changes, but it's definitely tricky. In future or using other bundlers it may well work out of the box, as wasm is entirely web compatible and I don't think brotli-wasm is doing anything especially unusual there so bundlers should be able to work it out in theory.

PRs towards any of those welcome if you're interested. Removing all these completely is non-trivial but probably a good general direction of travel for us to aim for nowadays (most of these were added back when Webpack v4 automatically handled all these polyfills for you).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants