Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

🐛 BUG: Network Request Response no data available for large responses #1853

Open
leongrdic opened this issue Mar 15, 2024 · 4 comments
Open
Labels
bug Something isn't working local dev

Comments

@leongrdic
Copy link

leongrdic commented Mar 15, 2024

Which Cloudflare product(s) does this pertain to?

Wrangler core, Miniflare

What version(s) of the tool(s) are you using?

3.34.2 [Wrangler]

What version of Node are you using?

20.11.1

What operating system and version are you using?

MacOS Sonoma 14.4

Describe the Bug

Observed behavior

The Network > [specific request] > Response tab in DevTools shows a message stating "This request has no response data available":
image
This happens when the response is large, I haven't managed to pinpoint exactly when it starts happening.

Expected behavior

The Response tab to show the response that was received in the worker.

Reproduction

// found this publicly available JSON endpoint that is large enough to reproduce this behavior
const json = await fetch('https://jsonplaceholder.typicode.com/photos');
@leongrdic leongrdic added the bug Something isn't working label Mar 15, 2024
@penalosa penalosa transferred this issue from cloudflare/workers-sdk Mar 18, 2024
@penalosa penalosa removed this from workers-sdk Mar 18, 2024
@penalosa
Copy link
Collaborator

penalosa commented Mar 18, 2024

I've transferred this to the workerd repo, since I think this is a runtime bug. See the playground for a reproduction, but essentially it looks like cfResponse is empty on CDP Network.loadingFinished events when the body is too large. @irvinebroque would someone be able to take a look at this?

@leongrdic
Copy link
Author

Hi, are there any updates with this? Is there any way I could help resolving this

@kentonv
Copy link
Member

kentonv commented Aug 7, 2024

Looks like this is by design:

// A proxy for OutputStream that internally buffers data as long as it's beyond a given limit.
// Also, it counts size of all the data it has seen (whether it has hit the limit or not).
//
// We use this in the Network tab to report response stats and preview [decompressed] bodies,
// but we don't want to keep buffering extremely large ones, so just discard buffered data
// upon hitting a limit and don't return any body to the devtools frontend afterwards.
class Worker::Isolate::LimitedBodyWrapper: public kj::OutputStream {

Buffering really large bodies and then sending them over the JSON-oriented inspector protocol could be problematic. Although maybe the current 1MB limit is smaller than it needs to be?

@leongrdic
Copy link
Author

leongrdic commented Aug 7, 2024

Thanks for the quick comment!

Correct me if I'm wrong but isn't this only really relevant when running the worker while in local development. I would argue that there should be no limit (at least not a small one), even if there is a performance impact, because otherwise it's way too hard to debug any issues when working with big datasets.

That's also because a very similar issue happens when console-logging too much data, it also stops showing anything logged in the devtools (and the output in the terminal is not always useful because it collapses nested objects and arrays).

It's possible that I'm also missing something!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working local dev
Projects
None yet
Development

No branches or pull requests

4 participants