Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

batch lookup #181

Closed
Finistere opened this issue Mar 27, 2025 · 3 comments
Closed

batch lookup #181

Finistere opened this issue Mar 27, 2025 · 3 comments

Comments

@Finistere
Copy link

Finistere commented Mar 27, 2025

Hi! I'm from Grafbase and we're considering implementing part of this spec to support entities across our subgraphs defined with extensions (REST services, NATS, Postgres, etc.).

I've noticed that there is no notion of batching in here even though in practice we would often end up with fields like:

type Query {
  productsById(id: [ID!]!): [Product]! @lookup
}

type Product @key(fields: "id"){
  id: ID!
}

To my current understanding, the current spec only supports the following:

type Query {
  productById(id: ID!): Product @lookup
}

type Product @key(fields: "id"){
  id: ID!
}

forcing us to "batch" manually with a query like:

{
 e1: productById(id: 1) { ... }
 e2: productById(id: 2) { ... }
 e3: productById(id: 3) { ... }
}

While it works, it's definitely less efficient if a batch field exists.

From what I've seen of the current specification, It seems like @lookup could automatically be treated as a batch whenever it returns a list and it has a single argument that is a list? Any opinion? Do you have something else planned?

@dariuszkuc
Copy link

Hello 👋
The main reason batch @lookups are not something that we want to support (at least not in the first version of the spec) is that they quickly become problematic with handling of @require arguments.

Aliasing is one of the ways that batching can be implemented (but also gets somewhat complex pretty fast). Our suggested approach for handling batching is to use variable batching (combined with request batching if necessary). See #25 and graphql/graphql-over-http#307.

@Finistere
Copy link
Author

Finistere commented Mar 27, 2025

/facepalm Sorry, thought of creating an issue but somehow didn't occur to me to search for one. Thank you for the quick answer.

Hum, I see the problem but I'm not sure what to think of it. Not having batching capabilities forces the use of data loaders to be even remotely efficient. However, in our gateway everything is already batched together and most often an appropriate batch resolver will be available. So I see this as a performance hit.

Off the top of my head, I think I would handle it the following way:

  • Fields are batch resolvers if they accept a single list + have a list as output.
  • All key fields, for a given resolver, must be required.
  • All possible field requirements must be present and optional.

So in this case:

type Query {
    orderById(id: ID!): Order
    ordersById(ids: [ID!]!): [Order]!
}

type Order {
    id: ID!
    deliveryEstimate(dimension: ProductDimensionInput! @require(fields: "product { dimension }")) : Int!
}

I would change the fields to

type Query {
    orderById(id: OrderInput!): Order
    ordersById(ids: [OrderInput!]!): [Order]!
}

input OrderInput {
   id: ID!
   dimension: ProductDimensionInput
}

type Order {
    id: ID!
    deliveryEstimate(dimension: ProductDimensionInput! @require(fields: "product { dimension }")) : Int!
}

It does require dedicated support from the subgraph GraphQL framework to make it more or less transparent compared to variable batching and others, so it's similar to Apollo federation in that regard. It's less dynamic than _entities at least.

For the current use cases I can think of for extensions, that would work for us. I'm not sure how relevant it is for this specification though. My primary goal is to at least not to go against this specification. Let me know if I should write it in #25.

@Finistere
Copy link
Author

Moved the conversation there, so closing this issue. Thanks again for pointing me to it

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants