[useResponseCache] cached Async-Iterable result breaks clients #2236
Labels
kind/bug
Something isn't working
stage/3-local-solution
A user has a possible solution that solves the issue
Issue workflow progress
Progress of the issue based on the
Contributor Workflow
minimal reproduction available on
Stackblitz.
@envelop/*
packages that you are using.Intro
Description
Caching Async Iterable is extremely useful, but GraphQL clients expect that queries using
@defer
and@stream
are executed in streaming mode.Instead, the implementation of this plugin delivers the whole cached result within a single response.
This entirely breaks Relay because it doesn't look within the initial response for any fragment that is marked as
@defer
but instead continue waiting for more parts to be delivered with the deferred fragments.Additional information on Relay behaviour are available in this GitHub issue.
Workaround
A workaround is to add an extension to the response to let clients know that the operation is complete.
In case of Relay this must be
extensions: { is_final: true }
. Although the casing looks weird, so it might be possible that other clients expectisFinal
instead.This workaround can be implemented just by adding the following line of code before line 615:
With this extension Relay does process all the fragments available in the response even if this is not delivered in streaming mode.
I hope you will consider accepting this fix quickly since this is my last blocker for migrating from Helix to Yoga.
Proper solution
Even with the workaround above, Relay still issues the following warning
This tells me that GraphQL clients expect a stream mode.
With this in mind, the only proper way to solve this problem is to cache the response in parts. Then deliver the first part straight away and yield results for the following parts sequentially.
This will take longer to implement, so the workaround is still valid to avoid breaking clients.
The text was updated successfully, but these errors were encountered: