Log in to GraphQL EditorGet started
GraphQL cache

Michał Tyszkiewicz

12/14/2024

GraphQL cache: using LRU cache with GraphQL Zeus

For GraphQL cache can significantly improve performance and reduce redundancy, particularly when the same data is queried multiple times within a short timeframe. In this post, I'll go over implementing caching using GraphQL Zeus to fetch data from our custom CMS. No worries this can also be applied just as effectively to other GraphQL APIs, such as Hygraph or Hasura. Whether you’re working with a headless CMS, an e-commerce backend, or any other GraphQL-powered service, caching is just as important in every scenario and the implementation should look very similar.

Why Add Caching?

Caching provides a number of advantages like:

  • Reduced Server Load: By serving repeated requests from the cache, you avoid making unnecessary calls to the GraphQL server, which can be essential for scaling applications with high traffic.
  • Improved Performance: frequently requested data can be served directly from memory, reducing query response time significantly
  • Cost Efficiency: if your GraphQL API usage is metered (e.g., cloud-based APIs), caching significantly reduces the number of queries made, saving on usage costs

GraphQL Zeus query example

Let’s start with a simple query I made using GraphQL Zeus, which fetches homepage data from our CMS:

export const getHomepageData = async () => {
  try {
    const result = await chain('query')({
      onehomepageBySlug: [{ slug: 'home' }, HomepageDataSelector],
    });
    return result;
  } catch (error) {
    console.error('failed to fetch home data', error);
  }
};

This query retrieves all the homepage data at once and passes it to the relevant components. This allows users to update the homepage content directly through the CMS without needing to modify any code. Currently, this function is called in server-side props, meaning it triggers an API call every time the page is refreshed or visited.

To optimize this and reduce redundant API calls, we’ll implement caching by using the node-lru-cache library.

LRU cache

LRU stands for "Least Recently Used," a caching strategy that automatically removes the least recently accessed items when the cache reaches its limit. The node-lru-cache library offers several configuration options, but in our case to set up a basic GraphQL cache, we only need two key settings:

  • ttl (Time to Live): This setting defines how long an item stays in the cache before it’s considered expired, specified in milliseconds.
  • ttlAutopurge: When enabled, this option automatically deletes expired items from the cache. Without it, expired items are only purged when the cache is next accessed which can potentially lead it to grow unbounded.
import { LRUCache } from 'lru-cache';

const cache = new LRUCache({
  ttl: 1000 * 60 * 5, // Set time to live to 5 minutes
  ttlAutopurge: true,  // Automatically clean up expired entries
});

Caching the query

To cache the query I mentioned earlier, we need to do three simple things:

  1. Set up a cache key: This helps the cache know which entry to return when the same data is requested. You can name the key however you like, but it has to be unique to avoid conflicts.
  2. Check if the cache contains the result: If the cache has the entry for that key, return the cached result.
  3. Store the query result in the cache: After fetching the data, save the result in the cache for future use.
export const getHomepageData = async () => {
  const cacheKey = 'homepageData'; // Cache key for this entry
  if (cache.has(cacheKey)) { // Check if the cache contains this key
    return cache.get(cacheKey) as { onehomepageBySlug: HomepageDataType }; // Return cached result
  } 
  try {
    const result = await chain('query')({
      onehomepageBySlug: [{ slug: 'home' }, HomepageDataSelector],
    });
    cache.set(cacheKey, result); // Store query result in the cache
    return result;
  } catch (error) {
    console.error('failed to fetch home data', error);
  }
};

This approach results in two scenarios:

  • First visit: The query runs, fetches the data, and caches it for future use.
  • Subsequent visits (within the TTL): The cached result is returned, avoiding the need to make a new API request.

Query with params (slug, locale)

As mentioned earlier, the cache key must be unique to avoid conflicts. This is especially important for queries that include parameters, such as locale (for localization) or slug (for specific blog pages). Without unique cache keys, you may end up with the same content being served for different locales or blog pages. To avoid this, it's best to use a dynamic cache key based on the query parameters.

export const getBlogpostBySlug = async ({ slug }: { slug: string }) => {
  const cacheKey = `Blogpost:${slug}`; // Unique cache key based on the slug

By using this dynamic approach, we ensure that the correct data is returned for the given slug. If you were to use a static cache key like 'blogpost', you'd always get the first cached blog post, regardless of which slug is provided. The same issue would arise with localization: without unique keys, you'd end up serving the wrong content for different locales.

Its all about performance

GraphQL is all about efficiency — retrieving only the data you need (unlike REST). It makes sense to take that a step further to further enhance performance. With a GraphQL cache setup like this we can reduce redundant API calls and significantly speed up queries, especially for frequently accessed data like homepage content or blog posts. Dynamic cache keys based on query parameters ensure that the data is correctly localized or filtered according to specific slugs and GraphQL Zeus and selectors keep everything type-safe. Together this solution can greatly improve your app's speed and scalability. As mentioned, even if you're not using GraphQL Zeus, the implementation should still be very similar to the above, so give it a try!

Check out our other blogposts

GraphQL cache: using LRU cache with GraphQL Zeus
Michał Tyszkiewicz
Michał Tyszkiewicz
GraphQL cache: using LRU cache with GraphQL Zeus
1 min read
8 days ago
Unlocking the Power of React 19
Tomasz Gajda
Tomasz Gajda
Unlocking the Power of React 19
1 min read
about 2 months ago
Zeus update - GraphQL spread operator
Michał Tyszkiewicz
Michał Tyszkiewicz
Zeus update - GraphQL spread operator
1 min read
3 months ago

Ready for take-off?

Elevate your work with our editor that combines world-class visual graph, documentation and API console

Get Started with GraphQL Editor