Sitecore JSS – NEXT.js – Exploring the Incremental Site Regeneration (ISR).

Next.js allows you to create or update static pages after you’ve built your site. Incremental Static Regeneration (ISR) enables developers and content editors to use static-generation on a per-page basis, without needing to rebuild the entire site. With ISR, you can retain the benefits of static while scaling to millions of pages.

Static pages can be generated at runtime (on-demand) instead of at build-time with ISR. Using analytics, A/B testing, or other metrics, you are equipped with the flexibility to make your own tradeoff on build times.

Consider an e-commerce store with 100,000 products. At a realistic 50ms to statically generate each product page, the build would take almost 2 hours without ISR. With ISR, we can choose from:

Faster Builds → Generate the most popular 1,000 products at build-time. Requests made to other products will be a cache miss and statically generate on-demand: 1-minute builds.

Higher Cache Hit Rate → Generate 10,000 products at build-time, ensuring more products are cached ahead of a user’s request: 8-minute builds.

Exploring ISR

In my previous post, I’ve created a JSS-Next.js app that we deployed to Vercel. I also created a WebHook to trigger a full rebuild in Vercel (SSG). Now, I’ll explain how the ISR works in this same app.

Fetching Data and Generating Paths

Data:

ISR uses the same Next.js API to generate static pages: getStaticProps.
By specifying revalidate: 5, we inform Next.js to use ISR to update this page after it’s generated.

Check the src/pages/[[…path]].tsx file and the getStaticProps function:

Paths:

Next.js defines which pages to generate at build-time based on the paths returned by
getStaticPaths. For example, you can generate the most popular 1,000 products at build-time by returning the paths for the top 1,000 product IDs in getStaticPaths.

With this configuration, I’m telling Next.js to enable ISR and to revalidate every 5 sec. After this time period, the first user making the request will receive the old static version of the page and trigger the revalidation behind the scenes.

The Flow

  1. Next.js can define a revalidation time per-page (e.g. 5 seconds).
  2. The initial request to the page will show the cached page.
  3. The data for the page is updated in the CMS.
  4. Any requests to the page after the initial request and before the 5 seconds window will show the cached (hit) page.
  5. After the 5 second window, the next request will still show the cached (stale) page. Next.js triggers a regeneration of the page in the background.
  6. Once the page has been successfully generated, Next.js will invalidate the cache and show the updated product page. If the background regeneration fails, the old page remains unaltered.

Page Routing

Here’s a high-level overview of the routing process:

In the diagram above, you can see how the Next.js route is applied to Sitecore JSS.

The [[…path]].tsx Next.js route will catch any path and pass this information along to getStaticProps or getServerSideProps on the context object. The Page Props Factory uses the path information to construct a normalized Sitecore item path. It then makes a request to the Sitecore Layout Service REST API or Sitecore GraphQL Edge schema to fetch layout data for the item.

Demo!

So, back to our previously deployed app in Vercel, login to Sitecore Content Editor and make a change on a field. I’m updating the heading field (/sitecore/content/sitecoreverceldemo/home/Page Components/home-jss-main-ContentBlock-1) by adding “ISR Rocks!”. We save the item and refresh the page deployed on Vercel. (Don’t publish! this will trigger the webhook that is defined in the publish:end event).

After refreshing the page, I can still see the old version:

But, if I keep checking what is going on in the ngrok, I can see the requests made to the layout service:

So, after refreshing again the page, I can see the changes there!

So, it got updated without the need of rebuilding and regenerating the whole site.

That’s it! I hope this post helps to understand how the ISR works and how to start with it on your Sitecore JSS implementation.

Thanks for reading and stay tuned for more Sitecore stuff!

Using Redis as Sitecore custom cache

In this post I’ll share how to use Azure Redis Cache as Sitecore custom cache provider.

Azure Cache for Redis is a fully managed, distributed, in-memory cache that enables high-performance and scalable architectures. You can use it to create cloud or hybrid deployments that handle millions of requests per second at sub-millisecond latency, all with the configuration, security and availability benefits of a managed service. More info here.

The first step is to create the Redis cache in Azure, for this we log in to the Azure Portal and then add a new resource, search for “Azure Cache for Redis” and choose a plan, for this demo I selected a “Basic C1” plan, we can scale it later if needed.

Azure Redis Cache is now deployed and ready to connect to.

The next step is to get the connection string data and add a new entry “redis.sessions” into the connectionstrings.config file:

Now our app is connected to the Redis cache. Let’s now have a look at a custom cache implementation.

We start by creating a cache provider:

[Service(typeof(IRedisCacheProvider), Lifetime = Lifetime.Singleton)]
public class RedisCacheProvider : IRedisCacheProvider
{
    private static readonly Lazy<ConnectionMultiplexer> LazyConnection = new Lazy<ConnectionMultiplexer>(() =>
    {
        var connectionString = ConfigurationManager.ConnectionStrings["redis.sessions"].ConnectionString;
        var options = ConfigurationOptions.Parse(connectionString);

        options.AllowAdmin = true;
        options.SyncTimeout = 60000;
        options.ConnectRetry = 5;

        return ConnectionMultiplexer.Connect(options);
    });

    public static ConnectionMultiplexer Connection => LazyConnection.Value;

    private readonly IDatabase _redisCache;

    public RedisCacheProvider()
    {
        _redisCache = Connection.GetDatabase();
    }

    public IDatabase GetRedisCache()
    {
        return _redisCache;
    }

    public IServer GetServer()
    {
        return Connection.GetServer(Connection.GetEndPoints().FirstOrDefault());
    }
}

Now we need to a create a cache manager, that class will contain all the methods to call the cache and to communicate with Redis:

[Service(typeof(ICacheManager), Lifetime = Lifetime.Singleton)]
public class CacheManager : ICacheManager
{
    private readonly IDatabase _redisCache;
    private readonly IServer _redisServer;

    public CacheManager(IRedisCacheProvider redisCacheProvider)
    {
        _redisCache = redisCacheProvider.GetRedisCache();
        _redisServer = redisCacheProvider.GetServer();
    }

    private static readonly Dictionary<string, object> CacheKeyDictionary = new Dictionary<string, object>();

    public object Get(string key)
    {
        return Get(key, string.Empty);
    }

    public object Get(string key, string site)
    {
        var siteName = string.IsNullOrEmpty(site) ? Context.Site?.Name : site;
        var cacheKey = $"{siteName}{Context.Database?.Name}{Context.Language}{key}";
        var res = _redisCache.StringGet(cacheKey);

        return !string.IsNullOrEmpty(res) ? JsonConvert.DeserializeObject(res) : res;
    }

    public void Set(string key, object value)
    {
        Set(key, value, string.Empty);
    }

    public void Set(string key, object value, string site)
    {
        var siteName = string.IsNullOrEmpty(site) ? Context.Site?.Name : site;
        var cacheKey = $"{siteName}{Context.Database?.Name}{Context.Language}{key}";

        _redisCache.StringSet(cacheKey, JsonConvert.SerializeObject(value));
    }

    public IList<string> GetAllKeys()
    {
        return _redisServer.Keys().Select(k => k.ToString()).ToList();
    }

    public void Remove(string key)
    {
        _redisCache.KeyDelete(key);
    }

    public void ClearCache(object sender, EventArgs args)
    {
        Log.Info($"RedisCache Cache Clearer.", this);

        _redisServer.FlushAllDatabases();

        Log.Info("RedisCache Cache Clearer done.", (object)this);
    }

    public TObj GetCachedObject<TObj>(string cacheKey, Func<TObj> creator) where TObj : class
    {
        return GetCachedObject(cacheKey, creator, string.Empty);
    }

    public TObj GetCachedObject<TObj>(string cacheKey, Func<TObj> creator, string site) where TObj : class
    {
        if (string.IsNullOrEmpty(site))
        {
            site = Context.Site.Name;
        }

        var obj = Get(cacheKey, site) as TObj;

        if (obj == null)
        {
            // get the lock object
            var lockObject = GetCacheLockObject(cacheKey, site);

            try
            {
                lock (lockObject)
                {
                    obj = creator.Invoke();

                    Set(cacheKey, obj);
                }
            }
            finally
            {
                RemoveCacheLockObject(cacheKey, site);
            }
        }

        return obj;
    }

    private object GetCacheLockObject(string cacheKey, string site)
    {
        cacheKey += site;

        lock (CacheKeyDictionary)
        {
            if (!CacheKeyDictionary.ContainsKey(cacheKey))
            {
                CacheKeyDictionary.Add(cacheKey, new object());
            }

            return CacheKeyDictionary[cacheKey];
        }
    }

    private void RemoveCacheLockObject(string cacheKey, string site)
    {
        cacheKey += site;

        lock (CacheKeyDictionary)
        {
            if (CacheKeyDictionary.ContainsKey(cacheKey))
            {
                CacheKeyDictionary.Remove(cacheKey);
            }
        }
    }
}

It’s important to keep in mind that this is a distributed cache, meaning that all Sitecore instances connected to the same cache are sharing it, for example, if we’ve a setup with one CM instance and two CDs, all of those will be sharing the same cache, while in memory cache is specific to the instance. That’s why I’m adding the site name, database and language to the cache key.

Almost done, but now we have to think about one of the most important things when working with caches, when and how to invalidate those.

We can just call the ClearCache() on the publish:end and publish:end:remote events, but I wanted to make it a bit flexible, as the cache is shared across instances is better to keep control on that rather than just flushing everything on each publish action.

I decided to go with a custom event handler approach. Check the config patch, I’m introducing the customCache:rebuild and customCache:rebuild:remote events:

<!--For more information on using transformations see the web.config examples at http://go.microsoft.com/fwlink/?LinkId=214134. -->
<configuration xmlns:patch="http://www.sitecore.net/xmlconfig/" xmlns:set="http://www.sitecore.net/xmlconfig/set" xmlns:xdt="http://schemas.microsoft.com/XML-Document-Transform">
  <sitecore>
    <pipelines>
      <initialize>
        <processor type="Foundation.RedisCache.Pipelines.Initialize, Foundation.RedisCache" method="InitializeFromPipeline" />
      </initialize>
    </pipelines>
    <commands>
      <command name="rediscache:cleancache" type="Foundation.RedisCache.Commands.CleanCacheCommand, Foundation.RedisCache" />
    </commands>
    <events xdt:Transform="Insert">
      <event name="customCache:rebuild">
        <handler type="Foundation.RedisCache.Events.EventHandlers.CacheRebuildEventHandler, Foundation.RedisCache" method="OnCustomCacheRebuild" />
      </event>
      <event name="customCache:rebuild:remote">
        <handler type="Foundation.RedisCache.Events.EventHandlers.CacheRebuildEventHandler, Foundation.RedisCache" method="OnCustomCacheRebuild" />
      </event>
    </events>
  </sitecore>
</configuration>

The initialize pipeline:

public class Initialize
{
    /// <summary>
    /// Initializes event subscription
    /// </summary>
    /// <param name="args">Args</param>
    public virtual void InitializeFromPipeline(PipelineArgs args)
    {
        var action = new Action<CacheRebuildEvent>(RaiseRemoteEvent);

        Sitecore.Eventing.EventManager.Subscribe<CacheRebuildEvent>(action);
    }

    /// <summary>
    /// Raises remote event
    /// </summary>
    /// <param name="cacheRebuildEvent"></param>
    private void RaiseRemoteEvent(CacheRebuildEvent cacheRebuildEvent)
    {
        var eventArgs = new object[] { new CacheRebuildEventArgs(cacheRebuildEvent) };

        Sitecore.Events.Event.RaiseEvent(Constants.CustomCacheRebuildEventNameRemote, eventArgs);
    }
}

I’ve also decided to create a simple command that we can just call from the Sitecore ribbon in order to flush this cache manually, this can help in case something get wrong and to avoid the need of manually flushing the redis cache from Azure.

[Serializable]
public class CleanCacheCommand : Sitecore.Shell.Framework.Commands.Command
{
    public override void Execute(Sitecore.Shell.Framework.Commands.CommandContext context)
    {
        var raiser = new CacheRebuildEventRaiser();
        var ev = new CacheRebuildEvent { CacheKey = Constants.ClearAll };

        raiser.RaiseEvent(ev);

        SheerResponse.Alert("Redis Cache flushed");
    }
}

That’s very much it! Let’s see this in action now!

So, to make use of this caching foundation, we just need to inject the ICacheManager and use the GetCachedObject method:

var cacheKey = $"RedisCacheTest-{path}";

            return _cacheManager.GetCachedObject(cacheKey, () =>
            {
                var slowMe = DateTime.Now + TimeSpan.FromSeconds(5);

                while (DateTime.Now < slowMe)
                {
                    //This is just an expensive operation...
                }

                return "/some/url";
            });

Please note that at the end the cache key will be generated by: {Site Name}{Database Name}{Language Name}{RedisCacheTest}-{path}.

Let’s check now the Redis Cache Console in Azure, we can run the command SCAN 0 COUNT 1000 MATCH * to get all keys from the cache:

As you can see the “RedisCacheTest” is there!

Let me take the opportunity to introduce the Redis Cache Visual Code extension, find the details here.

The extension provided a quick and easy way to browse the Redis cache contents,

I hope you find this interesting!

You can find the full code in Github.