Sitecore media optimization with Azure Functions + Blob Storage + Magick.NET

In my previous post, I’ve explained how to configure the Blob Storage Module on a Sitecore 9.3+ instance. The following post assumes you are already familiar with it and you’ve your Sitecore instance making use of the Azure blob storage provider.

In this post I’ll show you how we can make use of Azure Functions (blob trigger) to optimize (compress) images on the fly, when those are uploaded to the media library, in order to gain performance and with a serverless approach.

Media Compression Flow

About Azure Functions and Blob Trigger

Azure Functions is an event driven, compute-on-demand experience that extends the existing Azure application platform with capabilities to implement code triggered by events occurring in Azure or third party service as well as on-premises systems. Azure Functions allows developers to take action by connecting to data sources or messaging solutions thus making it easy to process and react to events. Developers can leverage Azure Functions to build HTTP-based API endpoints accessible by a wide range of applications, mobile and IoT devices. Azure Functions is scale-based and on-demand, so you pay only for the resources you consume. For more info please refer to the official MS documentation.

Azure Functions

Azure Functions integrates with Azure Storage via triggers and bindings. Integrating with Blob storage allows you to build functions that react to changes in blob data as well as read and write values.

Creating the Azure Function

For building the blob storage trigger function I’ll be using Visual Code, so first of all make sure you have the Azure Functions plugin for Visual Code, you can get it from the marketplace or from the extensions menu, also from the link: vscode:extension/ms-azuretools.vscode-azurefunctions.

Install the extension for Azure Functions
Azure Functions Plugin

Before proceeding, make sure you are logged into your Azure subscription. >az login.

  1. Create an Azure Functions project: Click on the add function icon and then select the blob trigger option, give a name to the function.

2. Choose the Blob Storage Account you are using in your Sitecore instance (myblobtestazure_STORAGE in my case).

3. Choose your blob container path (blobcontainer/{same})

4. The basics are now created and we can start working on our implementation.

Default function class

Generated project files

The project template creates a project in your chosen language and installs required dependencies. For any language, the new project has these files:

  • host.json: Lets you configure the Functions host. These settings apply when you’re running functions locally and when you’re running them in Azure. For more information, see host.json reference.
  • local.settings.json: Maintains settings used when you’re running functions locally. These settings are used only when you’re running functions locally. For more information, see Local settings file.

Edit the local.settgins.json file to add the connection string of your blob storage:

local.settings.json

The function implementation

using System.IO;
using Microsoft.Azure.WebJobs;
using Microsoft.Extensions.Logging;
using ImageMagick;
using Microsoft.WindowsAzure.Storage.Blob;

namespace SitecoreImageCompressor
{
    public static class CompressBlob
    {
        [FunctionName("CompressBlob")]
        public static async void Run([BlobTrigger("blobcontainer/{name}", Connection = "myblobtestazure_STORAGE")] CloudBlockBlob inputBlob, ILogger log)
        {
            log.LogInformation($"C# Blob trigger function Processed blob\n Name:{inputBlob.Name} \n Size: {inputBlob.Properties.Length} Bytes");

            if (inputBlob.Metadata.ContainsKey("Status") && inputBlob.Metadata["Status"] == "Processed")
            {
                log.LogInformation($"blob: {inputBlob.Name} has already been processed");
            }
            else
            {
                using (var memoryStream = new MemoryStream())
                {
                    await inputBlob.DownloadToStreamAsync(memoryStream);
                    memoryStream.Position = 0;

                    var before = memoryStream.Length;
                    var optimizer = new ImageOptimizer { OptimalCompression = true, IgnoreUnsupportedFormats = true };

                    if (optimizer.IsSupported(memoryStream))
                    {
                        var compressionResult = optimizer.Compress(memoryStream);

                        if (compressionResult)
                        {
                            var after = memoryStream.Length;
                            var gain = 100 - (float)(after * 100) / before;

                            log.LogInformation($"Optimized {inputBlob.Name} - from: {before} to: {after} Bytes. Optimized {gain}%");

                            await inputBlob.UploadFromStreamAsync(memoryStream);
                        }
                        else
                        {
                            log.LogInformation($"Image {inputBlob.Name} - compression failed...");
                        }
                    }
                    else
                    {
                        var info = MagickNET.GetFormatInformation(new MagickImageInfo(memoryStream).Format);

                        log.LogInformation($"Image {inputBlob.Name} - the format is not supported. Compression skipped - {info.Format}");
                    }
                }

                inputBlob.Metadata.Add("Status", "Processed");
                
                await inputBlob.SetMetadataAsync();
            }
        }
    }
}

As you can see, I’m creating and async task that will be triggered as soon as a new blob is added to the blob storage. Since we’re compressing and then uploading the modified image, we’ve to make sure the function is not triggered multiple times. For avoiding that, I’m also updating the image metadata with a “Status = Processed“.

The next step is to get the image from the CloudBlockBlob and then compress using the Magick.NET library. Please note that this library also provides a LosslessCompress method, for this implementation I choose to go with the full compression. Feel free to update and compare the results.

Nuget references

So, in order to make it working we need to install the required dependencies. Please run the following commands to install the Nuget packages:

  • dotnet add package Azure.Storage.Blobs –version 12.8.0
  • dotnet add package Magick.NET-Q16-AnyCPU –version 7.23.2
  • dotnet add package Microsoft.Azure.WebJobs.Extensions.Storage –version 3.0.10
  • dotnet add package Microsoft.Azure.WebJobs.Host.Storage –version 4.0.1
  • dotnet add package Microsoft.NET.Sdk.Functions –version 1.0.38

Test and deploy

Now we have everything in place. Let’s press F5 and see if the function is compiling

Terminal output

We are now ready to deploy to Azure and test the blob trigger! Click on the up arrow in order to deploy to Azure, choose your subscription and go!

Azure publish

Check the progress in the terminal and output window:

Testing the trigger

Now we can go to the Azure portal, go to the Azure function and double check that everything is there as expected:

Azure function from the portal

Go to the “Monitor” and click on “Logs” so we can have a look at the live stream when uploading an image to the blob storage. Now in your Sitecore instance, go to the Media Library and upload an image, this will upload the blob to the Azure Storage and the trigger will take place and compress the image.

Media Library Upload
Azure functions logs

As we can see in the logs the image got compressed, gaining almost 15%:

2021-02-23T10:21:36.894 [Information] Optimized 6bdf3e56-c6fc-488b-a7bb-eee64ce04343 – from: 81147 to: 69158 Bytes. Optimized 14.774422%

Azure Blob Storage – With the trigger enabled
Azure Blob Storage – With the trigger disabled

Let’s check the browser for the final results

Without the trigger: the image size is 81147 bytes.

With the trigger: the image size is 69158 bytes.

I hope you find this useful, you can also get the full implementation from GitHub.

Thanks for reading!

How to enable Azure Blob Storage on Sitecore 9.3+

In this post I’m explaining how to switch the blob storage provider to make use of Azure Blob Storage. Before Sitecore 9.3, we could store the blobs on the DB or filesystem, Azure Blob Storage was not supported out of the box and even tough it was possible, it required some customizations to make it working, nowadays, since Sitecore 9.3 a module has been released and is very straightforward to setup, as you will see in this post.

By doing this we can significantly reduce costs and improve performance as the DB size won’t increase that much due to the media library items.

Resultado de imagen de azure blob storage

Introduction to Azure Blob storage

Azure Blob storage is Microsoft’s object storage solution for the cloud. Blob storage is optimized for storing massive amounts of unstructured data. Unstructured data is data that doesn’t adhere to a particular data model or definition, such as text or binary data.

Blob storage is designed for:

  • Serving images or documents directly to a browser.
  • Storing files for distributed access.
  • Streaming video and audio.
  • Writing to log files.
  • Storing data for backup and restore, disaster recovery, and archiving.
  • Storing data for analysis by an on-premises or Azure-hosted service.

Users or client applications can access objects in Blob storage via HTTP/HTTPS, from anywhere in the world. Objects in Blob storage are accessible via the Azure Storage REST APIAzure PowerShellAzure CLI, or an Azure Storage client library.

For more info please refer here and also you can find some good documentation here.

Creating your blob storage resource

Azure Storage Account

Create the resource by following the wizard and then check the “Access Keys” section, you’ll need the “Connection string” later.

Connection String and keys

Configuring your Sitecore instance

There are basically three main option to install the blob storage module into your instance:

  1. Install the Azure Blob Storage module in Sitecore PaaS.
    1. Use the Sitecore Azure Toolkit:
      1. Use a new Sitecore installation with Sitecore Azure Toolkit
      2. Use an existing Sitecore installation with Sitecore Azure Toolkit
    2. Use Sitecore in the Azure Marketplace (for new Sitecore installations only)
  2. Install the Azure Blob Storage module on an on-premise Sitecore instance.
  3. Manually install the Azure Blob Storage module in PaaS or on-premise.

This time I’ll be focusing in the last option, manually installing the module, doesn’t matter if it’s a PaaS or on-premise approach.

Manual installations steps

  1. Download the Azure Blob Storage module WDP from the Sitecore Downloads page.
  2. Extract (unzip) the WDP.
  3. Copy the contents of the bin folder of the WDP into the Sitecore web application bin folder.
  4. Copy the contents of the App_Config folder of the WDP into the Sitecore web application App_Config folder.
  5. Copy the contents of the App_Data folder of the WDP into the Sitecore web application App_Data folder.
  6. Add the following connection string to the App_Config\ConnectionStrings.config file of the Sitecore web application.
 <add name="azureblob" connectionString="DefaultEndpointsProtocol=https;AccountName=myblobtestazure;AccountKey={KEY};EndpointSuffix=core.windows.net"/>

7. In the \App_Config\Modules\Sitecore.AzureBlobStorage\Sitecore.AzureBlobStorage.config file, ensure that <param name="blobcontainer"> is the name you gave to the container after creating the resource.

Let’s test it!

If everything went well, then we can just test it by uploading a media item to the Sitecore media library

Let’s have a look now at the Storage Explorer in the Azure portal

Here we go, the image is now uploaded into the Azure blob storage, meaning the config is fine and working as expected.