In my previous post, I’ve explained how to configure the Blob Storage Module on a Sitecore 9.3+ instance. The following post assumes you are already familiar with it and you’ve your Sitecore instance making use of the Azure blob storage provider.
In this post I’ll show you how we can make use of Azure Functions (blob trigger) to optimize (compress) images on the fly, when those are uploaded to the media library, in order to gain performance and with a serverless approach.

About Azure Functions and Blob Trigger
Azure Functions is an event driven, compute-on-demand experience that extends the existing Azure application platform with capabilities to implement code triggered by events occurring in Azure or third party service as well as on-premises systems. Azure Functions allows developers to take action by connecting to data sources or messaging solutions thus making it easy to process and react to events. Developers can leverage Azure Functions to build HTTP-based API endpoints accessible by a wide range of applications, mobile and IoT devices. Azure Functions is scale-based and on-demand, so you pay only for the resources you consume. For more info please refer to the official MS documentation.

Azure Functions integrates with Azure Storage via triggers and bindings. Integrating with Blob storage allows you to build functions that react to changes in blob data as well as read and write values.
Creating the Azure Function
For building the blob storage trigger function I’ll be using Visual Code, so first of all make sure you have the Azure Functions plugin for Visual Code, you can get it from the marketplace or from the extensions menu, also from the link: vscode:extension/ms-azuretools.vscode-azurefunctions.

Before proceeding, make sure you are logged into your Azure subscription. >az login.
- Create an Azure Functions project: Click on the add function icon and then select the blob trigger option, give a name to the function.

2. Choose the Blob Storage Account you are using in your Sitecore instance (myblobtestazure_STORAGE in my case).

3. Choose your blob container path (blobcontainer/{same})

4. The basics are now created and we can start working on our implementation.

Generated project files
The project template creates a project in your chosen language and installs required dependencies. For any language, the new project has these files:
- host.json: Lets you configure the Functions host. These settings apply when you’re running functions locally and when you’re running them in Azure. For more information, see host.json reference.
- local.settings.json: Maintains settings used when you’re running functions locally. These settings are used only when you’re running functions locally. For more information, see Local settings file.
Edit the local.settgins.json file to add the connection string of your blob storage:

The function implementation
using System.IO;
using Microsoft.Azure.WebJobs;
using Microsoft.Extensions.Logging;
using ImageMagick;
using Microsoft.WindowsAzure.Storage.Blob;
namespace SitecoreImageCompressor
{
public static class CompressBlob
{
[FunctionName("CompressBlob")]
public static async void Run([BlobTrigger("blobcontainer/{name}", Connection = "myblobtestazure_STORAGE")] CloudBlockBlob inputBlob, ILogger log)
{
log.LogInformation($"C# Blob trigger function Processed blob\n Name:{inputBlob.Name} \n Size: {inputBlob.Properties.Length} Bytes");
if (inputBlob.Metadata.ContainsKey("Status") && inputBlob.Metadata["Status"] == "Processed")
{
log.LogInformation($"blob: {inputBlob.Name} has already been processed");
}
else
{
using (var memoryStream = new MemoryStream())
{
await inputBlob.DownloadToStreamAsync(memoryStream);
memoryStream.Position = 0;
var before = memoryStream.Length;
var optimizer = new ImageOptimizer { OptimalCompression = true, IgnoreUnsupportedFormats = true };
if (optimizer.IsSupported(memoryStream))
{
var compressionResult = optimizer.Compress(memoryStream);
if (compressionResult)
{
var after = memoryStream.Length;
var gain = 100 - (float)(after * 100) / before;
log.LogInformation($"Optimized {inputBlob.Name} - from: {before} to: {after} Bytes. Optimized {gain}%");
await inputBlob.UploadFromStreamAsync(memoryStream);
}
else
{
log.LogInformation($"Image {inputBlob.Name} - compression failed...");
}
}
else
{
var info = MagickNET.GetFormatInformation(new MagickImageInfo(memoryStream).Format);
log.LogInformation($"Image {inputBlob.Name} - the format is not supported. Compression skipped - {info.Format}");
}
}
inputBlob.Metadata.Add("Status", "Processed");
await inputBlob.SetMetadataAsync();
}
}
}
}
As you can see, I’m creating and async task that will be triggered as soon as a new blob is added to the blob storage. Since we’re compressing and then uploading the modified image, we’ve to make sure the function is not triggered multiple times. For avoiding that, I’m also updating the image metadata with a “Status = Processed“.
The next step is to get the image from the CloudBlockBlob and then compress using the Magick.NET library. Please note that this library also provides a LosslessCompress method, for this implementation I choose to go with the full compression. Feel free to update and compare the results.
Nuget references
So, in order to make it working we need to install the required dependencies. Please run the following commands to install the Nuget packages:
- dotnet add package Azure.Storage.Blobs –version 12.8.0
- dotnet add package Magick.NET-Q16-AnyCPU –version 7.23.2
- dotnet add package Microsoft.Azure.WebJobs.Extensions.Storage –version 3.0.10
- dotnet add package Microsoft.Azure.WebJobs.Host.Storage –version 4.0.1
- dotnet add package Microsoft.NET.Sdk.Functions –version 1.0.38
Test and deploy
Now we have everything in place. Let’s press F5 and see if the function is compiling

We are now ready to deploy to Azure and test the blob trigger! Click on the up arrow in order to deploy to Azure, choose your subscription and go!

Check the progress in the terminal and output window:


Testing the trigger
Now we can go to the Azure portal, go to the Azure function and double check that everything is there as expected:

Go to the “Monitor” and click on “Logs” so we can have a look at the live stream when uploading an image to the blob storage. Now in your Sitecore instance, go to the Media Library and upload an image, this will upload the blob to the Azure Storage and the trigger will take place and compress the image.


As we can see in the logs the image got compressed, gaining almost 15%:
2021-02-23T10:21:36.894 [Information] Optimized 6bdf3e56-c6fc-488b-a7bb-eee64ce04343 – from: 81147 to: 69158 Bytes. Optimized 14.774422%


Let’s check the browser for the final results
Without the trigger: the image size is 81147 bytes.

With the trigger: the image size is 69158 bytes.

I hope you find this useful, you can also get the full implementation from GitHub.
Thanks for reading!