Thursday, January 1, 2026

Using Azure Functions with Managed Identity and SQL, Blob, Queue, and Event Grid Triggers

Like a good coding nerd, I spent (a surprising amount of) time figuring out how to Azure Functions with the following trigger types, all while using a managed identity:

  • SQL Table
  • Blob
  • Blob using Event Grid
  • Queue
Microsoft's documentation and logging in Azure is spread out. Twitter's Grok AI did a great job of helping pull it together when things weren't working.

To start, I created the 2 usual Aspire projects using Visual Studio 2022 and .NET 9, the ServiceDefaults and AppHost projects.

I then created my Functions project in Visual Studio 2022 using .NET 9 and the Functions project template, choosing the queue trigger template. This brings me to my first 2 complaints.

Complaint #1: The Functions template does not support the use of central package management. If your solution currently does you will immediately have a broken project until you adjust the project file and Directory.Packages.props. Knowing this, I used a project that does not use CPM.

Complaint #2: Azure Functions tooling is available 3 different ways on Windows, none of which I consider standard. 
  • There's a command line tool "func" that you install with an MSI called Azure Functions Core Tools. I would have expected templates that you install with the dotnet CLI.
  • There's something in Visual Studio that you need to update from the Options menu. It doesn't update by itself as far as I know. It's not just an extension, you have to hunt for it in the menus.
    Click that button and there's no progress meter. Just an ephemeral message in the bottom right corner of VS. I would have expected an extension or something that updates with VS every couple of weeks like VS does.
  • The Azure command line tool, azd. The docs say it requires the Azure Functions Core Tools, so maybe azd just calls func.
Which one is the latest? There's a github repo for func, so maybe that one. I don't know where the VS tooling lives on the internet.

Here is my AppHost.cs. I started with using named connections for the queue and storage connections, but when it came to configuring them in Azure it started getting very complex, so it's using the defaults. The SQL connection is for the SQL Table trigger. More details below.

var builder = DistributedApplication.CreateBuilder(args);
 
var db = builder.AddConnectionString("db");
 
var migrations = builder.AddProject<Projects.MigrationService>("migrationservice")
    .WithReference(db)
    .WaitFor(db);
 
var storage = builder.AddAzureStorage("storage")
    .RunAsEmulator(az => az.WithLifetime(ContainerLifetime.Persistent))
;
 
builder.AddAzureFunctionsProject<Projects.QueueTriggerFunction>("queuetriggerfunction")
    .WithHostStorage(storage)
    .WaitFor(storage)
    .WithReference(db)
    .WaitFor(db)
    .WaitForCompletion(migrations)
    ;
 
builder.Build().Run();

Over the years, a couple of Function defaults have driven me to distraction leading to my next 2 complaints.

Complaint #3: local.settings.json file is claimed to contain some secret values, so it is not included for commit by default in .gitignore. However, one critical value that everyone who works with the repo needs is "FUNCTIONS_WORKER_RUNTIME": "dotnet-isolated".

Complaint #4: something about the default logging setup makes sure only warning and above logging makes it to the Azure output. The Azure console relies SOLELY on this output to determine if your function ran, so logging something is critical. (Never mind that some other part of Azure knows it ran your function.)

I want local.settings.json committed, and I want to use the facility .NET Core already has for per-dev settings, secrets.json. Plus, using Aspire means the Function's project never needs settings in a config file anyway.

I also found some code that fixes the default exclusion of Information and below logging.

This is my Function project's Programs.cs. I added a simple config class and put a value for it in secrets.json to make sure the code uses it.


using Microsoft.Azure.Functions.Worker;
using Microsoft.Azure.Functions.Worker.Builder;
using Microsoft.Extensions.Configuration;
using Microsoft.Extensions.DependencyInjection;
using Microsoft.Extensions.Hosting;
using QueueTriggerFunction;
using System.Reflection;
using Microsoft.Extensions.Logging;
using Data;
using Microsoft.EntityFrameworkCore;
 
var builder = FunctionsApplication.CreateBuilder(args);
 
builder.AddServiceDefaults();
 
builder.ConfigureFunctionsWebApplication();
 
builder.Services
    .AddApplicationInsightsTelemetryWorkerService()
    .ConfigureFunctionsApplicationInsights();
 
builder.Logging.Services.Configure<LoggerFilterOptions>(options =>
{
    // The Application Insights SDK adds a default logging filter that instructs ILogger to capture only Warning and more severe logs. Application Insights requires an explicit override.
    // Log levels can also be configured using appsettings.json. For more information, see /azure-monitor/app/worker-service#ilogger-logs
    LoggerFilterRule? defaultRule = options.Rules.FirstOrDefault(rule => rule.ProviderName
        == "Microsoft.Extensions.Logging.ApplicationInsights.ApplicationInsightsLoggerProvider");
    if (defaultRule is not null)
    {
        options.Rules.Remove(defaultRule);
    }
 
    // Add a new rule to capture Information and above for AI
    options.AddFilter("Microsoft.Extensions.Logging.ApplicationInsights.ApplicationInsightsLoggerProvider",
        LogLevel.Information);
    options.MinLevel = LogLevel.Information;
});
 
builder.Services.AddOptions<MyConfigurationSecrets>()
    .Configure<IConfiguration>((settings, configuration) =>
    {
        configuration.GetSection("MyConfigurationSecrets").Bind(settings);
    });
 
builder.Configuration
       .SetBasePath(Environment.CurrentDirectory)
       .AddJsonFile("local.settings.json", optional: true)
       .AddUserSecrets(Assembly.GetExecutingAssembly(), optional: true)
       .AddEnvironmentVariables();
 
builder.Services.AddDbContext<DataContext>(optionsBuilder =>
{
    optionsBuilder.UseSqlServer(
        builder.Configuration.GetConnectionString("db"),
        b =>
        {
            b.MigrationsAssembly("Data");
            b.EnableRetryOnFailure();
        });
});
 
builder.Build().Run();

There's a Migrations WebJob project called Data like the ones seen in Aspire examples that creates the DB and a simple table. How the table gets created isn't important, so I'm not including that here. I will show the steps for enabling table change tracking later, though.

A word on NuGet versions as of December 2025:

  • Aspire 13 libraries
  • Latest Functions libraries
  • Latest .NET 9 libraries, except for Microsoft.Extensions.Configuration.UserSecrets, which is the latest 10.
I let VS publish the function to Azure in a .NET 9, Linux App Service. It created most of the environment variables that start with AzureWebJobsStorage, but I don't remember exactly which ones. I think it populated the storage account to use, but I don't remember how it knew. I also don't remember if I had already configured the managed identity to use, which helped it fill in the values. Anyway, here is the complete configuration:

The client ID value is the one with that name from the managed identity's properties:

The managed identity has the following role assignments for the storage account. They are probably more than is needed for just reading messages and blobs:
  • Storage Blob Data Owner
  • Storage Blob Data Contributor
  • Storage Queue Data Contributor
  • Storage Queue Data Message Processor
  • Storage Account Contributor
Something also assigned it the Monitoring Metrics Publisher role on the Application Insights resource that the function resource uses.

Some of these allow the managed user to list blobs/messages, and certain other ones allow it to read individual blobs/messages. It is not as clear as AWS's explicit list of actions. Again, Grok helped fill in my knowledge. At one point it showed me how to enable diagnostic logging on the storage account and look up the authorization failure in a Log Analytics Workspace.


Queue Trigger

The queue trigger code is almost the stock template code.

using Azure.Storage.Queues.Models;
using Microsoft.Azure.Functions.Worker;
using Microsoft.Extensions.Logging;
using Microsoft.Extensions.Options;
 
namespace QueueTriggerFunction;
 
public class QueueTriggerFunction
{
    private readonly MyConfigurationSecrets _myConfigurationSecrets;
    private readonly ILogger<QueueTriggerFunction> _logger;
 
    public QueueTriggerFunction(ILogger<QueueTriggerFunction> logger, IOptions<MyConfigurationSecrets> myConfigurationSecrets)
    {
        _logger = logger;
        _myConfigurationSecrets = myConfigurationSecrets.Value;
    }
 
    [Function(nameof(QueueTriggerFunction))]
    public void Run([QueueTrigger("myqueue-items")] QueueMessage message)
    {
        _logger.LogInformation("Using secret: {Secret}", _myConfigurationSecrets.Secret);
        _logger.LogInformation("C# Queue trigger function processed: {messageText}", message.MessageText);
    }
}

At this point, if you create a queue called myqueue-items and add a text message, the trigger should fire. I did this using the Azure portal. For some reason, even though I'm an Owner on the account, I had to add the following rights for myself to use the storage explorer in Azure:
  • Reader
  • Storage Blob Data Contributor
  • Storage Queue Data Contributor


File Trigger

The file trigger code is also almost as simple as the template:

using Microsoft.Azure.Functions.Worker;
using Microsoft.Extensions.Logging;
 
namespace QueueTriggerFunction;
 
public class FileTriggerFunction
{
    private readonly ILogger<FileTriggerFunction> _logger;
 
    public FileTriggerFunction(ILogger<FileTriggerFunction> logger)
    {
        _logger = logger;
    }
 
    [Function(nameof(FileTriggerFunction))]
    public async Task Run([BlobTrigger("uploads/{name}")] Stream stream, string name)
    {
        using var blobStreamReader = new StreamReader(stream);
        var content = await blobStreamReader.ReadToEndAsync();
        _logger.LogInformation("C# Blob trigger function Processed blob\n Name: {name} \n Data: {content}", name, content);
    }
}

At this point, everything is already set up to allow the file trigger to run when you upload a text file to a blob container called uploads.


SQL Trigger

The SQL trigger code:

using Data;
using Microsoft.Azure.Functions.Worker;
using Microsoft.Azure.Functions.Worker.Extensions.Sql;
using Microsoft.Extensions.Logging;
using Newtonsoft.Json;
 
namespace QueueTriggerFunction;
 
public class SqlTriggerFunction
{
    private readonly ILogger _logger;
 
    public SqlTriggerFunction(ILoggerFactory loggerFactory)
    {
        _logger = loggerFactory.CreateLogger<SqlTriggerFunction>();
    }
 
    // Visit https://aka.ms/sqltrigger to learn how to use this trigger binding
    [Function("SqlTriggerFunction")]
    public void Run(
        [SqlTrigger("[dbo].[TodoItems]", "db")] IReadOnlyList<SqlChange<ToDoItem>> changes,
            FunctionContext context)
    {
        _logger.LogInformation("SQL Changes: " + JsonConvert.SerializeObject(changes));
    }
}

As I mentioned, I created the DB and its single table with a WebJob that ran an EF migration. We need to do a few things to make the SQL trigger work:
  1. Add the managed identity to the DB and give it the DB roles needed to read from the DB
    CREATE USER [queue-trigger-user] FROM EXTERNAL PROVIDER;
    ALTER ROLE db_datareader ADD MEMBER [queue-trigger-user];
    ALTER ROLE db_datawriter ADD MEMBER [queue-trigger-user];
    ALTER ROLE db_ddladmin ADD MEMBER [queue-trigger-user];
    
  2. Enable change tracking on the DB
    ALTER DATABASE [function-test]
    SET CHANGE_TRACKING = ON
    (CHANGE_RETENTION = 2 DAYS, AUTO_CLEANUP = ON);
    
  3. Enable change tracking on the table
    ALTER TABLE [dbo].[ToDoItems]
    ENABLE CHANGE_TRACKING;
    
  4. Grant the DB identity the ability to see change tracking events on the table
    GRANT VIEW CHANGE TRACKING ON OBJECT::dbo.ToDoItems TO [queue-trigger-user];
    
We also need to configure the "db" connection string. None of the examples on the SQL DB connection string page in Azure are correct. The closest one is the "ADO.NET (Microsoft Entra integrated authentication)" one, except with the managed identity's Client ID in place of the User ID, and Authentication = "Active Directory Managed Identity".

If things are not working, one place I found errors is in the traces in the Live Metrics view of the Application Insights. Nothing showed in the Failures view when the SQL trigger wasn't showing.

Complaint #5: Logging in Azure for Functions, and Azure in general, is in so many places, unlike AWS and its CloudWatch system. There seems to be a lot more that needs to be enabled manually, like when diagnosing access problems to storage. Also, inconsistent time display. Some allow switching between local and UTC, some are only local, some are only UTC. Long delays in some cases e.g. 5 minutes between a function running and it showing in the console.

Complaint #6: The ways to allow resources in Azure to access other resources are too many. And once you've chosen a method, there can be many different ways to configure that one method. E.g. providing a config entry for each storage service (blob/table/queue) vs. a different one that points to the overall storage account. In the storage account case it tries one approach, then another. If you've accidentally provided a config that's partially one and partially the other, you'll have a hard time trying to understand why it's unhappy.

At this point the SQL trigger was working properly.

Event Grid-Based Blog Trigger

I read somewhere that at scale the direct blob trigger is not the best choice and that you should use an Event Grid-based blob trigger. The code looks like this:

using Microsoft.Azure.Functions.Worker;
using Microsoft.Extensions.Logging;
 
namespace QueueTriggerFunction;
 
public class FileTriggerEventGridFunction
{
    private readonly ILogger<FileTriggerEventGridFunction> _logger;
 
    public FileTriggerEventGridFunction(ILogger<FileTriggerEventGridFunction> logger)
    {
        _logger = logger;
    }
 
    [Function(nameof(FileTriggerEventGridFunction))]
    public async Task Run([BlobTrigger("uploads-eventgrid/{name}", Source = BlobTriggerSource.EventGrid)] Stream stream, string name)
    {
        using var blobStreamReader = new StreamReader(stream);
        var content = await blobStreamReader.ReadToEndAsync();
        _logger.LogInformation("C# Blob trigger function via event grid Processed blob\n Name: {name} \n Data: {content}", name, content);
    }
}

Almost the same as the blog trigger, but the Source is set to Event Grid. This is not the same as a pure Event Grid trigger, which would receive an object that describes the event instead of a ready-to-use Stream.

You wire the storage event to the function as a web hook, so you have to build a funky URL to call. This page details how to build it. Overall it's

https://<FUNCTION_APP_NAME>.azurewebsites.net/runtime/webhooks/blobs?functionName=Host.Functions.<FunctionName>&code=<BLOB_EXTENSION_KEY>

The BLOB_EXTENSION_KEY comes from the App keys section of your function resource

In my case I wanted only uploads in a specific container to fire events so I had to specify a filter on the event e.g.

/blobServices/default/containers/uploads-eventgrid/blobs/

After all that my Event Grid function started firing when I loaded to uploads-eventgrid.

Summary

I do not find Azure easy to work with. Granted, I've only ever deployed my own little personal projects, while I have built an actual production-quality project in AWS with a team. But the differences in the consistency of experience across services in Azure vs. AWS is glaring. There are fewer ways to do things in AWS, and that's probably due to the history and organization of each company. Azure is starting to introduce AI helpers directly in Azure, but I haven't found them nearly as helpful as pasting errors into Grok. As long as you can find those errors.