Like a good coding nerd, I spent (a surprising amount of) time figuring out how to Azure Functions with the following trigger types, all while using a managed identity:
- SQL Table
- Blob
- Blob using Event Grid
- Queue
- There's a command line tool "func" that you install with an MSI called Azure Functions Core Tools. I would have expected templates that you install with the dotnet CLI.
- There's something in Visual Studio that you need to update from the Options menu. It doesn't update by itself as far as I know. It's not just an extension, you have to hunt for it in the menus.
Click that button and there's no progress meter. Just an ephemeral message in the bottom right corner of VS. I would have expected an extension or something that updates with VS every couple of weeks like VS does. - The Azure command line tool, azd. The docs say it requires the Azure Functions Core Tools, so maybe azd just calls func.
var builder = DistributedApplication.CreateBuilder(args);
var db = builder.AddConnectionString("db");
var migrations = builder.AddProject<Projects.MigrationService>("migrationservice")
.WithReference(db)
.WaitFor(db);
var storage = builder.AddAzureStorage("storage")
.RunAsEmulator(az => az.WithLifetime(ContainerLifetime.Persistent))
;
builder.AddAzureFunctionsProject<Projects.QueueTriggerFunction>("queuetriggerfunction")
.WithHostStorage(storage)
.WaitFor(storage)
.WithReference(db)
.WaitFor(db)
.WaitForCompletion(migrations)
;
builder.Build().Run();
Over the years, a couple of Function defaults have driven me to distraction leading to my next 2 complaints.
Complaint #3: local.settings.json file is claimed to contain some secret values, so it is not included for commit by default in .gitignore. However, one critical value that everyone who works with the repo needs is "FUNCTIONS_WORKER_RUNTIME": "dotnet-isolated".
Complaint #4: something about the default logging setup makes sure only warning and above logging makes it to the Azure output. The Azure console relies SOLELY on this output to determine if your function ran, so logging something is critical. (Never mind that some other part of Azure knows it ran your function.)
I want local.settings.json committed, and I want to use the facility .NET Core already has for per-dev settings, secrets.json. Plus, using Aspire means the Function's project never needs settings in a config file anyway.
I also found some code that fixes the default exclusion of Information and below logging.
This is my Function project's Programs.cs. I added a simple config class and put a value for it in secrets.json to make sure the code uses it.
using Microsoft.Azure.Functions.Worker;
using Microsoft.Azure.Functions.Worker.Builder;
using Microsoft.Extensions.Configuration;
using Microsoft.Extensions.DependencyInjection;
using Microsoft.Extensions.Hosting;
using QueueTriggerFunction;
using System.Reflection;
using Microsoft.Extensions.Logging;
using Data;
using Microsoft.EntityFrameworkCore;
var builder = FunctionsApplication.CreateBuilder(args);
builder.AddServiceDefaults();
builder.ConfigureFunctionsWebApplication();
builder.Services
.AddApplicationInsightsTelemetryWorkerService()
.ConfigureFunctionsApplicationInsights();
builder.Logging.Services.Configure<LoggerFilterOptions>(options =>
{
// The Application Insights SDK adds a default logging filter that instructs ILogger to capture only Warning and more severe logs. Application Insights requires an explicit override.
// Log levels can also be configured using appsettings.json. For more information, see /azure-monitor/app/worker-service#ilogger-logs
LoggerFilterRule? defaultRule = options.Rules.FirstOrDefault(rule => rule.ProviderName
== "Microsoft.Extensions.Logging.ApplicationInsights.ApplicationInsightsLoggerProvider");
if (defaultRule is not null)
{
options.Rules.Remove(defaultRule);
}
// Add a new rule to capture Information and above for AI
options.AddFilter("Microsoft.Extensions.Logging.ApplicationInsights.ApplicationInsightsLoggerProvider",
LogLevel.Information);
options.MinLevel = LogLevel.Information;
});
builder.Services.AddOptions<MyConfigurationSecrets>()
.Configure<IConfiguration>((settings, configuration) =>
{
configuration.GetSection("MyConfigurationSecrets").Bind(settings);
});
builder.Configuration
.SetBasePath(Environment.CurrentDirectory)
.AddJsonFile("local.settings.json", optional: true)
.AddUserSecrets(Assembly.GetExecutingAssembly(), optional: true)
.AddEnvironmentVariables();
builder.Services.AddDbContext<DataContext>(optionsBuilder =>
{
optionsBuilder.UseSqlServer(
builder.Configuration.GetConnectionString("db"),
b =>
{
b.MigrationsAssembly("Data");
b.EnableRetryOnFailure();
});
});
builder.Build().Run();
There's a Migrations WebJob project called Data like the ones seen in Aspire examples that creates the DB and a simple table. How the table gets created isn't important, so I'm not including that here. I will show the steps for enabling table change tracking later, though.
A word on NuGet versions as of December 2025:
- Aspire 13 libraries
- Latest Functions libraries
- Latest .NET 9 libraries, except for Microsoft.Extensions.Configuration.UserSecrets, which is the latest 10.
The client ID value is the one with that name from the managed identity's properties:
The managed identity has the following role assignments for the storage account. They are probably more than is needed for just reading messages and blobs:
- Storage Blob Data Owner
- Storage Blob Data Contributor
- Storage Queue Data Contributor
- Storage Queue Data Message Processor
- Storage Account Contributor
Queue Trigger
using Azure.Storage.Queues.Models;
using Microsoft.Azure.Functions.Worker;
using Microsoft.Extensions.Logging;
using Microsoft.Extensions.Options;
namespace QueueTriggerFunction;
public class QueueTriggerFunction
{
private readonly MyConfigurationSecrets _myConfigurationSecrets;
private readonly ILogger<QueueTriggerFunction> _logger;
public QueueTriggerFunction(ILogger<QueueTriggerFunction> logger, IOptions<MyConfigurationSecrets> myConfigurationSecrets)
{
_logger = logger;
_myConfigurationSecrets = myConfigurationSecrets.Value;
}
[Function(nameof(QueueTriggerFunction))]
public void Run([QueueTrigger("myqueue-items")] QueueMessage message)
{
_logger.LogInformation("Using secret: {Secret}", _myConfigurationSecrets.Secret);
_logger.LogInformation("C# Queue trigger function processed: {messageText}", message.MessageText);
}
}
- Reader
- Storage Blob Data Contributor
- Storage Queue Data Contributor
File Trigger
using Microsoft.Azure.Functions.Worker;
using Microsoft.Extensions.Logging;
namespace QueueTriggerFunction;
public class FileTriggerFunction
{
private readonly ILogger<FileTriggerFunction> _logger;
public FileTriggerFunction(ILogger<FileTriggerFunction> logger)
{
_logger = logger;
}
[Function(nameof(FileTriggerFunction))]
public async Task Run([BlobTrigger("uploads/{name}")] Stream stream, string name)
{
using var blobStreamReader = new StreamReader(stream);
var content = await blobStreamReader.ReadToEndAsync();
_logger.LogInformation("C# Blob trigger function Processed blob\n Name: {name} \n Data: {content}", name, content);
}
}
SQL Trigger
using Data;
using Microsoft.Azure.Functions.Worker;
using Microsoft.Azure.Functions.Worker.Extensions.Sql;
using Microsoft.Extensions.Logging;
using Newtonsoft.Json;
namespace QueueTriggerFunction;
public class SqlTriggerFunction
{
private readonly ILogger _logger;
public SqlTriggerFunction(ILoggerFactory loggerFactory)
{
_logger = loggerFactory.CreateLogger<SqlTriggerFunction>();
}
// Visit https://aka.ms/sqltrigger to learn how to use this trigger binding
[Function("SqlTriggerFunction")]
public void Run(
[SqlTrigger("[dbo].[TodoItems]", "db")] IReadOnlyList<SqlChange<ToDoItem>> changes,
FunctionContext context)
{
_logger.LogInformation("SQL Changes: " + JsonConvert.SerializeObject(changes));
}
}
- Add the managed identity to the DB and give it the DB roles needed to read from the DB
CREATE USER [queue-trigger-user] FROM EXTERNAL PROVIDER; ALTER ROLE db_datareader ADD MEMBER [queue-trigger-user]; ALTER ROLE db_datawriter ADD MEMBER [queue-trigger-user]; ALTER ROLE db_ddladmin ADD MEMBER [queue-trigger-user]; - Enable change tracking on the DB
ALTER DATABASE [function-test] SET CHANGE_TRACKING = ON (CHANGE_RETENTION = 2 DAYS, AUTO_CLEANUP = ON); - Enable change tracking on the table
ALTER TABLE [dbo].[ToDoItems] ENABLE CHANGE_TRACKING; - Grant the DB identity the ability to see change tracking events on the table
GRANT VIEW CHANGE TRACKING ON OBJECT::dbo.ToDoItems TO [queue-trigger-user];
Event Grid-Based Blog Trigger
using Microsoft.Azure.Functions.Worker;
using Microsoft.Extensions.Logging;
namespace QueueTriggerFunction;
public class FileTriggerEventGridFunction
{
private readonly ILogger<FileTriggerEventGridFunction> _logger;
public FileTriggerEventGridFunction(ILogger<FileTriggerEventGridFunction> logger)
{
_logger = logger;
}
[Function(nameof(FileTriggerEventGridFunction))]
public async Task Run([BlobTrigger("uploads-eventgrid/{name}", Source = BlobTriggerSource.EventGrid)] Stream stream, string name)
{
using var blobStreamReader = new StreamReader(stream);
var content = await blobStreamReader.ReadToEndAsync();
_logger.LogInformation("C# Blob trigger function via event grid Processed blob\n Name: {name} \n Data: {content}", name, content);
}
}
In my case I wanted only uploads in a specific container to fire events so I had to specify a filter on the event e.g.