Azure Blob Storage — Uploading, Downloading and Generating SAS URLs

Azure Blob Storage is the standard cloud storage solution for uploaded files in Azure-hosted ASP.NET Core applications. It provides unlimited scalable storage, built-in CDN integration, and Shared Access Signature (SAS) URLs for time-limited secure access to private files. The Azure.Storage.Blobs SDK integrates cleanly with ASP.NET Core’s DI system and supports both connection string and passwordless managed identity authentication — making it suitable from local development to production without credential management overhead.

Azure Blob Storage Service

// dotnet add package Azure.Storage.Blobs
// dotnet add package Azure.Identity  (for DefaultAzureCredential)

// ── Registration ──────────────────────────────────────────────────────────
builder.Services.AddSingleton(new BlobServiceClient(
    builder.Configuration.GetConnectionString("AzureStorage")));
// OR with managed identity (no connection string in production):
// builder.Services.AddSingleton(new BlobServiceClient(
//     new Uri("https://blogappstorage.blob.core.windows.net"),
//     new DefaultAzureCredential()));

builder.Services.AddScoped<IBlobStorageService, AzureBlobStorageService>();

// ── Azure Blob Storage implementation ────────────────────────────────────
public class AzureBlobStorageService(BlobServiceClient blobClient) : IBlobStorageService
{
    public async Task<FileUploadResult> UploadAsync(
        string container, string fileName, Stream stream,
        string contentType, CancellationToken ct = default)
    {
        var containerClient = blobClient.GetBlobContainerClient(container);
        await containerClient.CreateIfNotExistsAsync(PublicAccessType.None, ct);

        var blobClient = containerClient.GetBlobClient(fileName);
        await blobClient.UploadAsync(stream, new BlobUploadOptions
        {
            HttpHeaders = new BlobHttpHeaders { ContentType = contentType },
        }, ct);

        var sizeBytes = (await blobClient.GetPropertiesAsync(ct)).Value.ContentLength;
        return new FileUploadResult(fileName, blobClient.Uri.ToString(), sizeBytes);
    }

    public async Task<Stream> DownloadAsync(
        string container, string fileName, CancellationToken ct = default)
    {
        var blob = blobClient.GetBlobContainerClient(container).GetBlobClient(fileName);
        var download = await blob.DownloadStreamingAsync(ct);
        return download.Value.Content;
    }

    public async Task DeleteAsync(string container, string fileName, CancellationToken ct = default)
    {
        var blob = blobClient.GetBlobContainerClient(container).GetBlobClient(fileName);
        await blob.DeleteIfExistsAsync(ct: ct);
    }

    // ── SAS URL — time-limited access to private blobs ────────────────────
    public Uri GenerateSasUrl(string container, string fileName,
        TimeSpan validity, BlobSasPermissions permissions = BlobSasPermissions.Read)
    {
        var blob = blobClient.GetBlobContainerClient(container).GetBlobClient(fileName);
        var sasBuilder = new BlobSasBuilder
        {
            BlobContainerName = container,
            BlobName          = fileName,
            Resource          = "b",                       // "b" for blob
            ExpiresOn         = DateTimeOffset.UtcNow.Add(validity),
        };
        sasBuilder.SetPermissions(permissions);
        return blob.GenerateSasUri(sasBuilder);
        // Returns: https://...blob.core.windows.net/images/abc.jpg?sv=...&sig=...
        // Valid for the specified duration; expires automatically
    }
}
Note: DefaultAzureCredential is the recommended authentication approach for Azure-hosted applications. It checks multiple credential sources in order: environment variables, workload identity, managed identity, Azure CLI, Visual Studio. In local development it uses the developer’s Azure CLI credentials (az login); in production it uses the App Service or Container Apps managed identity — no secrets stored anywhere. The same application code works in both environments without configuration changes.
Tip: Generate SAS URLs for private blob access rather than making blobs publicly accessible. A SAS URL grants time-limited, permission-scoped access to a specific blob. A user downloading their uploaded document gets a SAS URL valid for 1 hour — after that, the URL stops working even if shared. Public blobs (for static site assets, public images) can use direct blob URLs. Private blobs (user uploads, documents) should always be accessed via SAS URLs generated server-side after authorisation checks.
Warning: Never store Azure storage connection strings with write permissions in the Angular client or expose them in API responses. A connection string with write access allows anyone who obtains it to upload arbitrary data (potentially malicious files, vast quantities of data) directly to your storage account, bypassing all server-side validation. The Angular client should upload to your API, which validates and then uploads to Azure. For direct-to-Azure uploads (large file optimisation), generate a scoped SAS URL for a single upload operation from the server and return it to the client.

Streaming File Download

// ── Stream file download to client ────────────────────────────────────────
[HttpGet("{fileName}")]
[Authorize]
public async Task<IActionResult> Download(string container, string fileName, CancellationToken ct)
{
    // Verify the requesting user is authorised to download this file
    if (!await _authService.CanDownloadAsync(User.GetUserId(), container, fileName, ct))
        return Forbid();

    // Stream directly — do not buffer in memory
    var stream = await _blobStorage.DownloadAsync(container, fileName, ct);
    var contentType = "application/octet-stream";   // generic binary
    return File(stream, contentType, fileName);     // File() handles streaming
    // File() sets Content-Disposition: attachment; filename="..."
    // Streams response in chunks — works for any file size
}

Common Mistakes

Mistake 1 — Making all blobs publicly accessible (any URL exposes all files)

❌ Wrong — container set to PublicAccessType.Blob; user documents accessible to anyone with the URL.

✅ Correct — use PublicAccessType.None for private files; serve via SAS URLs or streaming endpoints with auth checks.

Mistake 2 — Downloading entire blob to MemoryStream before sending to client

❌ Wrong — 200MB file = 200MB memory allocation on the server per download.

✅ Correct — stream from Azure Blob directly to the HTTP response via return File(stream, contentType).

🧠 Test Yourself

A user requests a private document. The server generates a SAS URL valid for 1 hour and returns it to the Angular client. The user shares the URL publicly. What happens after 1 hour?