pwshub.com

Quick Example using Azure's Node.js SDK for Signed URLs

Way back in June (wait, that's only two months ago?) I wrote up a blog post showing how to use the AWS SDK for Signed URLs: "Quick example using AWS Node.js SDK V3 for Signed URLs". The idea for this was to cover a very specific set of functionality I needed to use along with Adobe's Firefly Services. Specifically my needs are:

  • Create a readable URL for a cloud storage asset
  • Create a writable URL for a cloud storage asset

And on top of that - also I needed to upload directly to cloud storage. I worked with Azure Storage Blob SDK and came up with the following functions. Honestly, use this with a grain of salt as it "worked for me", but I can't make any promises about how reliable/safe/etc this code is. That being said, I'd love any comments or suggestions.

Imports and Connecting

Once I installed the SDK, I began by importing what I needed:

import { BlobServiceClient, BlobSASPermissions, generateBlobSASQueryParameters, StorageSharedKeyCredential } from "@azure/storage-blob";

Next, I loaded in my credentials as well as an account and container name. So to be clear, for credentials it's an Azure key that I got from my portal and a connections string. The account name was also from the portal, and finally the container name is the 'bucket' where I'm working. I feel like the connection string could be constructed dynamically, but I hard coded it. All of these values are in my environment:

// Credentials for Azure
const AZURE_ACCOUNTNAME = process.env.AZURE_ACCOUNTNAME;
const AZURE_KEY = process.env.AZURE_KEY;
const AZURE_CONTAINERNAME = process.env.AZURE_CONTAINERNAME;
const AZURE_CONNECTIONSTRING = process.env.AZURE_CONNECTIONSTRING;

And finally, I created my client objects:

const blobServiceClient = BlobServiceClient.fromConnectionString(AZURE_CONNECTIONSTRING);
const containerClient = blobServiceClient.getContainerClient(AZURE_CONTAINERNAME);

Creating Read URLs

To create readable URLs, I used two functions.

function createSASReadString(key, accountName, containerName, duration=5) {
	let permissions = new BlobSASPermissions();
	permissions.read = true;
	let currentDateTime = new Date();
	let expiryDateTime = new Date(currentDateTime.setMinutes(currentDateTime.getMinutes()+duration));
	let blobSasModel = {
		containerName,
		permissions,
		expiresOn: expiryDateTime
	};
	let credential = new StorageSharedKeyCredential(accountName,key);
	return generateBlobSASQueryParameters(blobSasModel,credential);
}
function getSignedDownloadUrl(name, key, accountName, containerName) {
	let b = containerClient.getBlockBlobClient(name);
	return b.url + '?' + createSASReadString(key, accountName, containerName);
}

Note that getSignedDownloadUrl chains to createSASReadString and doesn't modify the duration, I could update that. And honestly, looking at this now, I think it should be one function. When I was building this, I thought I'd be reusing createSASReadString a few times but I don't think I did. You could easily wrap those two together and I may do so in the future.

Using it then is as simple as:

let inputURL = await getSignedDownloadUrl(fileName, AZURE_KEY, AZURE_ACCOUNTNAME, AZURE_CONTAINERNAME);

Note that I'm passing in my auth stuff. In that previous blog post the methods I wrote used the global s3 objects which is "bad", but is simpler as well. I thought the approach above was a bit more generic and pure.

I don't want to get that caught up in it though - feel free to modify what I build. ;)

Creating Write URLs

On the flip side, here's the method to create writable URLs. This can be handed off, for example to the Photoshop APIs, and used for outputs.

async function getSignedUploadUrl(name, client, containerName, duration=5) {
	let permissions = new BlobSASPermissions();
	permissions.write = true;
	let currentDateTime = new Date();
	let expiryDateTime = new Date(currentDateTime.setMinutes(currentDateTime.getMinutes()+duration));
	let blobSasModel = {
		containerName,
		permissions,
		expiresOn: expiryDateTime
	};
	let tempBlockBlobClient = client.getBlockBlobClient(name);
	return await tempBlockBlobClient.generateSasUrl(blobSasModel);
}

Using it looks like so:

let outputInvertedURL = await getSignedUploadUrl(fileName, containerClient, AZURE_CONTAINERNAME);

Uploading to Azure

Normally I didn't have to worry about uploading to Azure. If I made an upload URL and the API used it, then I didn't need to worry about it. But I was curious how it would work. My 'usual' upload code failed because Azure requires a special header. Here's the function:

async function uploadFile(url, filePath) {
	let size = fs.statSync(filePath).size;
	await fetch(url, {
		method:'PUT', 
		headers: {
			'Content-Type':'image/*',
			'Content-Length':size,
			'x-ms-blob-type':'BlockBlob'
		},
		body: fs.readFileSync(filePath)
	});
}

That x-ms-blob-type is the special header you need. Also note I've hard coded an image content-type. You could make that an argument or get the value dynamically.

Using it just requires the URL, which you get from the previous method, and a file path:

// sourceInput is something like ./cats_rules.jpg'
let fileName = sourceInput.split('/').pop();
let uploadURL = await getSignedUploadUrl(fileName, containerClient, AZURE_CONTAINERNAME);
await uploadFile(uploadURL, sourceInput);

That's it. I hope this helps because this post is the post I wish I had found when I started. ;)

Source: raymondcamden.com

Related stories
1 week ago - I've looked at Chrome's on-device GenAI development a few times now, and as a feature it is moving pretty fast. In fact, that first post and my follow up both don't work anymore due to the API changing. I'm fine with that as I knew it was...
1 month ago - As a quick FYI, if you would rather skip reading my text and jump to a video, I've got one at the end of this post. Be my guest to scroll down and watch that instead. One of the most interesting aspects of Adobe Firefly Services is what...
1 week ago - What is V2 Cloud? V2 Cloud is a cost-effective desktop-as-a-service (DaaS) solution. It simplifies the deployment of cloud-hosted virtual machines (Windows-powered) to provide infrastructure with remote accessibility for small to...
3 weeks ago - In the second part of this series, Joas Pambou aims to build a more advanced version of the previous application that performs conversational analyses on images or videos, much like a chatbot assistant. This means you can ask and learn...
2 weeks ago - No-code platforms are tools that help people with little to no coding knowledge build applications, websites, and more with their drag-and-drop interface and customizable code templates. These tools offer pre-built components, AI...
Other stories
19 hours ago - Have you ever wanted to create a web app that works smoothly on any device—whether it's on the web, mobile, or desktop? Imagine if your app could load quickly, work without an internet connection, and feel like a native app, all without...
1 day ago - If you’re looking for a no-fuss way to monitor real-time power consumption on your Ubuntu laptop, a new GNOME Shell extension makes it deliciously easy. “Why would I want to see energy usage?” – anyone asking that question probably...
1 day ago - A beta of Ubuntu 24.10 ‘Oracular Oriole’ is now available to download, giving developers and enthusiasts the chance to test and assess and the changes before October’s stable release. Developers and non-developers alike can download this...
1 day ago - Starting with proto-personas can be better than a blank page, but don’t forget — they’re assumption-driven placeholders for the real thing. Research is key to turning them into true personas. The post Using a proto-persona for UX design...
1 day ago - In this tutorial, you will learn how to upgrade and refactor your React application to the latest version without having […] The post Migrating to React 19 using react-codemod appeared first on LogRocket Blog.