Azure Functions Azure Stack Configuration Development

Azure Stack – Use Function To Generate SAS Token For Blob Content

Bildergebnis für file exchange

Introduction

When it comes to Azure Stack, a lot of customers want to know what can Azure Stack do for them or in other words what are the use cases for Azure Stack.

Let’s assume you need to exchange files or any other blob content either between systems or users on-premises. This could be a scheduling / batch system which needs to copy certain files temporarily to a location in order to be picked up by another batch job or another another reason could be if you want to make files available to your customers, but only for a certain amount of time e.g. 5 min, 1 h or so. After the time has expired the user should not be able to access the content anymore. There is even more, e.g. if you want to granularly control what the users or the system should be able to do with this blob content e.g. only read, list but not delete.

One way we can implement such a scenario is to use shared access signature (SAS). But what is an SAS?

A SAS gives you granular control over the type of access you grant to clients who have the SAS, including:

  • The interval over which the SAS is valid, including the start time and the expiry time.
  • The permissions granted by the SAS. For example, a SAS for a blob might grant read and write permissions to that blob, but not delete permissions.
  • An optional IP address or range of IP addresses from which Azure Storage will accept the SAS. For example, you might specify a range of IP addresses belonging to your organization.
  • The protocol over which Azure Storage will accept the SAS. You can use this optional parameter to restrict access to clients using HTTPS.

Ok so far so good, but what does that mean in terms of Azure Stack and functions?

Good question, what we want to achieve is the following process:

  1. Create a script to copy a file to a storage blob account.
  2. Trigger a function via http post, passing the file name and blob location within the post request, to create a SAS for that specific file.
  3. Return the SAS to the consumer which then can access the file.

If we put it in a more graphical way it would look like this…

image

I hope to get the idea, but what is the real advantage of this solution? The huge advantage is, that we can talk to the object storage via https protocol. This means we are able to upload and download the content via http(s). In addition the expiration of the file access and permission is handled by the storage account itself. So we do not need to implement a complicated logic to handle the permission at all, we basically have to tell the blob what, who and how long we want to provide access to the file.

Now we have talked a lot about theory, let’s build this solution.

Prerequisites

I am going to build this solution on Azure Stack, because I need to use it on-premises.

The following components are needed:

  • Azure Stack deployed in a connected or disconnected scenario. In my case I’ll use the Azure Stack Development Kit.
  • Azure Stack App Service deployed
  • Azure.Storage PowerShell module

Create Function

Create a new Function App on Azure Stack, I call it DemoFunctionApp

1

…notice here, I’ll create a new storage account which will store the function, but also serve for our file storage. Go to Deployment Center

2

…select External

3

…next I provided my GitHub account, where I have the code of the function stored. I downloaded the code from lindydonna GitHub public repository, she gets all the credits for her code. This will create a HTTPTrigger function which executes the code when a HTTP request is received. Just have a look at the code. She has written this example for Azure in C#, but I wanted also to try it out on Azure Stack to see if it works. As I mentioned I copied the code to my repo, so I can modify it the way I need it…

4

…finally accept everything and finish the wizard…

5

…start syncing the external repo. The function GetSasToken-Net will be created…

6

…and there is a notification, telling us that we have source control enabled…

7

Next we need to copy the function URL…

image

…which looks like this…

8

Up to this point we have the function deployed which will create the SAS token. Next we need to create the blob container to store the files.

Create Storage Container

We will use the same storage account for our testing, as we use for our function. Go to the function and AzureWebJobsStorage application setting. This tells us which storage account is being used…

9

…navigate to that storage account and create a new container, we call it test

10

…then go to Access keys and copy the key1, because we will need it in our script…

11

Now we have everything prepared and the only thing which is missing is the PowerShell script.

PowerShell Script

I have written this small sample script and documented line by line. It simply does what we have described in the beginning….

image

…her the code for copy / paste purposes…

# Set variable values
# Name of the blob to upload
$fileName = "AwesomePicture.jpg"
# Local path
$path = "C:\Temp\"
# Name of the container
$storageContainer="test"
# Storage account name
$storageAccountName = "demofunctionapp"
# Storage account key
$storageAccountKey = "0AY6FT71VIuDkyNWUNSRgg3rU5lWVkbUOGLeGufa8fhKDX45rJZqMxrzRhkKhnuD3Ih8s9AUNmcI8HjoNV646A=="
# Function URL to call
$functionURL = "http://demofunctionapp.appservice.local.azurestack.external/api/GetSasToken-Net?code=YXWi3QWBNj/cPHjdTSAzTGpVlblArHYPtafpK5CLyxONVKeaUHi1nQ=="

# Get Storage account context
$ctx = New-AzureStorageContext -StorageAccountName $storageAccountName -StorageAccountKey $storageAccountKey
# Get storage account container
Get-AzureStorageContainer -Context $ctx -Name $storageContainer
# Upload a file to storage account
Set-AzureStorageBlobContent -File (Join-Path $path $fileName) -Container test -Blob $fileName -Context $ctx –Force

# Create SAS token
# Create the body of the webrequest
# Possible values are "Read", "Write", "Delete", "List", "Add", "Create". Comma-separate multiple permissions, such as "Read, Write, Create".
$body = '{"container":"' + $storageContainer +'","blobName":"' + $fileName + '", "permissions":"Read,List"}' 
# Invoke function and submit the JSON body
$token = Invoke-WebRequest $functionURL  -Method Post -Body $body -ContentType 'application/json'
# Get access token
$blobToken= (convertfrom-json $token.Content).token
# Create URL
$url = "https://$storageAccountName.blob.local.azurestack.external/$storageContainer/" + $fileName + $blobToken
Write-Host -ForegroundColor Green $url
# Put URL to clipboard
$url | clip.exe

…if we execute the script it look like this…

image

…as we can see we get an https URL returned, which we can put into our browser to download the file. I set the expiration of the file to 5 minutes, this means after we created the SAS token we have 5 minutes left to download the file. If we tried it later we get an error as expected…

image

I think this is a very cool way for exchanging files or any other blob content in an easy and secure way, without the need for file shares, permission hassle and any weird file share protocol.

I hope you enjoy this solution, let me know what you think!

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.