Introduction
In this post I’m going to detail how you can use the blob copy feature in AzCopy within an Azure Function.
This is especially useful if your requirement is to move or copy data from one Azure Storage Account to another based on a blob storage trigger event.
When a new upload to the blob container is detected, the function will trigger and copy to the destination of choice.
In my example, it will be a test.csv file dropped into the source Storage Account that then copies over to destination Storage Account.
I’m going to assume you already have two Storage Accounts with blob containers setup, in which you’re wanting to implement this or a similar style solution.
Azure Function setup & trigger
Firstly, we’re going to create an Azure Function using runtime of PowerShell in a Code publish method. I’m using a consumption plan type to reduce costs.
I’ve selected an existing Storage Account under ‘Hosting’ when creating my function which holds my current container I want to copy from for simplicity. However, you can create a new Storage Account instead to keep the Azure Function containers separate from your existing Storage Account.
Once the deployment completes we want to create a trigger so the function will run when a new blob is uploaded to the container.
Trigger setup and config
There’s a handy event trigger for Azure Blob Storage which is perfect for this solution.
When selecting a blob storage trigger we’re presented with some additional options for the trigger criteria:
Name: This is the name of the trigger
Path: This will be the path to the container/blob, in my example you can see I’m using ‘test/{name}.csv‘. This is so it only triggers on filenames with a .csv extension that is uploaded into the container ‘test’.
More on blob name patterns can be found here Azure Blob storage trigger for Azure Functions | Microsoft Learn
Storage account connection: My connection is ‘AzureWebJobsStorage‘ as I’ve linked my Function to an existing Storage Account for this (as mentioned in the intro). But if you want to select a different storage, click on ‘New’ and select the Storage Account you want for the trigger source. More on that here Azure Blob storage trigger for Azure Functions | Microsoft Learn
Finally, create the trigger and wait for deployment to complete.
Uploading AzCopy
Next we need to upload AzCopy executable into the Function so it can be invoked when the function is triggered. I’ve detailed how to upload files into an Azure Function in a previous blog post here. Go check it out if you need a more detailed step by step guide on this part.
We will be using AzCopy v10. This can be downloaded from the Microsoft website here.
A quick recap on how to do this:
- Go to your Azure Function and locate Deployment on the left navigation pane
- Click Deployment Center and locate the FTPS credentials tab
- Copy the endpoint, username & password to connect to the Azure Function via an FTP client such as FileZilla
- Lastly, lets drag & drop azcopy.exe from your machine to /site/wwwroot/BlobTrigger1 (or what the trigger name is set to)
AzCopy
Then there is the AzCopy copy command so when the Function is triggered it actions the copy required. AzCopy version 10 natively supports blob storage copy which makes this quite simple to script.
- In the Azure Function, go to Functions and select the blog storage trigger that has been created.
- Once in the trigger go to Code + Test with a dropdown to select the ‘run.ps1’ file – this is what executes when the function is triggered.
3. Below the default bindings and log output we can enter our AzCopy command.
My working example looks like this:
C:\home\site\wwwroot\BlobTrigger1\azcopy.exe copy
'https://<source-storage-account-name>.blob.core.windows.net/<container-name>/<SAS-token>'
'https://<destination-storage-account-name>.blob.core.windows.net/<container-name>/<SAS-token>'
--recursive --include-pattern *.csv
In my example, I’m using two Blob SAS URLs that I have generated on each Storage Account (here’s a guide on how to do this). I’ve selected the allowed resource types: container and left the rest of the defaults.
On my AzCopy command I’m using –recursive to ensure I copy across the same folder structure from the source. The –include-pattern *.csv instructs only the CSV files within the structure to be the files that get copied across.
Alternatively, you can use many different other pattern combos to match different requirements, such as filename matches and more. Copy blobs between Azure storage accounts with AzCopy v10 | Microsoft Learn
Note: Shared Access Tokens is the only supported method for this AzCopy functionality. It appears I cannot utilise a managed identity from my testing. There is no way to authenticate in AzCopy within the Function like how you can with the Az PS modules. Let me know if you know a way around this, I was unsuccessful!
More info on how to create a Shared Access Token for a storage account: Create a service SAS for a container or blob – Azure Storage | Microsoft Learn
Testing
Lets start by testing the functionality.
I’m matching only against .csv files for both trigger events & the AzCopy copy command.
- Upload a test.csv to Storage Account 1 (source).
- The function will trigger.
- This then executes our run.ps1 AzCopy command which will perform the copy.
Upload test.csv in the Azure Portal GUI to Storage Account 1:
After a few minutes I can see that the function has executed on the overview page (under Function Execution count). In my destination Storage Account Container I can review and check the copy was successful as the test.csv now appears.
Checking the logs
Under the Function, select the BlobTrigger1 (or your trigger name you specified) and locate the Monitor section under Developer.
Here we can review the execution logs and review any error outputs. In my example you can see the Event Trigger output and then the copy output which was successful.
Closing thoughts
Rather than uploading AzCopy manually into the Function I could automatically download this via PowerShell on each run (although I wanted to eliminate any delay on execution which is why I haven’t here).
Additionally, adding the Blob SAS URLs as application strings to call in the run.ps1 file as an environment variable could be a way to standardise the code going forward also.
I did run into an issue where the function ran out of memory when transferring larger files, even with a premium plan – unfortunately I didn’t get to the bottom of resolving that. Let me know if you have a similar issue and resolve it!
For house keeping you can also create a Lifecycle management rule to keep things tidy and clean up old files.
Hope you found this post useful and of interest, let me know your comments below.
Thanks!
Dan
Hello Dan,
Thank you for this wonderful artical. It is very clear, concise and up to the point. I am trying this solution provided for one of hobby project but I am having issue in upload azCopy section. the issue is while connecting to FTP server. I am getting 530 authentication error but I am sure that username and password is correct. I tried after reseting the credentials also. but same error.
In some another blog related to resolve 530 error I found out that may be issue with username as it contails special character “/ and $” may be FTP not understanding the username. How can I overide this and connect to FTP server or there may be any other issue.
Hi Yogesh,
Thanks for the comment.
That’s odd. I see the threads that mention the character issue. What FTP client are you trying to connect with?
I was using FileZilla in this example, which may be worth a try if you aren’t using that yet.
Could try restarting the function, and also check out the backend Kudu, you may be able to simply drag/drop the AzCopy executable instead as a temporary work-around:
https://YOURFUNCNAME.scm.azurewebsites.net/ZipDeployUI