Upload data to Azure storage services using Azure Bicep
These examples demonstrate how to upload data to the different storage services in a Storage Account using Azure Bicep.
Two reasons why you might want to do this:
az bicep install
)az account set --subscription <name or id of subscription>
Set a location for the deployment. To see available locations run az account list-locations -o table
.
location=northeurope
A new storage account is created with a blob container. The blob in data/blob.txt will be uploaded to the blob container. Deploy the example with the Azure CLI.
az deployment sub create \
--name blob-example \
--location $location \
--template-file ./main.bicep \
--parameters blobExample=true
A new storage account is created with a message queue. The queue data in data/queue.xml will be posted as a message in the queue. The operation requires a SAS token, this is generated using the Azure CLI. There is no direct support to post messages to a storage queue using the Azure CLI, thus the REST API is used through the az rest
command. The script that generates the SAS token and issues the REST API call is available in scripts/queue.sh. Deploy the example with the Azure CLI.
az deployment sub create \
--name queue-example \
--location $location \
--template-file ./main.bicep \
--parameters queueExample=true
A new storage account is created with a storage table. The table row data in data/table.json will be added as a row in the table. The operation requires a SAS token, this is generated using the Azure CLI. There is no direct support to add a row to a table using the Azure CLI, thus the REST API is used through the az rest
command. The script that generates the SAS token and issues the REST API call is available in scripts/table.sh. Deploy the example with the Azure CLI.
az deployment sub create \
--name table-example \
--location $location \
--template-file ./main.bicep \
--parameters tableExample=true
A new storage account is created with a file share. The file in data/file.txt will be uploaded to the file share. Deploy the example with the Azure CLI.
az deployment sub create \
--name file-example \
--location $location \
--template-file ./main.bicep \
--parameters fileExample=true
Try all examples at the same time.
az deployment sub create \
--name all-examples \
--location $location \
--template-file ./main.bicep \
--parameters blobExample=true queueExample=true tableExample=true fileExample=true
Delete the resource group.
rgName=$(az group list \
--query "[?@.tags.application && tags.application == 'azure-bicep-upload-data-to-storage'] | [0].name" \
--output tsv)
az group delete --name $rgName --yes --no-wait
loadTextContent
function. This prohibits us from using a loop construct for the deployment script resources, which would have allowed us to add several blobs, queue messages, table rows, and files, in the same deployment. A workaround is to expand the script used to perform the upload in creative ways (e.g. use a loop in Bash).