Az - Blob Storage

Learn AWS hacking from zero to hero with htARTE (HackTricks AWS Red Team Expert)!

Other ways to support HackTricks:

Basic Information

From the docs: Azure Blob storage is Microsoft's object storage solution for the cloud. Blob storage is optimized for storing massive amounts of unstructured data. Unstructured data is data that doesn't adhere to a particular data model or definition, such as text or binary data.

Blob storage offers three types of resources:

  • The storage account (unique name)

  • A container in the storage account (folder)

  • A blob in a container

Different types of storage

Blob storage

https://<storage-account> https://<stg-acc><container-name>?restype=container&comp=list

Azure Data Lake Storage Gen2


Azure Files


Queue storage


Table storage


Access to Storage

  • Use Azure AD principals via RBAC roles supported.

  • Access Keys: Use access keys of the storage account. This provides full access to the storage account.

  • Shared Access Signature (SAS): Time limited and specific permissions.

    • You can generate a SAS url with an access key (more complicated to detect).

    • As the SAS is generated from the access key, if it gets renewed the SAS stops workign.

Public Exposure

If "Allow Blob public access" is enabled (disabled by default), it's possible to:

  • Give public access to read blobs (you need to know the name).

  • List container blobs and read them.

Connect to Storage

If you find any storage you can connect to you could use the tool Microsoft Azure Storage Explorer to do so.


From the docs: A shared access signature (SAS) provides secure delegated access to resources in your storage account. With a SAS, you have granular control over how a client can access your data. For example:

  • What resources the client may access.

  • What permissions they have to those resources.

  • How long the SAS is valid.

A SAS URL looks like this: https://<container_name>

Use Storage Explorer to access the data or python:

#pip3 install azure-storage-blob
from import BlobServiceClient

# List containers
conn_str="<SAS URL>"
svc = BlobServiceClient.from_connection_string(conn_str=conn_str)
for c in svc.list_containers():

# List blobs inside conteiner
container = svc.get_container_client(container="<container_name>")
for b in container.list_blobs():

# Download file
blob = svc.get_blob_client(container="<container_name>",blob="<blob_name>")
with open("download.out","wb") as f:

User delegation SAS

You can secure a shared access signature (SAS) token for access to a container, directory, or blob by using either Azure Active Directory (Azure AD) credentials or an account key. To create a user delegation SAS, you must first request a user delegation key, which you then use to sign the SAS.

Support is provided for a User Delegation Shared Access Signature (SAS) in both Azure Blob Storage and Azure Data Lake Storage Gen2. However, it's important to note that Stored Access Policies are not compatible with a User Delegation SAS.

Note that user delegation SAS is secured with Azure AD credentials instead of storage account keys. This prevents clients/applications from storing/retrieving storage keys to create SAS.

Service SAS

A service SAS is secured with the storage account key. A service SAS delegates access to a resource in only one of the Azure Storage services: Blob storage, Queue storage, Table storage, or Azure Files. The URI for a service-level SAS consists of the URI to the resource for which the SAS will delegate access, followed by the SAS token.

To use Azure Active Directory (Azure AD) credentials to secure a SAS for a container or blob, user a user delegation SAS.

Account SAS

An account SAS is secured with one of the storage account keys (there are 2). An account SAS delegates access to resources in one or more of the storage services. All of the operations available via a service or user delegation SAS are also available via an account SAS.

from the docs: By creating an account SAS, you can:

  • Delegate access to service-level operations that aren't currently available with a service-specific SAS, such as the Get/Set Service Properties and Get Service Stats operations.

  • Delegate access to more than one service in a storage account at a time. For example, you can delegate access to resources in both Azure Blob Storage and Azure Files by using an account SAS.

  • Delegate access to write and delete operations for containers, queues, tables, and file shares, which are not available with an object-specific SAS.

  • Specify an IP address or a range of IP addresses from which to accept requests.

  • Specify the HTTP protocol from which to accept requests (either HTTPS or HTTP/HTTPS).


# Get storage accounts
az storage account list #Get the account name from here

# Get keys to authenticate
az storage account keys list --account-name <name>

# Get shares
az storage share list --account-name <name> --account-key <key>

# Get dirs/files inside the share
az storage file list --account-name <name> --share-name <share-name> --account-key <key>
## If type is "dir", you can continue enumerationg files inside of it
az storage file list --account-name <name> --share-name <prev_dir/share-name> --account-key <key>

# Download a complete share (with directories and files inside of them)
az storage file download-batch -d . --source <share-name> --account-name <name> --account-key <key>


Learn AWS hacking from zero to hero with htARTE (HackTricks AWS Red Team Expert)!

Other ways to support HackTricks:

Last updated