Public cloud storage breaches continue to make headlines with depressing regularity. S3 buckets, Azure Blob Storage containers, and Google Cloud Storage buckets holding sensitive data remain publicly accessible because someone set the wrong permission during creation and nobody ever reviewed it. The data sits there, indexed by specialised search engines, waiting for anyone to download it.
Cloud providers have added guardrails over the years. AWS blocks public access on new S3 buckets by default. Azure requires explicit configuration to enable anonymous access. Yet breaches continue because default settings can be overridden, legacy configurations predate the guardrails, and automation scripts copy insecure templates without updating permissions.
Why Misconfigurations Persist
Infrastructure-as-code templates created years ago propagate insecure defaults into every new deployment. A Terraform module written when public access was the S3 default continues to create publicly accessible buckets unless someone updates it. Teams that reuse shared modules across projects spread the misconfiguration silently.
Multi-account cloud architectures create management complexity. A security team that monitors the main production account may have limited visibility into developer sandbox accounts, staging environments, or accounts managed by different business units. Data in these secondary accounts receives less security attention despite often containing production data copied for testing or analysis purposes.
Signed URL and shared access signature mechanisms create temporary access grants that sometimes become permanent. A pre-signed URL generated for a one-off data transfer might be bookmarked, shared, or cached by CDNs well beyond its intended lifespan. SAS tokens in Azure with overly broad permissions and distant expiry dates grant persistent access that nobody tracks or revokes.
William Fieldhouse, Director of Aardwolf Security Ltd, comments: “Cloud storage breaches are entirely preventable. Every major provider offers tools to detect and block public access. The problem is operational: configurations drift, exceptions accumulate, and nobody reviews permissions after initial deployment. We find publicly accessible storage in a significant percentage of cloud assessments, often containing data the organisation had no idea was exposed.”
Locking Down Cloud Storage
Enable account-level public access blocks in AWS and Azure. Use service control policies and Azure Policy to prevent anyone from overriding these blocks without explicit approval. These preventive controls stop misconfigurations before they occur rather than detecting them after data is already exposed.
Commission AWS penetration testing and Azure penetration testing that specifically includes storage assessment. Testers should enumerate all storage resources, check access permissions, verify encryption settings, and assess whether stored data includes sensitive information that should not be in that location.
Review all infrastructure-as-code templates for insecure storage configurations. Update shared modules and pipeline templates to enforce private access by default. Implement automated scanning that flags any storage resource created without encryption or with public access enabled.
Cloud storage breaches are not sophisticated attacks. They are configuration oversights that automated tools find in seconds. Fix the defaults, monitor for drift, and test regularly. The alternative is reading about your own data exposure in the morning news.


Locking Down Cloud Storage