Cloud implementation and migration projects create ample opportunity for mistakes when it comes to the management of security controls in the new environment. This was demonstrated in dramatic fashion recently with the news that Viacom cloud configuration secrets had been exposed on the public internet due to misconfigured S3 bucket permissions.
We work with clients during our Cloud Execution projects to define the foundations of their cloud hosting environments. Designing and deploying cloud security controls and related compliance measures is woven throughout our multiple project phases. A primary consideration is maintaining adequate controls over all data and related management tools. Amazon Web Services specifically provides out of the box non-public access rights to S3 buckets, the ability to centrally define and apply security policies, and a full suite of monitoring changes to all cloud resource permissions and configurations.
While a simple enough concept, setting and maintaining security permissions across a multi-account environment can be challenging. Instituting independent checks of configuration settings per a trust but verify operations model provides an important cross-check to configuration mistakes and malicious access attempts. The monitoring scheme should regularly scan configurations for dangerous settings (public access, logging not enabled, encryption disabled), checking of files for sensitive data, verification of classification tag application, and logging of changes to storage bucket configurations. As part of the overall data classification lifecycle for the environment, data resources like storage volumes, object storage buckets, and file shares should be tagged and their permissions regularly evaluated for correctness. For example, data such as credentials and API secret keys should never be stored in unencrypted or otherwise unsecured source code. Managing cloud provider access secrets in the appropriate methods can be tedious in larger environments, but is essential to maintaining secure access to cloud provider resources and data.
Training system architects and administrators on how to apply the technical tools and work within the overall compliance scheme should be a high priority. The ease of cloud storage services deployment combined with virtually unlimited “on demand” resources removes some of the checks and balances that a legacy environment traditionally imposes on operational mistakes. Training key personnel on the how and why of the tools and their controls can limit unforced errors committed by inexperienced staff.
Lastly, adopt a defensive, in-depth strategy by designing cloud architectures to limit the blast radius in the event an error or attacker does expose sensitive information. Segregating resources in discreet account structures, limiting reuse of secrets, rotating access keys, and restricting permissions to the required minimums all contribute to a least privilege model that can protect against catastrophic loss of data and services.