Ante Miličević
January 18, 2024

24 Security Recommendations for Google Cloud Platform (GCP): Part 3

Welcome to part three of our GCP security series. Today, we’ll be covering Cloud Logging , Cloud SQL, and Big Query.

Welcome to the third part of our detailed series about securing your GCP. By the way, in case you did not manage to read its first part, read now. Now let’s dive deep into the details of Cloud Logging.

Cloud Logging

Cloud Logging serves as a comprehensive, fully managed service enabling the storage, retrieval, analysis, monitoring, and alerting of log data and events originating from both Google Cloud and Amazon Web Services. This service extends its functionality to the collection of log data from a diverse range of sources, including over 150 popular application components, on-premises systems, and hybrid cloud environments.In the realm of GCP security best practices, particular attention is directed towards leveraging the capabilities of Cloud Logging.

16. Ensure Proper Configuration of Cloud Audit Logging for Comprehensive Oversight

Cloud Audit Logging is a crucial element to ensure comprehensive oversight of all services and users within a project. Two distinct audit logs—Admin Activity and Data Access—are maintained for each project, folder, and organization.

  • Admin Activity Logs: These logs encompass entries for API calls or administrative actions altering the configuration or metadata of resources. Admin Activity logs are enabled for all services and cannot be configured.
  • Data Access Audit Logs: These logs capture API calls related to the creation, modification, or retrieval of user-provided data. They are initially disabled and need to be enabled manually.

To uphold effective default audit configuration, it is essential to log user activity tracking and changes (tampering) to user data. This necessitates capturing logs for all users. To achieve this, the project's policy must be edited. Begin by downloading the policy as a YAML file:

<pre class="codeWrap"><code>gcloud projects get-iam-policy PROJECT_ID > /tmp/project_policy.yaml</code></pre>

Next, modify the /tmp/project_policy.yaml file, adding or modifying only the audit logs configuration as follows:

<pre class="codeWrap"><code>auditConfigs:
- auditLogConfigs:
  - logType: DATA_WRITE
  - logType: DATA_READ
 service: allServices</code></pre>

Please note that the exemptedMembers field should not be set, as audit logging must be enabled for all users. Finally, update the policy with the new changes:

<pre class="codeWrap"><code>gcloud projects set-iam-policy PROJECT_ID /tmp/project_policy.yaml</code></pre>

Your project may be billed for the extra log use if you enable the Data Access audit logs.

17. Ensure Proper Configuration of Sinks for Comprehensive Log Entry Management

Managing log entries effectively involves configuring sinks to export copies of all log entries. This setup allows the aggregation of logs from various projects, facilitating their export to a Security Information and Event Management (SIEM) system.

Exporting log entries entails creating a sink, which involves defining a filter to select specific log entries and choosing the destination, which can be Cloud Storage, BigQuery, or Cloud Pub/Sub. Both the filter and destination are encapsulated in an object known as a sink. To guarantee that all log entries are exported to the sink, it's essential to ensure that the filter is not configured.To create a sink for exporting all log entries to a Google Cloud Storage bucket, execute the following command:

<pre class="codeWrap"><code>gcloud logging sinks create SINK_NAME storage.googleapis.com/BUCKET_NAME</code></pre>

While the example focuses on exporting events to a bucket, you may opt to use Cloud Pub/Sub or BigQuery as alternatives.Here's an illustrative Cloud Custodian rule to verify that sinks are configured without a filter:

<pre class="codeWrap"><code>- name: check-no-filters-in-sinks
  description: |
    It is recommended to create a sink that will export copies of
    all the log entries. This can help aggregate logs from multiple
    projects and export them to a Security Information and Event
    Management (SIEM).
  resource: gcp.log-project-sink
  filters:
   - type: value
      key: filter
      value: empty
</code></pre>

18. Confirm that log bucket retention policies are established with the utilization of Bucket Lock

You have the option to activate retention policies on log buckets to safeguard logs stored in cloud storage buckets from unintentional deletion or overwriting. It is advisable to establish retention policies and enable bucket locks for all storage buckets designated as log sinks, following the aforementioned best practices.To view all sinks directed to storage buckets:

<pre class="codeWrap"><code>gcloud logging sinks list --project=PROJECT_ID</code></pre>

For each storage bucket mentioned earlier, establish a retention policy and apply a lock:

<pre class="codeWrap"><code>gsutil retention set TIME_DURATION gs://BUCKET_NAME
gsutil retention lock gs://BUCKET_NAME</code></pre>

Locking a bucket is an irrevocable process. You are unable to shorten the retention term or delete the retention policy from a locked bucket.

19. Activate encryption for logs router using customer-managed keys

Ensure that your Google Cloud Logs Router data is secured through encryption employing a customer-managed key (CMK). This ensures comprehensive authority over the encryption and decryption procedures, aligning with your compliance obligations.It's essential to include a policy and binding within the IAM policy of the CMK, assigning the Cloud KMS "CryptoKey Encrypter/Decrypter" role to the relevant service account. In this step, utilize the previously created keyring and CMK mentioned in #13.

<pre class="codeWrap"><code>gcloud kms keys add-iam-policy-binding KEY_ID --keyring=KEY_RING_NAME --location=global --member=serviceAccount:PROJECT_NUMBER@gcp-sa-logging.iam.gserviceaccount.com --
role=roles/cloudkms.cryptoKeyEncrypterDecrypter</code></pre>

Cloud SQL OverviewCloud SQL stands as a completely managed relational database service catering to MySQL, PostgreSQL, and SQL Server. It enables the operation of familiar relational databases, complete with their extensive extension collections, configuration flags, and developer ecosystem. The significant advantage lies in the elimination of the burdensome task of self-management.Security

Best Practices for Cloud SQL

20. Verify Mandatory SSL for Incoming Connections to Cloud SQL

Connections to SQL databases could potentially expose sensitive information, including credentials, queries, and query outputs if intercepted (MITM). For enhanced security, it is advised to consistently employ SSL encryption when connecting to PostgreSQL, MySQL generation 1, and MySQL generation 2 instances.To enforce SSL encryption for instance, execute the following command:

<pre class="codeWrap"><code>gcloud sql instances patch INSTANCE_NAME --require-ssl</code></pre>

Moreover, it's important to note that MySQL generation 1 instances need to be restarted for this configuration to take effect.

The Cloud Custodian rule outlined below can assess instances lacking SSL enforcement:

<pre class="codeWrap"><code>- name: cloud-sql-instances-without-ssl-required
  description: |
    It is recommended to enforce all incoming connections to
    SQL database instance to use SSL.
  resource: gcp.sql-instance
  filters:
   - not:
      - type: value
        key: "settings.ipConfiguration.requireSsl"
        value: true
</code></pre>

21. Confirm Cloud SQL Instances Restrict Access from Untrusted Sources

It is imperative to limit access to Cloud SQL database instances exclusively to trusted and necessary IPs. This practice reduces the potential attack surface of the database server instance. The permitted networks should not include an IP/network configuration of 0.0.0.0/0, as this would grant access from anywhere in the world. It is essential to note that allowed networks apply solely to instances with public IPs.

<pre class="codeWrap"><code>gcloud sql instances patch INSTANCE_NAME --authorized-networks=IP_ADDR1,IP_ADDR2...</code></pre>

To proactively prevent the configuration of new SQL instances allowing connections from any IP addresses, establish a Restrict Authorized Networks Organization Policy on Cloud SQL instances.

22. Confirm Cloud SQL Instances Lack Public IPs for Enhanced Security

To reduce the organization's vulnerability to potential attacks, it's crucial that Cloud SQL database instances do not possess public IPs. Utilizing private IPs enhances network security and minimizes application latency.

For each instance, eliminate its public IP and instead assign a private IP:

<pre class="codeWrap"><code>gcloud beta sql instances patch INSTANCE_NAME --network=VPC_NETWORK_NAME --no-assign-ip</code></pre>

To proactively thwart the configuration of new SQL instances with public IP addresses, implement a Restrict Public IP Access Organization Policy on Cloud SQL instances.

23. Verify Automated Backup Configuration for Cloud SQL Database Instances

Establishing automated backups for Cloud SQL instances is essential for data recovery and safeguarding against instance-related issues. It is advisable to configure automatic backups for all instances holding crucial data susceptible to loss or damage. This recommendation is applicable to instances running SQL Server, PostgreSQL, MySQL generation 1, and MySQL generation 2.

Retrieve a list of all Cloud SQL database instances with the following command:

<pre class="codeWrap"><code>gcloud sql instances list</code></pre>

Activate automated backups for each Cloud SQL database instance:

<pre class="codeWrap"><code>gcloud sql instances patch INSTANCE_NAME --backup-start-time [HH:MM]</code></pre>

The backup-start-time parameter, specified in 24-hour time in the UTC±00 time zone, designates the commencement of a 4-hour backup window. Backups may initiate at any point during this specified backup window. It is essential to note that automated backups are not configured by default for Cloud SQL instances.

Without configuring Automated Backup, data backup is not feasible for any Cloud SQL instance. While there are additional Cloud SQL best practices specific to MySQL, PostgreSQL, or SQL Server, the aforementioned four practices are arguably the most crucial.

BigQuery

BigQuery is a cloud data warehouse that can be defined as a serverless, highly scalable, and cost-saving solution that comes with an in-memory BI Engine along with built-in machine learning capabilities. Similar to other sections, GCP security best practices are essential for optimal performance and data protection in BigQuery.

24. Confirm Restricted Access for BigQuery Datasets

It is imperative to prevent anonymous or public access to BigQuery datasets through careful management of IAM policies. Granting permissions to allUsers or allAuthenticatedUsers can potentially expose the dataset to unauthorized access, which is particularly concerning for datasets containing sensitive information.

Therefore, it is crucial to ensure that anonymous and/or public access is restricted.To achieve this, follow these steps. First, retrieve dataset information to your local filesystem:

<pre class="codeWrap"><code>bq show --format=prettyjson PROJECT_ID:DATASET_NAME > dataset_info.json</code></pre>

Next, in the access section of dataset_info.json, modify the dataset information to eliminate all roles containing allUsers or allAuthenticatedUsers. Lastly, update the dataset:

<pre class="codeWrap"><code>bq update --source=dataset_info.json PROJECT_ID:DATASET_NAME</code></pre>

Implementing the Domain Restricted Sharing organization policy is an effective measure to prevent BigQuery datasets from becoming publicly accessible.

Compliance Standards and Benchmarks

Implementing and maintaining detection rules for your GCP environment to ensure security is an ongoing task that can consume a significant amount of time. This effort becomes even more challenging without a roadmap to guide you through this continuous process.A more effective approach is to align with the compliance standard(s) pertinent to your industry. These standards outline all the necessary requirements for securing your cloud environment effectively.

By adhering to industry-specific compliance standards, you can streamline the process of securing your infrastructure.Given the persistent nature of securing your infrastructure and adhering to security standards, it is advisable to periodically execute benchmarks. For example, the CIS Google Cloud Platform Foundation Benchmark can be recurrently run to audit your system and identify any non-conformities that may exist. This proactive approach helps in maintaining a robust security posture over time.

Conclusion

Venturing into the realm of cloud computing introduces a plethora of opportunities, accompanied by the necessity to grasp a distinct set of Google Cloud Platform security best practices. It's crucial to recognize that every new cloud service employed brings along its unique set of potential risks that demand careful consideration.

Fortunately, cloud-native security tools such as Falco and Cloud Custodian serve as valuable guides in navigating the intricacies of Google Cloud Platform security best practices. These tools not only assist in implementing recommended measures but also play a pivotal role in ensuring compliance with regulatory requirements. Embracing these tools is key to fortifying your cloud environment and safeguarding against potential security threats.

Facing Challenges in Cloud, DevOps, or Security?
Let’s tackle them together!

get free consultation sessions

In case you prefer e-mail first:

Thank you! Your message has been received!
We will contact you shortly.
Oops! Something went wrong while submitting the form.
By clicking “Accept”, you agree to the storing of cookies on your device to enhance site navigation, analyze site usage, and assist in our marketing efforts. View our Privacy Policy for more information. If you wish to disable storing cookies, click here.