Google Cloud Architecture Best Practices
48 Questions
1 Views

Choose a study mode

Play Quiz
Study Flashcards
Spaced Repetition
Chat to Lesson

Podcast

Play an AI-generated podcast conversation about this lesson

Questions and Answers

What is the recommended approach to provide a bare-metal server application with access to Cloud Storage while adhering to security policies?

  • Assign a public IP address to the server
  • Set up a Cloud VPN connection (correct)
  • Use a Cloud Storage transfer service
  • Enable Cloud Storage API directly on the server

In a project with a single VPC and multiple regions, which method ensures the new instance in europe-west1 can access the application hosted in us-central1?

  • Ensure the Compute Engine instance has a public IP
  • Use a global load balancer to route traffic (correct)
  • Establish a VPN connection between the two regions
  • Deploy the new instance in the us-central1 region

What is the most efficient way to verify that all dependencies in a Deployment Manager template are satisfied before deployment?

  • Check dependencies manually before committing.
  • Use the 'gcloud deployment-manager deployments validate' command. (correct)
  • Run the template in a staging environment.
  • Deploy the template in the production environment directly.

Which service should be used to minimize costs when sending logs from Compute Engine instances to a BigQuery dataset?

<p>Use the Cloud Logging agent to filter logs (B)</p> Signup and view all the answers

How can you enable TCP communication on port 8080 from tier #1 to tier #2 in a 3-tier Compute Engine solution?

<p>Modify firewall rules to allow traffic on port 8080. (B)</p> Signup and view all the answers

What should you do to ensure an application on Cloud Run processes messages from a Cloud Pub/Sub topic following recommended practices?

<p>Set up authentication with IAM roles (D)</p> Signup and view all the answers

How can you quickly disable excessive logging from a development GKE container while minimizing steps?

<p>Use kubectl to change log verbosity (C)</p> Signup and view all the answers

To create service cost estimates from multiple Google Cloud projects, which action should you take?

<p>Run a standard SQL query against the billing data. (B)</p> Signup and view all the answers

What steps should you take to enable Cloud Pub/Sub for your App Engine application while ensuring service account authentication?

<p>Enable the Cloud Pub/Sub API in the Google Cloud Console. (C)</p> Signup and view all the answers

What can be done to avoid storing a database password in plain text within a GKE deployment YAML file?

<p>Access the password using a secret management service (D)</p> Signup and view all the answers

What method can you use to deliver 1% of your website traffic to a test version hosted on App Engine?

<p>Use traffic splitting in App Engine's version settings. (A)</p> Signup and view all the answers

If multiple changes frequently need to be made by a small data science team using BigQuery, what is the best approach to provide them with necessary access?

<p>Create a service account for the team (B)</p> Signup and view all the answers

Which service configuration is most appropriate for point-in-time recovery?

<p>Enable daily backups on Cloud SQL (D)</p> Signup and view all the answers

Which approach should be implemented for storing audit log files in compliance with a 3-year retention policy for hundreds of Google Cloud projects?

<p>Use Cloud Storage with lifecycle management. (C)</p> Signup and view all the answers

For a Cloud Spanner application needing monitoring access without exposing table data to the support team, what is the recommended action?

<p>Grant monitoring permissions without table access. (A)</p> Signup and view all the answers

To implement a caching HTTP reverse proxy on GCP for a latency-sensitive website, which option should be prioritized?

<p>Use a regional load balancer with caching capabilities. (D)</p> Signup and view all the answers

What action should you take to assign permissions for an external auditor to review GCP Audit Logs and Data Access logs?

<p>Assign a Cloud IAM role with specific log viewing permissions. (D)</p> Signup and view all the answers

To access combined logs for all GCP projects for the past 60 days, what is the recommended approach?

<p>Use Cloud Logging to aggregate logs across projects. (A)</p> Signup and view all the answers

What is the most effective way to turn off all configured services in an existing GCP project to reduce service costs?

<p>Use the Google Cloud Console to disable each service individually. (A)</p> Signup and view all the answers

How should you enable a service account in a web-applications project to access BigQuery datasets in another project?

<p>Use a Cloud IAM role that includes permissions for BigQuery access. (D)</p> Signup and view all the answers

What steps should you take to investigate any access by a terminated employee to sensitive customer information?

<p>Analyze GCP audit logs for actions taken by the employee. (A)</p> Signup and view all the answers

When creating a custom IAM role for GCP, what is a key consideration to ensure its suitability for production use?

<p>Ensure all included permissions are reviewed and approved. (D)</p> Signup and view all the answers

What is the recommended method for making unstructured data accessible for ETL transformations on Google Cloud?

<p>Store the data in Cloud Storage for access by Dataflow. (C)</p> Signup and view all the answers

To efficiently manage multiple Google Cloud projects using the Google Cloud SDK CLI, what should you do?

<p>Use the 'gcloud' command to set the desired project context. (B)</p> Signup and view all the answers

What is the necessary configuration to allow communication on TCP port 8080 between instances in tier #1 and tier #2?

<p>Create an ingress firewall rule targeting all instances with tier #2 service account. (B)</p> Signup and view all the answers

Which of the following configurations correctly establishes communication for TCP port 8080 between tier #2 and tier #3?

<p>An ingress firewall rule targeting all instances with tier #3 service account from tier #2. (A)</p> Signup and view all the answers

What is the primary purpose of executing the Deployment Manager template with the C-preview option?

<p>To observe the state of interdependent resources. (A)</p> Signup and view all the answers

Which of the following ingress firewall rule settings would NOT allow tier #1 to communicate with tier #2?

<p>Targets: all instances; Source filter: IP ranges set to 10.0.2.0/24; Protocols: allow TCP: 8080. (C)</p> Signup and view all the answers

What additional task should you perform besides creating necessary firewall rules to facilitate tier communication?

<p>Ensure all service accounts have the necessary permissions. (B)</p> Signup and view all the answers

Which of these ingress firewall rules allows both tiers to communicate effectively through TCP port 8080?

<p>Ingress rule for tier #2 service account from tier #1 service account, allowing TCP: 8080. (D)</p> Signup and view all the answers

When establishing communication on TCP port 8080, which approach can lead to potential security risks?

<p>Allowing all protocols in ingress rules. (A)</p> Signup and view all the answers

Which option defines the source filter of an ingress firewall rule that effectively allows communication between tier #2 and tier #3?

<p>Service account associated with tier #2. (D)</p> Signup and view all the answers

What is the best way to grant monitoring permissions to the support team without allowing access to table data in Cloud Spanner?

<p>Add the support team group to the roles/monitoring.viewer role. (A), Add the support team group to the roles/spanner.databaseReader role. (D)</p> Signup and view all the answers

Which option minimizes costs while providing a 30-GB in-memory cache and an additional 2 GB for other processes for a caching HTTP reverse proxy?

<p>Create a Cloud Memorystore for Redis instance with 32-GB capacity. (B)</p> Signup and view all the answers

To run a single binary application that automatically scales based on CPU usage in a policy-compliant way on Google Cloud, what should you do?

<p>Create an instance template, and use the template in a Managed Instance Group with autoscaling configured. (B)</p> Signup and view all the answers

What is the purpose of exporting logs from Cloud Audit to BigQuery?

<p>To analyze and store logs in a structured format. (A)</p> Signup and view all the answers

When exporting logs to Cloud Pub/Sub, what is the typical use case?

<p>For distributing logs to multiple subscribers. (A)</p> Signup and view all the answers

If you want to grant the support team appropriate permissions for monitoring without exposing them to sensitive data in Cloud Spanner, which role is appropriate?

<p>roles/spanner.databaseReader (B), roles/monitoring.viewer (D)</p> Signup and view all the answers

Which method provides a streamlined logging solution that does not require manual log management?

<p>Utilizing Stackdriver logging API to automate transfers. (D)</p> Signup and view all the answers

Which design is most appropriate for a caching HTTP reverse proxy with very low CPU consumption needs?

<p>Utilize Cloud Memorystore to manage the caching efficiently. (D)</p> Signup and view all the answers

What should you do to effectively manage a rolling-action update with the specified configuration to maxSurge and maxUnavailable?

<p>Perform a rolling-action start-update with maxSurge set to 1 and maxUnavailable set to 0. (D)</p> Signup and view all the answers

To grant access for three users to view and edit table data on a Cloud Spanner instance, which action is correct?

<p>Add the users to the roles/spanner.databaseUser role directly. (B)</p> Signup and view all the answers

What is the first step required to create a new billing account and link it to an existing GCP project?

<p>Verify you are Project Billing Manager for the GCP project. (D)</p> Signup and view all the answers

When trying to verify user access activities in Cloud Storage buckets, what is the most efficient approach?

<p>Use the GCP Console to filter the Activity log. (A)</p> Signup and view all the answers

How do you ensure that a newly created Managed Instance Group functions correctly after linking with the backend service for the load balancer?

<p>Monitor new instances until all are healthy before deleting the old group. (A)</p> Signup and view all the answers

In a Managed Instance Group, what is the purpose of deleting instances to recreate them using a new instance template?

<p>To efficiently apply the new application version without downtime. (C)</p> Signup and view all the answers

What is a consequence of setting maxSurge to 0 during a rolling-action update?

<p>Existing instances will be updated without additional instances being created. (B)</p> Signup and view all the answers

What must you confirm before linking a new billing account to any GCP project?

<p>You have Billing Administrator permissions for the billing account. (A)</p> Signup and view all the answers

Flashcards

Validating Deployment Manager Changes

Use Deployment Manager's preview feature to validate the changes before applying them. This will help you identify any potential issues early on and avoid unexpected errors.

Restricting Communication between Tiers

Configure firewall rules to restrict communication between tiers based on the specified ports. Create firewall rules for each tier allowing inbound traffic from the specific tier that needs to communicate with it. For example, the firewall rule for tier #2 should allow inbound traffic from tier #1 on port 8080.

Analyzing GCP Service Costs

Use BigQuery to query billing export data and create reports. Use the billing.export dataset to access billing data. Utilize the billing_export_table table for service cost details. You can use standard query syntax to group data by service type and generate cost estimates for the desired time period.

Enabling Cloud Pub/Sub for App Engine

Enable the Cloud Pub/Sub API in your project. Create a service account with the necessary permissions. This will grant your App Engine application access to the Cloud Pub/Sub API.

Signup and view all the flashcards

Testing New Website Version

Utilize App Engine's Split Traffic feature to randomly direct a percentage (1% in this case) of your users to the new test version. This ensures that the new version is tested on a small subset of users without major disruption to the majority.

Signup and view all the flashcards

Storing Audit Logs

Use Cloud Storage to store log audit files. Configure retention policies in Cloud Storage to automatically delete files older than 3 years. This solution provides a cost-effective approach for long-term storage and automatic cleanup.

Signup and view all the flashcards

Restricting Support Team Access

Create a Cloud Spanner database role for your support team with read-only permissions. This allows them to monitor the environment without granting access to sensitive data. This ensures that they can perform essential tasks without compromising data security.

Signup and view all the flashcards

Caching HTTP Reverse Proxy

Utilize a Cloud Load Balancing service like HTTP(S) Load Balancing to distribute traffic across multiple instances. Configure it to act as a caching HTTP reverse proxy, ensuring low latency and high availability for your website.

Signup and view all the flashcards

Monitoring Cloud Spanner for Support Team

Granting permissions to a support team to monitor Cloud Spanner environment without access to table data. This approach follows Google-recommended practices.

Signup and view all the flashcards

Efficiently Running an HTTP Reverse Proxy on GCP

An HTTP reverse proxy with 30GB in-memory cache and minimal CPU usage, best implemented for cost efficiency.

Signup and view all the flashcards

Auto-Scaling a Single Binary Application on GCP

Scaling a single binary application on Google Cloud Platform based on CPU usage, ensuring operational efficiency and speed.

Signup and view all the flashcards

How to provide audit access to an external auditor?

Grant the auditor the "Cloud Audit Logs Viewer" and "Data Access Viewer" roles to access the corresponding logs.

Signup and view all the flashcards

How to access logs from multiple GCP projects?

Use Cloud Logging to combine logs from all projects into a single location. Enable Log Router to collect logs from all projects and route them to a shared sink. This allows for easier analysis and visualization.

Signup and view all the flashcards

How to disable all services in a GCP project?

Disable all services in the GCP project by navigating to the project and using the 'IAM & Admin' section. Then, deactivate all services within the project individually.

Signup and view all the flashcards

How to grant BigQuery access to service accounts across multiple projects?

Create a service account in the web-applications project and grant it the "BigQuery Data Viewer" role in the crm-databases-proj project. Allow the service account to access the BigQuery datasets. This ensures minimal access and adheres to Google-recommended practices.

Signup and view all the flashcards

How to investigate unauthorized user access?

Check the Cloud Audit Logs to investigate the employee's actions after their termination. The Audit Logs can provide information on who accessed sensitive data after their termination.

Signup and view all the flashcards

How to create a custom IAM role for production use?

Create a custom IAM role with suitable production permissions. Document the role's purpose, permissions, and intended use. This provides clarity, transparency, and control over custom roles.

Signup and view all the flashcards

How to prepare unstructured data for processing in Dataflow?

Use Cloud Storage to upload your unstructured data in various formats. This allows for storage and organization of data before being processed.

Signup and view all the flashcards

How to manage multiple Google Cloud projects?

Configure the Google Cloud SDK to include multiple projects. Manage the gcloud configurations to switch between projects easily.

Signup and view all the flashcards

Ingress Firewall Rule

A network security rule that controls incoming traffic to instances. An ingress firewall rule can specify the source of the traffic, the target instances, and the allowed protocols.

Signup and view all the flashcards

Egress Firewall Rule

A network security rule that controls outgoing traffic from instances. An egress firewall rule can specify the destination of the traffic, the source instances, and the allowed protocols.

Signup and view all the flashcards

Tier Service Account

A service account used to represent the identity of a group of instances within a specific tier of a multi-tier application.

Signup and view all the flashcards

Firewall Rule: Tier 1 to Tier 2

A rule that allows communication on TCP port 8080 between instances in Tier #1 and Tier #2.

Signup and view all the flashcards

Firewall Rule: Tier 2 to Tier 3

A rule that allows communication on TCP port 8080 between instances in Tier #2 and Tier #3.

Signup and view all the flashcards

Deployment Manager Template Execution in a Separate Project

The process of checking and validating the configuration of a deployment template against a different GCP project.

Signup and view all the flashcards

C-Preview Option

A deployment manager option allowing to preview the changes without actually deploying the resources.

Signup and view all the flashcards

Monitoring Interdependent Resources

Observing the state of different resources during deployment, especially when resources are interdependent.

Signup and view all the flashcards

What is point-in-time recovery?

Point-in-time recovery allows you to restore your data to a specific point in time, even if the data has been modified or deleted. This is crucial for protecting your data against accidental deletions or modification and for disaster recovery.

Signup and view all the flashcards

How to support point-in-time recovery?

To support point-in-time recovery, you should implement a mechanism to regularly create backups or snapshots of your data. These backups can be stored in a separate location or on different storage media to ensure data redundancy and protection against data loss.

Signup and view all the flashcards

Securely access Cloud Storage from a private data center

To provide an application in your data center with access to Cloud Storage without public IPs, use a private service access connection. This enables secure communication between your on-premises environment and Google Cloud services through Private Google Access.

Signup and view all the flashcards

How to manage access to BigQuery for a data science team?

Use Cloud Identity and Access Management (IAM) to control access to your BigQuery data. You can create custom roles for your data science team, granting them only the necessary permissions to perform queries. Avoid giving them broad access to prevent potential misuse.

Signup and view all the flashcards

Deploying a new instance with access to an existing application

Deploy a new Compute Engine instance in the europe-west1 region, then use a VPN connection to establish a private network link between the two instances in different regions. This will allow the new instance to communicate securely with the existing instance in the us-central1 region.

Signup and view all the flashcards

Building a time-series data processing pipeline

For processing time-series data, you can use Cloud Pub/Sub to ingest data in real-time, Cloud Storage to store large volumes of time-series data, Cloud Dataflow for batch processing of data, and BigQuery for storing processed data and performing analysis. This combination provides a robust and scalable solution for time-series data processing.

Signup and view all the flashcards

Minimizing BigQuery logging costs

To minimize cost, you can configure the Cloud Logging agent to filter logs based on severity levels. Only send critical or warning logs to your BigQuery dataset, and filter out less important information to reduce storage costs. This will ensure that you retain the most important logs while keeping costs under control.

Signup and view all the flashcards

Deploying a Cloud Run application triggered by Pub/Sub

To deploy an application on Cloud Run that processes messages from a Cloud Pub/Sub topic, use a Cloud Run service triggered by a Cloud Pub/Sub topic. This will automatically execute your application whenever a new message arrives on the topic. This approach ensures that your application reacts in real-time to new data and processes it efficiently.

Signup and view all the flashcards

Rolling Update with maxSurge and maxUnavailable

A rolling update starts updating instances one by one, limiting the number of instances being updated at a time. maxSurge sets the maximum number of extra instances that can be created during the update, while maxUnavailable sets the maximum number of instances that can be stopped for the update.

Signup and view all the flashcards

Updating Managed Instance Group with new Instance Template

Update the existing Managed Instance Group with a new instance template. This means updating the configuration of existing instances to use the new instance template. Usually done when you want to upgrade your application code or change resources available to running instances.

Signup and view all the flashcards

Creating New Managed Instance Group

Create a new Managed Instance Group with an updated instance template, which often means creating instances with the new template. This will allow you to add new functionality to your application or resource, increasing performance or reliability.

Signup and view all the flashcards

Health Check Before Deletion

Ensure all instances in the new Managed Instance Group are healthy and working as expected before deleting the old one, minimizing down time and ensuring a seamless transition.

Signup and view all the flashcards

Granting Cloud Spanner Access to Users

You need to grant the users the spanner.databaseUser role, which allows them to view and edit data in the Cloud Spanner instance. You can do this either by directly adding them to this role or by creating a group and adding them to that group, and then adding the group to the role.

Signup and view all the flashcards

Linking Billing Account to Project

This step involves creating a new billing account separately from your Google Cloud project and then linking this new billing account to your existing Google Cloud project. This allows you to manage billing for your project under a separate billing account.

Signup and view all the flashcards

Checking Activity Log for Data Access

The Activity log is a detailed record of events in your Google Cloud project, including data access activities. You can filter it by specific users, buckets, and actions, providing you with a clear picture of who accessed what, when, and how.

Signup and view all the flashcards

Study Notes

Google Cloud Platform (GCP) Associate Cloud Engineer (ACE) Exam

  • GCP offers a range of services for building, deploying, and managing applications and infrastructure in the cloud.
  • The ACE exam covers fundamental GCP concepts and practical application-level skills.

Exam Questions and Answers

  • Question 1: Confirming dependencies in a Deployment Manager template: Use Deployment Manager's built-in dependency checking.
  • Question 2: Enabling communication between tiers in a 3-tier solution: Configure firewall rules to allow communication between instances of different tiers (1-2 and 2-3) on TCP port 8080.
  • Question 3: Estimating GCP service costs: Utilize standard query syntax within BigQuery to summarize service costs by type across multiple projects.
  • Question 4: Enabling Cloud Pub/Sub for an App Engine application: Enable the Cloud Pub/Sub API and configure the service account to authenticate the App Engine application.
  • Question 5: Deploying a new version of a website using App Engine: Use the --migrate option to deploy the new version without disrupting service to 1% of users.
  • Question 6: Implementing a cost-effective log file retention strategy: Utilize Cloud Storage's lifecycle management to retain audit logs for 3 years.
  • Question 7: Streamlining support team access to Cloud Spanner: Grant the correct permissions to the support team by using a Google-recommended approach without granting permissions to access the data.
  • Question 8: Optimizing GCP Costs for Caching HTTP Reverse Proxy: Use Cloud Memorystore for Redis with a 32 GB in-memory cache to minimize cost.
  • Question 9: Configuring automatic scaling for a binary application: Automatically scale the binary application based on CPU usage to be operationally efficient.
  • Question 10: Granting permissions for Compute Engine instances to write to Cloud Storage: Use service account permissions to enable writing to the Cloud Storage bucket for the Compute Engine instances.
  • Question 11: Sharing a Cloud Storage object with an external company: Securely share the object with the external company by using a Cloud Storage signed URL with an expiration time.
  • Question 12: Configuring an autoscaling Managed Instance Group for HTTPS Web Application: Use a health check on port 443 to automatically recreate unhealthy VMs.
  • Question 13: Setting up a Managed Instance Group to ensure only one instance runs per project: Configure the Managed Instance Group to ensure only one VM instance runs per project.
  • Question 14: Configuring VPC network and two subnets for production and test workloads: Configure a VPC network with two subnets in different regions to isolate production and test workloads.
  • Question 15: Configuring HTTPS load balancing services: Configure HTTPS load balancing services to terminate the client SSL session to minimize complexity.
  • Question 16: Deploying a new version of an application using a Managed Instance Group: Gradually roll out the new version while maintaining the available capacity of your web application.
  • Question 17: Granting access to Cloud Spanner instance table data: Grant three users access to view and modify table data for a Cloud Spanner instance.
  • Question 18: Linking a billing account and GCP project: Create a new billing account and link it to an existing project.
  • Question 19: Verifying Cloud Storage activity: Appropriately filter logs to verify Cloud Storage activity for a particular user.
  • Question 20: Estimating BigQuery query costs: Use on-demand pricing to query BigQuery and estimate the costs of a query that expects to return a lot of records.
  • Question 21: Monitoring resources across multiple projects: Collect and consolidate logs across projects into a single Stackdriver Monitoring view.
  • Question 22: Dynamically provisioning VMs: Configure dynamic VM provisioning using a dedicated configuration file.
  • Question 23: Sharing sensitive Cloud Storage objects with external companies: Leverage Cloud Storage's signed URLs to share sensitive data with specific expiration times.

…and so on for questions 24-61. This is a sample, and the full list is extensive.

Studying That Suits You

Use AI to generate personalized quizzes and flashcards to suit your learning preferences.

Quiz Team

Related Documents

Description

This quiz assesses your understanding of best practices for implementing Google Cloud solutions, focusing on security, resource access, and efficient deployment. Questions cover cloud storage access, instance management across regions, logging optimization, and service estimates. Test your knowledge and enhance your skills in cloud architecture design.

More Like This

Use Quizgecko on...
Browser
Browser