Professional-Cloud-Security-Engineer V15.35.pdf

Full Transcript

IT Certification Guaranteed, The Easy Way! Exam : Professional-Cloud-Security- Engineer Title : Google Cloud Certified - Professional Cloud Security Engineer Exam Vendor : Google Version : V15.35...

IT Certification Guaranteed, The Easy Way! Exam : Professional-Cloud-Security- Engineer Title : Google Cloud Certified - Professional Cloud Security Engineer Exam Vendor : Google Version : V15.35 1 IT Certification Guaranteed, The Easy Way! NO.1 Which two implied firewall rules are defined on a VPC network? (Choose two.) A. A rule that allows all outbound connections B. A rule that denies all inbound connections C. A rule that blocks all inbound port 25 connections D. A rule that blocks all outbound connections E. A rule that allows all inbound port 80 connections Answer: A B Explanation Implied IPv4 allow egress rule. An egress rule whose action is allow, destination is 0.0.0.0/0, and priority is the lowest possible (65535) lets any instance send traffic to any destination Implied IPv4 deny ingress rule. An ingress rule whose action is deny, source is 0.0.0.0/0, and priority is the lowest possible (65535) protects all instances by blocking incoming connections to them. https://cloud.google.com/vpc/docs/firewalls?hl=en#default_firewall_rules NO.2 You are deploying a web application hosted on Compute Engine. A business requirement mandates that application logs are preserved for 12 years and data is kept within European boundaries. You want to implement a storage solution that minimizes overhead and is cost-effective. What should you do? A. Create a Cloud Storage bucket to store your logs in the EUROPE-WEST1 region. Modify your application code to ship logs directly to your bucket for increased efficiency. B. Configure your Compute Engine instances to use the Google Cloud's operations suite Cloud Logging agent to send application logs to a custom log bucket in the EUROPE-WEST1 region with a custom retention of 12 years. C. Use a Pub/Sub topic to forward your application logs to a Cloud Storage bucket in the EUROPE- WEST1 region. D. Configure a custom retention policy of 12 years on your Google Cloud's operations suite log bucket in the EUROPE-WEST1 region. Answer: B Explanation https://youtu.be/MI4iG2GIZMA NO.3 Your organization operates Virtual Machines (VMs) with only private IPs in the Virtual Private Cloud (VPC) with internet access through Cloud NAT Everyday, you must patch all VMs with critical OS updates and provide summary reports What should you do? A. Validate that the egress firewall rules allow any outgoing traffic Log in to each VM and execute OS specific update commands Configure the Cloud Scheduler job to update with critical patches daily for daily updates. B. Ensure that VM Manager is installed and running on the VMs. In the OS patch management service. configure the patch jobs to update with critical patches daily. C. Assign public IPs to VMs. Validate that the egress firewall rules allow any outgoing traffic Log in to each VM. and configure a daily cron job to enable for OS updates at night during low activity periods. D. Copy the latest patches to the Cloud Storage bucket. Log in to each VM. download the patches from the bucket, and install them. 2 IT Certification Guaranteed, The Easy Way! Answer: B Explanation VM Manager is a suite of tools that can be used to manage operating systems for large virtual machine (VM) fleets running Windows and Linux on Compute Engine. It helps drive efficiency through automation and reduces the operational burden of maintaining these VM fleets. VM Manager includes several services such as OS patch management, OS inventory management, and OS configuration management. By using VM Manager, you can apply patches, collect operating system information, and install, remove, or auto-update software packages. The suite provides a high level of control and automation for managing large VM fleets on Google Cloud. https://cloud.google.com/compute/docs/vm-manager NO.4 You are backing up application logs to a shared Cloud Storage bucket that is accessible to both the administrator and analysts. Analysts should not have access to logs that contain any personally identifiable information (PII). Log files containing PII should be stored in another bucket that is only accessible to the administrator. What should you do? A. Upload the logs to both the shared bucket and the bucket with Pll that is only accessible to the administrator. Use the Cloud Data Loss Prevention API to create a job trigger. Configure the trigger to delete any files that contain Pll from the shared bucket. B. On the shared bucket, configure Object Lifecycle Management to delete objects that contain Pll. C. On the shared bucket, configure a Cloud Storage trigger that is only triggered when Pll is uploaded. Use Cloud Functions to capture the trigger and delete the files that contain Pll. D. Use Pub/Sub and Cloud Functions to trigger a Cloud Data Loss Prevention scan every time a file is uploaded to the administrator's bucket. If the scan does not detect Pll, have the function move the objects into the shared Cloud Storage bucket. Answer: B NO.5 You have noticed an increased number of phishing attacks across your enterprise user accounts. You want to implement the Google 2-Step Verification (2SV) option that uses a cryptographic signature to authenticate a user and verify the URL of the login page. Which Google 2SV option should you use? A. Titan Security Keys B. Google prompt C. Google Authenticator app D. Cloud HSM keys Answer: A Explanation https://cloud.google.com/titan-security-key Security keys use public key cryptography to verify a user's identity and URL of the login page ensuring attackers can't access your account even if you are tricked into providing your username and password. NO.6 Your company's chief information security officer (CISO) is requiring business data to be stored in specific locations due to regulatory requirements that affect the company's global expansion plans. After working on a plan to implement this requirement, you determine the following: The services in scope are included in the Google Cloud data residency requirements. 3 IT Certification Guaranteed, The Easy Way! The business data remains within specific locations under the same organization. The folder structure can contain multiple data residency locations. The projects are aligned to specific locations. You plan to use the Resource Location Restriction organization policy constraint with very granular control. At which level in the hierarchy should you set the constraint? A. Organization B. Resource C. Project D. Folder Answer: C NO.7 An organization receives an increasing number of phishing emails. Which method should be used to protect employee credentials in this situation? A. Multifactor Authentication B. A strict password policy C. Captcha on login pages D. Encrypted emails Answer: A Explanation https://cloud.google.com/blog/products/g-suite/7-ways-admins-can-help-secure-accounts-against- phishing-g-suit https://www.duocircle.com/content/email-security-services/email-security-in- cryptography#:~:text=Customer%2 NO.8 Your team needs to configure their Google Cloud Platform (GCP) environment so they can centralize the control over networking resources like firewall rules, subnets, and routes. They also have an on-premises environment where resources need access back to the GCP resources through a private VPN connection. The networking resources will need to be controlled by the network security team. Which type of networking design should your team use to meet these requirements? A. Shared VPC Network with a host project and service projects B. Grant Compute Admin role to the networking team for each engineering project C. VPC peering between all engineering projects using a hub and spoke model D. Cloud VPN Gateway between all engineering projects using a hub and spoke model Answer: A NO.9 A customer has an analytics workload running on Compute Engine that should have limited internet access. Your team created an egress firewall rule to deny (priority 1000) all traffic to the internet. The Compute Engine instances now need to reach out to the public repository to get security updates. What should your team do? A. Create an egress firewall rule to allow traffic to the CIDR range of the repository with a priority greater than 1000. 4 IT Certification Guaranteed, The Easy Way! B. Create an egress firewall rule to allow traffic to the CIDR range of the repository with a priority less than 1000. C. Create an egress firewall rule to allow traffic to the hostname of the repository with a priority greater than 1000. D. Create an egress firewall rule to allow traffic to the hostname of the repository with a priority less than 1000. Answer: B Explanation https://cloud.google.com/vpc/docs/firewalls#priority_order_for_firewall_rules NO.10 A manager wants to start retaining security event logs for 2 years while minimizing costs. You write a filter to select the appropriate log entries. Where should you export the logs? A. BigQuery datasets B. Cloud Storage buckets C. StackDriver logging D. Cloud Pub/Sub topics Answer: B NO.11 While migrating your organization's infrastructure to GCP, a large number of users will need to access GCP Console. The Identity Management team already has a well-established way to manage your users and want to keep using your existing Active Directory or LDAP server along with the existing SSO password. What should you do? A. Manually synchronize the data in Google domain with your existing Active Directory or LDAP server. B. Use Google Cloud Directory Sync to synchronize the data in Google domain with your existing Active Directory or LDAP server. C. Users sign in directly to the GCP Console using the credentials from your on-premises Kerberos compliant identity provider. D. Users sign in using OpenID (OIDC) compatible IdP, receive an authentication token, then use that token to log in to the GCP Console. Answer: B Explanation https://cloud.google.com/architecture/identity/federating-gcp-with-active-directory-configuring- single-sign-on NO.12 Your DevOps team uses Packer to build Compute Engine images by using this process: 1 Create an ephemeral Compute Engine VM. 2 Copy a binary from a Cloud Storage bucket to the VM's file system. 3 Update the VM's package manager. 4 Install external packages from the internet onto the VM. Your security team just enabled the organizational policy. consrraints/compure.vnExtemallpAccess. 5 IT Certification Guaranteed, The Easy Way! to restrict the usage of public IP Addresses on VMs. In response your DevOps team updated their scripts to remove public IP addresses on the Compute Engine VMs however the build pipeline is failing due to connectivity issues. What should you do? Choose 2 answers A. Provision a Cloud NAT instance in the same VPC and region as the Compute Engine VM B. Provision an HTTP load balancer with the VM in an unmanaged instance group to allow inbound connectionsfrom the internet to your VM. C. Update the VPC routes to allow traffic to and from the internet. D. Provision a Cloud VPN tunnel in the same VPC and region as the Compute Engine VM. E. Enable Private Google Access on the subnet that the Compute Engine VM is deployed within. Answer: A E NO.13 A website design company recently migrated all customer sites to App Engine. Some sites are still in progress and should only be visible to customers and company employees from any location. Which solution will restrict access to the in-progress sites? A. Upload an.htaccess file containing the customer and employee user accounts to App Engine. B. Create an App Engine firewall rule that allows access from the customer and employee networks and denies all other traffic. C. Enable Cloud Identity-Aware Proxy (IAP), and allow access to a Google Group that contains the customer and employee user accounts. D. Use Cloud VPN to create a VPN connection between the relevant on-premises networks and the company's GCP Virtual Private Cloud (VPC) network. Answer: C Explanation https://cloud.google.com/iap/docs/concepts-overview#when_to_use_iap NO.14 A customer wants to make it convenient for their mobile workforce to access a CRM web interface that is hosted on Google Cloud Platform (GCP). The CRM can only be accessed by someone on the corporate network. The customer wants to make it available over the internet. Your team requires an authentication layer in front of the application that supports two-factor authentication Which GCP product should the customer implement to meet these requirements? A. Cloud Identity-Aware Proxy B. Cloud Armor C. Cloud Endpoints D. Cloud VPN Answer: A Explanation Cloud IAP is integrated with Google Sign-in which Multi-factor authentication can be enabled.https://cloud.google.com/iap/docs/concepts-overview NO.15 An engineering team is launching a web application that will be public on the internet. The web application is hosted in multiple GCP regions and will be directed to the respective backend based on the URL request. 6 IT Certification Guaranteed, The Easy Way! Your team wants to avoid exposing the application directly on the internet and wants to deny traffic from a specific list of malicious IP addresses Which solution should your team implement to meet these requirements? A. Cloud Armor B. Network Load Balancing C. SSL Proxy Load Balancing D. NAT Gateway Answer: A Explanation https://cloud.google.com/armor/docs/security-policy-overview#edge-security NO.16 You are a member of the security team at an organization. Your team has a single GCP project with credit card payment processing systems alongside web applications anddata processing systems. You want to reduce the scope of systems subject to PCI audit standards. What should you do? A. Use multi-factor authentication for admin access to the web application. B. Use only applications certified compliant with PA-DSS. C. Move the cardholder data environment into a separate GCP project. D. Use VPN for all connections between your office and cloud environments. Answer: C Explanation https://cloud.google.com/solutions/best-practices-vpc-design "Setting up your payment-processing environment" section inhttps://cloud.google.com/solutions/pci-dss-compliance-in-gcp. NO.17 You need to enforce a security policy in your Google Cloud organization that prevents users from exposing objects in their buckets externally. There are currently no buckets in your organization. Which solution should you implement proactively to achieve this goal with the least operational overhead? A. Create an hourly cron job to run a Cloud Function that finds public buckets and makes them private. B. Enable theconstraints/storage.publicAccessPreventionconstraint at the organization level. C. Enable theconstraints/storage.uniformBucketLevelAccessconstraint at the organization level. D. Create a VPC Service Controls perimeter that protects the storage.googleapis.com service in your projects that contains buckets. Add any new project that contains a bucket to the perimeter. Answer: B Explanation https://cloud.google.com/storage/docs/public-access-prevention Public access prevention protects Cloud Storage buckets and objects from being accidentally exposed to the public. If your bucket is contained within an organization, youcan enforce public access prevention by using the organization policy constraint storage.publicAccessPrevention at the project, folder, or organization level. NO.18 Your team needs to make sure that their backend database can only be accessed by the 7 IT Certification Guaranteed, The Easy Way! frontend application and no other instances on the network. How should your team design this network? A. Create an ingress firewall rule to allow access only from the application to the database using firewall tags. B. Create a different subnet for the frontend application and database to ensure network isolation. C. Create two VPC networks, and connect the two networks using Cloud VPN gateways to ensure network isolation. D. Create two VPC networks, and connect the two networks using VPC peering to ensure network isolation. Answer: A Explanation "However, even though it is possible to uses tags for target filtering in this manner, we recommend that you use service accounts where possible. Target tags are not access-controlled and can be changed by someone with the instanceAdmin role while VMs are in service. Service accounts are access-controlled, meaning that a specific user must be explicitly authorized to use a service account. There can only be one service account per instance, whereas there can be multiple tags. Also, service accounts assigned to a VM can only be changed when the VM is stopped" NO.19 You are part of a security team that wants to ensure that a Cloud Storage bucket in Project A can only be readable from Project B. You also want to ensure that data in the Cloud Storage bucket cannot be accessed from or copied to Cloud Storage buckets outside the network, even if the user has the correct credentials. What should you do? A. Enable VPC Service Controls, create a perimeter with Project A and B, and include Cloud Storage service. B. Enable Domain Restricted Sharing Organization Policy and Bucket Policy Only on the Cloud Storage bucket. C. Enable Private Access in Project A and B networks with strict firewall rules to allow communication between the networks. D. Enable VPC Peering between Project A and B networks with strict firewall rules to allow communication between the networks. Answer: A Explanation https://cloud.google.com/vpc-service-controls/docs/overview#isolate NO.20 You are on your company's development team. You noticed that your web application hosted in staging on GKE dynamically includes user data in web pages without first properly validating the inputted data. This could allow an attacker to execute gibberish commands and display arbitrary content in a victim user's browser in a production environment. How should you prevent and fix this vulnerability? A. Use Cloud IAP based on IP address or end-user device attributes to prevent and fix the vulnerability. B. Set up an HTTPS load balancer, and then use Cloud Armor for the production environment to prevent the potential XSS attack. 8 IT Certification Guaranteed, The Easy Way! C. Use Web Security Scanner to validate the usage of an outdated library in the code, and then use a secured version of the included library. D. Use Web Security Scanner in staging to simulate an XSS injection attack, and then use a templating system that supports contextual auto-escaping. Answer: D Explanation There is mention about simulating in Web Security Scanner. "Web Security Scanner cross-site scripting (XSS) injection testing *simulates* an injection attack by inserting a benign test string into user-editable fields and then performing various user actions."https://cloud.google.com/security- command-center/docs/how-to-remediate-web-security-scanner-findin NO.21 You are in charge of creating a new Google Cloud organization for your company. Which two actions should you take when creating the super administrator accounts? (Choose two.) A. Create an access level in the Google Admin console to prevent super admin from logging in to Google Cloud. B. Disable any Identity and Access Management (1AM) roles for super admin at the organization level in the Google Cloud Console. C. Use a physical token to secure the super admin credentials with multi-factor authentication (MFA). D. Use a private connection to create the super admin accounts to avoid sending your credentials over the Internet. E. Provide non-privileged identities to the super admin users for their day-to-day activities. Answer: C E Explanation https://cloud.google.com/resource-manager/docs/super-admin-best- practices#discourage_super_admin_account_ - Use a security key or other physical authentication device to enforce two-step verification - Giv e super admins a separate account that requires a separate login NO.22 Your organization has implemented synchronization and SAML federation between Cloud Identity and Microsoft Active Directory. You want to reduce the risk of Google Cloud user accounts being compromised. What should you do? A. Create a Cloud Identity password policy with strong password settings, and configure 2-Step Verification with security keys in the Google Admin console. B. Create a Cloud Identity password policy with strong password settings, and configure 2-Step Verification with verification codes via text or phone call in the Google Admin console. C. Create an Active Directory domain password policy with strong password settings, and configure post-SSO (single sign-on) 2-Step Verification with security keys in the Google Admin console. D. Create an Active Directory domain password policy with strong password settings, and configure post-SSO (single sign-on) 2-Step Verification with verification codes via text or phone call in the Google Admin console. Answer: C NO.23 You need to use Cloud External Key Manager to create an encryption key to encrypt specific 9 IT Certification Guaranteed, The Easy Way! BigQuery data at rest in Google Cloud. Which steps should you do first? A. 1. Create or use an existing key with a unique uniform resource identifier (URI) in your Google Cloud project. 2. Grant your Google Cloud project access to a supported external key management partner system. B. 1. Create or use an existing key with a unique uniform resource identifier (URI) in Cloud Key Management Service (Cloud KMS). 2. In Cloud KMS, grant your Google Cloud project access to use the key. C. 1. Create or use an existing key with a unique uniform resource identifier (URI) in a supported external key management partner system. 2. In the external key management partner system, grant access for this key to use your Google Cloud project. D. 1. Create an external key with a unique uniform resource identifier (URI) in Cloud Key Management Service (Cloud KMS). 2. In Cloud KMS, grant your Google Cloud project access to use the key. Answer: C Explanation https://cloud.google.com/kms/docs/ekm#how_it_works - First, you create or use an existing key in a supported external key management partner system. This key has a unique URI or key path. - Next, you grant your Google Cloud project access to use the key, in the external key management partner system. - In your Google Cloud project, you create a Cloud EKM key, using the URI or key path for the externally-managed key. NO.24 As adoption of the Cloud Data Loss Prevention (DLP) API grows within the company, you need to optimize usage to reduce cost. DLP target data is stored in Cloud Storage and BigQuery. The location and region are identified as a suffix in the resource name. Which cost reduction options should you recommend? A. Set appropriate rowsLimit value on BigQuery data hosted outside the US and set appropriate bytesLimitPerFile value on multiregional Cloud Storage buckets. B. Set appropriate rowsLimit value on BigQuery data hosted outside the US, and minimize transformation units on multiregional Cloud Storage buckets. C. Use rowsLimit and bytesLimitPerFile to sample data and use CloudStorageRegexFileSet to limit scans. D. Use FindingLimits and TimespanContfig to sample data and minimize transformation units. Answer: C Explanation https://cloud.google.com/dlp/docs/inspecting- storage#samplinghttps://cloud.google.com/dlp/docs/best-practices- NO.25 Your organization wants to be General Data Protection Regulation (GDPR) compliant You want to ensure that your DevOps teams can only create Google Cloud resources in the Europe regions. What should you do? A. Use the org policy constraint "Restrict Resource Service Usage'* on your Google Cloud 10 IT Certification Guaranteed, The Easy Way! organization node. B. Use Identity and Access Management (1AM) custom roles to ensure that your DevOps team can only create resources in the Europe regions C. Use the org policy constraint Google Cloud Platform - Resource Location Restriction" on your Google Cloud organization node. D. Use Identity-Aware Proxy (IAP) with Access Context Manager to restrict the location of Google Cloud resources. Answer: A Explanation https://cloud.google.com/resource-manager/docs/organization-policy/defining-locations NO.26 A company is running their webshop on Google Kubernetes Engine and wants to analyze customer transactions in BigQuery. You need to ensure that no credit card numbers are stored in BigQuery What should you do? A. Create a BigQuery view with regular expressions matching credit card numbers to query and delete affected rows. B. Use the Cloud Data Loss Prevention API to redact related infoTypes before data is ingested into BigQuery. C. Leverage Security Command Center to scan for the assets of type Credit Card Number in BigQuery. D. Enable Cloud Identity-Aware Proxy to filter out credit card numbers before storing the logs in BigQuery. Answer: B Explanation https://cloud.google.com/bigquery/docs/scan-with-dlp Cloud Data Loss Prevention API allows to detect and redact or remove sensitive data before the comments or reviews are published. Cloud DLP will read information from BigQuery, Cloud Storage or Datastore and scan it for sensitive data. NO.27 Your organization acquired a new workload. The Web and Application (App) servers will be running on Compute Engine in a newly created custom VPC. You are responsible for configuring a secure network communication solution that meets the following requirements: Only allows communication between the Web and App tiers. Enforces consistent network security when autoscaling the Web and App tiers. Prevents Compute Engine Instance Admins from altering network traffic. What should you do? A. 1. Configure all running Web and App servers with respective network tags. 2. Create an allow VPC firewall rule that specifies the target/source with respective network tags. B. 1. Configure all running Web and App servers with respective service accounts. 2. Create an allow VPC firewall rule that specifies the target/source with respective service accounts. C. 1. Re-deploy the Web and App servers with instance templates configured with respective network tags. 2. Create an allow VPC firewall rule that specifies the target/source with respective network tags. D. 1. Re-deploy the Web and App servers with instance templates configured with respective service accounts. 11 IT Certification Guaranteed, The Easy Way! 2. Create an allow VPC firewall rule that specifies the target/source with respective service accounts. Answer: D Explanation https://cloud.google.com/vpc/docs/firewalls#service-accounts-vs-tags https://cloud.google.com/vpc/docs/firewalls#service-accounts-vs-tags A service account represents an identity associated with an instance. Only one service account can be associated with an instance. You control access to the service account by controlling the grant of the Service Account User role for other IAM principals. For an IAM principal to start an instance by using a service account, that principal must have the Service Account User role to at least use that service account and appropriate permissions to create instances (for example, having the Compute Engine Instance Admin role to the project). NO.28 You perform a security assessment on a customer architecture and discover that multiple VMs have public IP addresses. After providing a recommendation to remove the public IP addresses, you are told those VMs need to communicate to external sites as part of the customer's typical operations. What should you recommend to reduce the need for public IP addresses in your customer's VMs? A. Google Cloud Armor B. Cloud NAT C. Cloud Router D. Cloud VPN Answer: B Explanation https://cloud.google.com/nat/docs/overview NO.29 Your team sets up a Shared VPC Network where project co-vpc-prod is the host project. Your team has configured the firewall rules, subnets, and VPN gateway on the host project. They need to enable Engineering Group A to attach a Compute Engine instance to only the 10.1.1.0/24 subnet. What should your team grant to Engineering Group A to meet this requirement? A. Compute Network User Role at the host project level. B. Compute Network User Role at the subnet level. C. Compute Shared VPC Admin Role at the host project level. D. Compute Shared VPC Admin Role at the service project level. Answer: B Explanation https://cloud.google.com/vpc/docs/shared-vpc#svc_proj_admins https://cloud.google.com/vpc/docs/shared-vpc#svc_proj_admins NO.30 You are deploying regulated workloads on Google Cloud. The regulation has data residency and data access requirements. It also requires that support is provided from the same geographical location as where the data resides. What should you do? A. Enable Access Transparency Logging. B. Deploy resources only to regions permitted by data residency requirements C. Use Data Access logging and Access Transparency logging to confirm that no users are accessing 12 IT Certification Guaranteed, The Easy Way! data from another region. D. Deploy Assured Workloads. Answer: D Explanation Assured Workloads for Google Cloud allows you to deploy regulated workloads with data residency, access, and support requirements. It helps you configure your environment in a manner that aligns with specific compliance frameworks and standards. NO.31 For compliance reasons, an organization needs to ensure that in-scope PCI Kubernetes Pods reside on "in- scope" Nodes only. These Nodes can only contain the "in-scope" Pods. How should the organization achieve this objective? A. Add a nodeSelector field to the pod configuration to only use the Nodes labeled inscope: true. B. Create a node pool with the label inscope: true and a Pod Security Policy that only allows the Pods to run on Nodes with that label. C. Place a taint on the Nodes with the label inscope: true and effect NoSchedule and a toleration to match in the Pod configuration. D. Run all in-scope Pods in the namespace "in-scope-pci". Answer: A Explanation nodeSelector is the simplest recommended form of node selection constraint. You can add the nodeSelector field to your Pod specification and specify the node labels you wantthe target node to have. Kubernetes only schedules the Pod onto nodes that have each of the labels you specify. => https://kubernetes.io/docs/concepts/scheduling-eviction/assign-pod-node/#nodeselector Tolerations are applied to pods. Tolerations allow the scheduler to schedule pods with matching taints. Tolerations allow scheduling but don't guarantee scheduling: the scheduler also evaluates other parameters as part of its function. =>https://kubernetes.io/docs/concepts/scheduling-eviction/taint-and-toleration/ NO.32 You control network traffic for a folder in your Google Cloud environment. Your folder includes multiple projects and Virtual Private Cloud (VPC) networks You want to enforce on the folder level that egress connections are limited only to IP range 10.58.5.0/24 and only from the VPC network dev-vpc." You want to minimize implementation and maintenance effort What should you do? A. * 1. Attach external IP addresses to the VMs in scope. * 2. Configure a VPC Firewall rule in "dev-vpc" that allows egress connectivity to IP range 10.58.5.0/24 for all source addresses in this network. B. * 1. Attach external IP addresses to the VMs in scope. * 2. Define and apply a hierarchical firewall policy on folder level to deny all egress connections and to allow egress to IP range 10 58.5.0/24 from network dev-vpc. C. * 1. Leave the network configuration of the VMs in scope unchanged. * 2. Create a new project including a new VPC network "new-vpc." * 3 Deploy a network appliance in "new-vpc" to filter access requests and only allow egress connections from -dev-vpc" to 10.58.5.0/24. D. * 1 Leave the network configuration of the VMs in scope unchanged * 2 Enable Cloud NAT for dev-vpc" and restrict the target range in Cloud NAT to 10.58.5 0/24. 13 IT Certification Guaranteed, The Easy Way! Answer: B Explanation This approach allows you to control network traffic at the folder level. By attaching external IP addresses to the VMs in scope, you can ensure that the VMs have a unique, routable IP address for outbound connections. Then, by defining and applying a hierarchical firewall policy at the folder level, you can enforce that egress connections are limited to the specified IP range and only from the specified VPC network. NO.33 Your organization s customers must scan and upload the contract and their driver license into a web portal in Cloud Storage. You must remove all personally identifiable information (Pll) from files that are older than 12 months. Also you must archive the anonymized files for retention purposes. What should you do? A. Set a time to live (TTL) of 12 months for the files in the Cloud Storage bucket that removes PH and movesthe files to the archive storage class. B. Schedule a Cloud Key Management Service (KMS) rotation period of 12 months for the encryption keys ofthe Cloud Storage files containing Pll to de-identify them Delete the original keys. C. Create a Cloud Data Loss Prevention (DLP) inspection job that de-identifies Pll in files created more than 12months ago and archives them to another Cloud Storage bucket. Delete the original files. D. Configure the Autoclass feature of the Cloud Storage bucket to de-identify Pll Archive the files that are olderthan 12 months Delete the original files. Answer: C NO.34 Your company requires the security and network engineering teams to identify all network anomalies and be able to capture payloads within VPCs. Which method should you use? A. Define an organization policy constraint. B. Configure packet mirroring policies. C. Enable VPC Flow Logs on the subnet. D. Monitor and analyze Cloud Audit Logs. Answer: B Explanation https://cloud.google.com/vpc/docs/packet-mirroring Packet Mirroring clones the traffic of specified instances in your Virtual Private Cloud (VPC) network and forwards it for examination. Packet Mirroring captures all traffic and packet data, including payloads and headers. NO.35 You are creating a new infrastructure CI/CD pipeline to deploy hundreds of ephemeral projects in your Google Cloud organization to enable your users to interact with Google Cloud. You want to restrict theuse of the default networks in your organization while following Google- recommended best practices. What should you do? A. Enable theconstraints/compute.skipDefaultNetworkCreationorganization policy constraint at the organization level. B. Create a cron job to trigger a daily Cloud Function to automatically delete all default networks for each project. C. Grant your users the 1AM Owner role at the organization level. Create a VPC Service Controls 14 IT Certification Guaranteed, The Easy Way! perimeter around the project that restricts thecompute.googleapis.comAPI. D. Only allow your users to use your CI/CD pipeline with a predefined set of infrastructure templates they can deploy to skip the creation of the default networks. Answer: A Explanation Enable the constraints/compute.skipDefaultNetworkCreation organization policy constraint at the organization level. https://cloud.google.com/resource-manager/docs/organization-policy/org-policy-constraints - constraints/compute.skipDefaultNetworkCreation This boolean constraint skips the creation of the default network and related resources during Google Cloud Platform Project resource creation where this constraint is set to True. By default, a default network and supporting resources are automatically created when creating a Project resource. NO.36 An organization is starting to move its infrastructure from its on-premises environment to Google Cloud Platform (GCP). The first step the organization wants to take is to migrate its ongoing data backup and disaster recovery solutions to GCP. The organization's on-premises production environment is going to be the next phase for migration to GCP. Stable networking connectivity between the on-premises environment and GCP is also being implemented. Which GCP solution should the organization use? A. BigQuery using a data pipeline job with continuous updates via Cloud VPN B. Cloud Storage using a scheduled task and gsutil via Cloud Interconnect C. Compute Engines Virtual Machines using Persistent Disk via Cloud Interconnect D. Cloud Datastore using regularly scheduled batch upload jobs via Cloud VPN Answer: B Explanation https://cloud.google.com/solutions/dr-scenarios-for-data#production_environment_is_on-premises https://medium.com/@pvergadia/cold-disaster-recovery-on-google-cloud-for-applications-running- on-premises-1 NO.37 A customer needs an alternative to storing their plain text secrets in their source-code management (SCM) system. How should the customer achieve this using Google Cloud Platform? A. Use Cloud Source Repositories, and store secrets in Cloud SQL. B. Encrypt the secrets with a Customer-Managed Encryption Key (CMEK), and store them in Cloud Storage. C. Run the Cloud Data Loss Prevention API to scan the secrets, and store them in Cloud SQL. D. Deploy the SCM to a Compute Engine VM with local SSDs, and enable preemptible VMs. Answer: B NO.38 Your privacy team uses crypto-shredding (deleting encryption keys) as a strategy to delete personally identifiable information (PII). You need to implement this practice on Google Cloud while still utilizing the majority of the platform's services and minimizing operational overhead. What should you do? A. Use client-side encryption before sending data to Google Cloud, and delete encryption keys on- premises 15 IT Certification Guaranteed, The Easy Way! B. Use Cloud External Key Manager to delete specific encryption keys. C. Use customer-managed encryption keys to delete specific encryption keys. D. Use Google default encryption to delete specific encryption keys. Answer: C Explanation https://cloud.google.com/sql/docs/mysql/cmek "You might have situations where you want to permanently destroy data encrypted with CMEK. To do this, you destroy the customer-managed encryption key version. You can't destroy the keyring or key, but you can destroy key versions of the key." NO.39 You manage a mission-critical workload for your organization, which is in a highly regulated industry The workload uses Compute Engine VMs to analyze and process the sensitive data after it is uploaded to Cloud Storage from the endpomt computers. Your compliance team has detected that this workload does not meet the data protection requirements for sensitive data. You need to meet these requirements; * Manage the data encryption key (DEK) outside the Google Cloud boundary. * Maintain full control of encryption keys through a third-party provider. * Encrypt the sensitive data before uploading it to Cloud Storage * Decrypt the sensitive data during processing in the Compute Engine VMs * Encrypt the sensitive data in memory while in use in the Compute Engine VMs What should you do ? Choose 2 answers A. Create a VPC Service Controls service perimeter across your existing Compute Engine VMs and Cloud Storage buckets B. Migrate the Compute Engine VMs to Confidential VMs to access the sensitive data. C. Configure Cloud External Key Manager to encrypt the sensitive data before it is uploaded to Cloud Storage and decrypt the sensitive data after it is downloaded into your VMs D. Create Confidential VMs to access the sensitive data. E. Configure Customer Managed Encryption Keys to encrypt the sensitive data before it is uploaded to Cloud Storage, and decrypt the sensitive data after it is downloaded into your VMs. Answer: C D Explanation https://cloud.google.com/confidential-computing/confidential-vm/docs/creating-cvm- instance#considerations Confidential VM does not support live migration. You can only enable Confidential Computing on a VM when you first create the instance.https://cloud.google.com/confidential-computing/confidential-vm/docs/creating-cvm- instance NO.40 How should a customer reliably deliver Stackdriver logs from GCP to their on-premises SIEM system? A. Send all logs to the SIEM system via an existing protocol such as syslog. B. Configure every project to export all their logs to a common BigQuery DataSet, which will be queried by the SIEM system. C. Configure Organizational Log Sinks to export logs to a Cloud Pub/Sub Topic, which will be sent to the SIEM via Dataflow. 16 IT Certification Guaranteed, The Easy Way! D. Build a connector for the SIEM to query for all logs in real time from the GCP RESTful JSON APIs. Answer: C Explanation Scenarios for exporting Cloud Logging data: Splunk This scenario shows how to export selected logs from Cloud Logging to Pub/Sub for ingestion into Splunk. Splunk is a security information and event management (SIEM) solution that supports several ways of ingesting data, such as receiving streaming data out of Google Cloud through Splunk HTTP Event Collector (HEC) or by fetching data from Google Cloud APIs through Splunk Add-on for Google Cloud. Using the Pub/Sub to Splunk Dataflow template, you can natively forward logs and events from a Pub/Sub topic into Splunk HEC. If Splunk HEC is not available in your Splunk deployment, you can use the Add-on to collect the logs and events from the Pub/Sub topic.https://cloud.google.com/solutions/exporting-stackdriver-logging-for- splunk NO.41 A customer wants to run a batch processing system on VMs and store the output files in a Cloud Storage bucket. The networking and security teams have decided that no VMs may reach the public internet. How should this be accomplished? A. Create a firewall rule to block internet traffic from the VM. B. Provision a NAT Gateway to access the Cloud Storage API endpoint. C. Enable Private Google Access on the VPC. D. Mount a Cloud Storage bucket as a local filesystem on every VM. Answer: C Explanation https://cloud.google.com/vpc/docs/private-google-access NO.42 You need to centralize your team's logs for production projects. You want your team to be able to search and analyze the logs using Logs Explorer. What should you do? A. Enable Cloud Monitoring workspace, and add the production projects to be monitored. B. Use Logs Explorer at the organization level and filter for production project logs. C. Create an aggregate org sink at the parent folder of the production projects, and set the destination to a Cloud Storage bucket. D. Create an aggregate org sink at the parent folder of the production projects, and set the destination to a logs bucket. Answer: D Explanation https://cloud.google.com/logging/docs/export/aggregated_sinks#supported-destinations You can use aggregated sinks to route logs within or between the same organizations and folders to the following destinations: - Another Cloud Logging bucket: Log entries held in Cloud Logging log buckets. NO.43 A customer's data science group wants to use Google Cloud Platform (GCP) for their analytics workloads. Company policy dictates that all data must be company-owned and all user authentications must go through their own Security Assertion Markup Language (SAML) 2.0 Identity Provider (IdP). The Infrastructure Operations Systems Engineer was trying to set up Cloud Identity for the customer and realized that their domain was already being used by G Suite. 17 IT Certification Guaranteed, The Easy Way! How should you best advise the Systems Engineer to proceed with the least disruption? A. Contact Google Support and initiate the Domain Contestation Process to use the domain name in your new Cloud Identity domain. B. Register a new domain name, and use that for the new Cloud Identity domain. C. Ask Google to provision the data science manager's account as a Super Administrator in the existing domain. D. Ask customer's management to discover any other uses of Google managed services, and work with the existing Super Administrator. Answer: D Explanation https://support.google.com/cloudidentity/answer/7389973 NO.44 You are the project owner for a regulated workload that runs in a project you own and manage as an Identity and Access Management (IAM) admin. For an upcoming audit, you need to provide access reviews evidence. Which tool should you use? A. Policy Troubleshooter B. Policy Analyzer C. IAM Recommender D. Policy Simulator Answer: B Explanation https://cloud.google.com/policy-intelligence/docs/policy-analyzer-overview Policy Analyzer lets you find out which principals (for example, users, service accounts, groups, and domains) have what access to which Google Cloud resources based on your IAM allow policies. NO.45 You have stored company approved compute images in a single Google Cloud project that is used as an image repository. This project is protected with VPC Service Controls and exists in the perimeter along with other projects in your organization. This lets other projects deploy images from the image repository project. A team requires deploying a third-party disk image that is stored in an external Google Cloud organization. You need to grant read access to the disk image so that it can be deployed into the perimeter. What should you do? A. *1 Update the perimeter *2 Configure the egressTo field to set identity Type toany_identity. *3 Configure the egressFrom field to include the external Google Cloud project number as an allowed resource and the serviceName to compute. googleapis. com. B. * Allow the external project by using the organizational policy constraints/compute.trustedlmageProjects. C. *1 Update the perimeter *2 Configure the egressTo field to include the external Google Cloud project number as an allowed resource and the serviceName to compute. googleapis. com. *3 Configure the egressFrom field to set identity Type toany_idestity. D. *1 Update the perimeter 18 IT Certification Guaranteed, The Easy Way! *2 Configure the ingressFrcm field to set identityType toan-y_identity. *3 Configure the ingressTo field to include the external Google Cloud project number as an allowed resource and the serviceName to compute.googleapis -com. Answer: A NO.46 An organization is migrating from their current on-premises productivity software systems to G Suite. Some network security controls were in place that were mandated by a regulatory body in their region for their previous on-premises system. The organization's risk team wants to ensure that network security controls are maintained and effective in G Suite. A security architect supporting this migration has been asked to ensure that network security controls are in place as part of the new shared responsibility model between the organization and Google Cloud. What solution would help meet the requirements? A. Ensure that firewall rules are in place to meet the required controls. B. Set up Cloud Armor to ensure that network security controls can be managed for G Suite. C. Network security is a built-in solution and Google's Cloud responsibility for SaaS products like G Suite. D. Set up an array of Virtual Private Cloud (VPC) networks to control network security as mandated by the relevant regulation. Answer: C Explanation https://gsuite.google.com/learn-more/security/security-whitepaper/page-1.html Shared responsibility "Security of the Cloud" - GCP is responsible for protecting the infrastructure that runs all of the services offered in the GCP Cloud. This infrastructure iscomposed of the hardware, software, networking, and facilities that run GCP Cloud services. NO.47 You are the security admin of your company. Your development team creates multiple GCP projects under the "implementation" folder for several dev, staging, and production workloads. You want to prevent data exfiltration by malicious insiders or compromised code by setting up a security perimeter. However, you do not want to restrict communication between the projects. What should you do? A. Use a Shared VPC to enable communication between all projects, and use firewall rules to prevent data exfiltration. B. Create access levels in Access Context Manager to prevent data exfiltration, and use a shared VPC for communication between projects. C. Use an infrastructure-as-code software tool to set up a single service perimeter and to deploy a Cloud Function that monitors the "implementation" folder via Stackdriver and Cloud Pub/Sub. When the function notices that a new project is added to the folder, it executes Terraform to add the new project to the associated perimeter. D. Use an infrastructure-as-code software tool to set up three different service perimeters for dev, staging, and prod and to deploy a Cloud Function that monitors the "implementation" folder via Stackdriver and Cloud Pub/Sub. When the function notices that a new project is added to the folder, it executes Terraform to add the new project to the respective perimeter. Answer: C Explanation 19 IT Certification Guaranteed, The Easy Way! https://cloud.google.com/vpc-service-controls/docs/overview#benefits https://github.com/terraform-google-modules/terraform-google-vpc-service- controls/tree/master/examples/autom NO.48 Your organization s customers must scan and upload the contract and their driver license into a web portal in Cloud Storage. You must remove all personally identifiable information (Pll) from files that are older than 12 months. Also you must archive the anonymized files for retention purposes. What should you do? A. Set a time to live (TTL) of 12 months for the files in the Cloud Storage bucket that removes PH and movesthe files to the archive storage class. B. Create a Cloud Data Loss Prevention (DLP) inspection job that de-identifies Pll in files created more than 12months ago and archives them to another Cloud Storage bucket. Delete the original files. C. Schedule a Cloud Key Management Service (KMS) rotation period of 12 months for the encryption keys ofthe Cloud Storage files containing Pll to de-identify them Delete the original keys. D. Configure the Autoclass feature of the Cloud Storage bucket to de-identify Pll Archive the files that are olderthan 12 months Delete the original files. Answer: B NO.49 Users are reporting an outage on your public-facing application that is hosted on Compute Engine. You suspect that a recent change to your firewall rules is responsible. You need to test whether your firewall rules are working properly. What should you do? A. Enable Firewall Rules Logging on the latest rules that were changed. Use Logs Explorer to analyze whether the rules are working correctly. B. Connect to a bastion host in your VPC. Use a network traffic analyzer to determine at which point your requests are being blocked. C. In a pre-production environment, disable all firewall rules individually to determine which one is blocking user traffic. D. Enable VPC Flow Logs in your VPC. Use Logs Explorer to analyze whether the rules are working correctly. Answer: A NO.50 A customer terminates an engineer and needs to make sure the engineer's Google account is automatically deprovisioned. What should the customer do? A. Use the Cloud SDK with their directory service to remove their IAM permissions in Cloud Identity. B. Use the Cloud SDK with their directory service to provision and deprovision users from Cloud Identity. C. Configure Cloud Directory Sync with their directory service to provision and deprovision users from Cloud Identity. D. Configure Cloud Directory Sync with their directory service to remove their IAM permissions in Cloud Identity. Answer: C Explanation https://cloud.google.com/identity/solutions/automate-user- 20 IT Certification Guaranteed, The Easy Way! provisioning#cloud_identity_automated_provisioning "Cloud Identity has a catalog of automated provisioning connectors, which act as a bridge between Cloud Identity and third-party cloud apps." NO.51 A customer wants to move their sensitive workloads to a Compute Engine-based cluster using Managed Instance Groups (MIGs). The jobs are bursty and must be completed quickly. They have a requirement to be able to manage and rotate the encryption keys. Which boot disk encryption solution should you use on the cluster to meet this customer's requirements? A. Customer-supplied encryption keys (CSEK) B. Customer-managed encryption keys (CMEK) using Cloud Key Management Service (KMS) C. Encryption by default D. Pre-encrypting files before transferring to Google Cloud Platform (GCP) for analysis Answer: B Explanation Reference https://cloud.google.com/kubernetes-engine/docs/how-to/dynamic-provisioning-cmek NO.52 Your organization recently activated the Security Command Center {SCO standard tier. There are a few Cloud Storage buckets that were accidentally made accessible to the public. You need to investigate the impact of the incident and remediate it. What should you do? A. *1 Remove the Identity and Access Management (IAM) granting access to allusers from the buckets *2 Apply the organization policy storage. unifromBucketLevelAccess to prevent regressions *3 Query the data access logs to report on unauthorized access B. *1 Change bucket permissions to limit access *2 Query the data access audit logs for any unauthorized access to the buckets *3 After the misconfiguration is corrected mute the finding in the Security Command Center C. *1 Change permissions to limit access for authorized users *2 Enforce a VPC Service Controls perimeter around all the production projects to immediately stop any unauthorized access *3 Review the administrator activity audit logs to report on any unauthorized access D. *1 Change the bucket permissions to limit access *2 Query the buckets usage logs to report on unauthorized access to the data *3 Enforce the organization policy storage.publicAccessPrevention to avoid regressions Answer: B NO.53 A customer wants to deploy a large number of 3-tier web applications on Compute Engine. How should the customer ensure authenticated network separation between the different tiers of the application? A. Run each tier in its own Project, and segregate using Project labels. B. Run each tier with a different Service Account (SA), and use SA-based firewall rules. C. Run each tier in its own subnet, and use subnet-based firewall rules. D. Run each tier with its own VM tags, and use tag-based firewall rules. Answer: B 21 IT Certification Guaranteed, The Easy Way! Explanation "Isolate VMs using service accounts when possible" "even though it is possible to uses tags for target filtering in this manner, we recommend that you use service accounts where possible. Target tags are not access-controlled and can be changed by someone with the instanceAdmin role while VMs are in service. Service accounts are access-controlled, meaning that a specific user must be explicitly authorized to use a service account. There can only be one service account per instance, whereas there can be multiple tags. Also, service accounts assigned to a VM can only be changed when the VM isstopped."https://cloud.google.com/solutions/best-practices-vpc-design#isolate-vms-service- accounts NO.54 You have been tasked with inspecting IP packet data for invalid or malicious content. What should you do? A. Use Packet Mirroring to mirror traffic to and from particular VM instances. Perform inspection using security software that analyzes the mirrored traffic. B. Enable VPC Flow Logs for all subnets in the VPC. Perform inspection on the Flow Logs data using Cloud Logging. C. Configure the Fluentd agent on each VM Instance within the VPC. Perform inspection on the log data using Cloud Logging. D. Configure Google Cloud Armor access logs to perform inspection on the log data. Answer: A Explanation https://cloud.google.com/vpc/docs/packet-mirroring Packet Mirroring clones the traffic of specified instances in your Virtual Private Cloud (VPC) network and forwards it for examination. Packet Mirroring captures all traffic and packet data, including payloads and headers. NO.55 You are designing a new governance model for your organization's secrets that are stored in Secret Manager. Currently, secrets for Production and Non-Production applications are stored and accessed using service accounts. Your proposed solution must: Provide granular access to secrets Give you control over the rotation schedules for the encryption keys that wrap your secrets Maintain environment separation Provide ease of management Which approach should you take? A. 1. Use separate Google Cloud projects to store Production and Non-Production secrets. 2. Enforce access control to secrets using project-level identity and Access Management (IAM) bindings. 3. Use customer-managed encryption keys to encrypt secrets. B. 1. Use a single Google Cloud project to store both Production and Non-Production secrets. 2. Enforce access control to secrets using secret-level Identity and Access Management (IAM) bindings. 3. Use Google-managed encryption keys to encrypt secrets. C. 1. Use separate Google Cloud projects to store Production and Non-Production secrets. 2. Enforce access control to secrets using secret-level Identity and Access Management (IAM) bindings. 22 IT Certification Guaranteed, The Easy Way! 3. Use Google-managed encryption keys to encrypt secrets. D. 1. Use a single Google Cloud project to store both Production and Non-Production secrets. 2. Enforce access control to secrets using project-level Identity and Access Management (IAM) bindings. 3. Use customer-managed encryption keys to encrypt secrets. Answer: A Explanation Provide granular access to secrets: 2.Enforce access control to secrets using project-level identity and Access Management (IAM) bindings. Give you control over the rotation schedules for the encryption keys that wrap your secrets: 3. Use customer-managed encryption keys to encrypt secrets. Maintain environment separation: 1. Use separate Google Cloud projects to store Production and Non-Production secrets. NO.56 Your organization's Google Cloud VMs are deployed via an instance template that configures them with a public IP address in order to host web services for external users. The VMs reside in a service project that is attached to a host (VPC) project containing one custom Shared VPC for the VMs. You have been asked to reduce the exposure of the VMs to the internet while continuing to service external users. You have already recreated the instance template without a public IP address configuration to launch the managed instance group (MIG). What should you do? A. Deploy a Cloud NAT Gateway in the service project for the MIG. B. Deploy a Cloud NAT Gateway in the host (VPC) project for the MIG. C. Deploy an external HTTP(S) load balancer in the service project with the MIG as a backend. D. Deploy an external HTTP(S) load balancer in the host (VPC) project with the MIG as a backend. Answer: D Explanation https://cloud.google.com/load-balancing/docs/https#shared-vpc While you can create all the load balancing components and backends in the Shared VPC host project, this model does not separate network administration and service development responsibilities. NO.57 You discovered that sensitive personally identifiable information (PII) is being ingested to your Google Cloud environment in the daily ETL process from an on-premises environment to your BigQuery datasets. You need to redact this data to obfuscate the PII, but need to re-identify it for data analytics purposes. Which components should you use in your solution? (Choose two.) A. Secret Manager B. Cloud Key Management Service C. Cloud Data Loss Prevention with cryptographic hashing D. Cloud Data Loss Prevention with automatic text redaction E. Cloud Data Loss Prevention with deterministic encryption using AES-SIV Answer: B E Explanation B: you need KMS to store the CryptoKeyhttps://cloud.google.com/dlp/docs/reference/rest/v2/projects.deidentifyTemplates#crypt E: for the de-identity you need to use CryptoReplaceFfxFpeConfig or CryptoDeterministicConfighttps://cloud.google.com/dlp/docs/reference/rest/v2/projects.deidentifyT 23 IT Certification Guaranteed, The Easy Way! emplates#cry https://cloud.google.com/dlp/docs/deidentify-sensitive-data NO.58 You are working with protected health information (PHI) for an electronic health record system. The privacy officer is concerned that sensitive data is stored in the analytics system. You are tasked with anonymizing the sensitive data in a way that is not reversible. Also, the anonymized data should not preserve the character set and length. Which Google Cloud solution should you use? A. Cloud Data Loss Prevention with deterministic encryption using AES-SIV B. Cloud Data Loss Prevention with format-preserving encryption C. Cloud Data Loss Prevention with cryptographic hashing D. Cloud Data Loss Prevention with Cloud Key Management Service wrapped cryptographic keys Answer: C NO.59 You are a Cloud Identity administrator for your organization. In your Google Cloud environment groups are used to manage user permissions. Each application team has a dedicated group Your team is responsible for creating these groups and the application teams can manage the team members on their own through the Google Cloud console. You must ensure that the application teams can only add users from within your organization to their groups. What should you do? A. Change the configuration of the relevant groups in the Google Workspace Admin console to prevent externalusers from being added to the group. B. Set an Identity and Access Management (1AM) policy that includes a condition that restricts groupmembership to user principals that belong to your organization. C. Define an Identity and Access Management (IAM) deny policy that denies the assignment of principals thatare outside your organization to the groups in scope. D. Export the Cloud Identity logs to BigQuery Configure an alert for external members added to groups Havethe alert trigger a Cloud Function instance that removes the external members from the group. Answer: B NO.60 You need to set up two network segments: one with an untrusted subnet and the other with a trusted subnet. You want to configure a virtual appliance such as a next-generation firewall (NGFW) to inspect all traffic between the two network segments. How should you design the network to inspect the traffic? A. 1. Set up one VPC with two subnets: one trusted and the other untrusted. 2. Configure a custom route for all traffic (0.0.0.0/0) pointed to the virtual appliance. B. 1. Set up one VPC with two subnets: one trusted and the other untrusted. 2. Configure a custom route for all RFC1918 subnets pointed to the virtual appliance. C. 1. Set up two VPC networks: one trusted and the other untrusted, and peer them together. 2. Configure a custom route on each network pointed to the virtual appliance. D. 1. Set up two VPC networks: one trusted and the other untrusted. 2. Configure a virtual appliance using multiple network interfaces, with each interface connected to one of the VPC networks. 24 IT Certification Guaranteed, The Easy Way! Answer: D Explanation Multiple network interfaces. The simplest way to connect multiple VPC networks through a virtual appliance is by using multiple network interfaces, with each interface connecting to one of the VPC networks. Internet and on-premises connectivity is provided over one ortwo separate network interfaces. With many NGFW products, internet connectivity is connected through an interface marked as untrusted in the NGFW software. https://cloud.google.com/architecture/best-practices-vpc-design#l7 This architecture has multiple VPC networks that are bridged by an L7 next-generation firewall (NGFW) appliance, which functions as a multi-NIC bridge between VPC networks. An untrusted, outside VPC network is introduced to terminate hybrid interconnects and internet-based connections that terminate on the outside leg of the L7 NGFW for inspection. There are many variations on this design, but the key principle is to filter traffic through the firewall before the traffic reaches trusted VPC networks. NO.61 You are a security engineer at a finance company. Your organization plans to store data on Google Cloud, but your leadership team is worried about the security of their highly sensitive data Specifically, your company is concerned about internal Google employees' ability to access your company's data on Google Cloud. What solution should you propose? A. Use customer-managed encryption keys. B. Use Google's Identity and Access Management (IAM) service to manage access controls on Google Cloud. C. Enable Admin activity logs to monitor access to resources. D. Enable Access Transparency logs with Access Approval requests for Google employees. Answer: D Explanation https://cloud.google.com/access-transparency Access approval Explicitly approve access to your data or configurations on Google Cloud. Access Approval requests, when combined with Access Transparency logs, can be used to audit an end-to-end chain from support ticket to access request to approval, to eventual access. NO.62 Your organization s record data exists in Cloud Storage. You must retain all record data for at least seven years This policy must be permanent. What should you do? A. *1 Identify buckets with record data *2 Apply a retention policy and set it to retain for seven years *3 Monitor the bucket by using log-based alerts to ensure that no modifications to the retention policy occurs B. *1 Identify buckets with record data *2 Apply a retention policy and set it to retain for seven years *3 Remove any Identity and Access Management (IAM) roles that contain the storage buckets update permission C. *1 Identify buckets with record data *2 Enable the bucket policy only to ensure that data is retained *3 Enable bucket lock 25 IT Certification Guaranteed, The Easy Way! D. * 1 Identify buckets with record data *2 Apply a retention policy and set it to retain for seven years *3 Enable bucket lock Answer: D NO.63 Your organization uses BigQuery to process highly sensitive, structured datasets. Following the "need to know" principle, you need to create the Identity and Access Management (IAM) design to meet the needs of these users: * Business user must access curated reports. * Data engineer: must administrate the data lifecycle in the platform. * Security operator: must review user activity on the data platform. What should you do? A. Configure data access log for BigQuery services, and grant Project Viewer role to security operators. B. Generate a CSV data file based on the business user's needs, and send the data to their email addresses. C. Create curated tables in a separate dataset and assign the role roles/bigquery.dataViewer. D. Set row-based access control based on the "region" column, and filter the record from the United States for data engineers. Answer: C Explanation This option directly addresses the needs of the business user who must access curated reports. By creating curated tables in a separate dataset, you can control access to specific data. Assigning the roles/bigquery.dataViewer role allows the business user to view the data in BigQuery. NO.64 Your security team wants to reduce the risk of user-managed keys being mismanaged and compromised. To achieve this, you need to prevent developers from creating user-managed service account keys for projects in their organization. How should you enforce this? A. Configure Secret Manager to manage service account keys. B. Enable an organization policy to disable service accounts from being created. C. Enable an organization policy to prevent service account keys from being created. D. Remove theiam.serviceAccounts.getAccessTokenpermission from users. Answer: C Explanation https://cloud.google.com/iam/docs/best-practices-for-managing-service-account-keys "To prevent unnecessary usage of service account keys, use organization policy constraints: At the root of your organization's resource hierarchy, apply the Disable service account key creation and Disable service account key upload constraints to establish a default where service account keys are disallowed. When needed, override one of the constraints for selected projects to re-enable service account key creation or upload." NO.65 You need to follow Google-recommended practices to leverage envelope encryption and encrypt data at the application layer. What should you do? A. Generate a data encryption key (DEK) locally to encrypt the data, and generate a new key 26 IT Certification Guaranteed, The Easy Way! encryption key (KEK) in Cloud KMS to encrypt the DEK. Store both the encrypted data and the encrypted DEK. B. Generate a data encryption key (DEK) locally to encrypt the data, and generate a new key encryption key (KEK) in Cloud KMS to encrypt the DEK. Store both the encrypted data and the KEK. C. Generate a new data encryption key (DEK) in Cloud KMS to encrypt the data, and generate a key encryption key (KEK) locally to encrypt the key. Store both the encrypted data and the encrypted DEK. D. Generate a new data encryption key (DEK) in Cloud KMS to encrypt the data, and generate a key encryption key (KEK) locally to encrypt the key. Store both the encrypted data and the KEK. Answer: A NO.66 Your team needs to prevent users from creating projects in the organization. Only the DevOps team should be allowed to create projects on behalf of the requester. Which two tasks should your team perform to handle this request? (Choose two.) A. Remove all users from the Project Creator role at the organizational level. B. Create an Organization Policy constraint, and apply it at the organizational level. C. Grant the Project Editor role at the organizational level to a designated group of users. D. Add a designated group of users to the Project Creator role at the organizational level. E. Grant the billing account creator role to the designated DevOps team. Answer: A D Explanation https://cloud.google.com/resource-manager/docs/organization-policy/org-policy-constraints NO.67 A customer's company has multiple business units. Each business unit operates independently, and each has their own engineering group. Your team wants visibility into all projects created within the company and wants to organize their Google Cloud Platform (GCP) projects based on different business units. Each business unit also requires separate sets of IAM permissions. Which strategy should you use to meet these needs? A. Create an organization node, and assign folders for each business unit. B. Establish standalone projects for each business unit, using gmail.com accounts. C. Assign GCP resources in a project, with a label identifying which business unit owns the resource. D. Assign GCP resources in a VPC for each business unit to separate network access. Answer: A Explanation Refer to:https://cloud.google.com/resource-manager/docs/listing-all-resources Also:https://wideops.com/mapping-your-organization-with-the-google-cloud-platform-resource- hierarchy/ NO.68 You have the following resource hierarchy. There is an organization policy at each node in the hierarchy as shown. Which load balancer types are denied in VPC A? 27 IT Certification Guaranteed, The Easy Way! A. All load balancer types are denied in accordance with the global node's policy. B. INTERNAL_TCP_UDP, INTERNAL_HTTP_HTTPS is denied in accordance with the folder's policy. C. EXTERNAL_TCP_PROXY, EXTERNAL_SSL_PROXY are denied in accordance with the project's policy. D. EXTERNAL_TCP_PROXY, EXTERNAL_SSL_PROXY, INTERNAL_TCP_UDP, and INTERNAL_HTTP_HTTPS are denied in accordance with the folder and project's policies. Answer: D NO.69 You manage your organization's Security Operations Center (SOC). You currently monitor and detect network traffic anomalies in your VPCs based on network logs. However, you want to explore your environment using network payloads and headers. Which Google Cloud product should you use? A. Cloud IDS B. VPC Service Controls logs C. VPC Flow Logs D. Google Cloud Armor E. Packet Mirroring Answer: E Explanation https://cloud.google.com/vpc/docs/packet-mirroring Packet Mirroring clones the traffic of specified instances in your Virtual Private Cloud (VPC) network and forwards it for examination. Packet Mirroring captures all traffic and packet data, including 28 IT Certification Guaranteed, The Easy Way! payloads and headers. NO.70 You are working with a client who plans to migrate their data to Google Cloud. You are responsible for recommending an encryption service to manage their encrypted keys. You have the following requirements: The master key must be rotated at least once every 45 days. The solution that stores the master key must be FIPS 140-2 Level 3 validated. The master key must be stored in multiple regions within the US for redundancy. Which solution meets these requirements? A. Customer-managed encryption keys with Cloud Key Management Service B. Customer-managed encryption keys with Cloud HSM C. Customer-supplied encryption keys D. Google-managed encryption keys Answer: B Explanation https://cloud.google.com/docs/security/key-management-deep- divehttps://cloud.google.com/kms/docs/faq NO.71 A customer deploys an application to App Engine and needs to check for Open Web Application Security Project (OWASP) vulnerabilities. Which service should be used to accomplish this? A. Cloud Armor B. Google Cloud Audit Logs C. Cloud Security Scanner D. Forseti Security Answer: C NO.72 Your team wants to limit users with administrative privileges at the organization level. Which two roles should your team restrict? (Choose two.) A. Organization Administrator B. Super Admin C. GKE Cluster Admin D. Compute Admin E. Organization Role Viewer Answer: A B NO.73 Your security team uses encryption keys to ensure confidentiality of user data. You want to establish a process to reduce the impact of a potentially compromised symmetric encryption key in Cloud Key Management Service (Cloud KMS). Which steps should your team take before an incident occurs? (Choose two.) A. Disable and revoke access to compromised keys. B. Enable automatic key version rotation on a regular schedule. C. Manually rotate key versions on an ad hoc schedule. D. Limit the number of messages encrypted with each key version. 29 IT Certification Guaranteed, The Easy Way! E. Disable the Cloud KMS API. Answer: B D Explanation As per document "Limiting the number of messages encrypted with the same key version helps prevent attacks enabled by cryptanalysis."https://cloud.google.com/kms/docs/key-rotation NO.74 Your organization recently deployed a new application on Google Kubernetes Engine. You need to deploy a solution to protect the application. The solution has the following requirements: Scans must run at least once per week Must be able to detect cross-site scripting vulnerabilities Must be able to authenticate using Google accounts Which solution should you use? A. Google Cloud Armor B. Web Security Scanner C. Security Health Analytics D. Container Threat Detection Answer: B NO.75 You manage one of your organization's Google Cloud projects (Project A). AVPC Service Control (SC) perimeter is blocking API access requests to this project including Pub/Sub. A resource running under a service account in another project (Project B) needs to collect messages from a Pub/Sub topic in your project Project B is not included in a VPC SC perimeter. You need to provide access from Project B to the Pub/Sub topic in Project A using the principle of least Privilege. What should you do? A. Configure an ingress policy for the perimeter in Project A and allow access for the service account in ProjectB to collect messages. B. Create an access level that allows a developer in Project B to subscribe to the Pub/Sub topic that is locatedin Project A. C. Create a perimeter bridge between Project A and Project B to allow the required communication betweenboth projects. D. Remove the Pub/Sub API from the list of restricted services in the perimeter configuration for Project A. Answer: A NO.76 You are a security administrator at your company. Per Google-recommended best practices, you implemented the domain restricted sharing organization policy to allow only required domains to access your projects. An engineering team is now reporting that users at an external partner outside your organization domain cannot be granted access to the resources in a project. How should you make an exception for your partner's domain while following the stated best practices? A. Turn off the domain restriction sharing organization policy. Set the policy value to "Allow All." B. Turn off the domain restricted sharing organization policy. Provide the external partners with the required permissions using Google's Identity and Access Management (IAM) service. C. Turn off the domain restricted sharing organization policy. Add each partner's Google Workspace customer ID to a Google group, add the Google group as an exception under the organization policy, 30 IT Certification Guaranteed, The Easy Way! and then turn the policy back on. D. Turn off the domain restricted sharing organization policy. Set the policy value to "Custom." Add each external partner's Cloud Identity or Google Workspace customer ID as an exception under the organization policy, and then turn the policy back on. Answer: D Explanation https://cloud.google.com/resource-manager/docs/organization-policy/restricting- domains#setting_the_organizati The domain restriction constraint is a type of list constraint. Google Workspace customer IDs can be added and removed from the allowed_values list of a domain restriction constraint. The domain restriction constraint does not support denying values, and an organization policy can't be saved with IDs in the denied_values list. All domains associated with a Google Workspace account listed in the allowed_values will be allowed by the organization policy. All other domains will be denied by the organization policy. NO.77 An organization adopts Google Cloud Platform (GCP) for application hosting services and needs guidance on setting up password requirements for their Cloud Identity account. The organization has a password policy requirement that corporate employee passwords must have a minimum number of characters. Which Cloud Identity password guidelines can the organization use to inform their new requirements? A. Set the minimum length for passwords to be 8 characters. B. Set the minimum length for passwords to be 10 characters. C. Set the minimum length for passwords to be 12 characters. D. Set the minimum length for passwords to be 6 characters. Answer: A Explanation Default password length is 8 characters.https://support.google.com/cloudidentity/answer/33319?hl=en https://support.google.com/cloudidentity/answer/139399?hl=en#:~:text=It%20can%20be%20betwe en%208,deci NO.78 Applications often require access to "secrets" - small pieces of sensitive data at build or run time. The administrator managing these secrets on GCP wants to keep a track of "who did what, where, and when?" within their GCP projects. Which two log streams would provide the information that the administrator is looking for? (Choose two.) A. Admin Activity logs B. System Event logs C. Data Access logs D. VPC Flow logs E. Agent logs Answer: A C Explanation https://cloud.google.com/secret-manager/docs/audit-logging 31 IT Certification Guaranteed, The Easy Way! NO.79 You manage a fleet of virtual machines (VMs) in your organization. You have encountered issues with lack of patching in many VMs. You need to automate regular patching in your VMs and view the patch management data across multiple projects. What should you do? Choose 2 answers A. Deploy patches with VM Manager by using OS patch management B. View patch management data in VM Manager by using OS patch management. C. Deploy patches with Security Command Center by using Rapid Vulnerability Detection. D. View patch management data in a Security Command Center dashboard. E. View patch management data in Artifact Registry. Answer: A B Explanation https://cloud.google.com/compute/docs/os-patch-management NO.80 You are part of a security team investigating a compromised service account key. You need to audit which new resources were created by the service account. What should you do? A. Query Data Access logs. B. Query Admin Activity logs. C. Query Access Transparency logs. D. Query Stackdriver Monitoring Workspace. Answer: B Explanation Admin activity logs are always created to log entries for API calls or other actions that modify the configuration or metadata of resources. For example, these logs record when users create VM instances or change Identity and Access Management permissions. NO.81 You want to limit the images that can be used as the source for boot disks. These images will be stored in a dedicated project. What should you do? A. Use the Organization Policy Service to create a compute.trustedimageProjects constraint on the organization level. List the trusted project as the whitelist in an allow operation. B. Use the Organization Policy Service to create a compute.trustedimageProjects constraint on the organization level. List the trusted projects as the exceptions in a deny operation. C. In Resource Manager, edit the project permissions for the trusted project. Add the organization as member with the role: Compute Image User. D. In Resource Manager, edit the organization permissions. Add the project ID as member with the role:Compute Image User. Answer: A Explanation https://cloud.google.com/compute/docs/images/restricting-image-access#trusted_images NO.82 Which type of load balancer should you use to maintain client IP by default while using the standard network tier? 32 IT Certification Guaranteed, The Easy Way! A. SSL Proxy B. TCP Proxy C. Internal TCP/UDP D. TCP/UDP Network Answer: D Explanation https://cloud.google.com/load-balancing/docs/load-balancing-overview https://cloud.google.com/load-balancing/docs/load-balancing-overview#choosing_a_load_balancer NO.83 You are migrating an application into the cloud The application will need to read data from a Cloud Storage bucket. Due to local regulatory requirements, you need to hold the key material used for encryption fully under your control and you require a valid rationale for accessing the key material. What should you do? A. Encrypt the data in the Cloud Storage bucket by using Customer Managed Encryption Keys. Configure an 1AM deny policy for unauthorized groups B. Encrypt the data in the Cloud Storage bucket by using Customer Managed Encryption Keys backed by a Cloud Hardware Security Module (HSM). Enable data access logs. C. Generate a key in your on-premises environment and store it in a Hardware Security Module (HSM) that is managed on-premises Use this key as an external key in the Cloud Key Management Service (KMS). Activate Key Access Justifications (KAJ) and set the external key system to reject unauthorized accesses. D. Generate a key in your on-premises environment to encrypt the data before you upload the data to the Cloud Storage bucket Upload the key to the Cloud Key ManagementService (KMS). Activate Key Access Justifications (KAJ) and have the external key system reject unauthorized accesses. Answer: C Explanation By generating a key in your on-premises environment and storing it in an HSM that you manage, you're ensuring that the key material is fully under your control. Using the key as an external key in Cloud KMS allows you to use the key with Google Cloud services without having the key stored on Google Cloud. Activating Key Access Justifications (KAJ) provides a reason every time the key is accessed, and you can configure the external key system to reject unauthorized access attempts. NO.84 You are the Security Admin in your company. You want to synchronize all security groups that have an email address from your LDAP directory in Cloud IAM. What should you do? A. Configure Google Cloud Directory Sync to sync security groups using LDAP search rules that have "user email address" as the attribute to facilitate one-way sync. B. Configure Google Cloud Directory Sync to sync security groups using LDAP search rules that have "user email address" as the attribute to facilitate bidirectional sync. C. Use a management tool to sync the subset based on the email address attribute. Create a group in the Google domain. A group created in a Google domain will automatically have an explicit Google 33 IT Certification Guaranteed, The Easy Way! Cloud Identity and Access Management (IAM) role. D. Use a management tool to sync the subset based on group object class attribute. Create a group in the Google domain. A group created in a Google domain will automatically have an explicit Google Cloud Identity and Access Management (IAM) role. Answer: A Explanation search rules that have "user email address" as the attribute to facilitate one-way sync. Reference Links:https://support.google.com/a/answer/6126589?hl=en NO.85 Your company recently published a security policy to minimize the usage of service account keys. On-premises Windows-based applications are interacting with Google Cloud APIs. You need to implement Workload Identity Federation (WIF) with your identity provider on-premises. What should you do? A. Set up a workload identity pool with your corporate Active Directory Federation Service (ADFS) Configure arule to let principals in the pool impersonate the Google Cloud service account. B. Set up a workload identity pool with your corporate Active Directory Federation Service (ADFS) Let allprincipals in the pool impersonate the Google Cloud service account. C. Set up a workload identity pool with an OpenID Connect (OIDC) service on the name machine Configure arule to let principals in the pool impersonate the Google Cloud service account. D. Set up a workload identity pool with an OpenID Connect (OIDC) service on the same machine Let allprincipals in the pool impersonate the Google Cloud service account. Answer: A NO.86 Your team wants to centrally manage GCP IAM permissions from their on-premises Active Directory Service. Your team wants to manage permissions by AD group membership. What should your team do to meet these requirements? A. Set up Cloud Directory Sync to sync groups, and set IAM permissions on the groups. B. Set up SAML 2.0 Single Sign-On (SSO), and assign IAM permissions to the groups. C. Use the Cloud Identity and Access Management API to create groups and IAM permissions from Active Directory. D. Use the Admin SDK to create groups and assign IAM permissions from Active Directory. Answer: A Explanation "In order to be able to keep using the existing identity management system, identities need to be synchronized between AD and GCP IAM. To do so google provides a tool called Cloud Directory Sync. This tool will read all identities in AD and replicate those within GCP. Once the identities have been replicated then it's possible to apply IAM permissions on the groups. After that you will configure SAML so google can act as a service provider and either you ADFS or other third party tools like Ping or Okta will act as the identity provider. This way you effectively delegate the authentication from Google to something that is under your control." NO.87 Your organization wants to be compliant with the General Data Protection Regulation (GDPR) on Google Cloud You must implement data residency and operational sovereignty in the EU. 34 IT Certification Guaranteed, The Easy Way! What should you do? Choose 2 answers A. Limit the physical location of a new resource with the Organization Policy Service resource locations constraint." B. Use Cloud IDS to get east-west and north-south traffic visibility in the EU to monitor intra-VPC and mter-VPC communication. C. Limit Google personnel access based on predefined attributes such as their citizenship or geographic location by using Key Access Justifications D. Use identity federation to limit access to Google Cloud resources from non-EU entities. E. Use VPC Flow Logs to monitor intra-VPC and inter-VPC traffic in the EU. Answer: A C Explanation https://cloud.google.com/architecture/framework/security/data-residency- sovereignty#manage_your_operational NO.88 You are setting up a CI/CD pipeline to deploy containerized applications to your production clusters on Google Kubernetes Engine (GKE). You need to prevent containers with known vulnerabilities from being deployed. You have the following requirements for your solution: Must be cloud-native Must be cost-efficient Minimize operational overhead How should you accomplish this? (Choose two.) A. Create a Cloud Build pipeline that will monitor changes to your container templates in a Cloud Source Repositories repository. Add a step to analyze Container Analysis results before allowing the build to continue. B. Use a Cloud Function triggered by log events in Google Cloud's operations suite to automatically scan your container images in Container Registry. C. Use a cron job on a Compute Engine instance to scan your existing repositories for known vulnerabilities and raise an alert if a non-compliant container image is found. D. Deploy Jenkins on GKE and configure a CI/CD pipeline to deploy your containers to Container Registry. Add a step to validate your container images before deploying your container to the cluster. E. In your CI/CD pipeline, add an attestation on your container image when no vulnerabilities have been found. Use a Binary Authorization policy to block deployments of containers with no attestation in your cluster. Answer: A E NO.89 A large financial institution is moving its Big Data analytics to Google Cloud Platform. They want to have maximum control over the encryption process of data stored at rest in BigQuery. What technique should the institution use? A. Use Cloud Storage as a federated Data Source. B. Use a Cloud Hardware Security Module (Cloud HSM). C. Customer-managed encryption keys (CMEK). D. Customer-supplied encryption keys (CSEK). Answer: C 35 IT Certification Guaranteed, The Easy Way! Explanation If you want to manage the key encryption keys used for your data at rest, instead of having Google manage the keys, use Cloud Key Management Service to manage your keys. This scenario is known as customer-managed encryption keys (CMEK).https://cloud.google.com/bigquery/docs/encryption-at- rest NO.90 In order to meet PCI DSS requirements, a customer wants to ensure that all outbound traffic is authorized. Which two cloud offerings meet this requirement without additional compensating controls? (Choose two.) A. App Engine B. Cloud Functions C. Compute Engine D. Google Kubernetes Engine E. Cloud Storage Answer: C D Explanation App Engine ingress firewall rules are available, but egress rules are not currently available. Per requirements 1.2.1 and 1.3.4, you must ensure that all outbound traffic is authorized. SAQ A-EP and SAQ D-type merchants must provide compensating controls or use a different Google Cloud product. Compute Engine and GKE are the preferred alternatives.https://cloud.google.com/solutions/pci-dss- compliance-in-gcp NO.91 You have created an OS image that is hardened per your organization's security standards and is being stored in a project managed by the security team. As a Google Cloud administrator, you need to make sure all VMs in your Google Cloud organization can only use that specific OS image while minimizing operational overhead. What should you do? (Choose two.) A. Grant users the compuce.imageUser role in their own projects. B. Grant users the compuce.imageUser role in the OS image project. C. Store the image in every project that is spun up in your organization. D. Set up an image access organization policy constraint, and list the security team managed project in the projects allow list. E. Remove VM instance creation permission from users of the projects, and only allow you and your team to create VM instances. Answer: B D Explanation https://cloud.google.com/resource-manager/docs/organization-policy/org-policy-constraints - constraints/compute.trustedImageProjects This list constraint defines the set of projects that can be used for image storage and disk instantiation for Compute Engine. If this constraint is active, only images from trusted projects will be allowed as the source for boot disks for new instances. NO.92 An organization is moving applications to Google Cloud while maintaining a few mission- critical applications on-premises. The organization must transfer the data at a bandwidth of at least 50 Gbps. What should they use to ensure secure continued connectivity between sites? 36 IT Certification Guaranteed, The Easy Way! A. Dedicated Interconnect B. Cloud Router C. Cloud VPN D. Partner Interconnect Answer: A NO.93 Your organization is using Active Directory and wants to configure Security Assertion Markup Language (SAML). You must set up and enforce single sign-on (SSO) for all users. What should you do? A. 1. Manage SAML profile assignments. * 2. Enable OpenID Connect (OIDC) in your Active Directory (AD) tenant. * 3. Verify the domain. B. 1. Create a new SAML profile. * 2. Upload the X.509 certificate. * 3. Enable the change password URL. * 4. Configure Entity ID and ACS URL in your IdP. C. 1- Create a new SAML profile. * 2. Populate the sign-in and sign-out page URLs. * 3. Upload the X.509 certificate. * 4. Configure Entity ID and ACS URL in your IdP D. 1. Configure prerequisites for OpenID Connect (OIDC) in your Active Directory (AD) tenant * 2. Verify the AD domain. * 3. Decide which users should use SAML. * 4. Assign the pre-configured profile to the select organizational units (OUs) and groups. Answer: C Explanation When configuring SAML-based Single Sign-On (SSO) in an organization that's using Active Directory, the general steps would involve setting up a SAML profile, specifying the necessary URLs for sign-in and sign-out processes, uploading an X.509 certificate for secure communication, and setting up the Entity ID and Assertion Consumer Service (ACS) URL in the Identity Provider (which in this case would be Active Directory). NO.94 You want to use the gcloud command-line tool to authenticate using a third-party single sign- on (SSO) SAML identity provider. Which options are necessary to ensure that authentication is supported by the third-party identity provider (IdP)? (Choose two.) A. SSO SAML as a third-party IdP B. Identity Platform C. OpenID Connect D. Identity-Aware Proxy E. Cloud Identity Answer: A C Explanation To provide users with SSO-based access to selected cloud apps, Cloud Identity as your IdP supports the OpenID Connect (OIDC) and Security Assertion Markup Language 2.0 (SAML) 37 IT Certification Guaranteed, The Easy Way! protocols.https://cloud.google.com/identity/solutions/enable-sso NO.95 You are a Security Administrator at your organization. You need to restrict service account creation capability within production environments. You want to accomplish this centrally across the organization. What should you do? A. Use Identity and Access Management (IAM) to restrict access of all users and service accounts that have access to the production environment. B. Use organization policy constraints/iam.disableServiceAccountKeyCreation boolean to disable the creation of new service accounts. C. Use organization policy constraints/iam.disableServiceAccountKeyUpload boolean to disable the creation of new service accounts. D. Use organization policy constraints/iam.disa

Use Quizgecko on...
Browser
Browser