🎧 New: AI-Generated Podcasts Turn your study notes into engaging audio conversations. Learn more

UDEMY set_4.pdf

Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...

Full Transcript

AWS Certified Cloud Practitioner Practice Test 4 - Results Attempt 35 All domains ï‚· 65 all ï‚· 64 correct ï‚· 1 incorrect ï‚· 0 skipped ï‚· 0 marked Collapse all questions Question 1Correct Which of the following are examples of Platform as a Service (PaaS) in...

AWS Certified Cloud Practitioner Practice Test 4 - Results Attempt 35 All domains  65 all  64 correct  1 incorrect  0 skipped  0 marked Collapse all questions Question 1Correct Which of the following are examples of Platform as a Service (PaaS) in the AWS cloud? (Select TWO.) Amazon CloudWatch Your selection is correct AWS Elastic Beanstalk Amazon EC2 Amazon S3 Your selection is correct AWS App Runner Overall explanation Correct Options: AWS Elastic Beanstalk AWS Elastic Beanstalk is a fully managed service that makes it easy for developers to quickly deploy and manage applications in the AWS Cloud. Developers simply upload their applications, and Elastic Beanstalk automatically handles the deployment details of capacity provisioning, load balancing, scaling, and application health monitoring. Elastic Beanstalk supports applications developed in Go, Java,.NET, Node.js, Python, Ruby, and PHP. Because Elastic Beanstalk provides the infrastructure and managed services for deploying an application, it fits the definition of a Platform as a Service (PaaS) offering. AWS App Runner AWS App Runner is a fully managed service that makes it easy for developers to build, deploy, and scale containerized applications quickly. App Runner is designed for developers who want to be able to go straight from code or a container image to a scalable and secure web application in the AWS Cloud. This simplifies the operational aspects and fits the definition of a Platform as a Service (PaaS). Incorrect Options: Amazon EC2 Amazon Elastic Compute Cloud (Amazon EC2) provides secure, resizable compute capacity in the cloud. EC2 offers instances of virtual machines and options for networking and storage, but users must manage and maintain the operating systems and application software. This makes EC2 more of an Infrastructure as a Service (IaaS), not a PaaS. Amazon CloudWatch Amazon CloudWatch is a monitoring and observability service built for DevOps engineers, developers, site reliability engineers (SREs), and IT managers. CloudWatch provides data and actionable insights to monitor applications, respond to system-wide performance changes, optimize resource utilization, and get a unified view of operational health. It is a monitoring service, not a PaaS. Amazon S3 Amazon Simple Storage Service (Amazon S3) is an object storage service that offers industry-leading scalability, data availability, security, and performance. It's designed to store and retrieve any amount of data from anywhere on the web. It's a storage service, not a PaaS. References: https://aws.amazon.com/types-of-cloud-computing Domain Cloud Concepts Question 2Correct A company has large-scale distributed datasets and wants to analyze them. Which AWS service should be used? Your answer is correct Amazon EMR Amazon Redshift Amazon MQ Amazon Athena Overall explanation Correct Options: Amazon EMR Amazon EMR (Elastic MapReduce) is a cloud big data platform for processing massive amounts of data using open-source tools such as Apache Hadoop, Spark, HBase, Presto, and Flink, among others. EMR is designed to efficiently process, analyze, and derive insights from large datasets quickly by distributing the data across a resizable cluster of AWS EC2 instances. It simplifies running big data frameworks for processing and analyzing large datasets, handling everything from provisioning and configuring the data processing infrastructure to scaling and managing the cluster. This service is ideal for tasks like web indexing, data transformations (ETL), log analysis, data warehousing, machine learning, financial analysis, scientific simulation, and bioinformatics, offering a cost-effective, scalable, and easy-to-use solution for big data processing. Incorrect Options: Amazon Redshift Amazon Redshift is a fast, fully managed, petabyte-scale data warehousing service that makes it simple and cost-effective to analyze all your data using your existing business intelligence tools. It's not designed for processing large-scale distributed datasets. Amazon Athena Amazon Athena is an interactive query service that makes it easy to analyze data in Amazon S3 using standard SQL. Athena is serverless, so there is no infrastructure to manage. While Athena can handle large data sets, it's better suited for ad-hoc queries rather than complex processing and analytics of large-scale distributed datasets. Amazon MQ Amazon MQ is a managed message broker service for Apache ActiveMQ that makes it easy to set up and operate message brokers in the cloud. It's used for decoupling and scaling microservices, distributed systems, and serverless applications, rather than for data analysis. References: https://aws.amazon.com/emr Domain Cloud Technology and Services Question 3Correct Which AWS Support Plan responds within 15 minutes for Business/Mission-Critical System downtime? Developer Basic Your answer is correct Enterprise Business Overall explanation Correct Options: Enterprise The AWS Enterprise Support plan is the highest tier of support offered by AWS and is designed for customers with large and mission-critical operations. When a business or mission-critical system experiences downtime, which is classified as a Severity 1 issue, AWS Enterprise Support commits to a response time of less than 15 minutes. This rapid support can be crucial for businesses where every minute of downtime can lead to significant revenue loss or impact critical operations. Response times of Enterprise Support Plan are following  General guidance: < 24 hours  System impaired: < 12 hours  Production system impaired: < 4 hours  Production system down: < 1 hour  Business/Mission-critical system down: < 15 minutes Incorrect Options: Basic AWS Basic Support is free and includes 24/7 customer service, documentation, whitepapers, and support forums. It doesn't include technical support and has no provision for one-on-one fast-track support for system downtime. Developer The Developer Support plan is designed for testing or early development phases. The plan offers a 12-hour response time for general guidance questions and a 24-hour response time for system impaired issues. It does not offer a 15-minute response time for system downtime. Business The Business Support plan offers a one-hour response time for production system impairment. However, this is not the same as a 15-minute response for any mission-critical system downtime, which is only provided under the Enterprise Support plan. References: https://aws.amazon.com/premiumsupport/plans/enterprise Domain Billing, Pricing, and Support Question 4Correct An organization is preparing for an audit following SOC2 standards. Which AWS Service would provide insight into how AWS services impact organizations in meeting SOC2 compliance requirements? AWS Compliance Center Your answer is correct AWS Audit Manager AWS SOC2 Type II Toolkit AWS Marketplace Overall explanation Correct Options: AWS Audit Manager AWS Audit Manager helps automate the process of assessing, managing, and reporting on compliance with regulations and industry standards. It simplifies audit preparation, execution, and remediation tasks, including generating audit-ready reports. For SOC2 compliance, Audit Manager facilitates the collection of evidence, mapping controls to the SOC2 framework, and generating reports that demonstrate adherence to SOC2 requirements. It simplifies the audit process by providing a centralized platform for managing compliance activities within AWS environments. Incorrect Options: AWS Compliance Center AWS Compliance Center provides general information about various compliance programs that AWS participates in, but it doesn't offer specific reports or detailed insights into how individual AWS services affect SOC2 compliance. AWS Marketplace AWS Marketplace is a platform for third-party sellers to offer software and services that run on AWS. It doesn't provide any AWS-specific compliance or SOC2-related information. AWS SOC2 Type II Toolkit There is no AWS service or feature called the "AWS SOC2 Type II Toolkit". This option is used to distract you. References: https://docs.aws.amazon.com/audit-manager/latest/userguide/what-is.html Domain Security and Compliance Question 5Correct A media-based company wants to create French subtitles from videos. As a cloud practitioner, what combination of AWS services would you recommend? (Select TWO.) Amazon Polly Amazon Rekognition Your selection is correct Amazon Translate Your selection is correct Amazon Transcribe Amazon Textract Overall explanation Correct Options: Amazon Transcribe Amazon Transcribe is an automatic speech recognition (ASR) service that converts spoken language into written text. It’s perfect for transcription needs and can handle different accents, multiple speakers, and even low-quality audio. In this scenario, Amazon Transcribe can be used to convert the spoken language in the videos into text, creating a transcription that can be used as the basis for subtitles. It supports several languages, including French. Amazon Translate Amazon Translate is a neural machine translation service that delivers fast, high-quality, and affordable language translation. Once the audio in the video has been transcribed into text with Amazon Transcribe, Amazon Translate can be used to convert that text into French. The service is highly accurate and capable of handling the complexities and nuances of different languages. Incorrect Options: Amazon Textract Amazon Textract is an AWS service designed to extract text and data from scanned documents, but it is not used for video content or language translation. Amazon Polly Amazon Polly is a text-to-speech service. It can convert written text into spoken word, it does not perform speech-to-text or translation services. Amazon Rekognition Amazon Rekognition adds image and video analysis to applications, but it doesn't handle language translation or transcription. While it could potentially identify on-screen text, it wouldn't help with generating subtitles from spoken language. References: https://aws.amazon.com/transcribe https://aws.amazon.com/translate Domain Cloud Technology and Services Question 6Correct A movie-based company requires periodic operations for video processing. It doesn't matter how long the operation takes to process. Which Amazon EC2 instance type is the most cost- efficient for this requirement? Your answer is correct Amazon EC2 Spot Instance Amazon EC2 On-Demand Instances Amazon EC2 Reserved Instances Amazon EC2 Dedicated Hosts Overall explanation Correct Options: Amazon EC2 Spot Instance Amazon EC2 Spot Instances allow you to take advantage of unused EC2 capacity in the AWS cloud at significant discounts compared to On-Demand Instances. The trade-off is that these instances can be terminated at any time if your bid price is lower than the current Spot price. However, given that the company's video processing operations are periodic and there's no strict time constraint for completion, Spot Instances would be an excellent and cost- efficient choice. Spot Instances can save up to 90% of the costs compared to On-Demand Instances and it is a great choice for flexible, interruption-tolerant workloads. Incorrect Options: Amazon EC2 On-Demand Instances On-Demand Instances let you pay for compute capacity by the hour or second (minimum of 60 seconds) with no long-term commitments. It offers flexibility but is not the most cost- efficient option for periodic tasks that are not time-sensitive. Amazon EC2 Reserved Instances Reserved Instances provide a significant discount (up to 72%) compared to On-Demand pricing and provide a capacity reservation. It is ideal for applications with steady state usage but may not be the most cost-effective choice for periodic tasks that are not time-sensitive. Amazon EC2 Dedicated Hosts A Dedicated Host is a physical EC2 server dedicated for your use. It is more pricey compared to other EC2 instances. However, given the nature of the workload described, which doesn't require dedicated hardware, using a Dedicated Host likely wouldn't be the most cost-efficient choice. References: https://aws.amazon.com/ec2/spot Domain Billing, Pricing, and Support Question 7Correct A medical research organization wants to use EC2 instances to run an analytics application with a fault-tolerant architecture. The application requires high-performance hardware disks to perform I/O operations. What storage would you recommend that would be a cost-effective solution? Your answer is correct Amazon EC2 Instance Store Amazon EBS Amazon EFS Amazon S3 Overall explanation Amazon EC2 instance store Amazon EC2 Instance Store provides temporary block-level storage for your instances. This storage is located on disks that are physically attached to the host computer. The advantage of instance store is that it offers very high input/output operations per second (IOPS), which makes it a great choice for applications that require high-performance hardware disks. Additionally, instance storage comes at no additional cost, which makes it a cost-effective solution for temporary storage with high IOPS. However, the data on an instance store volume is lost if the instance is stopped or terminated, which means it's only suitable for temporary, transitory data, or when data can be recreated easily. Incorrect Options: Amazon EBS Amazon Elastic Block Store (EBS) provides persistent block storage volumes for use with EC2 instances, but they are more expensive compared to instance store volumes. EBS volumes offer high durability and ease of use and it is not the most cost-effective if high- performance is a major concern. Amazon EFS Amazon Elastic File System (EFS) is a scalable file storage for EC2 instances. It can be connected to multiple EC2 instances at the same time, it is not designed for high I/O performance. Amazon S3 Amazon Simple Storage Service (S3) is an object storage service, ideal for storing and retrieving any amount of data at any time. It's not designed for high-performance I/O operations and cannot be directly attached to an EC2 instance like a block or file storage. References: https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/InstanceStorage.html Domain Cloud Technology and Services Question 8Correct Which AWS service provides recommendations for optimal configuration to minimize costs? AWS Cost Explorer Amazon Redshift Your answer is correct AWS Compute Optimizer AWS Budgets Overall explanation Correct Options: AWS Compute Optimizer AWS Compute Optimizer analyzes the configuration and utilization metrics of your AWS resources. It recommends opportunities to reduce costs, improve system performance, or both. It makes recommendations based on your past usage, considering things like CPU utilization, disk I/O, and network I/O. For example, if you have an EC2 instance that's consistently underutilized, Compute Optimizer might suggest a smaller instance type that could perform the same job at a lower cost. This makes AWS Compute Optimizer an excellent tool for helping you to optimize costs while still meeting your performance needs. Incorrect Options: AWS Cost Explorer AWS Cost Explorer lets you visualize, understand, and manage your AWS costs and usage over time. It provides insights into your spending and can help you identify trends, spot cost drivers, and detect anomalies. It doesn't provide recommendations for optimal configurations to reduce costs. AWS Budgets AWS Budgets allows you to set custom cost and usage budgets that alert you when your costs or usage exceed (or are forecasted to exceed) the budgeted amount. It doesn't provide configuration recommendations for reducing those costs. Amazon Redshift Amazon Redshift is a fast, scalable data warehouse that makes it simple and cost-effective to analyze all your data across your data warehouse and data lake. It's a tool for data analysis and it doesn't offer recommendations for optimal configurations to reduce costs. References: https://aws.amazon.com/compute-optimizer Domain Billing, Pricing, and Support Question 9Correct A healthcare company needs to run machine learning models on 10 TB datasets locally and move the trained models to AWS for deployment. The facility frequently experiences network disruptions. Which AWS service is suitable for these requirements? Your answer is correct AWS Snowball Edge AWS Snowmobile AWS Storage Gateway AWS Direct Connect Overall explanation Correct Options: AWS Snowball Edge AWS Snowball Edge is a data transfer and edge computing device designed to securely move large amounts of data into and out of AWS. It integrates with AWS services and provides computing capabilities, enabling edge processing and storage in remote or disconnected environments. Snowball Edge supports data migration, edge computing applications, and runs AWS Lambda functions for processing data on-site before transferring to AWS. This rugged device ensures data transfer efficiency and operational flexibility, particularly useful in scenarios requiring offline data collection or processing at the edge. The device is capable of storing up to 80 TB of data and includes computing power for processing data locally. This mitigates the impact of network disruptions and suits the healthcare company's need to handle large datasets and deploy trained models in AWS securely and efficiently. Incorrect Options: AWS Snowmobile AWS Snowmobile is a ruggedized shipping container that can transfer exabyte-scale data (up to 100 PB per Snowmobile). It is too large and generally overkill for a 10 TB dataset, and its logistics and cost make it unsuitable for the given requirement of handling a 10 TB dataset. AWS Direct Connect AWS Direct Connect offers a dedicated network connection between your premises and AWS, providing more consistent network performance. However, it doesn’t solve the problem of frequent network disruptions experienced at the healthcare facility. AWS Storage Gateway AWS Storage Gateway is a hybrid cloud storage service that acts as a bridge between on- premises data and cloud storage. It’s suitable for extending on-premises storage capacities but doesn’t address the need for local processing of large datasets or the critical issue of frequent network disruptions. References: https://docs.aws.amazon.com/snowball/latest/developer-guide/whatisedge.html Domain Cloud Technology and Services Question 10Correct A company operates multiple departments, each with different responsibilities. Often employees move from one department to another department. They want to manage permission based on job responsibility properly. Which AWS IAM resource would be best for this requirement with the minimum operational overhead? IAM instance profiles IAM user groups Your answer is correct IAM roles IAM policies for individual users Overall explanation Correct Options: IAM roles AWS IAM roles are used to delegate access to AWS resources securely. They eliminate the need for long-term credentials by enabling temporary permissions through policies. Roles are versatile, supporting identity federation, cross-account access, and service-to-service access within AWS, enhancing security and operational efficiency. IAM roles are an ideal solution for the company’s requirement of managing permissions based on job responsibilities with minimal operational overhead. Unlike IAM user groups or individual user policies, roles simplify the process when employees move between departments because permissions are assigned to roles rather than users. When job responsibilities change, users can simply assume different roles without the need to rewrite or reorganize user-specific policies. This approach streamlines permission management and enhances security by ensuring employees only have the permissions required for their current responsibilities. Incorrect Options: IAM user groups IAM user groups allow you to manage permissions for a set of users collectively, but they are less flexible when dealing with frequent changes in job responsibilities. Whenever an employee moves to a new department, an administrator would need to manually add or remove them from groups, which increases operational overhead and potential for error. IAM instance profiles IAM instance profiles are meant to provide temporary credentials to applications running on EC2 instances. They are not designed for managing user permissions and would be inappropriate in this context. IAM policies for individual users Managing permissions for individual users through IAM policies can become cumbersome and inefficient, especially in environments with frequent changes and numerous employees. It increases the administrative burden and chances of inconsistent permissions, making it a less optimal solution for the company's needs. References: https://docs.aws.amazon.com/IAM/latest/UserGuide/id_roles.html Domain Security and Compliance Question 11Correct What is an effective approach when considering a migration strategy for cloud adoption? Ignoring the need for a dedicated migration team. Migrating all applications simultaneously to reduce transitional complexity. Prioritizing migration based solely on the age of the application. Your answer is correct Using a phased approach by starting with less critical applications. Overall explanation Correct Options: Using a phased approach by starting with less critical applications. Adopting a phased approach to cloud migration is a widely recommended strategy. Starting with less critical applications allows organizations to gain experience and confidence in the cloud environment with minimal risk to core business operations. This step-by-step approach helps in identifying potential issues and learning from them before migrating more critical applications. It also allows for the development of best practices, employee training, and gradual adjustment to the cloud without disrupting business continuity. Incorrect Options: Migrating all applications simultaneously to reduce transitional complexity. Migrating all applications at once is generally not advisable as it introduces significant risk and complexity. It can lead to overwhelming challenges in managing the transition, potential downtime, and difficulty in resolving issues that may arise. Ignoring the need for a dedicated migration team. Having a dedicated migration team is crucial for a successful cloud migration. This team should possess the necessary skills and expertise to plan, execute, and manage the migration process, ensuring minimal disruption to the business. Prioritizing migration based solely on the age of the application. Prioritizing migration based solely on the age of the application is not a recommended strategy. The decision to migrate should be based on factors like cloud readiness, business value, and complexity, rather than just the age of the application. References: https://aws.amazon.com/cloud-migration Domain Cloud Concepts Question 12Correct A company faces the challenge of a centralized group of users with large file storage needs that have exceeded the available on-premises capacity. They want to extend their file storage but wish to retain the performance advantages of local content sharing. Which AWS solution is most suitable for improving operational efficiency for these requirements? Deploy an Amazon EC2 instance with an attached Amazon EBS Provisioned IOPS volume and share this volume with the users. Your answer is correct Use AWS Storage Gateway's file gateway option and link it to each user's computer. Move the users' work environments to Amazon WorkSpaces and create an Amazon WorkDocs account for each one. Set up an individual Amazon S3 bucket for every user and use a utility to mount each S3 bucket as a file system. Overall explanation Correct Option: Use AWS Storage Gateway file gateway option and link it to each user's computer. The AWS Storage Gateway file gateway setup enables on-premises applications to seamlessly use AWS Cloud storage. It acts as a bridge between on-site storage environments and Amazon S3, providing a secure, scalable, and reliable storage solution. This setup maintains the performance benefits of local content sharing while extending storage capabilities into the AWS Cloud. It's operationally efficient because it minimizes changes to existing applications and workflows and reduces the need for additional on-premises storage infrastructure. The file gateway caches frequently accessed data locally for low-latency access, while asynchronously backing up data to S3 for durability and scalability. Incorrect Options: Set up an individual Amazon S3 bucket for every user and use a utility to mount each S3 bucket as a file system. Mounting an S3 bucket using a file system utility introduces complexity and potential performance issues, especially for applications not designed for high-latency environments. This method also lacks the seamless integration and local caching benefits provided by AWS Storage Gateway, potentially leading to slower access times and decreased user productivity. Move the users' work environments to Amazon WorkSpaces and create an Amazon WorkDocs account for each one. Moving each user's environment to Amazon WorkSpaces and setting up Amazon WorkDocs accounts would significantly change the company's operational model and could lead to higher costs and complexity. This option does not directly address the need for extended file storage capabilities while retaining local sharing benefits. Deploy an Amazon EC2 instance with an attached Amazon EBS Provisioned IOPS volume and share this volume with the users. Sharing an Amazon EBS Provisioned IOPS volume directly with users is not practical or recommended due to the limitations of EBS being designed for EC2 instance storage, not as a shared or distributed file system. This approach lacks the scalability and flexibility required for efficient file storage and sharing among multiple users. References: https://aws.amazon.com/storagegateway/file Domain Cloud Technology and Services Question 13Correct A company plans to move a monolithic application to AWS from on-premises servers. The strategy involves breaking down the application into microservices. Which best practice of the AWS Well-Architected Framework does this approach exemplify? Deploy the application across several geographical areas on AWS. Integrate functional testing within the AWS deployment process. Use automation for implementing changes on AWS. Your answer is correct Design the application with loosely coupled components. Overall explanation Correct Option: Design the application with loosely coupled components. By opting to divide a monolithic application into microservices, the company is adopting the principle of designing systems with loosely coupled components, a core recommendation of the AWS Well-Architected Framework. This approach enhances the application's scalability and maintainability, allowing each service to operate independently. It simplifies the development and deployment processes since updates to one service can be made without impacting others. Moreover, this strategy improves the system's resilience and agility, facilitating easier fault isolation, scaling, and integration of new features. The transition to a microservices architecture on AWS not only aligns with modern software development practices but also leverages AWS services like Amazon ECS, Lambda, and API Gateway to support these distributed components effectively. Incorrect Options: Integrate functional testing within the AWS deployment process. Functional testing as part of the AWS deployment process is a good practice for ensuring code quality and reliability, but it does not directly relate to the architectural principle of creating loosely coupled systems. Functional testing ensures that each part of the application works as expected but does not address the system design or inter-service dependencies. Use automation for implementing changes on AWS. While using automation to deploy changes is a key practice for achieving efficiency and reducing human error, it is more about operational excellence than architectural design. Automation supports the deployment of microservices but is not specifically about the principle of designing systems with loosely coupled components. Deploy the application across several geographical areas on AWS. Deploying an application to multiple locations is related to reliability and geographical redundancy rather than to the architectural design principle of loose coupling. This strategy ensures high availability and disaster recovery but does not directly address how the components of the application interact with each other. References: https://aws.amazon.com/architecture/well-architected https://docs.aws.amazon.com/wellarchitected/latest/framework/rel_prevent_interaction_failur e_loosely_coupled_system.html Domain Cloud Concepts Question 14Correct A company wants to store sensitive data in an Amazon S3 bucket and encrypt it after upload. Therefore, they want to manage their own keys for encryption in AWS services. Which of the following would be used to meet this requirement? AWS Owned Key IAM Access Key Your answer is correct Customer Managed Key AWS Managed Key Overall explanation Correct Options: Customer Managed Key A Customer Managed Key (CMK) is a key that's generated and managed within AWS Key Management Service (KMS) by the customer. This gives the customer full control over the cryptographic key including its lifecycle, policies, and grants. When dealing with sensitive data that needs encryption, a CMK allows the customer to manage the keys used for encryption, hence meeting the company's requirements. Furthermore, the CMK can be used for Amazon S3 bucket encryption and provides an additional layer of security by enabling the company to manage its own keys rather than AWS managing them. Incorrect Options: AWS Managed Key An AWS Managed Key is a key that is automatically created, managed, and protected by AWS on behalf of the customer. The customer does not have direct control over this type of key, which does not satisfy the company's requirement of managing its own keys. AWS Owned Key AWS Owned Keys are owned and managed by AWS and used in multiple AWS accounts. They are primarily used to encrypt data stored in AWS services. This key is not suitable for the company's requirement as it does not provide the ability for the customer to manage their own keys. IAM Access Key An IAM Access Key is a combination of an access key ID and a secret access key that AWS Identity and Access Management (IAM) users can use to authenticate programmatically to AWS services. It is not used for encryption of data and thus, cannot meet the company's requirement of managing its own keys for encryption. References: https://docs.aws.amazon.com/kms/latest/developerguide/concepts.html#customer-cmk Domain Security and Compliance Question 15Correct An organization wants to improve its data security by identifying and protecting sensitive information within its Amazon S3 buckets. Which AWS service will meet this requirement? Amazon Inspector AWS Shield AWS WAF Your answer is correct Amazon Macie Overall explanation Correct Option: Amazon Macie Amazon Macie is a fully managed data security and data privacy service that uses machine learning and pattern matching to discover and protect sensitive data in AWS. Macie is particularly designed to identify and protect sensitive data stored in Amazon S3 buckets. It automatically provides an inventory of S3 buckets and analyzes the buckets to detect sensitive data, such as personally identifiable information (PII), financial information, and credentials. Once identified, Macie generates detailed findings that can be used to mitigate potential data security and privacy risks. This service is tailored for organizations aiming to comply with privacy regulations and enhance their data protection strategies without manual effort. The automatic discovery of sensitive data and the continuous monitoring for data security and privacy risks make Amazon Macie the optimal choice for protecting data stored in Amazon S3 buckets. Incorrect Options: AWS Shield AWS Shield is a managed Distributed Denial of Service (DDoS) protection service that safeguards applications running on AWS. It provides automatic inline mitigation techniques that minimize application downtime and latency. Shield does not offer capabilities for discovering or protecting sensitive data within S3 buckets. Amazon Inspector Amazon Inspector is an automated security assessment service that helps improve the security and compliance of applications deployed on AWS. Inspector automatically assesses applications for vulnerabilities or deviations from best practices. It does not support discovering or protecting sensitive data in S3 buckets. AWS WAF AWS WAF (Web Application Firewall) provides protection for web applications by allowing users to configure rules that block common attack patterns, such as SQL injection or cross- site scripting (XSS). It does not have the capability to discover or protect sensitive data stored in S3 buckets. References: https://aws.amazon.com/macie Domain Security and Compliance Question 16Correct According to the AWS Shared Responsibility Model, which controls do customers fully inherit from AWS? Configuration Management controls Communications controls Your answer is correct Physical and Environmental controls Awareness & Training controls Overall explanation Correct Options: Physical and Environmental controls In the AWS Shared Responsibility Model, customers fully inherit Physical and Environmental controls from AWS. This is because AWS is responsible for the physical security of the infrastructure that supports its cloud services. This includes the security of the buildings, data centers, and the physical hardware that AWS operates. AWS takes care of all aspects of physical security, such as guarding the premises, monitoring surveillance equipment, and managing environmental risks like fire and flood. Customers do not have to worry about these controls, as they are inherently managed by AWS. This arrangement allows customers to focus more on their applications and data, while AWS ensures the physical integrity and security of the infrastructure. The AWS Shared Responsibility Model clearly states that while AWS handles infrastructure security, customers are responsible for securing their data and applications running on that infrastructure. Incorrect Options: Awareness & Training controls Awareness & Training controls are not fully inherited from AWS; they are typically the responsibility of the customer. In the AWS Shared Responsibility Model, customers are responsible for managing their own data, which includes ensuring that their employees are aware of and trained in best practices for security and compliance. While AWS provides resources and tools to assist with training and awareness, the ultimate responsibility for educating users and administrators about security practices lies with the customer. Configuration Management controls Configuration Management controls are the customer's responsibility in the AWS Shared Responsibility Model. This includes tasks such as ensuring that the software and applications running on AWS instances are up to date, properly configured, and securely managed. AWS provides the infrastructure and tools to facilitate configuration management, but the execution of these tasks is up to the customer. This includes updates, security patches, and configuration changes. Communications controls Communications controls are primarily the responsibility of the customer in the AWS environment. This includes securing the transmission of data, ensuring that communication protocols are secure, and managing the way data is exchanged between different services and users. While AWS provides the capability to secure communication (such as VPCs and security groups), it is up to the customer to implement and manage these controls effectively. References: https://aws.amazon.com/compliance/shared-responsibility-model Domain Security and Compliance Question 17Correct How does AWS Cloud help to achieve cost savings? AWS offers the ability to increase resource allocation manually only during peak hours. Your answer is correct AWS enables cost savings through a pay-as-you-go pricing model and allows users to pay only for the resources consumed. AWS requires a long-term commitment for all services to guarantee lower prices. AWS Trusted Advisor automatically optimizes resources for cost savings without manual intervention. Overall explanation Correct Options: AWS enables cost savings through a pay-as-you-go pricing model and allows users to pay only for the resources consumed. AWS's pay-as-you-go pricing model is one of its fundamental cost-saving benefits. This approach allows organizations to scale their resources up or down based on current needs, which avoids upfront capital expenses and reduces ongoing costs. Users are billed for the compute power, storage, and other resources they use, without long-term commitment or complex licensing requirements. Incorrect Options: AWS requires a long-term commitment for all services to guarantee lower prices. AWS does not require long-term commitments for all its services. While AWS offers savings plans and reserved instances that involve a commitment to use a specific level of resources for a term in exchange for discounted rates, the majority of services can be used on a pay-as- you-go basis without long-term commitments. AWS offers the ability to increase resource allocation manually only during peak hours. AWS provides the capability for auto-scaling, which automatically adjusts resources to maintain consistent, predictable performance at the lowest possible cost. Manual intervention is not the only way to manage resource allocation. AWS Trusted Advisor automatically optimizes resources for cost savings without manual intervention. AWS Trusted Advisor provides recommendations for cost optimization, but it does not automatically optimize resources. It is up to the user to implement the recommendations provided by the Trusted Advisor. References: https://aws.amazon.com/pricing Domain Cloud Concepts Question 18Correct Within the AWS shared responsibility model, what is a customer's responsibility when using AWS Lambda? Ensuring the physical security of AWS data centers. Managing the underlying server and compute resources. Your answer is correct Managing the uploaded code to AWS Lambda functions. Updating the AWS Lambda service and its related components. Overall explanation Correct Option: Managing the uploaded code to AWS Lambda functions. The responsibility of customers using AWS Lambda involves managing the code that is executed by the service. This encompasses writing, uploading, and maintaining the function code, including handling any dependencies it might have. It also involves ensuring that the code is secure, free from vulnerabilities, and efficiently executes the tasks it's designed for. AWS Lambda abstracts away the underlying infrastructure, server provisioning, scaling, and maintenance, allowing developers to focus solely on code. This division of responsibilities is a cornerstone of the AWS shared responsibility model, where AWS manages the cloud infrastructure and services, while customers manage their data and application resources. By adhering to this model, customers can leverage AWS Lambda to build and deploy applications quickly and more securely, benefiting from the scalability, reliability, and cost- efficiency of AWS. Incorrect Options: Ensuring the physical security of AWS data centers. The physical security of AWS data centers is solely the responsibility of AWS. This includes all aspects of environmental security, such as access control, surveillance, and disaster protection for the infrastructure that hosts AWS services. Customers do not need to concern themselves with these aspects, as AWS ensures the physical integrity and security of their data centers globally. Managing the underlying server and compute resources. AWS Lambda is a serverless compute service, which means that AWS manages the compute infrastructure, including server maintenance, capacity provisioning, automatic scaling, and patching. Customers using AWS Lambda do not have access to nor need to manage the underlying servers or compute resources directly. Updating the AWS Lambda service and its related components. AWS is responsible for maintaining and updating the AWS Lambda service, including all its underlying components and infrastructure. This ensures that the service remains reliable, secure, and up-to-date without any required intervention from the customer. Users need only to manage their applications and the code that runs within the Lambda environment. References: https://docs.aws.amazon.com/whitepapers/latest/security-overview-aws-lambda/the-shared- responsibility-model.html Domain Security and Compliance Question 19Correct Which security service is automatically enabled for all AWS customers at no additional cost? AWS Web Application Firewall Your answer is correct AWS Shield Standard AWS Shield Advanced Amazon GuardDuty Overall explanation Correct Options: AWS Shield Standard AWS Shield Standard is a managed Distributed Denial of Service (DDoS) protection service that safeguards web applications running on AWS. It provides automatic DDoS protection at no additional cost for all AWS customers. This protection covers common and most frequently observed DDoS attacks. It's integrated with AWS services such as Amazon CloudFront and Amazon Route 53 to provide scalable protection against DDoS attacks. AWS Shield Standard is automatically enabled for all AWS customers, offering cost-effective DDoS protection and enabling businesses to maintain high availability and performance. Incorrect Options: Amazon GuardDuty Amazon GuardDuty is a threat detection service that continuously monitors for malicious activity and unauthorized behavior to protect your AWS accounts and workloads. However, it's not automatically enabled for all AWS customers at no additional cost. AWS Shield Advanced AWS Shield Advanced provides advanced DDoS protection by offering additional DDoS mitigation capabilities over AWS Shield Standard. It also includes cost protection and risk management benefits. This service is not automatically enabled and it incurs extra costs. AWS Web Application Firewall The AWS Web Application Firewall (WAF) is a security service that protects your web applications against common web exploits that could affect application availability, compromise security, or consume excessive resources. But this is not an automatically enabled service and it carries additional costs. References: https://aws.amazon.com/shield https://aws.amazon.com/shield/features https://aws.amazon.com/shield/pricing Domain Security and Compliance Question 20Correct Which AWS service provides forecasts that help you get an idea of what your future costs and usage might be? AWS Simple Monthly Calculator AWS Budgets AWS Cost and Usage Report Your answer is correct AWS Cost Explorer Overall explanation Correct Options: AWS Cost Explorer AWS Cost Explorer includes a forecasting feature that uses machine learning algorithms to predict your future AWS spending based on your historical cost data. This service helps you understand your AWS costs and usage over time and provides insights that can influence your decision-making process. By providing forecasts, it allows you to better anticipate what your future costs might be and make budgetary adjustments as necessary. Incorrect Options: AWS Cost and Usage Report The AWS Cost and Usage Report tracks AWS usage and provides comprehensive data about your costs. It does not offer a forecasting feature to predict future costs and usage. AWS Budgets AWS Budgets allows you to set custom cost and usage budgets and sends alerts if costs or usage exceed your budgeted amount. It doesn't provide future cost and usage forecasts. AWS Simple Monthly Calculator The AWS Simple Monthly Calculator provides an estimated cost of your AWS usage. It helps estimate costs based on expected usage. It doesn't generate forecasts based on your historical usage data. References: https://aws.amazon.com/aws-cost-management/aws-cost-explorer Domain Billing, Pricing, and Support Question 21Correct Which pillar of the AWS Well-Architected Framework focuses on continuously improving processes and procedures? Performance efficiency Security Your answer is correct Operational excellence Reliability Overall explanation Correct Options: Operational excellence The Operational Excellence pillar focuses on developing and running workloads effectively, gaining insight into their operations, and continuously improving processes and procedures. This pillar includes practices such as defining and measuring key performance indicators (KPIs), automating tasks and processes, implementing feedback mechanisms, and continuously reviewing and refining processes to optimize performance. Incorrect Options: Security Security focuses on protecting systems and data from unauthorized access, misuse, and attacks. This pillar includes practices such as implementing access controls, monitoring and logging, encrypting data at rest and in transit, and maintaining compliance with relevant regulations and standards. Reliability Reliability focuses on ensuring that systems can operate continuously and without interruption, even in the face of failures or disruptions. This pillar includes practices such as designing for failure, implementing automated recovery processes, and monitoring system health and performance. Performance efficiency Performance efficiency focuses on optimizing the use of resources to ensure that systems operate efficiently and cost-effectively. This pillar includes practices such as selecting appropriate instance types, using auto-scaling to adjust capacity to demand, and optimizing storage and network usage. References: https://docs.aws.amazon.com/wellarchitected/latest/operational-excellence-pillar/design- principles.html https://wa.aws.amazon.com/wellarchitected/2020-07-02T19-33- 23/wat.pillar.operationalExcellence.en.html Domain Cloud Concepts Question 22Correct Which AWS service allows you to subscribe to an RSS feed to be notified of interruptions to each individual service? AWS Personal Health Dashboard AWS Resource Access Manager Your answer is correct AWS Service Health Dashboard AWS Security Hub Overall explanation Correct Options: AWS Service Health Dashboard This service offers a real-time overview of the operational status for all AWS services across various regions. It also provides historical data about service uptime. One of its important features is the provision for subscribing to an RSS feed that notifies subscribers about interruptions or changes to each individual AWS service. This allows customers to stay informed about the status of AWS services that are relevant to their applications and workloads. You can visit any time to get the current status and availability information for each individual service. You can also subscribe to an RSS feed to be notified of interruptions to each individual service. AWS Service Health Dashboard is available at this link: https://status.aws.amazon.com Incorrect Options: AWS Security Hub AWS Security Hub provides you with a comprehensive view of your security state within AWS and helps you with compliance monitoring by collecting and aggregating findings from AWS services and supported third-party products. It does not offer features related to service health notifications or RSS feeds. AWS Resource Access Manager AWS Resource Access Manager (RAM) helps you securely share your resources across AWS accounts or within your AWS organization. It is primarily used to simplify the process of sharing resources and does not provide any service health information or notification capabilities. AWS Personal Health Dashboard The AWS Personal Health Dashboard provides alerts and remediation guidance when AWS is experiencing events that may impact your account. While it does provide notifications, these are specifically tailored to the user's own resources and do not include an RSS feed for broader AWS service health updates. References: https://status.aws.amazon.com Domain Security and Compliance Question 23Correct Which AWS service supports in-memory data storage which can accelerate database performance? Amazon RDS Amazon S3 Glacier Your answer is correct Amazon ElastiCache Amazon SQS Overall explanation Correct Options: Amazon ElastiCache Amazon ElastiCache provides in-memory data storage and retrieval. It offers fully managed Redis and Memcached for your most demanding applications that require sub-millisecond response times. By using an in-memory data store, ElastiCache improves application performance by allowing you to retrieve data from fast, managed, in-memory caches, instead of relying entirely on slower disk-based databases. Incorrect Options: Amazon RDS Amazon Relational Database Service (RDS) makes it easier to set up, operate, and scale a relational database in the cloud. It doesn't provide in-memory data storage capabilities. Amazon S3 Glacier Amazon S3 Glacier is a secure, durable, and low-cost storage class for data archiving and long-term backup. It does not provide in-memory data storage. Amazon SQS Amazon Simple Queue Service (SQS) is a fully managed message queuing service that enables you to decouple and scale microservices, distributed systems, and serverless applications. It's not an in-memory data store. References: https://aws.amazon.com/elasticache Domain Cloud Technology and Services Question 24Correct Your company uses multiple Amazon S3 buckets containing substantial data volumes. With recent changes in regulatory compliance, you must ensure a retention period of at least 5 years for all bucket contents. Which solution should you use to prevent the early deletion of any object? Apply an S3 Bucket Policy that denies delete permissions. Use AWS Config to track changes and prevent deletions. Create an AWS Lambda function to monitor deletions and restore objects. Your answer is correct Set up S3 Object Lock with a retention period of 5 years. Overall explanation Correct Options: Set up S3 Object Lock with a retention period of 5 years. Amazon S3 Object Lock provides a way to store objects using a "Write Once, Read Many" (WORM) model. It can prevent objects from being deleted or overwritten for a fixed amount of time or indefinitely. By setting up S3 Object Lock with a retention period of 5 years, you ensure that objects can't be deleted before the specified period, fulfilling the compliance requirements. This applies even if a root user attempts to delete the object. Incorrect Options: Create an AWS Lambda function to monitor deletions and restore objects. Using a Lambda function this way can introduce complexities and isn't foolproof. An object might still get deleted, and the function would need to restore it, which can lead to data inconsistencies and doesn't guarantee that the object is retained in its original state for 5 years. Use AWS Config to track changes and prevent deletions AWS Config can monitor and record configuration changes to S3 buckets and other AWS resources, it does not provide a mechanism to prevent deletions based on a retention period. Apply an S3 Bucket Policy that denies delete permissions. A bucket policy can be used to deny delete permissions, it is not as foolproof as S3 Object Lock. Denying delete permissions entirely would mean that after the 5-year period, manual intervention would be needed to allow deletions. Moreover, certain users like root could still override this policy. References: https://docs.aws.amazon.com/AmazonS3/latest/userguide/object-lock.html Domain Cloud Technology and Services Question 25Correct A company uses various AWS services for security management. Now they need to security alerts into a centralized dashboard. Which AWS service should meet these requirements? AWS WAF AWS Trusted Advisor AWS Shield Your answer is correct AWS Security Hub Overall explanation Correct Options: AWS Security Hub AWS Security Hub centrally manages and monitors security across an AWS environment. It aggregates security findings from AWS services like GuardDuty, Inspector, and Macie, as well as from integrated third-party solutions. Security Hub continuously scans for security vulnerabilities, configuration issues, and potential threats, providing a unified view of security posture through dashboards and insights. It simplifies compliance checks with industry standards and automates remediation actions, enhancing overall security visibility and governance to help organizations effectively manage and respond to security threats. Incorrect Options: AWS Shield AWS Shield is a managed DDoS protection service that safeguards applications running on AWS against DDoS attacks. It provides protection and mitigates potential impacts from such attacks but does not offer a centralized dashboard for general security alerts. AWS WAF AWS Web Application Firewall (WAF) helps protect your web applications from common web exploits and vulnerabilities. It does not offer a centralized security dashboard for a wide array of AWS services. AWS Trusted Advisor AWS Trusted Advisor provides real-time guidance to help optimize AWS resources with best practices. Its focus is on cost optimization, performance, security, and fault tolerance but it does not aggregate security alerts into a single dashboard. References: https://docs.aws.amazon.com/securityhub/latest/userguide/what-is-securityhub.html Domain Security and Compliance Question 26Incorrect Which is the software development framework that can define AWS cloud resources using programming languages? AWS CodeStar AWS Command Line Interface (CLI) Correct answer AWS Cloud Development Kit (AWS CDK) Your answer is incorrect AWS Software Developer Kit (SDK) Overall explanation Correct Options: AWS Cloud Development Kit (AWS CDK) The AWS Cloud Development Kit (AWS CDK) is an open-source software development framework to define cloud infrastructure in code and provision it through AWS CloudFormation. It uses familiar programming languages, including JavaScript, TypeScript, Python, C#, and Java, enabling developers to harness the full power of these languages to define reusable cloud components. The CDK integrates fully with AWS services and allows developers to easily model and provision cloud application resources using well-known programming languages. This approach gives developers the high-level interfaces they need to define infrastructure without needing to interact directly with the underlying CloudFormation service. Incorrect Options: AWS CodeStar AWS CodeStar is a cloud-based service for creating, managing, and working with software development projects on AWS. It allows you to develop, build, and deploy applications on AWS. It does not define cloud application resources using programming languages. AWS Software Developer Kit (SDK) AWS SDKs are a set of libraries and tools for developers to create, deploy, and manage applications on AWS. The SDKs provide a range of features for connecting to and working with AWS services. However, it does not allow developers to define cloud application resources using programming languages, as the AWS CDK does. AWS Command Line Interface (CLI) The AWS Command Line Interface (CLI) is a unified tool that allows you to manage your AWS services. With just one tool to download and configure, you can control multiple AWS services from the command line and automate them through scripts. It provides a way to interact with AWS services but it does not define cloud application resources using programming languages like the AWS CDK does. References: https://aws.amazon.com/cdk Domain Cloud Technology and Services Question 27Correct A company wants to migrate to AWS and use the same security software it uses on-premises. The security software vendor offers its security software as a service on AWS. Where can the company purchase the security solution? AWS Partner Solutions Finder AWS Support Center AWS Management Console Your answer is correct AWS Marketplace Overall explanation Correct Options: AWS Marketplace AWS Marketplace is an online store where companies can find, buy, and immediately start using the software and services they need to build products and run their businesses. If a company's security software vendor offers its software as a service on AWS, then the company can purchase the security solution directly from the AWS Marketplace. This not only allows the company to leverage the same security software it uses on-premises but also streamlines procurement by consolidating billing through their AWS account. Incorrect Options: AWS Partner Solutions Finder AWS Partner Solutions Finder helps you identify AWS Partner Network (APN) partners that can help you in your cloud journey, providing services in a variety of areas like migration, managed services, etc. But, it's not a platform where you can directly purchase software solutions. AWS Support Center The AWS Support Center is where AWS customers can go to find answers to frequently asked questions, post or browse through help forums, and contact AWS Support. The AWS Support Center doesn't sell software or services. AWS Management Console The AWS Management Console is a browser-based interface for managing and monitoring AWS resources such as EC2 instances, S3 buckets, and more. While it does provide access to various AWS services but it's not a marketplace where you can purchase third-party software solutions. References: https://aws.amazon.com/marketplace Domain Cloud Technology and Services Question 28Correct As per the AWS Shared Responsibility Model, which of the following tasks fall under AWS’s responsibility when managing an Amazon DynamoDB service? (Select TWO.) Your selection is correct Patching the underlying infrastructure of the DynamoDB service Managing client-side encryption for data stored in DynamoDB tables Your selection is correct Ensuring the high availability of the DynamoDB service Configuring DynamoDB secondary indexes for query optimization Conducting penetration testing on the DynamoDB service environment Overall explanation Correct Options: Ensuring the high availability of the DynamoDB service Ensuring the high availability of the DynamoDB service is part of AWS's responsibility. AWS designs and manages DynamoDB to deliver high availability and durability. This includes tasks like hardware maintenance, network and power redundancy, and deploying the service across multiple geographical regions or Availability Zones. This feature is integral to the value proposition of using AWS managed services, where AWS guarantees a certain level of service availability. Patching the underlying infrastructure of the DynamoDB service Patching the underlying infrastructure of the DynamoDB service is AWS’s responsibility. AWS takes care of the operational burden of the underlying hardware and software infrastructure, which includes updating and patching to ensure security and stability. Customers do not have access to the physical servers or the software that powers the DynamoDB service, therefore, they rely on AWS to maintain the infrastructure. Incorrect Options: Managing client-side encryption for data stored in DynamoDB tables Managing client-side encryption for data stored in DynamoDB tables is the customer's responsibility. AWS provides the DynamoDB service, but the customer must implement and manage data encryption on the client side before it is sent to AWS to be stored in DynamoDB. Configuring DynamoDB secondary indexes for query optimization Configuring DynamoDB secondary indexes for query optimization is the customer’s responsibility. AWS provides the DynamoDB service, which includes the ability to create secondary indexes, but it is the customer's responsibility to configure these indexes according to their application's specific access patterns. Conducting penetration testing on the DynamoDB service environment Conducting penetration testing on the DynamoDB service environment is is generally prohibited by AWS's acceptable use policy without prior authorization. Customers can request permission to conduct penetration testing on their own instances and services, but not on the underlying service infrastructure, which is managed by AWS. References: https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/Introduction.html https://aws.amazon.com/compliance/shared-responsibility-model Domain Security and Compliance Question 29Correct An application uses an Amazon S3 bucket to keep user-generated content. It needs to ensure that deleted objects can be retrieved. As a cloud practitioner, which feature should you suggest to meet this requirement? S3 Transfer Acceleration S3 PutObject S3 Lifecycle Management Your answer is correct S3 Versioning Overall explanation Correct Options: S3 Versioning Amazon S3 Versioning allows you to keep multiple versions of an object in a single bucket. When versioning is enabled, each time an object is updated or deleted, its previous version is preserved and can be retrieved. This ensures that even if an object is deleted, there is a record of its previous versions, which can be restored if needed. Enabling versioning on your S3 bucket is essential for data protection and recovery scenarios, as it provides a way to recover from accidental deletions and unintended overwrites, ensuring that deleted content can be restored. Incorrect Options: S3 Transfer Acceleration S3 Transfer Acceleration is a feature designed to speed up the transfer of files to and from Amazon S3 over long distances. It uses Amazon CloudFront's globally distributed edge locations to accelerate uploads and downloads. It does not provide any functionality related to data recovery or the ability to retrieve deleted objects. S3 PutObject The S3 PutObject operation is used to upload or replace an object in an S3 bucket. While it is a fundamental operation for storing data, it does not provide any capabilities for data recovery or retaining deleted versions of objects. S3 Lifecycle Management S3 Lifecycle Management allows you to define policies to manage your objects during their lifecycle. You can use lifecycle rules to transition objects to different storage classes or to expire them after a certain period. While lifecycle policies can help manage storage costs and ensure data retention, it does not specifically address the need to preserve and retrieve deleted objects. References: https://docs.aws.amazon.com/AmazonS3/latest/userguide/Versioning.html Domain Security and Compliance Question 30Correct When estimating the Total Cost of Ownership (TCO) for a cloud infrastructure versus an on- premises setup, which of the following should be considered? The investment in developer training for cloud service utilization. The expense of office space rental for IT staff. Your answer is correct The cost of electricity for running on-premises servers. The cost of physical security measures for the AWS data center facilities. Overall explanation Correct Options: The cost of electricity for running on-premises servers. Electricity cost is a significant part of running on-premises servers and should be included in TCO calculations. It not only encompasses the direct cost of power consumption by the hardware but also the cooling systems necessary to maintain optimal operating temperatures, which can be substantial depending on the scale of the data center. Incorrect Options: The cost of physical security measures for the AWS data center facilities. Physical security costs are generally not considered in TCO calculations for cloud infrastructure since this is a cost borne by the cloud provider, not the cloud consumer. The expense of office space rental for IT staff. Office space rental for IT staff does not relate to the costs of cloud infrastructure versus on- premises infrastructure. This expense would likely remain constant regardless of the infrastructure choice. The investment in developer training for cloud service utilization. While training is an essential part of transitioning to the cloud, it is not a direct comparison point for TCO between cloud and on-premises infrastructure. It's more about the cost of adapting to new technology rather than the ongoing costs of maintaining and operating infrastructure. References: https://aws.amazon.com/blogs/mt/estimating-total-cost-of-ownership-tco-for-modernizing- workloads-on-aws-using-containerization-part-1 Domain Cloud Concepts Question 31Correct Which AWS service should be used to prepare data for analysis? Amazon CloudWatch Your answer is correct AWS Glue Amazon Redshift AWS SMS Overall explanation Correct Options: AWS Glue AWS Glue is a fully managed extract, transform, and load (ETL) service that makes it easy for users to prepare and load their data for analytics. You can create and run an ETL job with a few clicks in the AWS Management Console. The service provides a comprehensive and robust set of capabilities for data preparation. AWS Glue discovers your data and stores the associated metadata (e.g., table definition and schema) in the AWS Glue Data Catalog. Once cataloged, your data is immediately searchable, queryable, and available for ETL. Incorrect Options: AWS SMS AWS Server Migration Service (SMS) makes it easier and faster for you to migrate thousands of on-premises workloads to AWS. It does not provide any features related to preparing data for analytics. Amazon CloudWatch Amazon CloudWatch is a monitoring service for AWS resources and the applications you run on AWS. It collects and tracks metrics, collects and monitors log files, and responds to system-wide performance changes. It doesn't offer features to prepare data for analytics. Amazon Redshift Amazon Redshift is a fast, scalable data warehouse that makes it simple and cost-effective to analyze all your data across your data warehouse and data lake. It is used for analyzing the data, using standard SQL and your existing business intelligence tools. It is not used for data analytics. References: https://aws.amazon.com/glue Domain Cloud Technology and Services Question 32Correct A company has several standalone AWS accounts for various projects. How can they reduce your AWS monthly costs? Delete unused accounts Your answer is correct Use AWS Organization with Consolidated Billing migrate to on-demand pricing for all resources Enable AWS Shield Overall explanation Correct Options: Use AWS Organization with Consolidated Billing AWS Organizations allows you to manage multiple AWS accounts under a single master account, simplifying billing processes and providing several cost-saving benefits. One of the main advantages is that AWS offers volume pricing discounts on aggregated usage across all accounts in the organization. This means that as usage increases, the costs can reduce through these volume discounts. Additionally, costs can be tracked and monitored centrally, making it easier to identify cost savings opportunities and efficiently allocate resources. Incorrect Options: Enable AWS Shield AWS Shield is a managed Distributed Denial of Service (DDoS) protection service. While it is essential for protecting applications from DDoS attacks, it does not offer any cost-saving benefits related to the reduction of AWS monthly costs. Delete unused accounts Deletion of unused accounts may not reduce monthly costs. AWS charges based on resource usage, not account. Deleting unused accounts or resources can protect against unauthorized access or security risks migrate to on-demand pricing for all resources On-demand pricing is the most flexible but also the most expensive pricing model AWS offers. Migrating all resources to on-demand pricing can actually increase monthly costs. It is typically more cost-efficient to use Reserved Instances or Savings Plans for predictable workloads rather than relying entirely on on-demand pricing. References: https://docs.aws.amazon.com/organizations/latest/userguide/orgs_introduction.html https://docs.aws.amazon.com/awsaccountbilling/latest/aboutv2/consolidated-billing.html Domain Billing, Pricing, and Support Question 33Correct Which AWS service provides a Git repository management system that allows the versioning of application code? AWS CodeBuild AWS CodeDeploy Your answer is correct AWS CodeCommit AWS CodePipeline Overall explanation Correct Options: AWS CodeCommit AWS CodeCommit is a fully managed source control service that hosts secure Git-based repositories. It's the go-to AWS service when you want versioning of application code. With CodeCommit, you can collaboratively work on code with your team, maintain version history, manage updates, and control who can make changes to your code. Incorrect Options: AWS CodePipeline AWS CodePipeline is a continuous integration and continuous deployment (CI/CD) service. It's used to automate your release pipelines for fast and reliable updates. It does not manage Git repositories. AWS CodeBuild AWS CodeBuild is a fully managed build service that compiles source code, runs tests, and produces software packages. It's used for building and testing your code but doesn't manage Git repositories. AWS CodeDeploy AWS CodeDeploy automates software deployments to a variety of compute services like Amazon EC2, AWS Fargate, AWS Lambda, and your on-premises servers. CodeDeploy aids in the deployment process but does not offer Git repository management. References: https://aws.amazon.com/codecommit Domain Cloud Technology and Services Question 34Correct What function does an Internet Gateway offer within an AWS Virtual Private Cloud (VPC)? Limiting the bandwidth for internet-bound traffic Distributing incoming internet traffic evenly across several EC2 instances Your answer is correct Allowing the VPC's ability to communicate with the internet Creating a Virtual Private Network (VPN) connection with the VPC Overall explanation Correct Option: Allowing the VPC's ability to communicate with the internet An AWS Internet Gateway is designed to enable communication between instances in a Virtual Private Cloud (VPC) and the internet. This gateway facilitates both the inbound and outbound internet traffic, ensuring that the instances within the VPC can access the internet and be accessed from the internet, under the right security and routing configurations. Essentially, it acts as a bridge between AWS's internal network and the broader internet, playing a crucial role in managing and directing internet-bound traffic to and from the VPC. Incorrect Options: Creating a Virtual Private Network (VPN) connection with the VPC The purpose of a VPN connection is to provide secure, encrypted connections between the VPC and other networks, such as a corporate network, over the internet, not to provide general internet access to the VPC resources. Limiting the bandwidth for internet-bound traffic An Internet Gateway does not impose bandwidth constraints on internet traffic. Its primary function is to provide a route for traffic between the VPC and the internet. Distributing incoming internet traffic evenly across several EC2 instances Load balancing across Amazon EC2 instances is achieved through the use of Elastic Load Balancing (ELB), not an Internet Gateway. ELB automatically distributes incoming application traffic across multiple targets, such as Amazon EC2 instances, containers, and IP addresses, to increase the scalability and reliability of applications. References: https://docs.aws.amazon.com/vpc/latest/userguide/VPC_Internet_Gateway.html Domain Cloud Technology and Services Question 35Correct A company hosts an application in an AWS region and needs a solution to create an identical environment in another AWS region to protect against data loss. Additionally, they require the ability to quickly activate a standby environment if the primary one fails. Which AWS service would best meet these requirements? Amazon S3 Glacier AWS DataSync Your answer is correct AWS Elastic Disaster Recovery AWS Backup Overall explanation Correct Options: AWS Elastic Disaster Recovery AWS Elastic Disaster Recovery (EDR) provides automated, cross-region disaster recovery for critical workloads. It allows businesses to replicate their applications and data from one AWS region to another, ensuring high availability and resilience. EDR simplifies the setup and management of standby environments, enabling rapid activation in the event of primary environment failure. This service helps organizations maintain business continuity by minimizing downtime and protecting against data loss, thereby ensuring that applications remain operational and accessible even during unforeseen disruptions. Incorrect Options: AWS Backup AWS Backup simplifies the process of backing up application data across AWS services. It provides centralized management and does not replicate entire environments or offer quick failover capabilities as Elastic Disaster Recovery does. AWS Backup is better suited for long- term data retention and protection strategies, rather than immediate recovery and failover solutions. Amazon S3 Glacier Amazon S3 Glacier is designed for archival storage and is cost-effective for storing data that is infrequently accessed. Although it provides data durability, it does not offer the capabilities required for quick failover, such as real-time synchronization and fast activation of a standby environment. Therefore, it does not meet the critical requirements for disaster recovery. AWS DataSync AWS DataSync automates data movement between on-premises storage and AWS storage services. It does not offer the comprehensive disaster recovery features needed for quickly activating a standby environment. Its primary use is for data transfer and synchronization rather than full environment replication and failover. References: https://aws.amazon.com/disaster-recovery Domain Cloud Technology and Services Question 36Correct An e-commerce company needs a database solution to continue functioning in an Availability Zone (AZ) outage without requiring manual administrative intervention. Which solution would best meet this requirement? Configure cross-region replication for the RDS database. Deploy the database in a read replica setup across multiple AZs. Your answer is correct Use Amazon RDS Multi-AZ deployment with automatic failover. Use EC2 Instance to deploy the database in a single AZ. Overall explanation Correct Options: Use Amazon RDS Multi-AZ deployment with automatic failover. Amazon RDS Multi-AZ (Availability Zone) deployment is designed precisely for scenarios that require high availability and automatic failover. This option ensures that the database continues to function even if there is an outage in one AZ. When you use Multi-AZ deployment, RDS automatically synchronizes data to a standby replica in another AZ. In the event of a failure (such as AZ outage, hardware failure, or software patching), RDS automatically performs a failover to the standby replica without administrative intervention. This means the application can continue to function with minimal downtime, ensuring a seamless experience for the end-users. RDS Multi-AZ also takes care of automated backups, software patching, and DB instance maintenance, reducing the operational burden on the administrative team. By providing high availability and durability, RDS Multi-AZ deployment offers a robust and automated solution to manage fault tolerance for database operations. Incorrect Options: Deploy the database in a read replica setup across multiple AZs. Amazon RDS read replicas are intended primarily for horizontal scaling of read operations and not for automatic failover. While they can be promoted to standalone databases, this process is manual and does not provide the same level of automated high availability and reliability as Multi-AZ deployment. Use EC2 Instance to deploy the database in a single AZ. Deploying a database on an EC2 instance in a single AZ does not provide high availability. In the event of an AZ outage, the database instance would be inaccessible, leading to service disruption. This option requires significant manual intervention to restore services. Configure cross-region replication for the RDS database. Cross-region replication is designed for disaster recovery and geographic redundancy rather than handling AZ outages within a specific region. While it can be used to replicate data across different regions, it introduces additional latency and does not provide the same immediate failover capability as Multi-AZ deployments within the same region. References: https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/Concepts.MultiAZ.html Domain Cloud Technology and Services Question 37Correct A company has a legacy application and wants to migrate to the AWS cloud based on a microservice architecture. Which AWS service should be used to decouple components? Your answer is correct Amazon SQS Amazon Lightsail Amazon API Gateway Amazon VPC Overall explanation Correct Options: Amazon SQS (Simple Queue Service) Amazon SQS is a fully managed message queuing service that enables you to decouple and scale microservices, distributed systems, and serverless applications. SQS eliminates the complexity and overhead associated with managing and operating message oriented middleware, and empowers developers to focus on differentiating work. SQS offers two types of message queues, standard delivery which offers maximum throughput, best-effort ordering, and at-least-once delivery, and FIFO queues which are designed to guarantee that messages are processed exactly once, in the exact order that they are sent. Therefore, for a company migrating a legacy application to the AWS cloud based on a microservice architecture, Amazon SQS is an excellent choice to decouple components. Incorrect Options: Amazon VPC (Virtual Private Cloud) Amazon VPC lets you launch AWS resources in a logically isolated virtual network that you define. It does not have functionality related to the decoupling of application components. Amazon Lightsail Amazon Lightsail is an easy-to-use cloud platform that offers you everything needed to build an application or website, plus a cost-effective, monthly plan. It is more suited for simpler applications with predictable traffic patterns and does not provide the decoupling functionality necessary for a microservices architecture. Amazon API Gateway Amazon API Gateway is a fully managed service that makes it easy for developers to create, publish, maintain, monitor, and secure APIs at any scale. API Gateway does not provide the messaging queue functionality that allows for the decoupling of services, as Amazon SQS does. It's more about providing a single entry point for microservice APIs. References: https://aws.amazon.com/sqs Domain Cloud Technology and Services Question 38Correct A company plans to store sensitive user data in Amazon S3 and needs to ensure that the data is encrypted at rest. Which of the following encryption options can be used to encrypt data at rest? (Select TWO.) Your selection is correct AWS KMS-managed keys (SSE-KMS) Your selection is correct AWS S3 managed keys (SSE-S3) AWS Glacier Vault Lock Amazon Redshift cluster encryption AWS Elastic File System (EFS) encryption Overall explanation Correct Options: AWS KMS-managed keys (SSE-KMS) Amazon S3 provides the option to use Server-Side Encryption with AWS Key Management Service (SSE-KMS). With SSE-KMS, Amazon S3 automatically encrypts the object data on the server-side as it writes it to the disk in its data centers and automatically decrypts it for you when you access it. You can either let AWS manage the encryption key for you or you can use a customer master key (CMK) from KMS. AWS S3 managed keys (SSE-S3) Another option available in Amazon S3 is Server-Side Encryption with Amazon S3-Managed Keys (SSE-S3). When you use SSE-S3, Amazon S3 handles and manages the encryption keys for you. Objects are encrypted using a unique key and, as an additional safeguard, this key itself is encrypted using a master key that is rotated regularly. Incorrect Options: AWS Elastic File System (EFS) encryption AWS EFS encryption is used for encrypting data at rest in the Elastic File System, not in Amazon S3. Although it also uses keys from AWS KMS, it's a different service and not relevant to S3 data encryption. Amazon Redshift cluster encryption Amazon Redshift encryption is used for encrypting data within Redshift clusters. It's specific to the Redshift service and not applicable to Amazon S3's encryption requirements. AWS Glacier Vault Lock While AWS Glacier is designed for long-term archival storage, Vault Lock is specifically used for regulatory and compliance archiving by creating a lockable policy. It doesn’t define the encryption method for Amazon S3. References: https://docs.aws.amazon.com/AmazonS3/latest/userguide/serv-side-encryption.html Domain Security and Compliance Question 39Correct Which AWS service can be used to perform vulnerability assessments and security audits of AWS resources? AWS WAF Your answer is correct Amazon Inspector AWS GuardDuty AWS KMS Overall explanation Correct Options: Amazon Inspector Amazon Inspector is an automated security assessment service that helps improve the security and compliance of applications deployed on AWS. It analyzes the behavior of the AWS resources and helps identify potential security issues, vulnerabilities, or deviations from best practices. By running Amazon Inspector, AWS customers can receive a detailed report on the security state of their AWS resources and actionable recommendations to mitigate any identified risks. Incorrect Options: AWS GuardDuty GuardDuty is a threat detection service that continuously monitors for malicious or unauthorized activity. It doesn't perform vulnerability assessments or security audits. AWS WAF AWS WAF (Web Application Firewall) protects web applications from common web exploits. It is used for protection rather than performing vulnerability assessments or audits. AWS KMS AWS Key Management Service (KMS) is used to create and manage cryptographic keys and control their use across a wide range of AWS services and applications. It doesn't perform vulnerability assessments or security audits. References: https://aws.amazon.com/inspector Domain Security and Compliance Question 40Correct An organization operates multiple VPCs spread across different regions. They need these VPCs to communicate with centralized network management and without using the public internet for security reasons. As a Cloud Practitioner, Which AWS service would you recommend to meet the requirement? VPC Peering AWS VPN Application Load Balancer Your answer is correct AWS Transit Gateway Overall explanation Correct Options: AWS Transit Gateway AWS Transit Gateway provides a way to connect multiple VPCs and on-premises networks through a single gateway. It simplifies the network architecture and allows centralized control over routing and security. Traffic between the connected VPCs does not traverse the public internet, making it a suitable solution for the given compliance requirements. With Transit Gateway, you can route traffic between VPCs in different regions without exposing it to the public internet. Incorrect Options: AWS VPN AWS VPN allows users to securely connect on-premises data centers and networks to Amazon's cloud infrastructure using a Virtual Private Network. It provides encrypted connections, ensuring data privacy and integrity during transit. It doesn't support centralized network to connect VPCs. VPC Peering VPC Peering allows for the connection of two VPCs to share resources. However, each VPC peering connection is between two VPCs specifically and doesn't allow for transitive peering. Therefore, managing connections centrally for multiple VPCs becomes complex, especially as the number of VPCs increases. Application Load Balancer (ALB) An ALB is designed for distributing incoming application traffic across multiple targets, such as EC2 instances. It doesn't serve as a mechanism to connect VPCs or route traffic between them. References: https://docs.aws.amazon.com/vpc/latest/tgw/what-is-transit-gateway.html Domain Cloud Technology and Services Question 41Correct How does AWS help users to focus on business value by increasing speed and agility? By providing a range of programming languages and tools. By offering a wide variety of pre-built templates and solutions. By providing access to a global network of data centers. Your answer is correct By providing automatic scaling and deployment capabilities. Overall explanation Correct Options: By providing automatic scaling and deployment capabilities AWS helps users focus on business value by increasing speed and agility through automatic scaling and deployment capabilities. Services like AWS Elastic Beanstalk, AWS Lambda, and AWS Auto Scaling allow developers to deploy applications rapidly without worrying about underlying infrastructure. They can scale up or down automatically based on demand, enabling businesses to react quickly to changing needs without over-provisioning or under- provisioning resources. This allows users to focus more on their core business tasks and less on managing IT infrastructure. Incorrect Options: By providing access to a global network of data centers AWS's global network of data centers can provide advantages like lower latency and data residency, it doesn't increase speed and agility in the context of application development and deployment. By offering a wide variety of pre-built templates and solutions AWS does offer pre-built templates and solutions, which can speed up certain tasks, but they are not specifically focused on automatic scaling and deployment, which are key to achieving high speed and agility in a cloud environment. By providing a range of programming languages and tools AWS supports a wide range of programming languages and provides various tools to assist in development, this broad support isn't directly tied to the concept of increasing speed and agility through automatic scaling and deployment. References: https://docs.aws.amazon.com/whitepapers/latest/aws-overview/six-advantages-of-cloud- computing.html Domain Cloud Concepts Question 42Correct Which of the following are the advantages of using the AWS Cloud? (Select TWO.) Your selection is correct Increased speed and agility Data is automatically secure AWS audits user data AWS manages compliance needs Your selection is correct Stop guessing about capacity Overall explanation Correct Options: Stop guessing about capacity In traditional infrastructure management, it's a common problem to estimate the capacity needs of an application or service. With the AWS cloud, you don't have to worry about this because you can scale your resources up or down as needed. This eliminates the cost of idle resources and the risk of not having enough capacity during peak times. With AWS, you can provision the amount of resources that you actually need. If your needs increase, you can easily scale up, and if your needs decrease, you can reduce resources to save costs. Increased speed and agility AWS cloud services allow businesses to move quickly and reduce the time to deliver their products or services. AWS provides a broad set of products and services that allow an organization to innovate faster, lower operational costs, and scale applications. The use of AWS services eliminates the need for costly and time-consuming infrastructure setup and maintenance, resulting in increased speed and agility in business processes and development cycles. Incorrect Options: AWS manages compliance needs While AWS provides several compliance and governance services and tools, such as AWS Config, AWS Security Hub, and AWS Audit Manager, these are designed to assist organizations with their compliance needs. It is still the customer's responsibility to ensure that their specific compliance requirements are met. AWS operates on a shared responsibility model where AWS is responsible for the security of the cloud, and customers are responsible for security in the cloud. AWS audits user data AWS does not audit user data. AWS provides various services and features to help users implement auditing and governance, but it does not access or audit user data as part of its standard operating procedures. The control and ownership of data lie with the AWS customers. Data is automatically secure AWS manages the security of the underlying infrastructure, but customers are responsible for securing their workloads and data. While AWS provides tools and services to secure data, it's not accurate to say data is automatically secure. Users need to configure these tools and follow security best practices. References: https://docs.aws.amazon.com/whitepapers/latest/aws-overview/six-advantages-of-cloud- computing.html Domain Cloud Concepts Question 43Correct Which AWS service allows you to receive a notification after inserting data into Amazon RDS? AWS Batch Amazon EC2 Amazon EKS Your answer is correct AWS Lambda Overall explanation Correct Options: AWS Lambda AWS Lambda is a serverless compute service that allows you to run your code in response to events, such as changes to data in an Amazon RDS database, and automatically manages the compute resources for you. You can configure AWS Lambda to execute code in response to insertions or modifications in an Amazon RDS instance, and it can receive notifications about these events. By integrating with Amazon SNS, Amazon SQS, or other AWS services, these notifications can be delivered to the end user. Incorrect Options: Amazon EC2 Amazon EC2 (Elastic Compute Cloud) provides scalable compute capacity in the cloud, but It's not designed to receive notifications of changes in an Amazon RDS database. Amazon EKS Amazon Elastic Kubernetes Service (EKS) is a managed service that allows you to run Kubernetes on AWS without needing to install, operate, and maintain your own Kubernetes control plane or nodes. It's not designed to receive notifications of changes in an Amazon RDS database. AWS Batch AWS Batch enables developers, scientists, and engineers to easily and efficiently run hundreds of thousands of batch computing jobs on AWS. It doesn't have the built-in capability to receive notifications when data is inserted into an Amazon RDS database. References: https://docs.aws.amazon.com/lambda/latest/dg/services-rds.html Domain Cloud Technology and Services Question 44Correct A company wants to move from on-premises infrastructure to AWS. What benefits do they get in terms of Operating Expenses (OpEx)? Invariable electricity and cooling costs Constant licensing fees for proprietary software Increased costs of hardware maintenance Your answer is correct Reduced upfront hardware investment costs Overall explanation Correct Options: Reduced upfront hardware investment costs When moving to AWS, one of the main benefits is the transition from a capital expenditure (CapEx) model to an operational expenditure (OpEx) model. With AWS, there's no need to invest in purchasing and maintaining physical hardware. Instead, the company pays only for the compute, storage, and other resources they use, which can lead to cost savings over time, especially when considering hardware depreciation and the need for periodic upgrades in an on-premises environment. Incorrect Options: Increased costs of hardware maintenance One of the benefits of migrating to the cloud is the elimination or significant reduction of hardware maintenance costs. With AWS, the responsibility for maintaining the underlying hardware lies with AWS, not the customer. Invariable electricity and cooling costs Electricity and cooling costs associated with running on-premises data centers can be substantial. By migrating to AWS, these costs will typically decrease, as there's no need to power and cool physical servers. Constant licensing fees for proprietary software While AWS provides a variety of licensing options, including License Included and Bring Your Own License (BYOL), the licensing fees may vary depending on the chosen model and software. It's not guaranteed that these fees will remain constant, especially if there's a transition from on-premises licenses to cloud-based models. References: https://aws.amazon.com/economics Domain Cloud Concepts Question 45Correct A company wants to deploy a PostgreSQL database in AWS Cloud. The database should be autoscaled and backup-enabled. As a Cloud Practitioner, which AWS service should you recommend? Amazon DynamoDB Your answer is correct Amazon Aurora Amazon DocumentDB Amazon Neptune Overall explanation Correct Options: Amazon Aurora Amazon Aurora is a relational database service that combines the speed and availability of high-end commercial databases with the simplicity and cost-effectiveness of open-source databases. It provides up to five times better performance than the typical MySQL database and three times the performance of the typical PostgreSQL database. Aurora features a distributed, fault-tolerant, self-healing storage system that auto-scales up to 64TB per database instance. It also delivers high performance and availability with up to 15 low- latency read replicas, point-in-time recovery, continuous backup to Amazon S3, and replication across three Availability Zones. Incorrect Options: Amazon DynamoDB Amazon DynamoDB is a key-value and document database that delivers single-digit millisecond performance at any scale. However, it is not suitable for PostgreSQL database deployment because it does not support relational data structures. Amazon Neptune Amazon Neptune is a fast, reliable, fully-managed graph database service that makes it easy to build and run applications that work with highly connected datasets. However, it does not support relational databases. Amazon DocumentDB Amazon DocumentDB is a fast, scalable, highly available, and fully managed document database service that supports MongoDB workloads. It does not host PostgreSQL databases. References: https://aws.amazon.com/rds/aurora Domain Cloud Technology and Services Question 46Correct Which service provides recommendations that help you reduce costs? AWS Pricing Calculator Your answer is correct AWS Trusted Advisor AWS Cost Explorer Amazon Inspector Overall explanation Correct Options: AWS Trusted Advisor AWS Trusted Advisor inspects AWS environment and makes recommendations to help you save money, improve system performance, and close security gaps. Trusted Advisor provides real-time insight into your usage patterns, configurations, and resources, then compares it to AWS best practices. Among its various checks, it includes cost optimization recommendations that help you identify underutilized resources, which can be resized or shut down to save costs. It also suggests ways to leverage AWS pricing models and discounts more effectively. Incorrect Options: AWS Pricing Calculator The AWS Pricing Calculator provides an estimate of the cost to use AWS services, based on the details you provide about expected usage. It is useful for cost estimation and planning and it does not provide recommendations to reduce costs. AWS Cost Explorer AWS Cost Explorer allows you to visualize, understand, and manage your AWS costs and usage over time. It provides data and insights that can help you make decisions to optimize costs. It does not provide recommendations for cost reduction. Amazon Inspector Amazon Inspector is an automated security assessment service that helps improve the security and compliance of applications deployed on AWS. It assesses applications for vulnerabilities or deviations from best practices, but it does not provide recommendations for cost savings or optimization. References: https://aws.amazon.com/premiumsupport/technology/trusted-advisor Domain Billing, Pricing, and Support Question 47Correct Which AWS design principle emphasizes the need for systems to be designed to handle failure? Scalability Security Flexibility Your answer is correct Resiliency Overall explanation Correct Options: Resiliency The resiliency principle emphasizes the need for systems to handle failure. This includes designing systems to be fault-tolerant, using backup and recovery strategies, and implementing monitoring and alerting to quickly detect and respond to issues. By focusing on resiliency, a system can better withstand unexpected events and provide reliable and consistent performance. Incorrect Options: Security The security principle emphasizes the need for systems to protect data and resources, prevent unauthorized access, and maintain compliance with industry standards and regulations. Scalability The scalability principle emphasizes the need for systems to automatically scale resources up and down based on usage patterns. While scalability can help a system handle increased traffic or workloads, it does not necessarily address the issue of system failure. Flexibility The flexibility principle emphasizes the need for systems to meet changing business requirements, integrate with other systems and services, and allow experimentation and innovation. While flexibility can help a system adapt to changing conditions, it does not necessarily address the issue of system failure. References: https://wa.aws.amazon.com/wellarchitected/2020-07-02T19-33- 23/wat.concept.resiliency.en.html https://docs.aws.amazon.com/wellarchitected/latest/reliability-pillar/resiliency-and-the- components-of-reliability.html Domain Cloud Concepts Question 48Correct Which one can establish a connection between VPC and DynamoDB table without a public internet connection? Your answer is correct VPC Endpoints Internet Gateway AWS VPN Amazon API Gateway Overall explanation Correct Options: VPC Endpoints VPC Endpoints allow you to privately connect Virtual Private Cloud (VPC) to supported AWS services and VPC endpoint services powered by PrivateLink without requiring public internet connections. Instances in VPC do not require public IP addresses to communicate with resources in the service. Traffic between your VPC and the other service does not leave the Amazon network, providing a more secure connection. With respect to Amazon DynamoDB, using a VPC Endpoint allows your EC2 instances within your VPC to access the DynamoDB table without having to travel the public internet. Incorrect Options: AWS VPN AWS VPN is a virtual private network service that enables secure and encrypted communication between AWS resources and on-premises networks, allowing organizations to extend their network infrastructure to the cloud. It does not allow direct connection to a D

Use Quizgecko on...
Browser
Browser