Summary

This document contains AWS SAA-C03 exam questions. The questions cover topics such as implementing AWS services, cloud security, and cost optimization. It's crucial for preparing for the AWS Certified Solutions Architect - Associate exam.

Full Transcript

Certshared now are offering 100% pass ensure SAA-C03 dumps! https://www.certshared.com/exam/SAA-C03/ (0 Q&As) Amazon-Web-Services Exam Questions SAA-C03...

Certshared now are offering 100% pass ensure SAA-C03 dumps! https://www.certshared.com/exam/SAA-C03/ (0 Q&As) Amazon-Web-Services Exam Questions SAA-C03 AWS Certified Solutions Architect - Associate (SAA-C03) Guaranteed success with Our exam guides visit - https://www.certshared.com Certshared now are offering 100% pass ensure SAA-C03 dumps! https://www.certshared.com/exam/SAA-C03/ (0 Q&As) NEW QUESTION 1 A company needs guaranteed Amazon EC2 capacity in three specific Availability Zones in a specific AWS Region for an upcoming event that will last 1 week. What should the company do to guarantee the EC2 capacity? A. Purchase Reserved instances that specify the Region needed B. Create an On Demand Capacity Reservation that specifies the Region needed C. Purchase Reserved instances that specify the Region and three Availability Zones needed D. Create an On-Demand Capacity Reservation that specifies the Region and three Availability Zones needed Answer: D Explanation: Explanation https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/ec2-capacity-reservations.html: "When you create a Capacity Reservation, you specify: The Availability Zone in which to reserve the capacity" NEW QUESTION 2 A company uses a popular content management system (CMS) for its corporate website. However, the required patching and maintenance are burdensome. The company is redesigning its website and wants anew solution. The website will be updated four times a year and does not need to have any dynamic content available. The solution must provide high scalability and enhanced security. Which combination of changes will meet these requirements with the LEAST operational overhead? (Choose two.) A. Deploy an AWS WAF web ACL in front of the website to provide HTTPS functionality B. Create and deploy an AWS Lambda function to manage and serve the website content C. Create the new website and an Amazon S3 bucket Deploy the website on the S3 bucket with static website hosting enabled D. Create the new websit E. Deploy the website by using an Auto Scaling group of Amazon EC2 instances behind an Application Load Balancer. Answer: AD NEW QUESTION 3 A company has two applications: a sender application that sends messages with payloads to be processed and a processing application intended to receive the messages with payloads. The company wants to implement an AWS service to handle messages between the two applications. The sender application can send about 1.000 messages each hour. The messages may take up to 2 days to be processed. If the messages fail to process, they must be retained so that they do not impact the processing of any remaining messages. Which solution meets these requirements and is the MOST operationally efficient? A. Set up an Amazon EC2 instance running a Redis databas B. Configure both applications to use the instanc C. Store, process, and delete the messages, respectively. D. Use an Amazon Kinesis data stream to receive the messages from the sender applicatio E. Integrate the processing application with the Kinesis Client Library (KCL). F. Integrate the sender and processor applications with an Amazon Simple Queue Service (Amazon SQS) queu G. Configure a dead-letter queue to collect the messages that failed to process. H. Subscribe the processing application to an Amazon Simple Notification Service (Amazon SNS) topic to receive notifications to proces I. Integrate the sender application to write to the SNS topic. Answer: C Explanation: Explanation https://aws.amazon.com/blogs/compute/building-loosely-coupled-scalable-c-applications-with-amazon-sqs-and- https://docs.aws.amazon.com/AWSSimpleQueueService/latest/SQSDeveloperGuide/sqs-dead-letter-queues.htm NEW QUESTION 4 A company has created an image analysis application in which users can upload photos and add photo frames to their images. The users upload images and metadata to indicate which photo frames they want to add to their images. The application uses a single Amazon EC2 instance and Amazon DynamoDB to store the metadata. The application is becoming more popular, and the number of users is increasing. The company expects the number of concurrent users to vary significantly depending on the time of day and day of week. The company must ensure that the application can scale to meet the needs of the growing user base. Which solution meats these requirements? A. Use AWS Lambda to process the photo B. Store the photos and metadata in DynamoDB. C. Use Amazon Kinesis Data Firehose to process the photos and to store the photos and metadata. D. Use AWS Lambda to process the photo E. Store the photos in Amazon S3. Retain DynamoDB to store the metadata. F. Increase the number of EC2 instances to thre G. Use Provisioned IOPS SSD (io2) Amazon Elastic Block Store (Amazon EBS) volumes to store the photos and metadata. Answer: A NEW QUESTION 5 An application runs on an Amazon EC2 instance in a VPC. The application processes logs that are stored in an Amazon S3 bucket. The EC2 instance needs to access the S3 bucket without connectivity to the internet. Which solution will provide private network connectivity to Amazon S3? Guaranteed success with Our exam guides visit - https://www.certshared.com Certshared now are offering 100% pass ensure SAA-C03 dumps! https://www.certshared.com/exam/SAA-C03/ (0 Q&As) A. Create a gateway VPC endpoint to the S3 bucket. B. Stream the logs to Amazon CloudWatch Log C. Export the logs to the S3 bucket. D. Create an instance profile on Amazon EC2 to allow S3 access. E. Create an Amazon API Gateway API with a private link to access the S3 endpoint. Answer: A NEW QUESTION 6 A solutions architect is designing the architecture of a new application being deployed to the AWS Cloud. The application will run on Amazon EC2 On-Demand Instances and will automatically scale across multiple Availability Zones. The EC2 instances will scale up and down frequently throughout the day. An Application Load Balancer (ALB) will handle the load distribution. The architecture needs to support distributed session data management. The company is willing to make changes to code if needed. What should the solutions architect do to ensure that the architecture supports distributed session data management? A. Use Amazon ElastiCache to manage and store session data. B. Use session affinity (sticky sessions) of the ALB to manage session data. C. Use Session Manager from AWS Systems Manager to manage the session. D. Use the GetSessionToken API operation in AWS Security Token Service (AWS STS) to manage the session Answer: A Explanation: Explanation https://aws.amazon.com/vi/caching/session-management/ In order to address scalability and to provide a shared data storage for sessions that can be accessible from any individual web server, you can abstract the HTTP sessions from the web servers themselves. A common solution to for this is to leverage an In-Memory Key/Value store such as Redis and Memcached. ElastiCache offerings for In-Memory key/value stores include ElastiCache for Redis, which can support replication, and ElastiCache for Memcached which does not support replication. NEW QUESTION 7 A company hosts an application on multiple Amazon EC2 instances The application processes messages from an Amazon SQS queue writes to an Amazon RDS table and deletes the message from the queue Occasional duplicate records are found in the RDS table. The SQS queue does not contain any duplicate messages. What should a solutions architect do to ensure messages are being processed once only? A. Use the CreateQueue API call to create a new queue B. Use the Add Permission API call to add appropriate permissions C. Use the ReceiveMessage API call to set an appropriate wail time D. Use the ChangeMessageVisibility APi call to increase the visibility timeout Answer: D Explanation: Explanation The visibility timeout begins when Amazon SQS returns a message. During this time, the consumer processes and deletes the message. However, if the consumer fails before deleting the message and your system doesn't call the DeleteMessage action for that message before the visibility timeout expires, the message becomes visible to other consumers and the message is received again. If a message must be received only once, your consumer should delete it within the duration of the visibility timeout. https://docs.aws.amazon.com/AWSSimpleQueueService/latest/SQSDeveloperGuide/sqs-visibility-timeout.html Keyword: SQS queue writes to an Amazon RDS From this, Option D best suite & other Options ruled out [Option A - You can't intruduce one more Queue in the existing one; Option B - only Permission & Option C - Only Retrieves Messages] FIF O queues are designed to never introduce duplicate messages. However, your message producer might introduce duplicates in certain scenarios: for example, if the producer sends a message, does not receive a response, and then resends the same message. Amazon SQS APIs provide deduplication functionality that prevents your message producer from sending duplicates. Any duplicates introduced by the message producer are removed within a 5-minute deduplication interval. For standard queues, you might occasionally receive a duplicate copy of a message (at-least- once delivery). If you use a standard queue, you must design your applications to be idempotent (that is, they must not be affected adversely when processing the same message more than once). NEW QUESTION 8 A company is implementing a shared storage solution for a media application that is hosted m the AWS Cloud The company needs the ability to use SMB clients to access data The solution must he fully managed. Which AWS solution meets these requirements? A. Create an AWS Storage Gateway volume gatewa B. Create a file share that uses the required client protocol Connect the application server to the tile share. C. Create an AWS Storage Gateway tape gateway Configure (apes to use Amazon S3 Connect the application server lo the tape gateway D. Create an Amazon EC2 Windows instance Install and configure a Windows file share role on the instanc E. Connect the application server to the file share. F. Create an Amazon FSx for Windows File Server tile system Attach the fie system to the origin server.Connect the application server to the tile system Answer: D NEW QUESTION 9 A company collects temperature, humidity, and atmospheric pressure data in cities across multiple continents. The average volume of data collected per site each day is 500 GB. Each site has a highspeed internet connection. The company's weather forecasting applications are based in a single Region and analyze the data daily. What is the FASTEST way to aggregate data from all of these global sites? A. Enable Amazon S3 Transfer Acceleration on the destination bucke B. Use multipart uploads todirectly upload site data to the destination bucket. Guaranteed success with Our exam guides visit - https://www.certshared.com Certshared now are offering 100% pass ensure SAA-C03 dumps! https://www.certshared.com/exam/SAA-C03/ (0 Q&As) C. Upload site data to an Amazon S3 bucket in the closest AWS Regio D. Use S3 cross-Regionreplication to copy objects to the destination bucket. E. Schedule AWS Snowball jobs daily to transfer data to the closest AWS Regio F. Use S3 cross-Regionreplication to copy objects to the destination bucket. G. Upload the data to an Amazon EC2 instance in the closest Regio H. Store the data in an AmazonElastic Block Store (Amazon EBS) volum I. Once a day take an EBS snapshot and copy it to thecentralized Regio J. Restore the EBS volume in the centralized Region and run an analysis on the datadaily. Answer: A Explanation: Explanation You might want to use Transfer Acceleration on a bucket for various reasons, including the following: You have customers that upload to a centralized bucket from all over the world. You transfer gigabytes to terabytes of data on a regular basis across continents. You are unable to utilize all of your available bandwidth over the Internet when uploading to Amazon S3. https://docs.aws.amazon.com/AmazonS3/latest/dev/transfer-acceleration.html https://aws.amazon.com/s3/transferacceleration/#:~:text=S3%20Transfer%20Acceleration%20(S3TA)%20reduces,to%20S3%20for%20remote%20applications: "Amazon S3 Transfer Acceleration can speed up content transfers to and from Amazon S3 by as much as 50-500% for long-distance transfer of larger objects. Customers who have either web or mobile applications with widespread users or applications hosted far away from their S3 bucket can experience long and variable upload and download speeds over the Internet" https://docs.aws.amazon.com/AmazonS3/latest/userguide/mpuoverview.html "Improved throughput - You can upload parts in parallel to improve throughput." NEW QUESTION 10 The management account has an Amazon S3 bucket that contains project reports. The company wants to limit access to this S3 bucket to only users of accounts within the organization in AWS Organizations. Which solution meets these requirements with the LEAST amount of operational overhead? A. Add the aws:PrincipalOrgID global condition key with a reference to the organization ID to the S3bucket policy. B. Create an organizational unit (OU) for each departmen C. Add the aws:PrincipalOrgPaths globalcondition key to the S3 bucket policy. D. Use AWS CloudTrail to monitor the CreateAccount, InviteAccountToOrganization,LeaveOrganization, and RemoveAccountFromOrganization event E. Update the S3 bucket policyaccordingly. F. Tag each user that needs access to the S3 bucke G. Add the aws:PrincipalTag global condition key tothe S3 bucket policy. Answer: A Explanation: Explanation https://aws.amazon.com/blogs/security/control-access-to-aws-resources-by-using-the-awsorganization- of-iam-principals/ The aws:PrincipalOrgID global key provides an alternative to listing all the account IDs for all AWS accounts in an organization. For example, the following Amazon S3 bucket policy allows members of any account in the XXX organization to add an object into the examtopics bucket. {"Version": "2020-09-10", "Statement": { "Sid": "AllowPutObject", "Effect": "Allow", "Principal": "*", "Action": "s3:PutObject", "Resource": "arn:aws:s3:::examtopics/*", "Condition": {"StringEquals": {"aws:PrincipalOrgID":["XXX"]}}}} https://docs.aws.amazon.com/IAM/latest/UserGuide/reference_policies_condition-keys.html NEW QUESTION 10 A company observes an increase in Amazon EC2 costs in its most recent bill The billing team notices unwanted vertical scaling of instance types for a couple of EC2 instances A solutions architect needs to create a graph comparing the last 2 months of EC2 costs and perform an in-depth analysis to identify the root cause of the vertical scaling How should the solutions architect generate the information with the LEAST operational overhead? A. Use AWS Budgets to create a budget report and compare EC2 costs based on instance types B. Use Cost Explorer's granular filtering feature to perform an in-depth analysis of EC2 costs based on instance types C. Use graphs from the AWS Billing and Cost Management dashboard to compare EC2 costs based on instance types for the last 2 months D. Use AWS Cost and Usage Reports to create a report and send it to an Amazon S3 bucket Use Amazon QuickSight with Amazon S3 as a source to generate an interactive graph based on instance types. Answer: B Explanation: Explanation AWS Cost Explorer is a tool that enables you to view and analyze your costs and usage. You can explore your usage and costs using the main graph, the Cost Explorer cost and usage reports, or the Cost Explorer RI reports. You can view data for up to the last 12 months, forecast how much you're likely to spend for the next 12 months, and get recommendations for what Reserved Instances to purchase. You can use Cost Explorer to identify areas that need further inquiry and see Guaranteed success with Our exam guides visit - https://www.certshared.com Certshared now are offering 100% pass ensure SAA-C03 dumps! https://www.certshared.com/exam/SAA-C03/ (0 Q&As) trends that you can use to understand your costs. https://docs.aws.amazon.com/costmanagement/ latest/userguide/ce-what-is.html NEW QUESTION 14 A company needs to review its AWS Cloud deployment to ensure that its Amazon S3 buckets do not have unauthorized configuration changes. What should a solutions architect do to accomplish this goal? A. Turn on AWS Config with the appropriate rules. B. Turn on AWS Trusted Advisor with the appropriate checks. C. Turn on Amazon Inspector with the appropriate assessment template. D. Turn on Amazon S3 server access loggin E. Configure Amazon EventBridge (Amazon Cloud Watch Events). Answer: A NEW QUESTION 16 A company runs an online marketplace web application on AWS. The application serves hundreds of thousands of users during peak hours. The company needs a scalable, near-real-time solution to share the details of millions of financial transactions with several other internal applications Transactions also need to be processed to remove sensitive data before being stored in a document database for low-latency retrieval. What should a solutions architect recommend to meet these requirements? A. Store the transactions data into Amazon DynamoDB Set up a rule in DynamoDB to remove sensitive data from every transaction upon write Use DynamoDB Streams to share the transactions data with other applications B. Stream the transactions data into Amazon Kinesis Data Firehose to store data in Amazon DynamoDB and Amazon S3 Use AWS Lambda integration with Kinesis Data Firehose to remove sensitive dat C. Other applications can consumethe data stored in Amazon S3 D. Stream the transactions data into Amazon Kinesis Data Streams Use AWS Lambda integration to remove sensitive data from every transaction and then store the transactions data in Amazon DynamoDB Other applications can consumethe transactions data off the Kinesis data stream. E. Store the batched transactions data in Amazon S3 as file F. Use AWS Lambda to process every file and remove sensitive data before updating the files in Amazon S3 The Lambda function then stores the data in Amazon DynamoDBOther applications can consume transaction files stored in Amazon S3. Answer: C Explanation: Explanation The destination of your Kinesis Data Firehose delivery stream. Kinesis Data Firehose can send data records to various destinations, including Amazon Simple Storage Service (Amazon S3), Amazon Redshift, Amazon OpenSearch Service, and any HTTP endpoint that is owned by you or any of your third-party service providers. The following are the supported destinations: * Amazon OpenSearch Service * Amazon S3 * Datadog * Dynatrace * Honeycomb * HTTP Endpoint * Logic Monitor * MongoDB Cloud * New Relic * Splunk * Sumo Logic https://docs.aws.amazon.com/firehose/latest/dev/create-name.html https://aws.amazon.com/kinesis/data-streams/ Amazon Kinesis Data Streams (KDS) is a massively scalable and durable real-time data streaming service. KDS can continuously capture gigabytes of data per second from hundreds of thousands of sources such as website clickstreams, database event streams, financial transactions, social media feeds, IT logs, and location-tracking events. NEW QUESTION 17 A company is preparing to launch a public-facing web application in the AWS Cloud. The architecture consists of Amazon EC2 instances within a VPC behind an Elastic Load Balancer (ELB). A third-party service is used for the DNS. The company's solutions architect must recommend a solution to detect and protect against large-scale DDoS attacks. Which solution meets these requirements? A. Enable Amazon GuardDuty on the account. B. Enable Amazon Inspector on the EC2 instances. C. Enable AWS Shield and assign Amazon Route 53 to it. D. Enable AWS Shield Advanced and assign the ELB to it. Answer: D NEW QUESTION 20 A company is building an application in the AWS Cloud. The application will store data in Amazon S3 buckets in two AWS Regions. The company must use an AWS Key Management Service (AWS KMS) customer managed key to encrypt all data that is stored in the S3 buckets. The data in both S3 buckets must be encrypted and decrypted with the same KMS key. The data and the key must be stored in each of the two Regions. Which solution will meet these requirements with the LEAST operational overhead? A. Create an S3 bucket in each Region Configure the S3 buckets to use server-side encryption with Amazon S3 managed encryption keys (SSE-S3) Configure replication between the S3 buckets. B. Create a customer managed multi-Region KMS ke C. Create an S3 bucket in each Regio Guaranteed success with Our exam guides visit - https://www.certshared.com Certshared now are offering 100% pass ensure SAA-C03 dumps! https://www.certshared.com/exam/SAA-C03/ (0 Q&As) D. Configure replication between the S3 bucket E. Configure the application to use the KMS key with client-side encryption. F. Create a customer managed KMS key and an S3 bucket in each Region Configure the S3 buckets to use server-side encryption with Amazon S3 managed encryption keys (SSE-S3) Configure replication between the S3 buckets. G. Create a customer managed KMS key and an S3 bucket m each Region Configure the S3 buckets to use server-side encryption with AWS KMS keys (SSE- KMS) Configure replication between the S3 buckets. Answer: C Explanation: Explanation From https://docs.aws.amazon.com/kms/latest/developerguide/custom-key-store-overview.htmlFor most users, the default AWS KMS key store, which is protected by FIPS 140-2 validatedcryptographic modules, fulfills their security requirements. There is no need to add an extra layer ofmaintenance responsibility or a dependency on an additional service. However, you might considercreating a custom key store if your organization has any of the following requirements: Key materialcannot be stored in a shared environment. Key material must be subject to a secondary, independentaudit path. The HSMs that generate and store key material must be certified at FIPS 140-2 Level 3. https://docs.aws.amazon.com/kms/latest/developerguide/custom-key-store-overview.html NEW QUESTION 22 A company is hosting a static website on Amazon S3 and is using Amazon Route 53 for DNS. The website is experiencing increased demand from around the world. The company must decrease latency for users who access the website. Which solution meets these requirements MOST cost-effectively? A. Replicate the S3 bucket that contains the website to all AWS Region B. Add Route 53 geolocation routing entries. C. Provision accelerators in AWS Global Accelerato D. Associate the supplied IP addresses with the S3 bucke E. Edit the Route 53 entries to point to the IP addresses of the accelerators. F. Add an Amazon CloudFront distribution in front of the S3 bucke G. Edit the Route 53 entries to point to the CloudFront distribution. H. Enable S3 Transfer Acceleration on the bucke I. Edit the Route 53 entries to point to the new endpoint. Answer: C NEW QUESTION 24 A company's application integrates with multiple software-as-a-service (SaaS) sources for data collection. The company runs Amazon EC2 instances to receive the data and to upload the data to an Amazon S3 bucket for analysis. The same EC2 instance that receives and uploads the data also sends a notification to the user when an upload is complete. The company has noticed slow application performance and wants to improve the performance as much as possible. Which solution will meet these requirements with the LEAST operational overhead? A. Create an Auto Scaling group so that EC2 instances can scale ou B. Configure an S3 event notification to send events to an Amazon Simple Notification Service (Amazon SNS) topic when the upload to the S3 bucket is complete. C. Create an Amazon AppFlow flow to transfer data between each SaaS source and the S3 bucket.Configure an S3 event notification to send events to an Amazon Simple Notification Service (Amazon SNS) topic when the upload to the S3 bucket is complete. D. Create an Amazon EventBridge (Amazon CloudWatch Events) rule for each SaaS source to send output dat E. Configure the S3 bucket as the rule's targe F. Create a second EventBridge (CloudWatch Events) rule to send events when the upload to the S3 bucket is complet G. Configure an Amazon Simple Notification Service (Amazon SNS) topic as the second rule's target. H. Create a Docker container to use instead of an EC2 instanc I. Host the containerized application on Amazon Elastic Container Service (Amazon ECS). Configure Amazon CloudWatch Container Insights to send events to an Amazon Simple Notification Service (Amazon SNS) topic when the upload to the S3 bucket is complete. Answer: B NEW QUESTION 29 A company has an on-premises application that generates a large amount of time-sensitive data that is backed up to Amazon S3. The application has grown and there are user complaints about internet bandwidth limitations. A solutions architect needs to design a long-term solution that allows for both timely backups to Amazon S3 and with minimal impact on internet connectivity for internal users. Which solution meets these requirements? A. Establish AWS VPN connections and proxy all traffic through a VPC gateway endpoint B. Establish a new AWS Direct Connect connection and direct backup traffic through this new connection. C. Order daily AWS Snowball devices Load the data onto the Snowball devices and return the devices to AWS each day. D. Submit a support ticket through the AWS Management Console Request the removal of S3 service limits from the account. Answer: B NEW QUESTION 34 A company stores call transcript files on a monthly basis. Users access the files randomly within 1 year of the call, but users access the files infrequently after 1 year. The company wants to optimize its solution by giving users the ability to query and retrieve files that are less than 1-year-old as quickly as possible. A delay in retrieving older files is acceptable. Which solution will meet these requirements MOST cost-effectively? A. Store individual files with tags in Amazon S3 Glacier Instant Retrieva B. Query the tags to retrieve the files from S3 Glacier Instant Retrieval. C. Store individual files in Amazon S3 Intelligent-Tierin D. Use S3 Lifecycle policies to move the files to S3 Glacier Flexible Retrieval after 1 yea E. Query and retrieve the files that are in Amazon S3 by using Amazon Athen Guaranteed success with Our exam guides visit - https://www.certshared.com Certshared now are offering 100% pass ensure SAA-C03 dumps! https://www.certshared.com/exam/SAA-C03/ (0 Q&As) F. Query and retrieve the files that are in S3 Glacier by using S3 Glacier Select. G. Store individual files with tags in Amazon S3 Standard storag H. Store search metadata for each archive in Amazon S3 Standard storag I. Use S3 Lifecycle policies to move the files to S3 Glacier Instant Retrieval after 1 yea J. Query and retrieve the files by searching for metadata from Amazon S3. K. Store individual files in Amazon S3 Standard storag L. Use S3 Lifecycle policies to move the files to S3 Glacier Deep Archive after 1 yea M. Store search metadata in Amazon RD N. Query the files from Amazon RD O. Retrieve the files from S3 Glacier Deep Archive. Answer: C NEW QUESTION 37 A company is developing an application that provides order shipping statistics for retrieval by a REST API. The company wants to extract the shipping statistics, organize the data into an easy-to-read HTML format, and send the report to several email addresses at the same time every morning. Which combination of steps should a solutions architect take to meet these requirements? (Choose two.) A. Configure the application to send the data to Amazon Kinesis Data Firehose. B. Use Amazon Simple Email Service (Amazon SES) to format the data and to send the report by email. C. Create an Amazon EventBridge (Amazon CloudWatch Events) scheduled event that invokes an AWS Glue job to query the application's API for the data. D. Create an Amazon EventBridge (Amazon CloudWatch Events) scheduled event that invokes an AWS Lambda function to query the application's API for the data. E. Store the application data in Amazon S3. Create an Amazon Simple Notification Service (Amazon SNS) topic as an S3 event destination to send the report by Answer: DE NEW QUESTION 40 A company needs to store its accounting records in Amazon S3. The records must be immediately accessible for 1 year and then must be archived for an additional 9 years. No one at the company, including administrative users and root users, can be able to delete the records during the entire 10- year period. The records must be stored with maximum resiliency. Which solution will meet these requirements? A. Store the records in S3 Glacier for the entire 10-year perio B. Use an access control policy to deny deletion of the records for a period of 10 years. C. Store the records by using S3 Intelligent-Tierin D. Use an IAM policy to deny deletion of the records.After 10 years, change the IAM policy to allow deletion. E. Use an S3 Lifecycle policy to transition the records from S3 Standard to S3 Glacier Deep Archive after 1 yea F. Use S3 Object Lock in compliance mode for a period of 10 years. G. Use an S3 Lifecycle policy to transition the records from S3 Standard to S3 One Zone-Infrequent Access (S3 One Zone-IA) after 1 yea H. Use S3 Object Lock in governance mode for a period of 10 years. Answer: C NEW QUESTION 41 A solutions architect is designing the cloud architecture for a new application being deployed on AWS. The process should run in parallel while adding and removing application nodes as needed based on the number of fobs to be processed. The processor application is stateless. The solutions architect must ensure that the application is loosely copied and the job items are durably stored Which design should the solutions architect use? A. Create an Amazon SNS topic to send the jobs that need to be processed Create an Amazon Machine Image (AMI) that consists of the processor application Create a launch configuration that uses the AMI Create an Auto Scaling group using the launch configuration Set the scaling policy for the Auto Scaling group to add and remove nodes based on CPU usage B. Create an Amazon SQS queue to hold the jobs that need to be processed Create an Amazon Machine image (AMI) that consists of the processor application Create a launch configuration that uses the AM' Create an Auto Scaling group using the launch configuration Set the scaling policy for the Auto Scaling group to add and remove nodes based on network usage C. Create an Amazon SQS queue to hold the jobs that needs to be processed Create an Amazon Machine image (AMI) that consists of the processor application Create a launch template that uses the AMI Create an Auto Scaling group using the launch template Set the scaling policy for the Auto Scaling group to add and remove nodes based on the number of items in the SQS queue D. Create an Amazon SNS topic to send the jobs that need to be processed Create an Amazon Machine Image (AMI) that consists of the processor application Create a launch template that uses the AMI Create an Auto Scaling group using the launch template Set the scaling policy for the Auto Scaling group to add and remove nodes based on the number of messages published to the SNS topic Answer: C Explanation: "Create an Amazon SQS queue to hold the jobs that needs to be processed. Create an Amazon EC2 Auto Scaling group for the compute application. Set the scaling policy for the Auto Scaling group to add and remove nodes based on the number of items in the SQS queue" In this case we need to find a durable and loosely coupled solution for storing jobs. Amazon SQS is ideal for this use case and can be configured to use dynamic scaling based on the number of jobs waiting in the queue.To configure this scaling you can use the backlog per instance metric with the target value being the acceptable backlog per instance to maintain. You can calculate these numbers as follows: Backlog per instance: To calculate your backlog per instance, start with the ApproximateNumberOfMessages queue attribute to determine the length of the SQS queue NEW QUESTION 42 A company hosts a serverless application on AWS. The application uses Amazon API Gateway. AWS Lambda, and an Amazon RDS for PostgreSQL database. The company notices an increase in application errors that result from database connection timeouts during times of peak traffic or unpredictable traffic. The company needs a solution that reduces the application failures with the least amount of change to the code. What should a solutions architect do to meet these requirements? Guaranteed success with Our exam guides visit - https://www.certshared.com Certshared now are offering 100% pass ensure SAA-C03 dumps! https://www.certshared.com/exam/SAA-C03/ (0 Q&As) A. Reduce the Lambda concurrency rate. B. Enable RDS Proxy on the RDS DB instance. C. Resize the ROS DB instance class to accept more connections. D. Migrate the database to Amazon DynamoDB with on-demand scaling Answer: B NEW QUESTION 46 A gaming company is moving its public scoreboard from a data center to the AWS Cloud. The company uses Amazon EC2 Windows Server instances behind an Application Load Balancer to host its dynamic application. The company needs a highly available storage solution for the application. The application consists of static files and dynamic server-side code. Which combination of steps should a solutions architect take to meet these requirements? (Select TWO.) A. Store the static files on Amazon S3. Use Amazon B. CloudFront to cache objects at the edge. C. Store the static files on Amazon S3. Use Amazon ElastiCache to cache objects at the edge. D. Store the server-side code on Amazon Elastic File System (Amazon EFS). Mount the EFS volume on each EC2 instance to share the files. E. Store the server-side code on Amazon FSx for Windows File Serve F. Mount the FSx for Windows File Server volume on each EC2 instance to share the files. G. Store the server-side code on a General Purpose SSD (gp2) Amazon Elastic Block Store (Amazon EBS) volum H. Mount the EBS volume on each EC2 instance to share the files. Answer: AE NEW QUESTION 49 A company is storing sensitive user information in an Amazon S3 bucket The company wants to provide secure access to this bucket from the application tier running on Ama2on EC2 instances inside a VPC Which combination of steps should a solutions architect take to accomplish this? (Select TWO.) A. Configure a VPC gateway endpoint (or Amazon S3 within the VPC B. Create a bucket policy to make the objects to the S3 bucket public C. Create a bucket policy that limits access to only the application tier running in the VPC D. Create an 1AM user with an S3 access policy and copy the IAM credentials to the EC2 instance E. Create a NAT instance and have the EC2 instances use the NAT instance to access the S3 bucket Answer: BD NEW QUESTION 50 A company is migrating its on-premises PostgreSQL database to Amazon Aurora PostgreSQL. The on-premises database must remain online and accessible during the migration. The Aurora database must remain synchronized with the on-premises database. Which combination of actions must a solutions architect take to meet these requirements? (Select TWO.) A. Create an ongoing replication task. B. Create a database backup of the on-premises database C. Create an AWS Database Migration Service (AWS DMS) replication server D. Convert the database schema by using the AWS Schema Conversion Tool (AWS SCT). E. Create an Amazon EventBridge (Amazon CloudWatch Events) rule to monitor the database synchronization Answer: CD NEW QUESTION 52 A company has a three-tier web application that is deployed on AWS. The web servers are deployed in a public subnet in a VPC. The application servers and database servers are deployed in private subnets in the same VPC. The company has deployed a third-party virtual firewall appliance from AWS Marketplace in an inspection VPC. The appliance is configured with an IP interface that can accept IP packets. A solutions architect needs to Integrate the web application with the appliance to inspect all traffic to the application before the traffic teaches the web server. Which solution will moot these requirements with the LEAST operational overhead? A. Create a Network Load Balancer the public subnet of the application's VPC to route the traffic lo the appliance for packet inspection B. Create an Application Load Balancer in the public subnet of the application's VPC to route the traffic to the appliance for packet inspection C. Deploy a transit gateway m the inspection VPC Configure route tables to route the incoming pockets through the transit gateway D. Deploy a Gateway Load Balancer in the inspection VPC Create a Gateway Load Balancer endpoint to receive the incoming packets and forward the packets to the appliance Answer: D NEW QUESTION 54 A company is running a critical business application on Amazon EC2 instances behind an Application Load Balancer The EC2 instances run in an Auto Scaling group and access an Amazon RDS DB instance The design did not pass an operational review because the EC2 instances and the DB instance are all located in a single Availability Zone A solutions architect must update the design to use a second Availability Zone Which solution will make the application highly available? A. Provision a subnet in each Availability Zone Configure the Auto Scaling group to distribute the EC2 instances across bothAvailability Zones Configure the DB instance with connections to each network B. Provision two subnets that extend across both Availability Zones Configure the Auto Scaling group to distribute the EC2 instancesacross both Availability Zones Configure the DB instance with connections to each network C. Provision a subnet in each Availability Zone Configure the Auto Scaling group to distribute the EC2 instances across both Availability Zones Configure the DB instance for Multi-AZ deployment Guaranteed success with Our exam guides visit - https://www.certshared.com Certshared now are offering 100% pass ensure SAA-C03 dumps! https://www.certshared.com/exam/SAA-C03/ (0 Q&As) D. Provision a subnet that extends across both Availability Zones Configure the Auto Scaling group to distribute the EC2 instancesacross both Availability Zones Configure the DB instance for Multi-AZ deployment Answer: C NEW QUESTION 57 A company's order system sends requests from clients to Amazon EC2 instances The EC2 instances process the orders and then store the orders in a database on Amazon RDS. Users report that they must reprocess orders when the system fails. The company wants a resilient solution that can process orders automatically if a system outage occurs. What should a solutions architect do to meet these requirements? A. Move the EC2 instances Into an Auto Scaling grou B. Create an Amazon EventBridge (Amazon CloudWatch Events) rule to target an Amazon Elastic Container Service (Amazon ECS) task C. Move the EC2 instances into an Auto Seating group behind an Application Load Balancer (Al B) Update the order system to send message to the ALB endpoint D. Move the EC2 instances into an Auto Scaling grou E. Configure the order system to send messages to an Amazon Simple Queue Service (Amazon SGS) queu F. Configure the EC2 instances to consume messages from the queue. G. Create an Amazon Simple Notification Service (Amazon SNS) topi H. Create an AWS Lambda function, and subscribe the function to the SNS topic Configure (he order system to send messages to the SNS topi I. Send a command to the EC2 instances to process the messages by using AWS Systems Manager Run Command Answer: C NEW QUESTION 58 A company is using a SQL database to store movie data that is publicly accessible. The database runs on an Amazon RDS Single-AZ DB instance A script runs queries at random intervals each day to record the number of new movies that have been added to the database. The script must report a final total during business hours The company's development team notices that the database performance is inadequate for development tasks when the script is running. A solutions architect must recommend a solution to resolve this issue. Which solution will meet this requirement with the LEAST operational overhead? A. Modify the DB instance to be a Multi-AZ deployment B. Create a read replica of the database Configure the script to query only the read replica C. Instruct the development team to manually export the entries in the database at the end of each day D. Use Amazon ElastiCache to cache the common queries that the script runs against the database Answer: B NEW QUESTION 59 A company has five organizational units (OUS) as part of its organization in AWS Organization. Each OU correlate to the five business that the company owns. The company research and development R&D business is separating from the company and will need its own organization. A solutions architect creates a separate new management account for this purpose. A. Have the R&D AWS account be part of both organizations during the transition. B. Invite the R&D AWS account to be part of the new organization after the R&D AWS account has left the prior organization. C. Create a new R&D AWS account in the new organizatio D. Migrate resources from the period R&D AWS account to thee new R&D AWS account E. Have the R&D AWS account into the now organisatio F. Make the now management account a member of the prior organisation Answer: B NEW QUESTION 62 A company has a web-based map application that provides status information about ongoing repairs. The application sometimes has millions of users. Repair teams have a mobile app that sends current location and status in a JSON message to a REST-based endpoint. Few repairs occur on most days. The company wants the application to be highly available and to scale when large numbers of repairs occur after nature disasters. Customer use the application most often during these times. The company does not want to pay for idle capacity. A. Create a webpage that is based on Amazon S3 to display informatio B. Use Amazon API Gateway and AWS Lambda to receive the JSON status data Store the JSON data m Amazon S3. C. Use Amazon EC2 instances as wad servers across multiple Availability Zone D. Run the EC2 instances inan Auto Scaling grou E. Use Amazon API Gateway and AWS Lambda to receive the JSON status data Store the JSON data In Amazon S3. F. Use Amazon EC2 instances as web servers across multiple Availability Zone G. Run the EC2 instances in an Auto Scaling grou H. Use a REST endpoint on the EC2 instances to receive the JSON status dat I. Store the JSON data in an Amazon RDS Mufti-AZ DB instance. J. Use Amazon EC? instances as web servers across multiple Availability zones Run the FC? instances in an Auto Scaling group Use a REST endpoint on the EC? instances to receive the JSON status data Store the JSON data in an Amazon DynamoDB table. Answer: D NEW QUESTION 66 A company has a web application that is based ornavaan^PH^Tnecompan^lanstomove the application from on premises to AWS The company needs the ability to test new site features frequently The company also needs a highly available and managed solution that requires minimum operational overhead. Which solution will meet these requirements? A. Create an Amazon S3 bucket Enable static web hosting on the S3 bucket Upload the static content to the S3 bucket Use AWS Lambda to process all dynamic content B. Deploy the web application to an AWS Elastic Beanstalk environment Use URL swapping to switch between multiple Elastic Beanstalk environments for feature Guaranteed success with Our exam guides visit - https://www.certshared.com Certshared now are offering 100% pass ensure SAA-C03 dumps! https://www.certshared.com/exam/SAA-C03/ (0 Q&As) testing C. Deploy the web application to Amazon EC2 instances that are configured with Java and PHP Use Auto Scaling groups and an Application Load Balancer to manage the website's availability. D. Containerize the web application Deploy the web application to Amazon EC2 instances Use the AWS Load Balancer Controller to dynamically route traffic between containers that contain the new site features for testing Answer: D NEW QUESTION 67 A company runs multiple Windows workloads on AWS. The company’s employees use Windows the file shares that are hosted on two Amazon EC2 instances. The file shares synchronize data between themselves and maintain duplicate copies. The company wants a highly available and durable storage solution that preserves how users currently access the files. A. Migrate all the data to Amazon S3 Set up IAM authentication for users to access files B. Set up an Amazon S3 File Gatewa C. Mount the S3 File Gateway on the existing EC2 Instances. D. Extend the file share environment to Amazon FSx for Windows File Server with a Multi-AZ configuratio E. Migrate all the data to FSx for Windows File Server. F. Extend the file share environment to Amazon Elastic File System (Amazon EFS) with a Multi-AZ configuratio G. Migrate all the data to Amazon EFS. Answer: C NEW QUESTION 68 A company is building an ecommerce application and needs to store sensitive customer information. The company needs to give customers the ability to complete purchase transactions on the website. The company also needs to ensure that sensitive customer data is protected, even from database administrators. Which solution meets these requirements? A. Store sensitive data in an Amazon Elastic Block Store (Amazon EBS) volum B. Use EBS encryption to encrypt the dat C. Use an IAM instance role to restrict access. D. Store sensitive data in Amazon RDS for MySQ E. Use AWS Key Management Service (AWS KMS) client-side encryption to encrypt the data. F. Store sensitive data in Amazon S3. Use AWS Key Management Service (AWS KMS) service-side encryption the dat G. Use S3 bucket policies to restrict access. H. Store sensitive data in Amazon FSx for Windows Serve I. Mount the file share on application servers.Use Windows file permissions to restrict access. Answer: C NEW QUESTION 72 A company has an application that runs on Amazon EC2 instances and uses an Amazon Aurora database. The EC2 instances connect to the database by using user names and passwords that are stored locally in a file. The company wants to minimize the operational overhead of credential management. What should a solutions architect do to accomplish this goal? A. Use AWS Secrets Manage B. Turn on automatic rotation. C. Use AWS Systems Manager Parameter Stor D. Turn on automatic rotatio E. Create an Amazon S3 bucket lo store objects that are encrypted with an AWS Key F. Management Service (AWS KMS) encryption ke G. Migrate the credential file to the S3 bucke H. Point the application to the S3 bucket. I. Create an encrypted Amazon Elastic Block Store (Amazon EBS) volume (or each EC2 instanc J. Attach the new EBS volume to each EC2 instanc K. Migrate the credential file to the new EBS volum L. Point the application to the new EBS volume. Answer: C NEW QUESTION 77 A company has an application that loads documents into an Amazon 53 bucket and converts the documents into another format. The application stores the converted documents m another S3 bucket and saves the document name and URLs in an Amazon DynamoOB table The DynamoOB entries are used during subsequent days to access the documents The company uses a DynamoOB Accelerator (DAX) cluster in front of the table Recently, traffic to the application has increased. Document processing tasks are timing out during the scheduled DAX maintenance window. A solutions architect must ensure that the documents continue to load during the maintenance window What should the solutions architect do to accomplish this goal? A. Modify the application to write to the DAX cluster Configure the DAX cluster to write to the DynamoDB table when the maintenance window is complete B. Enable Amazon DynamoDB Streams for the DynamoDB tabl C. Modify the application to write to the stream Configure the stream to load the data when the maintenance window is complete. D. Convert the application to an AWS Lambda function Configure the Lambda function runtime to be longer than the maintenance window Create an Amazon CloudWatch alarm to monitor Lambda timeouts E. Modify the application to write the document name and URLs to an Amazon Simple Queue Service (Amazon SOS) queue Create an AWS Lambda function to read the SOS queue and write to DynamoDB. Answer: C Guaranteed success with Our exam guides visit - https://www.certshared.com Certshared now are offering 100% pass ensure SAA-C03 dumps! https://www.certshared.com/exam/SAA-C03/ (0 Q&As) NEW QUESTION 79 A company is running an ASP.NET MVC application on a single Amazon EC2 instance. A recent increase in application traffic is causing slow response times for users during lunch hours. The company needs to resolve this concern with the least amount of configuration. What should a solutions architect recommend to meet these requirements? A. Move the application to AWS Elastic Beanstal B. Configure load-based auto scaling and time-based scaling to handle scaling during lunch hours C. Move the application to Amazon Elastic Container Service (Amazon ECS) Create an AWS Lambda function to handle scaling during lunch hours. D. Move the application to Amazon Elastic Container Service (Amazon ECS). Configure scheduled scaling for AWS Application Auto Scaling during lunch hours. E. Move the application to AWS Elastic Beanstal F. Configure load-based auto scaling, and create an AWS Lambda function to handle scaling during lunch hours. Answer: A Explanation: - Scheduled scaling is the solution here, while "using the least amount of settings possible" - Beanstalk vs moving to ECS - ECS requires MORE CONFIGURATION / SETTINGS (task and service definitions, configuring ECS container agent) than Beanstalk (upload application code) https://docs.aws.amazon.com/elasticbeanstalk/latest/dg/environments-cfg-autoscaling-scheduledactions.html Elastic Beanstalk supports time based scaling, since we are aware that the application performance slows down during the lunch hours. https://aws.amazon.com/about-aws/whats-new/2015/05/aws-elastic-beanstalk-supports-time-based-scaling/ NEW QUESTION 82 Availability Zone The company wants the application to be highly available with minimum downtime and minimum loss of data Which solution will meet these requirements with the LEAST operational effort? A. Place the EC2 instances in different AWS Regions Use Amazon Route 53 health checks to redirect traffic Use Aurora PostgreSQL Cross-Region Replication B. Configure the Auto Scaling group to use multiple Availability Zones Configure the database as Multi-AZ Configure an Amazon RDS Proxy instance for the database C. Configure the Auto Scaling group to use one Availability Zone Generate hourly snapshots of the database Recover the database from the snapshots in the event of a failure. D. Configure the Auto Scaling group to use multiple AWS Regions Write the data from the application to Amazon S3 Use S3 Event Notifications to launch an AWS Lambda function to write the data to the database Answer: B NEW QUESTION 87 A company has a web application that runs on Amazon EC2 instances. The company wants end users to authenticate themselves before they use the web application. The web application accesses AWS resources, such as Amazon S3 buckets, on behalf of users who are logged on. Which combination of actions must a solutions architect take to meet these requirements? (Select TWO). A. Configure AWS App Mesh to log on users. B. Enable and configure AWS Single Sign-On in AWS Identity and Access Management (IAM). C. Define a default (AM role for authenticated users. D. Use AWS Identity and Access Management (IAM) for user authentication. E. Use Amazon Cognito for user authentication. Answer: BE NEW QUESTION 90 A company is hosting a website from an Amazon S3 bucket that is configured for public hosting. The company’s security team mandates the usage of secure connections for access to the website. However; HTTP-based URLS and HTTPS-based URLS mist be functional. What should a solution architect recommend to meet these requirements? A. Create an S3 bucket policy to explicitly deny non-HTTPS traffic. B. Enable S3 Transfer Acceleratio C. Select the HTTPS Only bucket property. D. Place thee website behind an Elastic Load Balancer that is configured to redirect HTTP traffic to HTTTPS. E. Serve the website through an Amazon CloudFront distribution that is configured to redirect HTTP traffic to HTTPS. Answer: D NEW QUESTION 94 A company is running a publicly accessible serverless application that uses Amazon API Gateway and AWS Lambda. The application’s traffic recently spiked due to fraudulent requests from botnets. Which steps should a solutions architect take to block requests from unauthorized users? (Select TWO.) A. Create a usage plan with an API key that it shared with genuine users only. B. Integrate logic within the Lambda function to ignore the requests lion- fraudulent IP addresses C. Implement an AWS WAF rule to target malicious requests and trigger actions to filler them out D. Convert the existing public API to a private API Update the DNS records to redirect users to the new API endpoint E. Create an IAM role tor each user attempting to access the API A user will assume the role when making the API call Answer: CD NEW QUESTION 97 A company is developing a new machine learning (ML) model solution on AWS. The models are developed as independent microservices that fetch approximately 1GB of model data from Amazon S3 at startup and load the data into memory Users access the models through an asynchronous API Users can send a request or a batch of requests and specify where the results should be sent Guaranteed success with Our exam guides visit - https://www.certshared.com Certshared now are offering 100% pass ensure SAA-C03 dumps! https://www.certshared.com/exam/SAA-C03/ (0 Q&As) The company provides models to hundreds of users. The usage patterns for the models are irregular. Some models could be unused for days or weeks Other models could receive batches of thousands of requests at a time Which design should a solutions architect recommend to meet these requirements? A. Direct the requests from the API to a Network Load Balancer (NLB) Deploy the models as AWS Lambda functions that are invoked by the NLB. B. Direct the requests from the API to an Application Load Balancer (ALB). Deploy the models as Amazon Elastic Container Service (Amazon ECS) services that read from an Amazon Simple Queue Service (Amazon SQS) queue Use AWS App Mesh to scale the instances of the ECS cluster based on the SQS queue size C. Direct the requests from the API into an Amazon Simple Queue Service (Amazon SQS) queue Deploy the models as AWS Lambda functions that are invoked by SQS events Use AWS Auto Scaling to increase the number of vCPUs for the Lambda functions based on the SQS queue size D. Direct the requests from the API into an Amazon Simple Queue Service (Amazon SQS) queue Deploy the models as Amazon Elastic Container Service (Amazon ECS) services that read from the queue Enable AWS Auto Scaling on Amazon ECS for both the cluster and copies of the service based on thequeue size Answer: C NEW QUESTION 102 A company has enabled AWS CloudTrail logs to deliver log files to an Amazon S3 bucket for each of its developer accounts. The company has created a central AWS account for streamlining management and audit reviews An internal auditor needs to access the CloudTrail logs yet access needs to be restricted for all developer account users The solution must be secure and optimized How should a solutions architect meet these requirements? A. Configure an AWS Lambda function m each developer account to copy the log files to the central account Create an IAM role in the central account for the auditor Attach an IAM policy providing read-only permissions to the bucket B. Configure CloudTrail from each developer account to deliver the log files to an S3 bucket m the central account Create an IAM user in the central account for the auditor Attach an IAM policy providing full permissions to the bucket C. Configure CloudTrail from each developer account to deliver the log files to an S3 bucket in the central account Create an IAM role in the central account for the auditor Attach an IAM policy providingread-only permissions to the bucket D. Configure an AWS Lambda function in the central account to copy the log files from the S3 bucket m each developer account Create an IAM user m the central account for the auditor Attach an IAM policy providing full permissions to the bucket Answer: C Explanation: https://docs.aws.amazon.com/awscloudtrail/latest/userguide/cloudtrail-sharing-logs.html NEW QUESTION 105 A company is deploying a web portal. The company wants to ensure that only the web portion of the application is publicly accessible. To accomplish this, the VPC was designed with two public subnets and two private subnets. The application will run on several Amazon EC2 instances in an Auto Scaling group. SSL termination must be offloaded from the EC2 instances. What should a solutions architect do to ensure these requirements are met? Configure a Network Load Balancer in the public subnets. Configure the Auto Scaling A. group in the private subnets and associate it with an Application Load Balancer Configure a Network Load Balancer in the public subnet B. Configure the Auto Scaling C. group in the public subnets and associate it with an Application Load Balancer. D. Configure an Application Load Balancer in the public subnet E. Configure the Auto Scaling group in the private subnets and associate it with the Application Load F. Balancer, Configure an Application Load Balancer in the private subnet G. Configure the Auto Scaling group in the private subnets and associate it with the Application Load Balancer. Answer: C NEW QUESTION 109 A company wants to build a data lake on AWS from data that is stored in an onpremises Oracle relational database. The data lake must receive ongoing updates from the on-premises database. Which solution will meet these requirements with the LEAST operational overhead? A. Use AWS DataSync to transfer the data to Amazon S3. Use AWS Glue to transform the data and integrate the data into a data lake. B. Use AWS Snowball to transfer the data to Amazon S3. Use AWS Batch to transform the data and integrate the data into a data lake. C. Use AWS Database Migration Service (AWS DMS) to transfer the data to Amazon S3 Use AWS Glue to transform the data and integrate the data into a data lake. D. Use an Amazon EC2 instance to transfer the data to Amazon S3. Configure the EC2 instance to transform the data and integrate the data into a data lake. Answer: C NEW QUESTION 111 A solutions architect is designing a customer-facing application for a company. The application's database will have a clearly defined access pattern throughout the year and will have a variable number of reads and writes that depend on the time of year. The company must retain audit records for the database for 7 days. The recovery point objective (RPO) must be less than 5 hours. Which solution meets these requirements? A. Use Amazon DynamoDB with auto scaling Use on-demand backups and Amazon DynamoDB Streams B. Use Amazon Redshif C. Configure concurrency scalin D. Activate audit loggin E. Perform database snapshots every 4 hours. F. Use Amazon RDS with Provisioned IOPS Activate the database auditing parameter Perform database snapshots every 5 hours G. Use Amazon Aurora MySQL with auto scalin H. Activate the database auditing parameter Answer: B Guaranteed success with Our exam guides visit - https://www.certshared.com Certshared now are offering 100% pass ensure SAA-C03 dumps! https://www.certshared.com/exam/SAA-C03/ (0 Q&As) NEW QUESTION 113 An online photo application lets users upload photos and perform image editing operations The application offers two classes of service free and paid Photos submitted by paid users are processed before those submitted by free users Photos are uploaded to Amazon S3 and the job information is sent to Amazon SQS. Which configuration should a solutions architect recommend? A. Use one SQS FIFO queue Assign a higher priority to the paid photos so they are processed first B. Use two SQS FIFO queues: one for paid and one for free Set the free queue to use short polling and the paid queue to use long polling C. Use two SQS standard queues one for paid and one for free Configure Amazon EC2 instances to prioritize polling for the paid queue over the free queue. D. Use one SQS standard queu E. Set the visibility timeout of the paid photos to zero Configure Amazon EC2 instances to prioritize visibility settings so paid photos are processed first Answer: C Explanation: https://acloud.guru/forums/guru-of-the-week/discussion/-L7Be8rOao3InQxdQcXj/ https://aws.amazon.com/sqs/features/ Priority: Use separate queues to provide prioritization of work. https://aws.amazon.com/sqs/features/ https://aws.amazon.com/sqs/features/#:~:text=Priority%3A%20Use%20separate%20queues%20to%20provide% https://docs.aws.amazon.com/AWSSimpleQueueService/latest/SQSDeveloperGuide/sqs-short-and-long-polling. NEW QUESTION 118 A company is running an application in a private subnet in a VPC win an attached internet gateway The company needs to provide the application access to the internet while restricting public access to the application The company does not want to manage additional infrastructure and wants a solution that is highly available and scalable Which solution meets these requirements? A. Create a NAT gateway in the private subne B. Create a route table entry from the private subnet to the internet gateway C. Create a NAT gateway m a public subnet Create a route table entry from the private subnet to the NAT gateway D. Launch a NAT instance m the private subnet Create a route table entry from the private subnet lo the internet gateway E. Launch a NAT Instance in a public subnet Create a route table entry from the private subnet to the NAT instance. Answer: A NEW QUESTION 122 To meet security requirements, a company needs to encrypt all of its application data in transit while communicating with an Amazon RDS MySQL DB instance A recent security audit revealed that encryption al rest is enabled using AWS Key Management Service (AWS KMS). but data in transit Is not enabled What should a solutions architect do to satisfy the security requirements? A. Enable IAM database authentication on the database. B. Provide self-signed certificates, Use the certificates in all connections to the RDS instance C. Take a snapshot of the RDS instance Restore the snapshot to a new instance with encryption enabled D. Download AWS-provided root certificates Provide the certificates in all connections to the RDS instance Answer: C Explanation: https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/Overview.Encryption.html#Overview.Encryption. NEW QUESTION 126 A company runs its ecommerce application on AWS. Every new order is published as a message in a RabbitMQ queue that runs on an Amazon EC2 instance in a single Availability Zone. These messages are processed by a different application that runs on a separate EC2 instance. This application stores the details in a PostgreSQL database on another EC2 instance. All the EC2 instances are in the same Availability Zone. The company needs to redesign its architecture to provide the highest availability with the least operational overhead. What should a solutions architect do to meet these requirements? A. Migrate the queue to a redundant pair (active/standby) of RabbitMQ instances on Amazon M B. Create a Multi-AZ Auto Scaling group (or EC2 instances that host the applicatio C. Create another Multi-AZAuto Scaling group for EC2 instances that host the PostgreSQL database. D. Migrate the queue to a redundant pair (active/standby) of RabbitMQ instances on Amazon M E. Create a Multi-AZ Auto Scaling group for EC2 instances that host the applicatio F. Migrate the database to run on a Multi-AZ deployment of Amazon RDS for PostgreSQL. G. Create a Multi-AZ Auto Scaling group for EC2 instances that host the RabbitMQ queu H. Create another Multi-AZ Auto Scaling group for EC2 instances that host the applicatio I. Migrate the database to runon a Multi-AZ deployment of Amazon RDS fqjPostgreSQL. J. Create a Multi-AZ Auto Scaling group for EC2 instances that host the RabbitMQ queu K. Create another Multi-AZ Auto Scaling group for EC2 instances that host the applicatio L. Create a third Multi-AZ AutoScaling group for EC2 instances that host the PostgreSQL database. Answer: C NEW QUESTION 128 A research company runs experiments that are powered by a simulation application and a visualization application. The simulation application runs on Linux and outputs intermediate data to an NFS share every 5 minutes. The visualization application is a Windows desktop application that displays the simulation output and requires an SMB file system. The company maintains two synchronized tile systems. This strategy is causing data duplication and inefficient resource usage. The company needs to migrate the applications to AWS without making code changes to either application. Which solution will meet these requirements? Guaranteed success with Our exam guides visit - https://www.certshared.com Certshared now are offering 100% pass ensure SAA-C03 dumps! https://www.certshared.com/exam/SAA-C03/ (0 Q&As) A. Migrate both applications to AWS Lambda Create an Amazon S3 bucket to exchange data between the applications. B. Migrate both applications to Amazon Elastic Container Service (Amazon ECS). Configure Amazon FSx File Gateway for storage. C. Migrate the simulation application to Linux Amazon EC2 instance D. Migrate the visualization application to Windows EC2 instance E. Configure Amazon Simple Queue Service (Amazon SOS) to exchange data between the applications. F. Migrate the simulation application to Linux Amazon EC2 instance G. Migrate the visualization application to Windows EC2 instance H. Configure Amazon FSx for NetApp ONTAP for storage. I. B Answer: E NEW QUESTION 130 A solution architect is using an AWS CloudFormation template to deploy a three-tier web application. The web application consist of a web tier and an application that stores and retrieves user data in Amazon DynamoDB tables. The web and application tiers are hosted on Amazon EC2 instances, and the database tier is not publicly accessible. The application EC2 instances need to access the Dynamo tables Without exposing API credentials in the template. What should the solution architect do to meet the requirements? A. Create an IAM role to read the DynamoDB table B. Associate the role with the application instances by referencing an instance profile. C. Create an IAM role that has the required permissions to read and write from the DynamoDB table D. Add the role to the EC2 instance profile, and associate the instances profile with the application instances. E. Use the parameter section in the AWS CloudFormation template to have the user input access and secret keys from an already-created IAM user that has the required permissions to read and write from the DynamoDB tables. F. Create an IAM user in the AWS CloudFormation template that has the required permissions to read and write from the DynamoDB table G. Use the GetAtt function to retrieve the access secret keys, and pass them to the application instances through the user data. Answer: B NEW QUESTION 135 A company wants to run its critical applications in containers to meet requirements tor scalability and availability The company prefers to focus on maintenance of the critical applications The company does not want to be responsible for provisioning and managing the underlying infrastructure that runs the containerized workload What should a solutions architect do to meet those requirements? A. Use Amazon EC2 Instances, and Install Docker on the Instances B. Use Amazon Elastic Container Service (Amazon ECS) on Amazon EC2 worker nodes C. Use Amazon Elastic Container Service (Amazon ECS) on AWS Fargate D. Use Amazon EC2 instances from an Amazon Elastic Container Service (Amazon ECS)-op6mized Amazon Machine Image (AMI). Answer: C Explanation: using AWS ECS on AWS Fargate since they requirements are for scalability and availability without having to provision and manage the underlying infrastructure to run the containerized workload. https://docs.aws.amazon.com/AmazonECS/latest/userguide/what-is-fargate.html NEW QUESTION 139 A company has an application with a REST-based interface that allows data to be received in near-real time from a third-party vendor Once received the application processes and stores the data for further analysis. The application is running on Amazon EC2 instances. The third-party vendor has received many 503 Service Unavailable Errors when sending data to the application When the data volume spikes, the compute capacity reaches its maximum limit and the application is unable to process all requests. Which design should a solutions architect recommend to provide a more scalable solution? A. Use Amazon Kinesis Data Streams to ingest the data Process the data using AWS Lambda function. B. Use Amazon API Gateway on top of the existing applicatio C. Create a usage plan with a quota limit for the third-party vendor D. Use Amazon Simple Notification Service (Amazon SNS) to ingest the data Put the EC2 instances in an Auto Scaling group behind an Application Load Balancer E. Repackage the application as a container Deploy the application using Amazon Elastic Container Service (Amazon ECS) using the EC2 launch type with an Auto Scaling group Answer: A NEW QUESTION 140 The DNS provider that hosts a company's domain name records is experiencing outages that cause service disruption for a website running on AWS The company needs to migrate to a more resilient managed DNS service and wants the service to run on AWS. What should a solutions architect do to rapidly migrate the DNS hosting service? A. Create an Amazon Route 53 public hosted zone for the domain nam B. Import the zone file containing the domain records hosted by the previous provider. C. Create an Amazon Route 53 private hosted zone for the domain name Import the zone file containing the domain records hosted by the previous provider D. Create a Simple AD directory in AW E. Enable zone transfer between the DNS provider and AWS Directory Service for Microsoft Active Directory for the domain records. F. Create an Amazon Route 53 Resolver inbound endpoint in the VPC Specify the IP addresses that the provider's DNS will forward DNS queries to Configure the provider's DNS to forward DNS queries for the domain to the IP addresses that are specified in the inbound endpoint. Answer: B NEW QUESTION 141 Guaranteed success with Our exam guides visit - https://www.certshared.com Certshared now are offering 100% pass ensure SAA-C03 dumps! https://www.certshared.com/exam/SAA-C03/ (0 Q&As) A company needs to retain application logs files for a critical application for 10 years. The application team regularly accesses logs from the past month for troubleshooting, but logs older than 1 month are rarely accessed. The application generates more than 10 TB of logs per month. Which storage option meets these requirements MOST cost-effectively? A. Store the Iogs in Amazon S3 Use AWS Backup lo move logs more than 1 month old to S3 Glacier Deep Archive B. Store the logs in Amazon S3 Use S3 Lifecycle policies to move logs more than 1 month old to S3 Glacier Deep Archive C. Store the logs in Amazon CloudWatch Logs Use AWS Backup to move logs more then 1 month old to S3 Glacier Deep Archive D. Store the logs in Amazon CloudWatch Logs Use Amazon S3 Lifecycle policies to move logs more than 1 month old to S3 Glacier Deep Archive Answer: B NEW QUESTION 146 An online retail company needs to run near-real-time analytics on website traffic to analyze top-selling products across different locations. The product purchase data and the user location details are sent to a third-party application that runs on premises The application processes the data and moves the data into the company's analytics engine The company needs to implement a cloud-based solution to make the data available for near-real-time analytics. Which solution will meet these requirements with the LEAST operational overhead? A. Use Amazon Kinesis Data Streams to ingest the data Use AWS Lambda to transform the data Configure Lambda to write the data to Amazon Amazon OpenSearch Service (Amazon Elasticsearch Service) B. Configure Amazon Kinesis Data Streams to write the data to an Amazon S3 bucket Schedule an AWS Glue crawler job to enrich the data and update the AWS Glue Data Catalog Use Amazon Athena for analytics C. Configure Amazon Kinesis Data Streams to write the data to an Amazon S3 bucket Add an Apache Spark job on Amazon EMR to enrich the data in the S3 bucket and write the data to Amazon OpenSearch Service (Amazon Elasticsearch Service) D. Use Amazon Kinesis Data Firehose to ingest the data Enable Kinesis Data Firehose data transformation with AWS Lambda Configure Kinesis Data Firehose to write the data to Amazon OpenSearch Service (Amazon Elasticsearch Service). Answer: C NEW QUESTION 147 A company has hired a solutions architect to design a reliable architecture for its application. The application consists of one Amazon RDS DB instance and two manually provisioned Amazon EC2 instances that run web servers. The EC2 instances are located in a single Availability Zone. What should the solutions architect do to maximize reliability of the application Infrastructure? A. Delete one EC2 instance and enable termination protection on the other EC2 instanc B. Update the DB instance to De multi-AZ, and enable deletion protection. C. Update the DB instance to be Multi-A D. and enable deletion protectio E. Place the EC2 instances behind an Application Load Balancer, and run them in an EC2 Auto Scaling group across multiple Availability Zones F. Create an additional DB instance along with an Amazon API Gateway and an AWS Lambda function.Configure the application to invoke the Lambda function through API Gateway Have the Lambda function write the data to the two DB instances. G. Place the EC2 instances in an EC2 Auto Scaling group that has multiple subnets located in multiple Availability Zone H. Use Spot Instances instead of On-Demand Instance I. Set up Amazon CloudWatch alarms to monitor the health of the instance J. Update the DB instance to be Multi-AZ, and enable deletion protection. Answer: B Explanation: https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/using-spot-instances.html NEW QUESTION 150 A company wants to reduce the cost of its existing three-tier web architect. The web, application, and database servers are running on Amazon EC2 instance EC2 instance for the development, test and production environments. The EC2 instances average 30% CPU utilization during peak hours and 10% CPU utilization during non-peak hours. The production EC2 instance purchasing solution will meet the company’s requirements MOST cost-effectively? A. Use Spot Instances for the production EC2 instance B. Use Reserved Instances for the development and test EC2 instances C. Use Reserved Instances for the production EC2 instance D. Use On-Demand Instances for the development and test EC2 instances E. Use blocks for the production FC2 ins ranges Use Reserved instances for the development and lest EC2 instances F. Use On-Demand Instances for the production EC2 instance G. Use Spot blocks for the development and test EC2 instances Answer: B NEW QUESTION 154 A company hosts its web applications in the AWS Cloud. The company configures Elastic Load Balancers to use certificate that are imported into AWS Certificate Manager (ACM). The company’s security team must be notified 30 days before the expiration of each certificate. What should a solutions architect recommend to meet the requirement? A. Add a rule m ACM to publish a custom message to an Amazon Simple Notification Service (Amazon SNS) topic every day beginning 30 days before any certificate will expire. B. Create an AWS Config rule that checks for certificates that will expire within 30 day C. Configure Amazon EventBridge (Amazon CloudWatch Events) to invoke a custom alert by way of Amazon Simple Notification Service (Amazon SNS) when AWS Config reports a noncompliant resource D. Use AWS trusted Advisor to check for certificates that will expire within to day E. Create an Amazon CloudWatch alarm that is based on Trusted Advisor metrics for check status changes Configure the alarm to send a custom alert by way of Guaranteed success with Our exam guides visit - https://www.certshared.com Certshared now are offering 100% pass ensure SAA-C03 dumps! https://www.certshared.com/exam/SAA-C03/ (0 Q&As) Amazon Simple rectification Service (Amazon SNS) F. Create an Amazon EventBridge (Amazon CloudWatch Events) rule to detect any certificates that will expire within 30 day G. Configure the rule to invoke an AWS Lambda functio H. Configure the Lambda function to send a custom alert by way of Amazon Simple Notification Service (Amazon SNS). Answer: B NEW QUESTION 158 A company uses NFS to store large video files in on-premises network attached storage. Each video file ranges in size from 1MB to 500 GB. The total storage is 70 TB and is no longer growing. The company decides to migrate the video files to Amazon S3. The company must migrate the video files as soon as possible while using the least possible network bandwidth. Which solution will meet these requirements? A. Create an S3 bucket Create an 1AM role that has permissions to write to the S3 bucke B. Use the AWS CLI to copy all files locally to the S3 bucket. C. Create an AWS Snowball Edge jo D. Receive a Snowball Edge device on premise E. Use the Snowball Edge client to transfer data to the devic F. Return the device so that AWS can import the data intoAmazon S3. G. Deploy an S3 File Gateway on premise H. Create a public service endpoint to connect to the S3 File Gateway Create an S3 bucket Create a new NFS file share on the S3 File Gateway Point the new file share to the S3 bucke I. Transfer the data from the existing NFS file share to the S3 File Gateway. J. Set up an AWS Direct Connect connection between the on-premises network and AW K. Deploy an S3 File Gateway on premise L. Create a public virtual interlace (VIF) to connect to the S3 File Gatewa M. Create an S3 bucke N. Create a new NFS file share on the S3 File Gatewa O. Point the new file share to the S3 bucke P. Transfer the data from the existing NFS file share to the S3 File Gateway. Answer: C NEW QUESTION 163 A company runs a global web application on Amazon EC2 instances behind an Application Load Balancer The application stores data in Amazon Aurora. The company needs to create a disaster recovery solution and can tolerate up to 30 minutes of downtime and potential data loss. The solution does not need to handle the load when the primary infrastructure is healthy What should a solutions architect do to meet these requirements? A. Deploy the application with the required infrastructure elements in place Use Amazon Route 53 to configure active-passive failover Create an Aurora Replica in a second AWS Region B. Host a scaled-down deployment of the application in a second AWS Region Use Amazon Route 53 to configure active-active failover Create an Aurora Replica in the second Region C. Replicate the primary infrastructure in a second AWS Region Use Amazon Route 53 to configure active-active failover Create an Aurora database that is restored from the latest snapshot D. Back up data with AWS Backup Use the backup to create the required infrastructure in a second AWS Region Use Amazon Route 53 to configure active- passive failover Create an Aurora second primary instance in the second Region Answer: C NEW QUESTION 166 A company wants to create a mobile app that allows users to stream slow-motion video clips on their mobile devices Currently, the app captures video clips and uploads the video clips in raw format into an Amazon S3 bucket The app retrieves these video clips directly from the S3 bucket. However the videos are large in their raw format. Users are experiencing issues with buffering and playback on mobile devices. The company wants to implement solutions to maximize the performance and scalability of the app while minimizing operational overhead Which combination of solutions will meet these requirements? (Select TWO.) A. Deploy Amazon CloudFront for content delivery and caching B. Use AWS DataSync to replicate the video files across AWS Regions in other S3 buckets C. Use Amazon Elastic Transcoder to convert the video files to more appropriate formats D. Deploy an Auto Scaling group of Amazon EC2 instances in Local Zones for content delivery and caching E. Deploy an Auto Scaling group of Amazon EC2 instances to convert the video files to more appropriate formats Answer: CD NEW QUESTION 171 A company is building a solution that will report Amazon EC2 Auto Scaling events across all the applications In an AWS account. The company needs to use a serverless solution to store the EC2 Auto Scaling status data in Amazon S3 The company then will use the data m Amazon S3 to provide near-real time updates in a dashboard The solution must not affect the speed of EC2 instance launches. How should the company move the data to Amazon S3 to meet these requirements? A. Use an Amazon CioudWatch metric stream to send the EC2 Auto Scaling status data to Amazon Kinesis Data Firehose Store the data in Amazon S3 B. Launch an Amazon EMR duster to collect the EC2 Auto Scaling status data and send the data to Amazon Kinesis Data Firehose Store the data in Amazon S3 C. Create an Amazon EventBridge (Amazon CloudWatch Events) rule to invoke an AWS Lambda (unction on a schedule Configure the Lambda function to send the EC2 Auto Scaling status data directly to Amazon S3 D. Use a bootstrap script during the launch of an EC2 instance to install Amazon Kinesis Agent Configure Kinesis Agent to collect the EC2 Auto Scaling status data and send the data to Amazon Kinesis Data Firehose Store the data in Amazon S3 Guaranteed success with Our exam guides visit - https://www.certshared.com Certshared now are offering 100% pass ensure SAA-C03 dumps! https://www.certshared.com/exam/SAA-C03/ (0 Q&As) Answer: B NEW QUESTION 176 A company hosts an application on AWS Lambda functions mat are invoked by an Amazon API Gateway API The Lambda functions save customer data to an Amazon Aurora MySQL database Whenever the company upgrades the database, the Lambda functions fail to establish database connections until the upgrade is complete The result is that customer data Is not recorded for some of the event A solutions architect needs to design a solution that stores customer data that is created during database upgrades Which solution will meet these requirements? A. Provision an Amazon RDS proxy to sit between the Lambda functions and the database Configure the Lambda functions to connect to the RDS proxy B. Increase the run time of me Lambda functions to the maximum Create a retry mechanism in the code that stores the customer data in the database C. Persist the customer data to Lambda local storag D. Configure new Lambda functions to scan the local storage to save the customer data to the database. E. Store the customer data m an Amazon Simple Queue Service (Amazon SOS) FIFO queue Create a new Lambda function that polls the queue and stores the customer data in the database Answer: C NEW QUESTION 181 A payment processing company records all voice communication with its customers and stores the audio files in an Amazon S3 bucket. The company needs to capture the text from the audio files. The company must remove from the text any personally identifiable information (Pll) that belongs to customers. What should a solutions architect do to meet these requirements? A. Process the audio files by using Amazon Kinesis Video Stream B. Use an AWS Lambda function to scan for known Pll patterns. C. When an audio file is uploaded to the S3 bucket, invoke an AWS Lambda function to start an Amazon Textract task to analyze the call recordings. D. Configure an Amazon Transcribe transcription job with Pll redaction turned o E. When an audio file is uploaded to the S3 bucket, invoke an AWS Lambda function to start the transcription jo F. Store theoutput in a separate S3 bucket. G. Create an Amazon Connect contact flow that ingests the audio files with transcription turned o H. Embed an AWS Lambda function to scan for known Pll pattern I. Use Amazon EventBridge (Amazon CloudWatch Events) to start the contact flow when an audio file is uploaded to the S3 bucket. Answer: C NEW QUESTION 183 A company has a document management application that contains PDF documents The company hosts the application on Amazon EC2 instances According to regulations, the instances must not have access to the internet The application must be able to read and write to a persistent storage system that provides native versioning capabilities A solutions architect needs to design secure storage that maximizes resiliency and facilitates data sharing across instances Which solution meets these requirements? A. Place the instances in a public subnet Use Amazon S3 for storage Access S3 objects by using URLs B. Place the instances in a private subnet use Amazon S3 for storage Use a VPC endpoint to access S3 objects C. Use the instances with a Provisioned IOPS SSD (io2) Amazon Elastic Block Store (Amazon EBS) volume. D. Use Amazon Elastic File System (Amazon EPS) Standard-Infrequent Access (Standard-IA) to store data and provide shared access to the instances Answer: B NEW QUESTION 186...... Guaranteed success with Our exam guides visit - https://www.certshared.com Certshared now are offering 100% pass ensure SAA-C03 dumps! https://www.certshared.com/exam/SAA-C03/ (0 Q&As) Thank You for Trying Our Product We offer two products: 1st - We have Practice Tests Software with Actual Exam Questions 2nd - Questons and Answers in PDF Format SAA-C03 Practice Exam Features: * SAA-C03 Questions and Answers Updated Frequently * SAA-C03 Practice Questions Verified by Expert Senior Certified Staff * SAA-C03 Most Realistic Questions that Guarantee you a Pass on Your FirstTry * SAA-C03 Practice Test Questions in Multiple Choice Formats and Updatesfor 1 Year 100% Actual & Verified — Instant Download, Please Click Order The SAA-C03 Practice Test Here Guaranteed success with Our exam guides visit - https://www.certshared.com Powered by TCPDF (www.tcpdf.org)

Use Quizgecko on...
Browser
Browser