Pre-Summer Sale - 70% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: dm70dm

SAA-C03 AWS Certified Solutions Architect - Associate (SAA-C03) Questions and Answers

Questions 4

A company is building a serverless application to process ecommerce orders. The application must handle bursts of traffic and process orders asynchronously in the order received.

Which solution will meet these requirements?

Options:

A.

Use Amazon SNS with AWS Lambda.

B.

Use Amazon SQS FIFO with AWS Lambda.

C.

Use Amazon SQS standard with AWS Batch.

D.

Use Amazon SNS with AWS Batch.

Buy Now
Questions 5

An online video game company must maintain ultra-low latency for its game servers. The game servers run on Amazon EC2 instances. The company needs a solution that can handle millions of UDP internet traffic requests each second.

Which solution will meet these requirements MOST cost-effectively?

Options:

A.

Configure an Application Load Balancer with the required protocol and ports. Specify the EC2 instances as targets.

B.

Configure a Gateway Load Balancer for the internet traffic. Specify the EC2 instances as targets.

C.

Configure a Network Load Balancer with the required protocol and ports. Specify the EC2 instances as targets.

D.

Launch identical game servers in separate Regions. Route traffic to both sets of servers.

Buy Now
Questions 6

A solutions architect runs a web application on multiple Amazon EC2 instances that are in individual target groups behind an Application Load Balancer (ALB). Users can reach the application through a public website.

The solutions architect wants to allow engineers to use a development version of the website to access one specific development EC2 instance to test new features for the application. The solutions architect wants to use an Amazon Route 53 hosted zone to give the engineers access to the development instance. The solution must automatically route to the development instance even if the development instance is replaced.

Which solution will meet these requirements?

Options:

A.

Create an A record for the development website that has the value set to the ALB. Create a listener rule on the ALB that forwards requests for the development website to the target group that contains the development instance.

B.

Recreate the development instance with a public IP address. Create an A record for the development website that has the value set to the public IP address of the development instance.

C.

Create an A record for the development website that has the value set to the ALB. Create a listener rule on the ALB to redirect requests for the development website to the public IP address of the development instance.

D.

Place all the instances in the same target group. Create an A record for the development website. Set the value to the ALB. Create a listener rule on the ALB that forwards requests for the development website to the target group.

Buy Now
Questions 7

A global ecommerce company runs its critical workloads on AWS. The workloads use an Amazon RDS for PostgreSQL DB instance that is configured for a Multi-AZ deployment.

Customers have reported application timeouts when the company undergoes database failovers. The company needs a resilient solution to reduce failover time

Which solution will meet these requirements?

Options:

A.

Create an Amazon RDS Proxy. Assign the proxy to the DB instance.

B.

Create a read replica for the DB instance Move the read traffic to the read replica.

C.

Enable Performance Insights. Monitor the CPU load to identify the timeouts.

D.

Take regular automatic snapshots Copy the automatic snapshots to multiple AWS Regions

Buy Now
Questions 8

A company regularly receives route status updates from its delivery trucks as events in Amazon EventBridge. The company is building an API-based application in a VPC that will consume and process the events to create a delivery status dashboard. The API application must not be available by using public IP addresses because of security and compliance requirements.

How should the company send events from EventBridge to the API application?

Options:

A.

Create an AWS Lambda function that runs in the same VPC as the API application. Configure the function as an EventBridge target. Use the function to send events to the API.

B.

Create an internet-facing Application Load Balancer ALB in front of the API application. Associate a security group with rules that block access from all external sources except for EventBridge. Configure the ALB as an EventBridge target.

C.

Create an internet-facing Network Load Balancer NLB in front of the API application. Associate a security group with rules that block access from all external sources except for EventBridge. Configure the NLB as an EventBridge target.

D.

Use the application API endpoint in the VPC as a target for EventBridge. Send events directly to the application API endpoint from EventBridge.

Buy Now
Questions 9

A company needs a solution to process customer orders from a global ecommerce platform. The solution must automatically start processing new orders immediately and must maintain a history of all order processing attempts.

Which solution will meet these requirements in the MOST cost-effective way?

Options:

A.

Create an Amazon EventBridge rule that invokes an AWS Lambda function once every minute to check for new orders. Configure the Lambda function to process orders and store results in Amazon Aurora.

B.

Create an Amazon EventBridge event pattern that monitors the ecommerce platform ' s order events. Configure an EventBridge rule to invoke an AWS Lambda function when the platform receives a new order. Configure the function to store the results in Amazon DynamoDB.

C.

Use an Amazon EC2 instance to poll the ecommerce platform for new orders. Configure the instance to invoke an AWS Lambda function to process new orders. Configure the function to log results to Amazon CloudWatch.

D.

Use an Amazon SQS queue to invoke an AWS Lambda function when the platform receives a new order. Configure the function to process batches of orders and to store results in an Amazon EFS file system.

Buy Now
Questions 10

A company is designing a new application that uploads files to an Amazon S3 bucket. The uploaded files are processed to extract metadata.

Processing must take less than 5 seconds. The volume and frequency of the uploads vary from a few files each hour to hundreds of concurrent uploads.

Which solution will meet these requirements MOST cost-effectively?

Options:

A.

Configure AWS CloudTrail trails to log Amazon S3 API calls. Use AWS AppSync to process the files.

B.

Configure a new object created S3 event notification within the bucket to invoke an AWS Lambda function to process the files.

C.

Configure Amazon Kinesis Data Streams to deliver the files to the S3 bucket. Invoke an AWS Lambda function to process the files.

D.

Deploy an Amazon EC2 instance. Create a script that lists all files in the S3 bucket and processes new files. Use a cron job that runs every minute to run the script.

Buy Now
Questions 11

A gaming company is building an application that uses a database to store user data. The company wants the database to have an active-active configuration that allows data writes to a secondary AWS Region. The database must achieve a sub-second recovery point objective (RPO).

Options:

Options:

A.

Deploy an Amazon ElastiCache (Redis OSS) cluster. Configure a global data store for disaster recovery. Configure the ElastiCache cluster to cache data from an Amazon RDS database that is deployed in the primary Region.

B.

Deploy an Amazon DynamoDB table in the primary Region and the secondary Region. Configure Amazon DynamoDB Streams to invoke an AWS Lambda function to write changes from the table in the primary Region to the table in the secondary Region.

C.

Deploy an Amazon Aurora MySQL database in the primary Region. Configure a global database for the secondary Region.

D.

Deploy an Amazon DynamoDB table in the primary Region. Configure global tables for the secondary Region.

Buy Now
Questions 12

A gaming company is developing a game that requires significant compute resources to process game logic, player interactions, and real-time updates. The company needs a compute solution that can dynamically scale based on fluctuating player demand while maintaining high performance. The company must use a relational database that can run complex queries.

Options:

A.

Deploy Amazon EC2 instances to supply compute capacity. Configure Auto Scaling groups to achieve dynamic scaling based on player count. Use Amazon RDS for MySQL as the database.

B.

Refactor the game logic into small, stateless functions. Use AWS Lambda to process the game logic. Use Amazon DynamoDB as the database.

C.

Deploy an Amazon Elastic Container Service (Amazon ECS) cluster on AWS Fargate to supply compute capacity. Scale the ECS tasks based on player demand. Use Amazon Aurora Serverless v2 as the database.

D.

Use AWS ParallelCluster for high performance computing (HPC). Provision compute nodes that have GPU instances to process the game logic and player interactions. Use Amazon RDS for MySQL as the database.

Buy Now
Questions 13

A company is building an Amazon Elastic Kubernetes Service (Amazon EKS) cluster for its workloads. All secrets that are stored in Amazon EKS must be encrypted in the Kubernetes etcd key-value store.

Which solution will meet these requirements?

Options:

A.

Create a new AWS Key Management Service (AWS KMS) key. Use AWS Secrets Manager to manage, rotate, and store all secrets in Amazon EKS.

B.

Create a new AWS Key Management Service (AWS KMS) key. Enable Amazon EKS KMS secrets encryption on the Amazon EKS cluster.

C.

Create the Amazon EKS cluster with default options. Use the Amazon Elastic Block Store (Amazon EBS) Container Storage Interface (CSI) driver as an add-on.

D.

Create a new AWS Key Management Service (AWS KMS) key with the alias/aws/ebs alias. Enable default Amazon Elastic Block Store (Amazon EBS) volume encryption for the account.

Buy Now
Questions 14

A company uses Amazon S3 to host its static website. The company wants to add a contact form to the webpage. The contact form will have dynamic server-side components for users to input their name, email address, phone number, and user message.

The company expects fewer than 100 site visits each month. The contact form must notify the company by email when a customer fills out the form.

Which solution will meet these requirements MOST cost-effectively?

Options:

A.

Host the dynamic contact form in Amazon Elastic Container Service (Amazon ECS). Set up Amazon Simple Email Service (Amazon SES) to connect to a third-party email provider.

B.

Create an Amazon API Gateway endpoint that returns the contact form from an AWS Lambda function. Configure another Lambda function on the API Gateway to publish a message to an Amazon Simple Notification Service (Amazon SNS) topic.

C.

Host the website by using AWS Amplify Hosting for static content and dynamic content. Use server-side scripting to build the contact form. Configure Amazon Simple Queue Service (Amazon SQS) to deliver the message to the company.

D.

Migrate the website from Amazon S3 to Amazon EC2 instances that run Windows Server. Use Internet Information Services (IIS) for Windows Server to host the webpage. Use client-side scripting to build the contact form. Integrate the form with Amazon WorkMail.

Buy Now
Questions 15

A company manages an application that stores data on an Amazon RDS for PostgreSQL Multi-AZ DB instance. High traffic on the application is causing increased latency for many read queries.

A solutions architect must improve the performance of the application.

Which solution will meet this requirement?

Options:

A.

Enable Amazon RDS Performance Insights. Configure storage capacity to scale automatically.

B.

Configure the DB instance to use DynamoDB Accelerator (DAX).

C.

Create a read replica of the DB instance. Serve read traffic from the read replica.

D.

Use Amazon Data Firehose between the application and Amazon RDS to increase the concurrency of database requests.

Buy Now
Questions 16

A company hosts an application that processes highly sensitive customer transactions on AWS. The application uses Amazon RDS as its database. The company manages its own encryption keys to secure the data in Amazon RDS.

The company needs to update the customer-managed encryption keys at least once each year.

Which solution will meet these requirements with the LEAST operational overhead?

Options:

A.

Set up automatic key rotation in AWS Key Management Service (AWS KMS) for the encryption keys.

B.

Configure AWS Key Management Service (AWS KMS) to alert the company to rotate the encryption keys annually.

C.

Schedule an AWS Lambda function to rotate the encryption keys annually.

D.

Create an AWS CloudFormation stack to run an AWS Lambda function that deploys new encryption keys once each year.

Buy Now
Questions 17

A company has a large amount of data in an Amazon DynamoDB table. A large batch of data is appended to the table once each day. The company wants a solution that will make all the existing and future data in DynamoDB available for analytics on a long-term basis. Which solution meets these requirements with the LEAST operational overhead?

Options:

A.

Configure DynamoDB incremental exports to Amazon S3.

B.

Configure Amazon DynamoDB Streams to write records to Amazon S3.

C.

Configure Amazon EMR to copy DynamoDB data to Amazon S3.

D.

Configure Amazon EMR to copy DynamoDB data to Hadoop Distributed File System (HDFS).

Buy Now
Questions 18

A telemarketing company is designing its customer call center functionality on AWS. The company needs a solution that provides multiple speaker recognition and generates transcript files. The company wants to query the transcript files to analyze the business patterns.

Which solution will meet these requirements?

Options:

A.

Use Amazon Rekognition for multiple speaker recognition. Store the transcript files in Amazon S3. Use machine learning (ML) models to analyze the transcript files.

B.

Use Amazon Transcribe for multiple speaker recognition. Use Amazon Athena to analyze the transcript files.

C.

Use Amazon Translate for multiple speaker recognition. Store the transcript files in Amazon Redshift. Use SQL queries to analyze the transcript files.

D.

Use Amazon Rekognition for multiple speaker recognition. Store the transcript files in Amazon S3. Use Amazon Textract to analyze the transcript files.

Buy Now
Questions 19

An insurance company runs an application on premises to process contracts. The application processes jobs that are comprised of many tasks. The individual tasks run for up to 5 minutes. Some jobs can take up to 24 hours in total to finish. If a task fails, the task must be reprocessed.

The company wants to migrate the application to AWS. The company will use Amazon S3 as part of the solution. The company wants to configure jobs to start automatically when a contract is uploaded to an S3 bucket.

Which solution will meet these requirements?

Options:

A.

Use AWS Lambda functions to process individual tasks. Create a primary Lambda function to handle the overall job processing by calling individual Lambda functions in sequence. Configure the S3 bucket to send an event notification to invoke the primary Lambda function to begin processing.

B.

Use a state machine in AWS Step Functions to handle the overall contract processing job. Configure the S3 bucket to send an event notification to Amazon EventBridge. Create a rule in Amazon EventBridge to target the state machine.

C.

Use an AWS Batch job to handle the overall contract processing job. Configure the S3 bucket to send an event notification to initiate the Batch job.

D.

Use an S3 event notification to notify an Amazon Simple Queue Service (Amazon SQS) queue when a contract is uploaded. Configure an AWS Lambda function to read messages from the queue and to run the contract processing job.

Buy Now
Questions 20

A company needs to migrate a MySQL database from an on-premises data center to AWS within 2 weeks. The database is 180 TB in size. The company cannot partition the database.

The company wants to minimize downtime during the migration. The company ' s internet connection speed is 100 Mbps.

Which solution will meet these requirements?

Options:

A.

Order an AWS Snowball Edge Storage Optimized device. Use AWS Database Migration Service (AWS DMS) and the AWS Schema Conversion Tool (AWS SCT) to migrate the database to Amazon RDS for MySQL and replicate ongoing changes. Send the Snowball Edge device back to AWS to finish the migration. Continue to replicate ongoing changes.

B.

Establish an AWS Site-to-Site VPN connection between the data center and AWS. Use AWS Database Migration Service (AWS DMS) and the AWS Schema Conversion Tool (AWS SCT) to migrate the database to Amazon RDS tor MySQL and replicate ongoing changes.

C.

Establish a 10 Gbps dedicated AWS Direct Connect connection between the data center and AWS. Use AWS DataSync to replicate the database to Amazon S3. Create a script to import the data from Amazon S3 to a new Amazon RDS for MySQL database instance.

D.

Use the company ' s existing internet connection. Use AWS DataSync to replicate the database to Amazon S3. Create a script to import the data from Amazon S3 to a new Amazon RDS for MySQL database instance.

Buy Now
Questions 21

A company performs a security review of its AWS workloads and finds that all the company ' s IAM users have the AdministratorAccess IAM managed policy directly attached. The company ' s IAM users belong to either an engineering department or an operations department. Engineering users require full read and write access to all resources. Operations users require only read access to all resources.

The company must apply the principle of least privilege to user access.

Which solution will meet this requirement in the MOST operationally efficient way?

Options:

A.

Create an IAM group for each department. Add either the AdministratorAccess or ReadOnlyAccess IAM managed policy to each group as appropriate. Add each department user to the appropriate IAM group. Remove existing IAM permissions from the users.

B.

Create an IAM group named Staff. Apply both the AdministratorAccess and ReadOnlyAccess IAM managed policy to the Staff IAM group. Add all IAM users to the Staff group. Remove existing IAM permissions from the users.

C.

Add the ReadOnlyAccess IAM managed policy to IAM users that belong to the operations department. Remove existing AdministratorAccess IAM permissions from the operations department users. Add a tag of Operations to the operations department IAM users.

D.

Add the ReadOnlyAccess inline policy statement to IAM users that belong to the operations department. Remove the existing AdministratorAccess IAM permissions from operations department users. Add a tag of Operations to the operations department IAM users.

Buy Now
Questions 22

Question:

A company uses AWS Organizations to manage multiple AWS accounts. Each department in the company has its own AWS account. A security team needs to implement centralized governance and control to enforce security best practices across all accounts. The team wants to have control over which AWS services each account can use. The team needs to restrict access to sensitive resources based on IP addresses or geographic regions. The root user must be protected with multi-factor authentication (MFA) across all accounts.

Options:

Options:

A.

Use AWS Identity and Access Management (IAM) to manage IAM users and IAM roles in each account. Implement MFA for the root user in each account. Enforce service restrictions by using AWS managed prefix lists.

B.

Use AWS Control Tower to establish a multi-account environment. Use service control policies (SCPs) to enforce service restrictions in AWS Organizations. Configure MFA for the root user across all accounts.

C.

Use AWS Systems Manager to enforce service restrictions across multiple accounts. Use IAM policies to enforce MFA for the root user across all accounts.

D.

Use AWS IAM Identity Center to manage user access and to enforce service restrictions by using permissions boundaries in each account.

Buy Now
Questions 23

A solutions architect has created an AWS Lambda function that is written in Java. A company will use the Lambda function as a new microservice for its application. The company ' s customers must be able to call an HTTPS endpoint to reach the microservice. The microservice must use AWS Identity and Access Management (IAM) to authenticate calls.

Which solution will meet these requirements?

Options:

A.

Create an Amazon API Gateway REST API. Configure an API method to use the Lambda function. Create a second Lambda function that is configured as an authorizer.

B.

Create an AWS Lambda function URL for the Lambda function. Specify AWS_IAM as the authentication type.

C.

Create an Amazon CloudFront distribution. Deploy the Lambda function to Lambda@Edge. Integrate IAM authentication logic into the Lambda@Edge function.

D.

Create an Amazon CloudFront distribution. Deploy the Lambda function to CloudFront Functions. Specify AWS_IAM as the authentication type.

Buy Now
Questions 24

A company runs a MySQL database on a single Amazon EC2 instance.

The company needs to improve availability of the database to prepare for power outages.

Which solution will meet this requirement?

Options:

A.

Add an Application Load Balancer (ALB) in front of the EC2 instance.

B.

Configure EC2 automatic instance recovery to move the instance to another Availability Zone.

C.

Migrate the MySQL database to Amazon RDS and enable Multi-AZ deployment.

D.

Enable termination protection for the EC2 instance.

Buy Now
Questions 25

A company has a production Amazon RDS for MySQL database. The company needs to create a new application that will read frequently changing data from the database with minimal impact on the database ' s overall performance. The application will rarely perform the same query more than once.

What should a solutions architect do to meet these requirements?

Options:

A.

Set up an Amazon ElastiCache cluster. Query the results in the cluster.

B.

Set up an Application Load Balancer (ALB). Query the results in the ALB.

C.

Set up a read replica for the database. Query the read replica.

D.

Set up querying of database snapshots. Query the database snapshots.

Buy Now
Questions 26

A company runs multiple workloads in separate AWS environments. The company wants to optimize its AWS costs but must maintain the same level of performance for the environments.

The company ' s production environment requires resources to be highly available. The other environments do not require highly available resources.

Each environment has the same set of networking components, including the following:

• 1 VPC

• 1 Application Load Balancer

• 4 subnets distributed across 2 Availability Zones (2 public subnets and 2 private subnets)

• 2 NAT gateways (1 in each public subnet)

• 1 internet gateway

Which solution will meet these requirements?

Options:

A.

Do not change the production environment workload. For each non-production workload, remove one NAT gateway and update the route tables for private subnets to target the remaining NAT gateway for the destination 0.0.0.0/0.

B.

Reduce the number of Availability Zones that all workloads in all environments use.

C.

Replace every NAT gateway with a t4g.large NAT instance. Update the route tables for each private subnet to target the NAT instance that is in the same Availability Zone for the destination 0.0.0.0/0.

D.

In each environment, create one transit gateway and remove one NAT gateway. Configure routing on the transit gateway to forward traffic for the destination 0.0.0.0/0 to the remaining NAT gateway. Update private subnet route tables to target the transit gateway for the destination 0.0.0.0/0.

Buy Now
Questions 27

A solutions architect is designing a web application that will run on Amazon EC2 instances behind an Application Load Balancer (ALB). The company strictly requires that the application be resilient against malicious internet activity and attacks, and protect against new common vulnerabilities and exposures.

What should the solutions architect recommend?

Options:

A.

Leverage Amazon CloudFront with the ALB endpoint as the origin.

B.

Deploy an appropriate managed rule for AWS WAF and associate it with the ALB.

C.

Subscribe to AWS Shield Advanced and ensure common vulnerabilities and exposures are blocked.

D.

Configure network ACLs and security groups to allow only ports 80 and 443 to access the EC2 instances.

Buy Now
Questions 28

A company uses a single Amazon S3 bucket to store data that multiple business applications must access. The company hosts the applications on Amazon EC2 Windows instances that are in a VPC. The company configured a bucket policy for the S3 bucket to grant the applications access to the bucket.

The company continually adds more business applications to the environment. As the number of business applications increases, the policy document becomes more difficult to manage. The S3 bucket policy document will soon reach its policy size quota. The company needs a solution to scale its architecture to handle more business applications.

Which solution will meet these requirements in the MOST operationally efficient way?

Options:

A.

Migrate the data from the S3 bucket to an Amazon Elastic File System (Amazon EFS) volume. Ensure that all application owners configure their applications to use the EFS volume.

B.

Deploy an AWS Storage Gateway appliance for each application. Reconfigure the applications to use a dedicated Storage Gateway appliance to access the S3 objects instead of accessing the objects directly.

C.

Create a new S3 bucket for each application. Configure S3 replication to keep the new buckets synchronized with the original S3 bucket. Instruct application owners to use their respective S3 buckets.

D.

Create an S3 access point for each application. Instruct application owners to use their respective S3 access points.

Buy Now
Questions 29

A company tracks customer satisfaction by using surveys that the company hosts on its website. The surveys sometimes reach thousands of customers every hour. Survey results are currently sent in email messages to the company so company employees can manually review results and assess customer sentiment.

The company wants to automate the customer survey process. Survey results must be available for the previous 12 months.

Which solution will meet these requirements in the MOST scalable way?

Options:

A.

Send the survey results data to an Amazon API Gateway endpoint that is connected to an Amazon Simple Queue Service (Amazon SQS) queue. Create an AWS Lambda function to poll the SQS queue, call Amazon Comprehend for sentiment analysis, and save the results to an Amazon DynamoDB table. Set the TTL for all records to 365 days in the future.

B.

Send the survey results data to an API that is running on an Amazon EC2 instance. Configure the API to store the survey results as a new record in an Amazon DynamoDB table, call Amazon Comprehend for sentiment analysis, and save the results in a second DynamoDB table. Set the TTL for all records to 365 days in the future.

C.

Write the survey results data to an Amazon S3 bucket. Use S3 Event Notifications to invoke an AWS Lambda function to read the data and call Amazon Rekognition for sentiment analysis. Store the sentiment analysis results in a second S3 bucket. Use S3 Lifecycle policies on each bucket to expire objects after 365 days.

D.

Send the survey results data to an Amazon API Gateway endpoint that is connected to an Amazon Simple Queue Service (Amazon SQS) queue. Configure the SQS queue to invoke an AWS Lambda function that calls Amazon Lex for sentiment analysis and saves the results to an Amazon DynamoDB table. Set the TTL for all records to 365 days in the future.

Buy Now
Questions 30

A company needs to design a resilient web application to process customer orders. The web application must automatically handle increases in web traffic and application usage without affecting the customer experience or losing customer orders.

Which solution will meet these requirements?

Options:

A.

Use a NAT gateway to manage web traffic. Use Amazon EC2 Auto Scaling groups to receive, process, and store processed customer orders. Use an AWS Lambda function to capture and store unprocessed orders.

B.

Use a Network Load Balancer (NLB) to manage web traffic. Use an Application Load Balancer to receive customer orders from the NLB. Use Amazon Redshift with a Multi-AZ deployment to store unprocessed and processed customer orders.

C.

Use a Gateway Load Balancer (GWLB) to manage web traffic. Use Amazon Elastic Container Service (Amazon ECS) to receive and process customer orders. Use the GWLB to capture and store unprocessed orders. Use Amazon DynamoDB to store processed customer orders.

D.

Use an Application Load Balancer to manage web traffic. Use Amazon EC2 Auto Scaling groups to receive and process customer orders. Use Amazon Simple Queue Service (Amazon SQS) to store unprocessed orders. Use Amazon RDS with a Multi-AZ deployment to store processed customer orders.

Buy Now
Questions 31

A company is developing a social media application that must scale rapidly and handle long-running, ordered processes that store large amounts of relational data. Components must scale independently and evolve without downtime.

Which combination of AWS services will meet these requirements?

Options:

A.

Amazon ECS with Fargate, Amazon RDS, and Amazon SQS

B.

Amazon ECS with Fargate, Amazon RDS, and Amazon SNS

C.

AWS Lambda, Amazon DynamoDB Streams, and AWS Step Functions

D.

AWS Elastic Beanstalk, Amazon RDS, and Amazon SNS

Buy Now
Questions 32

A company has 5 TB of datasets. The datasets consist of 1 million user profiles and 10 million connections. The user profiles have connections as many-to-many relationships. The company needs a performance-efficient way to find mutual connections up to five levels.

Which solution will meet these requirements?

Options:

A.

Use an Amazon S3 bucket to store the datasets. Use Amazon Athena to perform SQL JOIN queries to find connections.

B.

Use Amazon Neptune to store the datasets with edges and vertices. Query the data to find connections.

C.

Use an Amazon S3 bucket to store the datasets. Use Amazon QuickSight to visualize connections.

D.

Use Amazon RDS to store the datasets with multiple tables. Perform SQL JOIN queries to find connections.

Buy Now
Questions 33

A marketing company receives a large amount of new clickstream data in Amazon S3 from a marketing campaign The company needs to analyze the clickstream data in Amazon S3 quickly. Then the company needs to determine whether to process the data further in the data pipeline.

Which solution will meet these requirements with the LEAST operational overhead?

Options:

A.

Create external tables in a Spark catalog Configure jobs in AWS Glue to query the data

B.

Configure an AWS Glue crawler to crawl the data. Configure Amazon Athena to query the data.

C.

Create external tables in a Hive metastore. Configure Spark jobs in Amazon EMR to query the data.

D.

Configure an AWS Glue crawler to crawl the data. Configure Amazon Kinesis Data Analytics to use SQL to query the data

Buy Now
Questions 34

A solutions architect needs to implement a solution that can handle up to 5,000 messages per second. The solution must publish messages as events to multiple consumers. The messages are upto 500 KB in size. The message consumers need to have the ability to use multiple programming languages to consume the messages with minimal latency. The solution must retain published messages for more than 3 months. The solution must enforce strict ordering of the messages.

Which solution will meet these requirements?

Options:

A.

Publish messages to an Amazon Kinesis Data Streams data stream. Enable enhanced fan-out. Ensure that consumers ingest the data stream by using dedicated throughput.

B.

Publish messages to an Amazon Simple Notification Service (Amazon SNS) topic. Ensure that consumers use an Amazon Simple Queue Service (Amazon SQS) FIFO queue to subscribe to the topic.

C.

Publish messages to Amazon EventBridge. Allow each consumer to create rules to deliver messages to the consumer ' s own target.

D.

Publish messages to an Amazon Simple Notification Service (Amazon SNS) topic. Ensure that consumers use Amazon Data Firehose to subscribe to the topic.

Buy Now
Questions 35

A company runs an HPC workload that uses a 200-TB file system on premises. The company needs to migrate this data to Amazon FSx for Lustre. Internet capacity is 10 Mbps, and all data must be migrated within 30 days.

Which solution will meet this requirement?

Options:

A.

Use AWS DMS to transfer data into S3 and link FSx for Lustre to the bucket.

B.

Deploy AWS DataSync on premises and transfer directly into FSx for Lustre.

C.

Use AWS Storage Gateway Volume Gateway to move data into FSx for Lustre.

D.

Use an AWS Snowball Edge storage-optimized device to transfer data into S3 and link FSx for Lustre to the bucket.

Buy Now
Questions 36

A company collects 10 GB of telemetry data every day from multiple devices. The company stores the data in an Amazon S3 bucket that is in a source data account.

The company has hired several consulting agencies to analyze the company ' s data. Each agency has a unique AWS account. Each agency requires read access to the company ' s data.

The company needs a secure solution to share the data from the source data account to the consulting agencies.

Which solution will meet these requirements with the LEAST operational effort?

Options:

A.

Set up an Amazon CloudFront distribution. Use the S3 bucket as the origin.

B.

Make the S3 bucket public for a limited time. Inform only the agencies that the bucket is publicly accessible.

C.

Configure cross-account access for the S3 bucket to the accounts that the agencies own.

D.

Set up an IAM user for each agency in the source data account. Grant each agency IAM user access to the company ' s S3 bucket.

Buy Now
Questions 37

A company is designing a new internal web application in the AWS Cloud. The new application must securely retrieve and store multiple employee usernames and passwords from an AWS managed service. Which solution will meet these requirements with the LEAST operational overhead?

Options:

A.

Store the employee credentials in AWS Systems Manager Parameter Store. Use AWS Cloud Formation and the BatchGetSecretValue API to retrieve usernames and passwords from Parameter Store.

B.

Store the employee credentials in AWS Secrets Manager. Use AWS Cloud Formation and AWS Batch with the BatchGetSecretValue API to retrieve the usernames and passwords from Secrets Manager.

C.

Store the employee credentials in AWS Systems Manager Parameter Store. Use AWS Cloud Formation and AWS Batch with the BatchGetSecretValue API to retrieve the usernames and passwords from Parameter Store.

D.

Store the employee credentials in AWS Secrets Manager. Use AWS Cloud Formation and the BatchGetSecretValue API to retrieve the usernames and passwords from Secrets Manager.

Buy Now
Questions 38

A city ' s weather forecast team is using Amazon DynamoDB in the data tier for an application. The application has several components. The analysis component of the application requires repeated reads against a large dataset. The application has started to temporarily consume all the read capacity in the DynamoDB table and is negatively affecting other applications that need to access the same data.

Which solution will resolve this issue with the LEAST development effort?

Options:

A.

Use DynamoDB Accelerator (DAX).

B.

Use Amazon CloudFront in front of DynamoDB.

C.

Create a DynamoDB table with a local secondary index (LSI).

D.

Use Amazon ElastiCache in front of DynamoDB.

Buy Now
Questions 39

A company runs an application on Amazon EC2 instances. The instances need to access an Amazon RDS database by using specific credentials. The company uses AWS Secrets Manager to contain the credentials the EC2 instances must use.

Which solution will meet this requirement?

Options:

A.

Create an IAM role, and attach the role to each EC2 instance profile. Use an identity-based policy to grant the new IAM role access to the secret that contains the database credentials.

B.

Create an IAM user, and attach the user to each EC2 instance profile. Use a resource-based policy to grant the new IAM user access to the secret that contains the database credentials.

C.

Create a resource-based policy for the secret that contains the database credentials. Use EC2 Instance Connect to access the secret.

D.

Create an identity-based policy for the secret that contains the database credentials. Grant direct access to the EC2 instances.

Buy Now
Questions 40

A company has applications that run in an organization in AWS Organizations. The company outsources operational support of the applications. The company needs to provide access for the external support engineers without compromising security.

The external support engineers need access to the AWS Management Console. The external support engineers also need operating system access to the company ' s fleet of Amazon EC2 instances that run Amazon Linux in private subnets.

Which solution will meet these requirements MOST securely?

Options:

A.

Confirm that AWS Systems Manager Agent (SSM Agent) is installed on all instances. Assign an instance profile with the necessary policy to connect to Systems Manager. Use AWS IAM IdentityCenter to provide the external support engineers console access. Use Systems Manager Session Manager to assign the required permissions.

B.

Confirm that AWS Systems Manager Agent {SSM Agent) is installed on all instances. Assign an instance profile with the necessary policy to connect to Systems Manager. Use Systems Manager Session Manager to provide local IAM user credentials in each AWS account to the external support engineers for console access.

C.

Confirm that all instances have a security group that allows SSH access only from the external support engineers source IP address ranges. Provide local IAM user credentials in each AWS account to the external support engineers for console access. Provide each external support engineer an SSH key pair to log in to the application instances.

D.

Create a bastion host in a public subnet. Set up the bastion host security group to allow access from only the external engineers ' IP address ranges Ensure that all instances have a security group that allows SSH access from the bastion host. Provide each external support engineer an SSH key pair to log in to the application instances. Provide local account IAM user credentials to the engineers for console access.

Buy Now
Questions 41

Question:

A genomics research company is designing a scalable architecture for a loosely coupled workload. Tasks in the workload are independent and can be processed in parallel. The architecture needs to minimize management overhead and provide automatic scaling based on demand.

Options:

Options:

A.

Use a cluster of Amazon EC2 instances. Use AWS Systems Manager to manage the workload.

B.

Implement a serverless architecture that uses AWS Lambda functions.

C.

Use AWS ParallelCluster to deploy a dedicated high-performance cluster.

D.

Implement vertical scaling for each workload task.

Buy Now
Questions 42

A company is developing a photo-hosting application in the us-east-1 Region. The application gives users across multiple countries the ability to upload and view photos. Some photos are heavily viewed for months, while other photos are viewed for less than a week. The application allows users to upload photos that are up to 20 MB in size. The application uses photo metadata to determine which photos to display to each user.

The company needs a cost-effective storage solution to support the application.

Options:

A.

Store the photos in Amazon DynamoDB. Turn on DynamoDB Accelerator (DAX).

B.

Store the photos in the Amazon S3 Intelligent-Tiering storage class. Store the photo metadata and the S3 location URLs in Amazon DynamoDB.

C.

Store the photos in the Amazon S3 Standard storage class. Set up an S3 Lifecycle policy to move photos older than 30 days to the S3 Standard-Infrequent Access (S3 Standard-IA) storage class. Use object tags to keep track of metadata.

D.

Store the photos in an Amazon DynamoDB table. Use the DynamoDB Standard-Infrequent Access (DynamoDB Standard-IA) storage class. Store the photo metadata in Amazon ElastiCache.

Buy Now
Questions 43

A company runs several websites on AWS for its different brands Each website generates tens of gigabytes of web traffic logs each day. A solutions architect needs to design a scalable solution to give the company ' s developers the ability to analyze traffic patterns across all the company ' s websites. This analysis by the developers will occur on demand once a week over the course of several months. The solution must support queries with standard SQL.

Which solution will meet these requirements MOST cost-effectively?

Options:

A.

Store the logs in Amazon S3. Use Amazon Athena for analysis.

B.

Store the logs in Amazon RDS. Use a database client for analysis.

C.

Store the logs in Amazon OpenSearch Service. Use OpenSearch Service for analysis.

D.

Store the logs in an Amazon EMR cluster. Use a supported open-source framework for SQL-based analysis.

Buy Now
Questions 44

As part of budget planning, management wants a report of AWS billed items listed by user. The data will be used to create department budgets. A solutions architect needs to determine the most efficient way to obtain this report information.

Which solution meets these requirements?

Options:

A.

Run a query with Amazon Athena to generate the report.

B.

Create a report in Cost Explorer and download the report.

C.

Access the bill details from the billing dashboard and download the bill.

D.

Modify a cost budget in AWS Budgets to alert with Amazon Simple Email Service (Amazon SES).

Buy Now
Questions 45

A company is using an Amazon Redshift cluster to run analytics queries for multiple sales teams. In addition to the typical workload, on the last Monday morning of each month, thousands of users run reports. Users have reported slow response times during the monthly surge.

The company must improve query performance without impacting the availability of the Redshift cluster.

Which solution will meet these requirements?

Options:

A.

Resize the Redshift cluster by using the classic resize capability of Amazon Redshift before every monthly surge. Reduce the cluster to its original size after each surge.

B.

Resize the Redshift cluster by using the elastic resize capability of Amazon Redshift before every monthly surge. Reduce the cluster to its original size after each surge.

C.

Enable the concurrency scaling feature for the Redshift cluster for specific workload management (WLM) queues.

D.

Enable Amazon Redshift Spectrum for the Redshift cluster before every monthly surge.

Buy Now
Questions 46

A company wants to enhance its ecommerce order-processing application that is deployed on AWS. The application must process each order exactly once without affecting the customer experience during unpredictable traffic surges.

Which solution will meet these requirements?

Options:

A.

Create an Amazon Simple Queue Service (Amazon SQS) FIFO queue. Put all the orders in the SQS queue. Configure an AWS Lambda function as the target to process the orders.

B.

Create an Amazon Simple Notification Service (Amazon SNS) standard topic. Publish all the orders to the SNS standard topic. Configure the application as a notification target.

C.

Create a flow by using Amazon AppFlow. Send the orders to the flow. Configure an AWS Lambda function as the target to process the orders.

D.

Configure AWS X-Ray in the application to track the order requests. Configure the application to process the orders by pulling the orders from Amazon CloudWatch.

Buy Now
Questions 47

An analytics application runs on multiple Amazon EC2 Linux instances that use Amazon Elastic File System (Amazon EFS) Standard storage. The files vary in size and access frequency. The company accesses the files infrequently after 30 days. However, users sometimes request older files to generate reports.

The company wants to reduce storage costs for files that are accessed infrequently. The company also wants throughput to adjust based on the size of the file system. The company wants to use the TransitionToIA Amazon EFS lifecycle policy to transition files to Infrequent Access (IA) storage after 30 days.

Which solution will meet these requirements?

Options:

A.

Configure files to transition back to Standard storage when a user accesses the files again. Specify the provisioned throughput mode.

B.

Specify the provisioned throughput mode only.

C.

Configure files to transition back to Standard storage when a user accesses the files again. Specify the bursting throughput mode.

D.

Specify the bursting throughput mode only.

Buy Now
Questions 48

A company is developing an ecommerce application that will consist of a load-balanced front end, a container-based application, and a relational database. A solutions architect needs to create a highly available solution that operates with as little manual intervention as possible.

Which solutions meet these requirements? Select TWO.

Options:

A.

Create an Amazon RDS DB instance in Multi-AZ mode.

B.

Create an Amazon RDS DB instance and one or more replicas in another Availability Zone.

C.

Create an Amazon EC2 instance-based Docker cluster to handle the dynamic application load.

D.

Create an Amazon ECS cluster with a Fargate launch type to handle the dynamic application load.

E.

Create an Amazon ECS cluster with an Amazon EC2 launch type to handle the dynamic application load.

Buy Now
Questions 49

A company uses Amazon FSx for NetApp ONTAP in its primary AWS Region for CIFS and NFS file shares. Applications that run on Amazon EC2 instances access the file shares. The company needs a storage disaster recovery (DR) solution in a secondary Region. The data that is replicated in the secondary Region needs to be accessed by using the same protocols as the primary Region.

Which solution will meet these requirements with the LEAST operational overhead?

Options:

A.

Create an AWS Lambda function to copy the data to an Amazon S3 bucket. Replicate the S3 bucket to the secondary Region.

B.

Create a backup of the FSx for ONTAP volumes by using AWS Backup. Copy the volumes to the secondary Region. Create a new FSx for ONTAP instance from the backup.

C.

Create an FSx for ONTAP instance in the secondary Region. Use NetApp SnapMirror to replicate data from the primary Region to the secondary Region.

D.

Create an Amazon EFS volume. Migrate the current data to the volume. Replicate the volume to the secondary Region.

Buy Now
Questions 50

A company hosts a web application on Amazon EC2 instances that are part of an Auto Scaling group behind an Application Load Balancer (ALB). The application experiences spikes in requests that come through the ALB throughout each day. The traffic spikes last between 15 and 20 minutes.

The company needs a solution that uses a standard or custom metric to scale the EC2 instances based on the number of requests that come from the ALB.

Which solution will meet these requirements MOST cost-effectively?

Options:

A.

Configure an Amazon CloudWatch alarm to monitor the ALB RequestCount metric. Configure a simple scaling policy to scale the EC2 instances in response to the metric.

B.

Configure a predictive scaling policy based on the ALB RequestCount metric to scale the EC2 instances.

C.

Configure an Amazon CloudWatch alarm to monitor the ALB UnhealthyHostCount metric. Configure a target tracking policy to scale the EC2 instances in response to the metric.

D.

Create an Amazon CloudWatch alarm to monitor a user-defined metric for GET requests. Configure a target tracking policy threshold to scale the EC2 instances.

Buy Now
Questions 51

A company uses AWS Lambda functions in a private subnet in a VPC to run application logic. The Lambda functions must not have access to the public internet. Additionally, all data communication must remain within the private network. As part of a new requirement, the application logic needs access to an Amazon DynamoDB table.

What is the MOST secure way to meet this new requirement?

Options:

A.

Provision the DynamoDB table inside the same VPC that contains the Lambda functions.

B.

Create a gateway VPC endpoint for DynamoDB to provide access to the table.

C.

Use a network ACL to only allow access to the DynamoDB table from the VPC.

D.

Use a security group to only allow access to the DynamoDB table from the VPC.

Buy Now
Questions 52

A website uses EC2 instances with Auto Scaling and EFS. How can the company optimize costs?

Options:

A.

Reconfigure the Auto Scaling group to set a desired number of instances. Turn off scheduled scaling.

B.

Create a new launch template version that uses larger EC2 instances.

C.

Reconfigure the Auto Scaling group to use a target tracking scaling policy.

D.

Replace the EFS volume with instance store volumes.

Buy Now
Questions 53

An online food delivery company wants to optimize its storage costs. The company has been collecting operational data for the last 10 years in a data lake that was built on Amazon S3 by using a Standard storage class. The company does not keep data that is older than 7 years. A solutions architect frequently uses data from the past 6 months for reporting and runs queries on data from the last 2 years about once a month. Data that is more than 2 years old is rarely accessed and is only used for audit purposes.

Which combination of solutions will optimize the company ' s storage costs? (Select TWO.)

Options:

A.

Create an S3 Lifecycle configuration rule to transition data that is older than 6 months to the S3 Standard-Infrequent Access (S3 Standard-IA) storage class. Create another S3 Lifecycle configuration rule to transition data that is older than 2 years to the S3 Glacier Deep Archive storage class.

B.

Create an S3 Lifecycle configuration rule to transition data that is older than 6 months to the S3 One Zone-Infrequent Access (S3 One Zone-IA) storage class. Create another S3 Lifecycle configuration rule to transition data that is older than 2 years to the S3 Glacier Flexible Retrieval storage class.

C.

Use the S3 Intelligent-Tiering storage class to store data instead of the S3 Standard storage class.

D.

Create an S3 Lifecycle expiration rule to delete data that is older than 7 years.

E.

Create an S3 Lifecycle configuration rule to transition data that is older than 7 years to the S3 Glacier Deep Archive storage class.

Buy Now
Questions 54

A company wants to use automatic machine learning (ML) to create and visualize forecasts of complex scenarios and trends.

Which solution will meet these requirements with the LEAST management overhead?

Options:

A.

Use an AWS Glue ML job to transform the data and create forecasts. Use Amazon QuickSight to visualize the data.

B.

Use Amazon QuickSight to visualize the data. Use ML-powered forecasting in QuickSight to create forecasts.

C.

Use a prebuilt ML AMI from the AWS Marketplace to create forecasts. Use Amazon QuickSight to visualize the data.

D.

Use Amazon SageMaker AI inference pipelines to create and update forecasts. Use Amazon QuickSight to visualize the combined data.

Buy Now
Questions 55

A company is building a serverless web application with multiple interdependent workflows that millions of users worldwide will access. The application needs to handle bursts of traffic.

Which solution will meet these requirements MOST cost-effectively?

Options:

A.

Deploy an Amazon API Gateway HTTP API with a usage plan and throttle settings. Use AWS Step Functions with a Standard Workflow.

B.

Deploy an Amazon API Gateway HTTP API with a usage plan and throttle settings. Use AWS Step Functions with an Express Workflow.

C.

Deploy an Amazon API Gateway HTTP API without a usage plan. Use AWS Step Functions with an Express Workflow.

D.

Deploy an Amazon API Gateway HTTP API without a usage plan. Use AWS Step Functions and multiple AWS Lambda functions with reserved concurrency.

Buy Now
Questions 56

A gaming company is building an application with Voice over IP capabilities. The application will serve traffic to users across the world. The application needs to be highly available with automated failover across AWS Regions. The company wants to minimize the latency of users without relying on IP address caching on user devices.

What should a solutions architect do to meet these requirements?

Options:

A.

Use AWS Global Accelerator with health checks.

B.

Use Amazon Route 53 with a geolocation routing policy.

C.

Create an Amazon CloudFront distribution that includes multiple origins.

D.

Create an Application Load Balancer that uses path-based routing.

Buy Now
Questions 57

A solutions architect needs to build a log storage solution for a client. The client has an application that produces user activity logs that track user API calls to the application. The application typically produces 50 GB of logs each day. The client needs a storage solution that makes the logs available for occasional querying and analytics.

Options:

A.

Store user activity logs in an Amazon S3 bucket. Use Amazon Athena to perform queries and analytics.

B.

Store user activity logs in an Amazon OpenSearch Service cluster. Use OpenSearch Dashboards to perform queries and analytics.

C.

Store user activity logs in an Amazon RDS instance. Use an Open Database Connectivity (ODBC) connector to perform queries and analytics.

D.

Store user activity logs in an Amazon CloudWatch Logs log group. Use CloudWatch Logs Insights to perform queries and analytics.

Buy Now
Questions 58

A media company hosts a web application on AWS. The application gives users the ability to upload and view videos. The application stores the videos in an Amazon S3 bucket. The company wants to ensure that only authenticated users can upload videos. Authenticated users must have the ability to upload videos only within a specified time frame after authentication. Which solution will meet these requirements with the LEAST operational overhead?

Options:

A.

Configure the application to generate IAM temporary security credentials for authenticated users.

B.

Create an AWS Lambda function that generates pre-signed URLs when a user authenticates.

C.

Develop a custom authentication service that integrates with Amazon Cognito to control and log direct S3 bucket access through the application.

D.

Use AWS Security Token Service (AWS STS) to assume a pre-defined IAM role that grants authenticated users temporary permissions to upload videos directly to the S3 bucket.

Buy Now
Questions 59

A company needs to ensure that an IAM group that contains database administrators can perform operations only within Amazon RDS. The company must ensure that the members of the IAM group cannot access any other AWS services.

Options:

A.

Create an IAM policy that includes a statement that has the Effect " Allow " and the Action " rds: " . Attach the IAM policy to the IAM group.

B.

Create an IAM policy that includes two statements. Configure the first statement to have the Effect " Allow " and the Action " rds: " . Configure the second statement to have the Effect " Deny " and the Action " " . Attach the IAM policy to the IAM group.

C.

Create an IAM policy that includes a statement that has the Effect " Deny " and the NotAction " rds: " . Attach the IAM policy to the IAM group.

D.

Create an IAM policy with a statement that includes the Effect " Allow " and the Action " rds: " . Include a permissions boundary that has the Effect " Allow " and the Action " rds: " . Attach the IAM policy to the IAM group.

Buy Now
Questions 60

A company is designing a microservice-based architecture tor a new application on AWS. Each microservice will run on its own set of Amazon EC2 instances. Each microservice will need to interact with multiple AWS services such as Amazon S3 and Amazon Simple Queue Service (Amazon SQS).

The company wants to manage permissions for each EC2 instance based on the principle of least privilege.

Which solution will meet this requirement?

Options:

A.

Assign an IAM user to each micro-service. Use access keys stored within the application code to authenticate AWS service requests.

B.

Create a single IAM role that has permission to access all AWS services. Associate the IAM role with all EC2 instances that run the microservices

C.

Use AWS Organizations to create a separate account for each microservice. Manage permissions at the account level.

D.

Create individual IAM roles based on the specific needs of each microservice. Associate the IAM roles with the appropriate EC2 instances.

Buy Now
Questions 61

A company is designing a new ecommerce application for a high-traffic retail website. The application needs to process a large volume of customer orders. The application must scale to handle spikes in order volume during peak shopping events.

Which solution will meet these requirements?

Options:

A.

Use a single large Amazon EC2 instance to run processing logic and to store order information. Run a relational database on the same EC2 instance.

B.

Use a single Amazon EC2 instance to run processing logic. Control the flow of orders into the EC2 instance by using an Amazon SQS queue. Use an Amazon S3 bucket to store order information.

C.

Use an Amazon API Gateway HTTP API and an AWS Lambda function to process orders. Use Amazon DynamoDB in on-demand mode to store order information.

D.

Use an Application Load Balancer ALB to distribute order processing traffic across multiple Amazon EC2 instances that run processing logic. Use Amazon Aurora with multiple reader nodes as the database.

Buy Now
Questions 62

A company is building a data analysis platform on AWS by using AWS Lake Formation. The platform will ingest data from different sources such as Amazon S3 and Amazon RDS. The company needs a secure solution to prevent access to portions of the data that contain sensitive information.

Which solution will meet these requirements with the LEAST operational overhead?

Options:

A.

Create an IAM role that includes permissions to access Lake Formation tables.

B.

Create data filters to implement row-level security and cell-level security.

C.

Create an AWS Lambda function that removes sensitive information before Lake Formation ingests the data.

D.

Create an AWS Lambda function that periodically queries and removes sensitive information from Lake Formation tables.

Buy Now
Questions 63

A company runs an application on Amazon EC2 instances behind an Application Load Balancer (ALB). The company uses Amazon Route 53 to route traffic to the ALB. The ALB is a resource in an AWS Shield Advanced protection group.

The company is preparing for a blue/green deployment in which traffic will shift to a new ALB. The company wants to protect against DDoS attacks during the deployment.

Which solution will meet this requirement?

Options:

A.

Add the new ALB to the Shield Advanced protection group. Select Sum as the aggregation type for the volume of traffic for the whole group.

B.

Add the new ALB to the Shield Advanced protection group. Select Mean as the aggregation type for the volume of traffic for the whole group.

C.

Create a new Shield Advanced protection group. Add the new ALB to the new protection group. Select Sum as the aggregation type for the volume of traffic.

D.

Set up an Amazon CloudFront distribution. Add the CloudFront distribution and the new ALB to the Shield Advanced protection group. Select Max as the aggregation type for the volume of traffic for the whole group.

Buy Now
Questions 64

A company uses Amazon EC2 instances to host its internal systems. As part of a deployment operation, an administrator tries to use the AWS

CLI to terminate an EC2 instance. However, the administrator receives a 403 (Access Denied) error message.

The administrator is using an IAM role that has the following IAM policy attached:

What is the cause of the unsuccessful request?

Options:

A.

The EC2 instance has a resource-based policy with a Deny statement.

B.

The principal has not been specified in the policy statement.

C.

The " Action " field does not grant the actions that are required to terminate the EC2 instance.

D.

The request to terminate the EC2 instance does not originate from the CIDR blocks 192.0.2.0/24 or 203.0.113.0/24.

Buy Now
Questions 65

A solutions architect needs to ensure that only resources in VPC vpc-11aabb22 can access an S3 bucket in account 123456789012 with Block Public Access enabled.

Which solution meets this requirement?

Options:

A.

Create a bucket policy with Deny and a Condition using " StringNotEquals " : { " aws:SourceVpc " : " vpc-11aabb22 " }.

B.

Create a bucket policy with Allow and Resource " arn:aws:ec2:us-west-2:123456789012:vpc/vpc-11aabb22 " .

C.

Create a bucket policy with Allow and a Condition using " StringNotEquals " : { " aws:SourceVpc " : " vpc-11aabb22 " }.

D.

Create a bucket policy with Deny and " StringNotEquals " : { " aws:PrincipalAccount " : " 123456789012 " }.

Buy Now
Questions 66

A global media streaming company is migrating its user authentication and content delivery services to AWS. The company wants to use Amazon API Gateway for user authentication and authorization. The company needs a solution that restricts API access to AWS Regions in the United States and ensures minimal latency.

Which solution will meet these requirements?

Options:

A.

Create an API Gateway REST API. Configure an AWS WAF firewall in the same Region. Implement AWS WAF rules to deny requests that originate from Regions outside the United States. Associate the AWS WAF firewall with the API Gateway REST API.

B.

Create an API Gateway HTTP API. Configure an AWS WAF firewall in a different Region. Implement AWS WAF rules to deny requests that originate from Regions outside the United States. Associate the AWS WAF firewall with the API Gateway HTTP API.

C.

Create an API Gateway REST API. Configure an AWS WAF firewall in a different Region. Implement AWS WAF rules to deny requests that originate from Regions outside the United States. Associate the AWS WAF firewall with the API Gateway REST API.

D.

Create an API Gateway HTTP API. Configure an AWS WAF firewall in the same Region. Implement AWS WAF rules to deny requests that originate from Regions outside the United States. Associate the AWS WAF firewall with the API Gateway HTTP API.

Buy Now
Questions 67

A company is deploying an application that processes streaming data in near-real time. The company plans to use Amazon EC2 instances for the workload. The network architecture must be configurable to provide the lowest possible latency between nodes.

Which networking solution meets these requirements?

Options:

A.

Place the EC2 instances in multiple VPCs, and configure VPC peering.

B.

Attach an Elastic Fabric Adapter (EFA) to each EC2 instance.

C.

Run the EC2 instances in a spread placement group.

D.

Use Amazon Elastic Block Store (Amazon EBS) optimized instance types.

Buy Now
Questions 68

A company wants to store a large amount of data as objects for analytics and long-term archiving. Resources from outside AWS need to access the data. The external resources need to access the data with unpredictable frequency. However, the external resource must have immediate access when necessary.

The company needs a cost-optimized solution that provides high durability and data security.

Which solution will meet these requirements?

Options:

A.

Store the data in Amazon S3 Standard. Apply S3 Lifecycle policies to transition older data to S3 Glacier Deep Archive.

B.

Store the data in Amazon S3 Intelligent-Tiering.

C.

Store the data in Amazon S3 Glacier Flexible Retrieval. Use expedited retrieval to provide immediate access when necessary.

D.

Store the data in Amazon Elastic File System (Amazon EFS) Infrequent Access (IA). Use lifecycle policies to archive older files.

Buy Now
Questions 69

A company has built an application that uses an Amazon Simple Queue Service (Amazon SQS) standard queue and an AWS Lambda function. The Lambda function writes messages to the SQS queue. The company needs a solution to ensure that the consumer of the SQS queue never receives duplicate messages.

Which solution will meet this requirement with the FEWEST changes to the current architecture?

Options:

A.

Modify the SQS queue to enable long polling for the queue.

B.

Delete the existing SQS queue. Recreate the queue as a FIFO queue. Enable content-based deduplication for the queue.

C.

Modify the SQS queue to enable content-based deduplication for the queue.

D.

Delete the SQS queue. Create an Amazon MQ message broker. Configure the broker to deduplicate messages.

Buy Now
Questions 70

A company needs to run a critical data processing workload that uses a Python script every night. The workload takes 1 hour to finish.

Which solution will meet these requirements MOST cost-effectively?

Options:

A.

Deploy an Amazon Elastic Container Service (Amazon ECS) cluster with the AWS Fargate launch type. Use the Fargate Spot capacity provider. Schedule the job to run once every night.

B.

Deploy an Amazon Elastic Container Service (Amazon ECS) cluster with the Amazon EC2 launch type. Schedule the job to run once every night.

C.

Create an AWS Lambda function that uses the existing Python code. Configure Amazon EventBridge to invoke the function once every night.

D.

Create an Amazon EC2 On-Demand Instance that runs Amazon Linux. Migrate the Python script to the instance. Use a cron job to schedule the script. Create an AWS Lambda function to start and stop the instance once every night.

Buy Now
Questions 71

A company stores sensitive financial information for an application in Amazon RDS for MySQL. The company requires a stateful solution to ensure that only a specific on-premises IP address can access the RDS database instances. The company wants to rotate database credentials automatically. The company does not want to hardcode the credentials into the application.

Which solution will meet these requirements?

Options:

A.

Use security groups to allow access only from the specified IP addresses. Store the database credentials in AWS Secrets Manager. Configure automatic rotation for the credentials.

B.

Use IAM policies to restrict access based on IP address. Manage database credentials in the application code. Configure an AWS Lambda function to rotate the database credentials.

C.

Use a network ACL to allow access only from the specified IP addresses. Store the database credentials in an encrypted Amazon S3 bucket. Configure an AWS Lambda function to rotate the database credentials.

D.

Use security groups to allow access only from the specified IP addresses. Store the database credentials in AWS KMS. Configure automatic rotation for the credentials.

Buy Now
Questions 72

A company stores 5 PB of archived data on physical tapes. The company needs to preserve the data for another 10 years. The data center that stores the tapes has a 10 Gbps Direct Connect connection to an AWS Region. The company wants to migrate the data to AWS within the next 6 months.

Options:

A.

Read the data from the tapes on premises. Use local storage to stage the data. Use AWS DataSync to migrate the data to Amazon S3 Glacier Flexible Retrieval storage.

B.

Use an on-premises backup application to read the data from the tapes. Use the backup application to write directly to Amazon S3 Glacier Deep Archive storage.

C.

Order multiple AWS Snowball Edge devices. Copy the physical tapes to virtual tapes on the Snowball Edge devices. Ship the Snowball Edge devices to AWS. Create an S3 Lifecycle policy to move the tapes to Amazon S3 Glacier Instant Retrieval storage.

D.

Configure an on-premises AWS Storage Gateway Tape Gateway. Create virtual tapes in the AWS Cloud. Use backup software to copy the physical tapes to the virtual tapes. Move the virtual tapes to Amazon S3 Glacier Deep Archive storage.

Buy Now
Questions 73

A company is building a serverless application to process clickstream data from its website. The clickstream data is sent to an Amazon Kinesis Data Streams data stream from the application web servers.

The company wants to enrich the clickstream data by joining the clickstream data with customer profile data from an Amazon Aurora Multi-AZ database. The company wants to use Amazon Redshift to analyze the enriched data. The solution must be highly available.

Which solution will meet these requirements?

Options:

A.

Use an AWS Lambda function to process and enrich the clickstream data. Use the same Lambda function to write the clickstream data to Amazon S3. Use Amazon Redshift Spectrum to query the enriched data in Amazon S3.

B.

Use an Amazon EC2 Spot Instance to poll the data stream and enrich the clickstream data. Configure the EC2 instance to use the COPY command to send the enriched results to Amazon Redshift.

C.

Use an Amazon Elastic Container Service (Amazon ECS) task with AWS Fargate Spot capacity to poll the data stream and enrich the clickstream data. Configure an Amazon EC2 instance to use the COPY command to send the enriched results to Amazon Redshift.

D.

Use Amazon Kinesis Data Firehose to load the clickstream data from Kinesis Data Streams to Amazon S3. Use AWS Glue crawlers to infer the schema and populate the AWS Glue Data Catalog. Use Amazon Athena to query the raw data in Amazon S3.

Buy Now
Questions 74

A company wants to migrate hundreds of gigabytes of unstructured data from an on-premises location to an Amazon S3 bucket. The company has a 100-Mbps internet connection on premises. The company needs to encrypt the data in transit to the S3 bucket. The company will store new data directly in Amazon S3.

Options:

A.

Use AWS Database Migration Service (AWS DMS) to synchronize the on-premises data to a destination S3 bucket.

B.

Use AWS DataSync to migrate the data from the on-premises location to an S3 bucket.

C.

Use an AWS Snowball Edge device to migrate the data to an S3 bucket. Use an AWS CloudHSM key to encrypt the data on the Snowball Edge device.

D.

Set up an AWS Direct Connect connection between the on-premises location and AWS. Use the s3 cp command to move the data directly to an S3 bucket.

Buy Now
Questions 75

An insurance company wants to migrate an application that calculates insurance premiums to AWS. The company must run calculations immediately when a customer submits information through the application. The application usually takes 10 seconds to process a calculation.

Which solution will meet this requirement?

Options:

A.

Set up an Amazon API Gateway HTTP API to receive the data. Use an AWS Lambda function to process the data immediately.

B.

Upload the customer data to an Amazon S3 bucket. Start an Amazon EC2 Spot Instance to process every data upload.

C.

Set up AWS Transfer Family to receive the customer data. Configure an Amazon EKS job to process the customer data on a schedule.

D.

Upload the data to an Amazon S3 bucket. Invoke an AWS Batch job to process every customer data upload.

Buy Now
Questions 76

A company is developing a new online gaming application. The application will run on Amazon EC2 instances in multiple AWS Regions and will have a high number of globally distributed users. A solutions architect must design the application to optimize network latency for the users.

Which actions should the solutions architect take to meet these requirements? (Select TWO.)

Options:

A.

Configure AWS Global Accelerator. Create Regional endpoint groups in each Region where an EC2 fleet is hosted.

B.

Create a content delivery network (CDN) by using Amazon CloudFront. Enable caching for static and dynamic content, and specify a high expiration period.

C.

Integrate AWS Client VPN into the application. Instruct users to select which Region is closest to them after they launch the application. Establish a VPN connection to that Region.

D.

Create an Amazon Route 53 weighted routing policy. Configure the routing policy to give the highest weight to the EC2 instances in the Region that has the largest number of users.

E.

Configure an Amazon API Gateway endpoint in each Region where an EC2 fleet is hosted. Instruct users to select which Region is closest to them after they launch the application. Use the API Gateway endpoint that is closest to them.

Buy Now
Questions 77

A company has a VPC with multiple private subnets that host multiple applications. The applications must not be accessible to the internet. However, the applications need to access multiple AWS services. The applications must not use public IP addresses to access the AWS services.

Options:

A.

Configure interface VPC endpoints for the required AWS services. Route traffic from the private subnets through the interface VPC endpoints.

B.

Deploy a NAT gateway in each private subnet. Route traffic from the private subnets through the NAT gateways.

C.

Deploy internet gateways in each private subnet. Route traffic from the private subnets through the internet gateways.

D.

Set up an AWS Direct Connect connection between the private subnets. Route traffic from the private subnets through the Direct Connect connection.

Buy Now
Questions 78

A company hosts an application on Amazon EC2 instances behind an Application Load Balancer ALB. The company wants the application to be accessible only from inside the VPC that hosts the ALB.

The company creates an alias record of example.com in Amazon Route 53. The DNS record for the application must be resolvable only in the VPC where the application runs.

Which solution will meet these requirements?

Options:

A.

Use an internet-facing ALB. Create a Route 53 public hosted zone for the application DNS name.

B.

Use an internal ALB. Create a Route 53 public hosted zone for the application DNS name.

C.

Use an internet-facing ALB. Create a Route 53 private hosted zone for the application DNS name.

D.

Use an internal ALB. Create a Route 53 private hosted zone for the application DNS name.

Buy Now
Questions 79

A company runs an application on Amazon EC2 instances. EC2 instance usage is higher during daytime hours than nighttime hours.

A solutions architect wants to automatically optimize Amazon EC2 costs based on this usage pattern.

Which AWS service or purchasing option will meet this requirement?

Options:

A.

Spot Instances

B.

Reserved Instances

C.

AWS CloudFormation

D.

AWS Auto Scaling

Buy Now
Questions 80

An e-commerce company stores inventory, order, and user information in multiple Amazon Redshift clusters. The Redshift clusters must comply with the company ' s security policies. The company must receive notifications about any security configuration violations.

Which solution will meet these requirements?

Options:

A.

Create an Amazon EventBridge rule that uses the Redshift clusters as the source. Create an AWS Lambda function to evaluate the Redshift cluster security configuration. Configure theLambda function to notify the company of any violations of the security policies. Add the Lambda function as a target of the EventBridge rule.

B.

Create an AWS Lambda function to check the validity of the Redshift cluster security configurations. Create an Amazon EventBridge rule that invokes the Lambda function when Redshift clusters are created. Notify the company of any violations of security policies.

C.

Set up Amazon Redshift Advisor in the company ' s AWS account to monitor cluster configurations. Configure Redshift Advisor to generate notifications for security items that the company must address.

D.

Create an AWS Lambda function to check the Redshift clusters for any violation of the security configurations. Create an AWS Config custom rule to invoke the Lambda function when Redshift cluster security configurations are modified. Provide the compliance state of each Redshift cluster to AWS Config. Configure AWS Config to notify the company of any violations of the security policies.

Buy Now
Questions 81

A company is testing an application that runs on an Amazon EC2 Linux instance. A single 500 GB Amazon Elastic Block Store (Amazon EBS) General Purpose SSD (gp2) volume is attached to the EC2 instance.

The company will deploy the application on multiple EC2 instances in an Auto Scaling group. All instances require access to the data that is stored in the EBS volume. The company needs a highly available and resilient solution that does not introduce significant changes to the application ' s code.

Which solution will meet these requirements?

Options:

A.

Provision an EC2 instance that uses NFS server software. Attach a single 500 GB gp2 EBS volume to the instance.

B.

Provision an Amazon FSx for Windows File Server file system. Configure the file system as an SMB file store within a single Availability Zone.

C.

Provision an EC2 instance with two 250 GB Provisioned IOPS SSD EBS volumes.

D.

Provision an Amazon Elastic File System (Amazon EFS) file system. Configure the file system to use General Purpose performance mode.

Buy Now
Questions 82

A company has AWS Lambda functions that use environment variables. The company does not want its developers to see environment variables in plaintext.

Which solution will meet these requirements?

Options:

A.

Deploy code to Amazon EC2 instances instead of using Lambda functions.

B.

Configure SSL encryption on the Lambda functions to use AWS CloudHSM to store and encrypt the environment variables.

C.

Create a certificate in AWS Certificate Manager (ACM). Configure the Lambda functions to use the certificate to encrypt the environment variables.

D.

Create an AWS Key Management Service (AWS KMS) key. Enable encryption helpers on the Lambda functions to use the KMS key to store and encrypt the environment variables.

Buy Now
Questions 83

A financial company is migrating its banking applications to a set of AWS accounts managed by AWS Organizations. The applications will store sensitive customer data on Amazon Elastic Block Store (Amazon EBS) volumes. The company will take regular snapshots for backup purposes.

The company wants to implement controls across all AWS accounts to prevent sharing EBS snapshots publicly.

Which solution will meet these requirements with the LEAST operational overhead?

Options:

A.

Enable AWS Config rules for each organizational unit (OU) in Organizations to monitor EBS snapshot permissions.

B.

Enable block public access for EBS snapshots at the organization level.

C.

Create an IAM policy in the root account of the organization that prevents users from modifying snapshot permissions.

D.

Use AWS CloudTrail to track snapshot permission changes.

Buy Now
Questions 84

A company is building an application on an Amazon ECS cluster that uses the AWS Fargate launch type. The application must read files from a private Amazon S3 bucket.

The company needs to design a security solution to allow ECS tasks to retrieve data from the S3 bucket.

Which solution will meet these requirements with the LEAST administrative effort?

Options:

A.

Assign an inline IAM policy to the task role that is configured in the ECS task definition. Configure the policy to grant access to the S3 bucket.

B.

Create an IAM user that has programmatic access to the S3 bucket. Store the IAM user credentials as a parameter in AWS Systems Manager Parameter Store. Configure the ECS task definition to read the parameter during runtime.

C.

Assign an IAM policy to the task execution role that is configured in the ECS task definition. Configure the policy to grant access to the S3 bucket.

D.

Create an IAM user and access keys for the S3 bucket. Store the access credentials as a secret in AWS Secrets Manager. Configure the ECS task definition to read the secret during runtime.

Buy Now
Questions 85

A media company hosts a mobile app backend in the AWS Cloud. The company is releasing a new feature to allow users to upload short videos and apply special effects by using the mobile app. The company uses AWS Amplify to store the videos that customers upload in an Amazon S3 bucket.

The videos must be processed immediately. Users must receive a notification when processing is finished.

Which solution will meet these requirements?

Options:

A.

Use Amazon EventBridge Scheduler to schedule an AWS Lambda function to process the videos. Save the processed videos to the S3 bucket. Use Amazon Simple Notification Service (Amazon SNS) to send push notifications to customers when processing is finished.

B.

Use Amazon EventBridge Scheduler to schedule AWS Fargate to process the videos. Save the processed videos to the S3 bucket. Use Amazon Simple Notification Service (Amazon SNS) to send push notifications to customers when processing is finished.

C.

Use an S3 trigger to invoke an AWS Lambda function to process the videos. Save the processed videos to the S3 bucket. Use Amazon Simple Notification Service (Amazon SNS) to send push notifications to customers when processing is finished.

D.

Use an S3 trigger to invoke an AWS Lambda function to process the videos. Save the processed videos to the S3 bucket. Use AWS Amplify to send push notifications to customers when processing is finished.

Buy Now
Questions 86

A company is planning to deploy a managed MySQL database solution for its non-production applications. The company plans to run the system for several years on AWS. Which solution will meet these requirements MOST cost-effectively?

Options:

A.

Create an Amazon RDS for MySQL instance. Purchase a Reserved Instance.

B.

Create an Amazon RDS for MySQL instance. Use the instance on an on-demand basis.

C.

Create an Amazon Aurora MySQL cluster with writer and reader nodes. Use the cluster on an on-demand basis.

D.

Create an Amazon EC2 instance. Manually install and configure MySQL Server on the instance.

Buy Now
Questions 87

A company uses AWS to run its ecommerce platform. The platform is critical to the company ' s operations and has a high volume of traffic and transactions. The company configures a multi-factor authentication (MFA) device to secure its AWS account root user credentials. The company wants to ensure that it will not lose access to the root user account if the MFA device is lost.

Which solution will meet these requirements?

Options:

A.

Set up a backup administrator account that the company can use to log in if the company loses the MFA device.

B.

Add multiple MFA devices for the root user account to handle the disaster scenario.

C.

Create a new administrator account when the company cannot access the root account.

D.

Attach the administrator policy to another IAM user when the company cannot access the root account.

Buy Now
Questions 88

An internal product team is deploying a new application to a private VPC in a company ' s AWS account. The application runs on Amazon EC2 instances that are in a security group named App1. The EC2 instances store application data in an Amazon S3 bucket and use AWS Secrets Manager to store application service credentials. The company ' s security policy prohibits applications in a private VPC from using public IP addresses to communicate.

Which combination of solutions will meet these requirements? (Select TWO.)

Options:

A.

Configure gateway endpoints for Amazon S3 and AWS Secrets Manager.

B.

Configure interface VPC endpoints for Amazon S3 and AWS Secrets Manager.

C.

Add routes to the endpoints in the VPC route table.

D.

Associate the App1 security group with the interface VPC endpoints. Configure a self-referencing security group rule to allow inbound traffic.

E.

Associate the App1 security group with the gateway endpoints. Configure a self-referencing security group rule to allow inbound traffic.

Buy Now
Questions 89

A company must follow strict regulations for the management of data encryption keys. The company manages its own key externally and imports the key into AWS Key Management Service (AWS KMS). The company must control the imported key material and must rotate the key material on a regular schedule.

A solutions architect needs to import the key material into AWS KMS and rotate the key without interrupting applications that use the key.

Which solution will meet these requirements?

Options:

A.

Create a new AWS KMS key that has the same key ID as the existing key. Import new key material into the key.

B.

Schedule the existing AWS KMS key for deletion. Create a new KMS key that has new key material.

C.

Import new key material into the existing AWS KMS key. Set an expiration time for the old key material.

D.

Enable automatic key rotation for the existing AWS KMS key.

Buy Now
Questions 90

A company needs to save confidential medical results in an Amazon S3 bucket. The repository must allow a few approved users to add new files. The repository must restrict all other users to read-only access by using a write once, read many (WORM) approach. The company must keep every file in the repository for a minimum of 1 year after its creation date.

Which solution will meet these requirements with the LEAST implementation effort?

Options:

A.

Configure the S3 bucket with multi-factor authentication (MFA) delete. Do not share the MFA secret with users to avoid deletion.

B.

Use S3 Object Lock in compliance mode with a retention period of 1 year. Use an IAM policy that restricts file access to specified approved users.

C.

Use an IAM role to restrict all users from deleting or changing objects in the S3 bucket. Use an S3 bucket policy to only allow the IAM role.

D.

Configure the S3 bucket to invoke an AWS Lambda function every time an object is added. Configure the function to track the hash of the saved object so that modified objects can be marked accordingly.

Buy Now
Questions 91

A solutions architect is building an Amazon S3 data lake for a company. The company uses Amazon Kinesis Data Firehose to ingest customer personally identifiable information (PII) and transactional data in near real-time to an S3 bucket. The company needs to mask all PII data before storing thedata in the data lake.

Which solution will meet these requirements?

Options:

A.

Create an AWS Lambda function to detect and mask PII. Invoke the function from Kinesis Data Firehose.

B.

Use Amazon Macie to scan the S3 bucket. Configure Macie to detect and mask PII.

C.

Enable server-side encryption (SSE) on the S3 bucket.

D.

Create an AWS Lambda function that integrates with AWS CloudHSM. Configure the function to detect and mask PII.

Buy Now
Questions 92

A solutions architect manages a containerized application that is deployed on Amazon ECS. The application stores data in an Amazon DynamoDB database. The solutions architect must implement a solution to rotate the database credentials every 30 days.

Which solution will meet these requirements with the LEAST development effort?

Options:

A.

Store the credentials as an ECS environment variable. Enable encryption by using AWS KMS with scheduled rotation configured.

B.

Store the credentials as a secure string parameter in AWS Systems Manager Parameter Store. Configure automated rotation of the parameter on a schedule.

C.

Store the credentials as a secret in AWS Secrets Manager. Configure automated rotation of the secret on a schedule.

D.

Store the ciphertext as an application environment variable. Implement client-side encryption and scheduled rotation by using code.

Buy Now
Questions 93

A company is developing a latency-sensitive application. Part of the application includes several AWS Lambda functions that need to initialize as quickly as possible. The Lambda functions are written in Java and contain initialization code outside the handlers to load libraries, initialize classes, and generate unique IDs.

Which solution will meet the startup performance requirement MOST cost-effectively?

Options:

A.

Move all the initialization code to the handlers for each Lambda function. Activate Lambda SnapStart for each Lambda function. Configure SnapStart to reference the $LATEST version of each Lambda function.

B.

Publish a version of each Lambda function. Create an alias for each Lambda function. Configure each alias to point to its corresponding version. Set up a provisioned concurrency configuration for each Lambda function to point to the corresponding alias.

C.

Publish a version of each Lambda function. Set up a provisioned concurrency configuration for each Lambda function to point to the corresponding version. Activate Lambda SnapStart for the published versions of the Lambda functions.

D.

Update the Lambda functions to add a pre-snapshot hook. Move the code that generates unique IDs into the handlers. Publish a version of each Lambda function. Activate Lambda SnapStart for the published versions of the Lambda functions.

Buy Now
Questions 94

A company needs to run a critical Python data processing job each night. The job runs for approximately 1 hour and must not be interrupted.

Which solution will meet these requirements MOST cost-effectively?

Options:

A.

Deploy an Amazon ECS cluster with the AWS Fargate launch type. Use the Fargate Spot capacity provider. Schedule the job to run once each night.

B.

Create an AWS Step Functions Express workflow. Define a state machine for the process. Use Amazon EventBridge to schedule the workflow.

C.

Create an AWS Lambda function that uses the existing Python code. Configure Amazon EventBridge to invoke the function once each night.

D.

Deploy an Amazon EC2 On-Demand Instance that runs Amazon Linux. Migrate the Python script to the EC2 instance. Use a cron job to schedule the script. Create an AWS Lambda function to start and stop the instance once each night.

Buy Now
Questions 95

A company is planning to migrate multiple workloads to Amazon EC2 instances and needs to determine an appropriate AWS account structure. The workloads must be isolated from one another and belong to separate business units. The company needs to be able to perform chargeback to the business units by using a consolidated monthly view.

Which solution will meet these requirements with the LEAST administrative overhead?

Options:

A.

Create a separate standalone AWS account for each business unit. Create a script to call AWS Cost Explorer APIs from each account to perform chargeback.

B.

Create a single organization in AWS Organizations. Create a member account for each business unit. Use the bill from the organization management account to perform chargeback.

C.

Create a single AWS account for all the business units. Assign tags to the EC2 instances that correspond with the business units. Activate the tags for cost allocation to perform chargeback by using AWS Cost Explorer.

D.

Create a separate organization in AWS Organizations for each business unit. Use the bill in each organization management account to perform chargeback.

Buy Now
Questions 96

A company stores medical reports and images in Amazon S3 Standard storage. The company accesses each medical report only once each year. However, the company must be able to access the medical reports in real time when necessary. The company rarely accesses the medical images, but the company must retain each image for 7 years. The company can tolerate flexible retrieval times for the medical images.

The company wants to optimize storage costs for the medical reports and images.

Which solution will meet this requirement MOST cost-effectively?

Options:

A.

Store the medical reports and images in S3 Glacier Deep Archive.

B.

Store the medical reports in S3 Glacier Instant Retrieval. Store the medical images in S3 Glacier Deep Archive.

C.

Store the medical reports in S3 Intelligent-Tiering. Store the medical images in S3 Glacier Deep Archive.

D.

Store the medical reports in S3 Glacier Flexible Retrieval. Store the medical images in S3 Glacier Deep Archive.

Buy Now
Questions 97

A company needs to allow a vendor to access CloudWatch Logs in the company’s AWS account by using IAM roles for cross-account access.

Which solution will meet these requirements?

Options:

A.

Create roles in both accounts and trust the company role.

B.

Create a role in the vendor account and trust the company role.

C.

Create a role in the company account and trust the company role.

D.

Create a role in the company account with permissions and trust the vendor role.

Buy Now
Questions 98

A marketing team wants to build a campaign for an upcoming multi-sport event. The team has news reports from the past five years in PDF format. The team needs a solution to extract insights about the content and the sentiment of the news reports. The solution must use Amazon Textract to process the news reports.

Which solution will meet these requirements with the LEAST operational overhead?

Options:

A.

Provide the extracted insights to Amazon Athena for analysis Store the extracted insights and analysis in an Amazon S3 bucket.

B.

Store the extracted insights in an Amazon DynamoDB table. Use Amazon SageMaker to build a sentiment model.

C.

Provide the extracted insights to Amazon Comprehend for analysis. Save the analysis to an Amazon S3 bucket.

D.

Store the extracted insights in an Amazon S3 bucket. Use Amazon QuickSight to visualize and analyze the data.

Buy Now
Questions 99

A company runsmultiple applications on Amazon EC2 instances in a VPC.

Application Aruns in aprivate subnetthat has acustom route table and network ACL.

Application Bruns in asecond private subnet in the same VPC.

The companyneeds to prevent Application A from sending traffic to Application B.

Which solution will meet this requirement?

Options:

A.

Add adeny outbound ruleto asecurity group associated with Application B. Configure the rule toprevent Application B from sending traffic to Application A.

B.

Add adeny outbound ruleto asecurity group associated with Application A. Configure the rule toprevent Application A from sending traffic to Application B.

C.

Add adeny outbound ruleto thecustom network ACL for the Application B subnet. Configure the rule toprevent Application B from sending traffic to the IP addresses associated with Application A.

D.

Add adeny outbound ruleto thecustom network ACL for the Application A subnet. Configure the rule toprevent Application A from sending traffic to the IP addresses associated with Application B.

Buy Now
Questions 100

A company is migrating its on-premises Oracle database to an Amazon RDS for Oracle database. The company needs to retain data for 90 days to meet regulatory requirements. The company must also be able to restore the database to a specific point in time for up to 14 days.

Which solution will meet these requirements with the LEAST operational overhead?

Options:

A.

Create Amazon RDS automated backups. Set the retention period to 90 days.

B.

Create an Amazon RDS manual snapshot every day. Delete manual snapshots that are older than 90 days.

C.

Use the Amazon Aurora Clone feature for Oracle to create a point-in-time restore. Delete clones that are older than 90 days

D.

Create a backup plan that has a retention period of 90 days by using AWS Backup for Amazon RDS.

Buy Now
Questions 101

A company wants to reduce the cost of its existing three-tier web application. The web servers, application servers, and database servers run on Amazon EC2 On-Demand instances in development, test, and production environments. The EC2 instances average 30% CPU utilization during peak hours and 10% CPU utilization during non-peak hours.

The production EC2 instances run 24 hours a day all year. The development and test EC2 instances run for at least 8 hours a day all year. The company wants to implement automation to stop the development and test EC2 instances when those EC2 instances are not in use.

Which EC2 instance purchasing solution will meet these requirements MOST cost-effectively?

Options:

A.

Use Reserved Instances for the production EC2 instances. Use Reserved Instances for the development and test EC2 instances.

B.

Use Reserved Instances for the production EC2 instances. Use On-Demand Instances for the development and test EC2 instances.

C.

Use a Spot Fleet for the production EC2 instances. Use Reserved Instances for the development and test EC2 instances.

D.

Use On-Demand Instances for the production EC2 instances. Use a Spot Fleet for the development and test EC2 instances.

Buy Now
Questions 102

A company deployed a three-tier web application in a single Availability Zone in the us-east-1 Region on a single Amazon EC2 instance. Usage of the application is growing.

A solutions architect needs to ensure that the application can handle the growing amount of traffic and that the application is resilient. The solution must be cost-effective.

Which solution will meet these requirements MOST cost-effectively?

Options:

A.

Create two additional EC2 instances spread across two separate Availability Zones. Create an Application Load Balancer (ALB). Configure the ALB to route traffic to a target group that contains all three instances. Create an Amazon CloudWatch alarm to scale the EC2 instances vertically to handle the application traffic.

B.

Create eight additional EC2 instances spread across three separate Availability Zones. Create an Application Load Balancer (ALB). Configure the ALB to route traffic to a target group that contains all nine instances. Create an Amazon CloudWatch alarm to scale the EC2 instances horizontally to handle the application traffic.

C.

Create an EC2 Auto Scaling group that contains a minimum of three EC2 instances in the same Availability Zone. Create an Application Load Balancer (ALB). Configure the ALB to route traffic to a target group that contains all the instances. Configure scheduled scaling for the Auto Scaling group.

D.

Create an EC2 Auto Scaling group that contains a minimum of three EC2 instances spread across Availability Zones. Create an Application Load Balancer (ALB). Configure the ALB to route traffic to a target group that contains all the instances. Create an Amazon CloudWatch alarm to scale the EC2 instances horizontally to handle the application traffic.

Buy Now
Questions 103

A weather forecasting company collects temperature readings from various sensors on a continuous basis. An existing data ingestion process collects the readings and aggregates the readings into larger Apache Parquet files. Then the process encrypts the files by using client-side encryption with KMS managed keys (CSE-KMS). Finally, the process writes the files to an Amazon S3 bucket with separate prefixes for each calendar day.

The company wants to run occasional SQL queries on the data to take sample moving averages for a specific calendar day.

Which solution will meet these requirements MOST cost-effectively?

Options:

A.

Configure Amazon Athena to read the encrypted files. Run SQL queries on the data directly in Amazon S3.

B.

Use Amazon S3 Select to run SQL queries on the data directly in Amazon S3.

C.

Configure Amazon Redshift to read the encrypted files Use Redshift Spectrum and Redshift query editor v2 to run SQL queries on the data directly in Amazon S3.

D.

Configure Amazon EMR Serverless to read the encrypted files. Use Apache SparkSQL to run SQL queries on the data directly in Amazon S3.

Buy Now
Questions 104

A company is building a new application that uses multiple serverless architecture components. The application architecture includes an Amazon API Gateway REST API and AWS Lambda functions to manage incoming requests.

The company needs a service to send messages that the REST API receives to multiple target Lambda functions for processing. The service must filter messages so each target Lambda function receives only the messages the function needs.

Which solution will meet these requirements with the LEAST operational overhead?

Options:

A.

Send the requests from the REST API to an Amazon Simple Notification Service (Amazon SNS) topic. Subscribe multiple Amazon Simple Queue Service (Amazon SQS) queues to the SNS topic. Configure the target Lambda functions to poll the SQS queues.

B.

Send the requests from the REST API to a set of Amazon EC2 instances that are configured to process messages. Configure the instances to filter messages and to invoke the target Lambda functions.

C.

Send the requests from the REST API to Amazon Managed Streaming for Apache Kafka (Amazon MSK). Configure Amazon MSK to publish the messages to the target Lambda functions.

D.

Send the requests from the REST API to multiple Amazon Simple Queue Service (Amazon SQS) queues. Configure the target Lambda functions to poll the SQS queues.

Buy Now
Questions 105

A company is using Amazon CloudFront with its website. The company has enabled logging on the CloudFront distribution, and logs are saved in one of the company ' s Amazon S3 buckets. The company needs to perform advanced analyses on the logs and build visualizations.

What should a solutions architect do to meet these requirements?

Options:

A.

Use standard SQL queries in Amazon Athena to analyze the CloudFront logs in the S3 bucket. Visualize the results with AWS Glue.

B.

Use standard SQL queries in Amazon Athena to analyze the CloudFront logs in the S3 bucket. Visualize the results with Amazon QuickSight.

C.

Use standard SQL queries in Amazon DynamoDB to analyze the CloudFront logs in the S3 bucket. Visualize the results with AWS Glue.

D.

Use standard SQL queries in Amazon DynamoDB to analyze the CloudFront logs in the S3 bucket. Visualize the results with Amazon QuickSight.

Buy Now
Questions 106

A company wants to design a microservices architecture for an application. Each microservice must perform operations that can be completed within 30 seconds.

The microservices need to expose RESTful APIs and must automatically scale in response to varying loads. The APIs must also provide client access control and rate limiting to maintain equitable usage and service availability.

Which solution will meet these requirements with the LEAST operational overhead?

Options:

A.

Use Amazon Elastic Container Service (Amazon ECS) on Amazon EC2 to host each microservice. Use Amazon API Gateway to manage the RESTful API requests.

B.

Deploy each microservice as a set of AWS Lambda functions. Use Amazon API Gateway to manage the RESTful API requests.

C.

Host each microservice on Amazon EC2 instances in Auto Scaling groups behind an Elastic Load Balancing (ELB) load balancer. Use the ELB to manage the RESTful API requests.

D.

Deploy each microservice on Amazon Elastic Beanstalk. Use Amazon CloudFront to manage the RESTful API requests.

Buy Now
Questions 107

A solutions architect is designing the architecture for a company website that is composed of static content. The company ' s target customers are located in the United States and Europe.

Which architecture should the solutions architect recommend to MINIMIZE cost?

Options:

A.

Store the website files on Amazon S3 in the us-east-2 Region. Use an Amazon CloudFront distribution with the price class configured to limit the edge locations in use.

B.

Store the website files on Amazon S3 in the us-east-2 Region. Use an Amazon CloudFront distribution with the price class configured to maximize the use of edge locations.

C.

Store the website files on Amazon S3 in the us-east-2 Region and the eu-west-1 Region. Use an Amazon CloudFront geolocation routing policy to route requests to the closest Region to the user.

D.

Store the website files on Amazon S3 in the us-east-2 Region and the eu-west-1 Region. Use an Amazon CloudFront distribution with an Amazon Route 53 latency routing policy to route requests to the closest Region to the user.

Buy Now
Questions 108

An ecommerce company is migrating its on-premises workload to the AWS Cloud. The workload currently consists of a web application and a backend Microsoft SQL database for storage.

The company expects a high volume of customers during a promotional event. The new infrastructure in the AWS Cloud must be highly available and scalable.

Which solution will meet these requirements with the LEAST administrative overhead?

Options:

A.

Migrate the web application to two Amazon EC2 instances across two Availability Zones behind an Application Load Balancer. Migrate the database to Amazon RDS for Microsoft SQL Server with read replicas in both Availability Zones.

B.

Migrate the web application to an Amazon EC2 instance that runs in an Auto Scaling group across two Availability Zones behind an Application Load Balancer. Migrate the database to two EC2 instances across separate AWS Regions with database replication.

C.

Migrate the web application to Amazon EC2 instances that run in an Auto Scaling group across two Availability Zones behind an Application Load Balancer. Migrate the database to Amazon RDS with Multi-AZ deployment.

D.

Migrate the web application to three Amazon EC2 instances across three Availability Zones behind an Application Load Balancer. Migrate the database to three EC2 instances across three Availability Zones.

Buy Now
Questions 109

A company is developing a latency-sensitive application. Part of the application includes several AWS Lambda functions that need to initialize as quickly as possible. The Lambda functions are written in Java and contain initialization code outside the handlers to load libraries, initialize classes, and generate unique IDs.

Which solution will meet the startup performance requirement MOST cost-effectively?

Options:

A.

Move all the initialization code to the handlers for each Lambda function. Activate Lambda SnapStart for each Lambda function. Configure SnapStart to reference the $LATEST version of each Lambda function.

B.

Publish a version of each Lambda function. Create an alias for each Lambda function. Configure each alias to point to its corresponding version. Set up provisioned concurrency configuration for each Lambda function to point to the corresponding alias.

C.

Publish a version of each Lambda function. Set up a provisioned concurrency configuration for each Lambda function to point to the corresponding version. Activate Lambda SnapStart for the published versions of the Lambda functions.

D.

Update the Lambda functions to add a pre-snapshot hook. Move the code that generates unique IDs into the handlers. Publish a version of each Lambda function. Activate Lambda SnapStart for the published versions of the Lambda functions.

Buy Now
Questions 110

A company temporarily stages transactional datasets in an Amazon S3 bucket before the company moves the datasets to their final destinations. Some datasets include personally identifiable information PII.

The company must remove PII data during staging before the company moves the datasets to their destinations. A solutions architect needs to configure Amazon Macie to continuously monitor the datasets.

Which solution will meet these requirements with the LEAST operational overhead?

Options:

A.

Create an AWS Lambda function to launch an Amazon Macie discovery job when a new dataset is stored in the target S3 bucket if a Macie discovery job is not already running. Create a second Lambda function to remove the PII data that the Macie discovery job finds.

B.

Set up Amazon Macie automated sensitive data discovery. Create an AWS Lambda function to remove the PII data that Macie finds. Configure an Amazon EventBridge rule to invoke the Lambda function when Macie discovers PII data.

C.

Schedule a daily Amazon Macie discovery job. Create an AWS Lambda function to run once every day to remove the PII data that the daily Macie job finds.

D.

Create an AWS Lambda function that runs once each day to list all datasets that are saved to the S3 bucket every day. Call Amazon Macie on the list of datasets. Create a second Lambda function to remove the PII data that Macie finds. Configure an Amazon EventBridge rule to invoke the PII removal Lambda function every day.

Buy Now
Questions 111

A company is launching a new gaming application. The company will use Amazon EC2 Auto Scaling groups to deploy the application. The application stores user data in a relational database.

The company has office locations around the world that need to run analytics on the user data in the database. The company needs a cost-effective database solution that provides cross-Region disaster recovery with low-latency read performance across AWS Regions.

Which solution will meet these requirements?

Options:

A.

Create an Amazon ElastiCache for Redis cluster in the Region where the application is deployed. Create read replicas in Regions where the company offices are located. Ensure the company offices read from the read replica instances.

B.

Create Amazon DynamoDB global tables. Deploy the tables to the Regions where the company offices are located and to the Region where the application is deployed. Ensure that each company office reads from the tables that are in the same Region as the office.

C.

Create an Amazon Aurora global database. Configure the primary cluster to be in the Region where the application is deployed. Configure the secondary Aurora replicas to be in the Regions where the company offices are located. Ensure the company offices read from the Aurora replicas.

D.

Create an Amazon RDS Multi-AZ DB cluster deployment in the Region where the application is deployed. Ensure the company offices read from read replica instances.

Buy Now
Questions 112

A company runs a content management system on an Amazon Elastic Container Service (Amazon ECS) cluster. The system allows visitors to provide feedback about the company ' s products by uploading documents and photos of the products to an Amazon S3 bucket.

The company has a workflow on AWS that processes uploaded documents to perform sentiment analysis of photos and text. The processing workflow calls multiple AWS services.

The company needs a solution to automate the processing workflow. The solution must handle any failed uploads.

Which solution will meet these requirements with the LEAST effort?

Options:

A.

Use S3 Event Notifications to publish events to an Amazon Simple Notification Service (Amazon SNS) topic. Deploy a web application on the Amazon ECS cluster to subscribe to the SNS topic and listen for events to orchestrate the processing workflow.

B.

Use S3 Event Notifications to publish events to an Amazon Simple Queue Service (Amazon SQS) queue. Configure long polling. Deploy an Amazon EC2 instance that runs a script to orchestrate the processing workflow.

C.

Use S3 Event Notifications to publish events to an Amazon Simple Queue Service (Amazon SQS) queue. Create an ECS cluster that scales based on the number of messages in the queue. Configure the cluster to orchestrate the processing workflow.

D.

Use S3 Event Notifications to invoke an Amazon EventBridge rule. Configure the rule to initiate an AWS Step Functions workflow that orchestrates the processing workflow.

Buy Now
Questions 113

A global ecommerce company is designing a three-tier application on AWS. The application includes a web tier that serves static content, an application tier that handles business logic, and a database tier that stores product information and user data. The application interacts with a relational database.

The company needs a highly available application architecture to serve global users with low latency, with the least operational overhead.

Which solution will meet these requirements?

Options:

A.

Deploy Amazon EC2 instances in an Auto Scaling group for the application tier and web tier in a single AWS Region. Use an Application Load Balancer to distribute web traffic. Use an Amazon RDS database and Multi-AZ deployments for the database tier.

B.

Set up an Amazon CloudFront distribution that uses an Amazon S3 bucket as the origin. Use Amazon Elastic Container Service (Amazon ECS) containers on AWS Fargate to deploy the application tier to each AWS Region where the company operates. Use an Amazon Aurora global database for the database tier.

C.

Use an Amazon S3 bucket to store the static web content. Use Amazon EC2 Auto Scaling and EC2 Spot Instances for the application tier. Use Amazon RDS for MySQL with read replicas for the database tier. Use AWS Database Migration Service (AWS DMS) to replicate data to secondary AWS Regions.

D.

Use an Amazon S3 bucket to store static web content. Use AWS Lambda functions to handle serverless backend logic in the application tier. Use Amazon API Gateway to invoke the Lambda functions for web requests. Use an Amazon DynamoDB database for the database tier. Deploy the DynamoDB database across multiple AWS Regions.

Buy Now
Questions 114

A company generates SSL certificates from a third-party provider. The company imports the certificates into AWS Certificate Manager (ACM) to use with public web applications.

A solutions architect must implement a solution to notify the company ' s security team 30 days before an imported certificate expires. The company already has an Amazon Simple Queue Service (Amazon SQS) queue. The company also has an Amazon Simple Notification Service (Amazon SNS) topic that has the security team ' s email address as a subscriber.

Which solution will provide the security team with the required notification about certificates?

Options:

A.

Create an AWS Lambda function to scan for expiring certificates. Program the Lambda function to list the certificates in a JSON message and to deliver the message to the SQS queue.

B.

Create an AWS Lambda function to scan for expiring certificates. Program the Lambda function to list the certificates in a JSON message and to deliver the message to the SNS topic.

C.

Create an Amazon EventBridge rule that specifies the ACM Certificate Approaching Expiration event type. Set the SQS queue as the rule ' s target.

D.

Create an Amazon EventBridge rule that specifies the ACM Certificate Approaching Expiration event type. Set the SNS topic as the rule ' s target.

Buy Now
Questions 115

A company runs a Java-based job on an Amazon EC2 instance. The job runs every hour and takes 10 seconds to run. The job runs on a scheduled interval and consumes 1 GB of memory. The CPU utilization of the instance is low except for short surges during which the job uses the maximum CPU available. The company wants to optimize the costs to run the job.

Options:

A.

Use AWS App2Container (A2C) to containerize the job. Run the job as an Amazon Elastic Container Service (Amazon ECS) task on AWS Fargate with 0.5 virtual CPU (vCPU) and 1 GB of memory.

B.

Copy the code into an AWS Lambda function that has 1 GB of memory. Create an Amazon EventBridge scheduled rule to run the code each hour.

C.

Use AWS App2Container (A2C) to containerize the job. Install the container in the existing Amazon Machine Image (AMI). Ensure that the schedule stops the container when the task finishes.

D.

Configure the existing schedule to stop the EC2 instance at the completion of the job and restart the EC2 instance when the next job starts.

Buy Now
Questions 116

A company deploys its applications on Amazon Elastic Kubernetes Service (Amazon EKS) behind an Application Load Balancer in an AWS Region. The application needs to store data in a PostgreSQL database engine. The company wants the data in the database to be highly available. The company also needs increased capacity for read workloads.

Which solution will meet these requirements with the MOST operational efficiency?

Options:

A.

Create an Amazon DynamoDB database table configured with global tables.

B.

Create an Amazon RDS database with Multi-AZ deployments

C.

Create an Amazon RDS database with Multi-AZ DB cluster deployment.

D.

Create an Amazon RDS database configured with cross-Region read replicas.

Buy Now
Questions 117

A company is building new learning management applications on AWS. The company is using Amazon Elastic Container Service (Amazon ECS) on Amazon EC2 to host the applications. The company must ensure that container images are secure. Company administrators must receive notifications of any security vulnerabilities in the images.

Which combination of solutions will meet these requirements? (Select TWO.)

Options:

A.

Modify the ECS cluster properties to use privileged mode. Enable host-based logging.

B.

Use the AWS Config conformance pack for Amazon ECS. Use AWS Config to notify administrators if any security vulnerabilities are detected.

C.

Configure AWS WAF to invoke an Amazon CloudWatch alarm when a new security vulnerability is detected.

D.

Use Amazon Inspector to scan container images in Amazon Elastic Container Registry (Amazon ECR).

E.

Use AWS Systems Manager Parameter Store to encrypt container images.

Buy Now
Questions 118

A company provides a trading platform to customers. The platform uses an Amazon API Gateway REST API, AWS Lambda functions, and an Amazon DynamoDB table. Each trade that the platform processes invokes a Lambda function that stores the trade data in Amazon DynamoDB. The company wants to ingest trade data into a data lake in Amazon S3 for near real-time analysis. Which solution will meet these requirements with the LEAST operational overhead?

Options:

A.

Use Amazon DynamoDB Streams to capture the trade data changes. Configure DynamoDB Streams to invoke a Lambda function that writes the data to Amazon S3.

B.

Use Amazon DynamoDB Streams to capture the trade data changes. Configure DynamoDB Streams to invoke a Lambda function that writes the data to Amazon Data Firehose. Write the data from Data Firehose to Amazon S3.

C.

Enable Amazon Kinesis Data Streams on the DynamoDB table to capture the trade data changes. Configure Kinesis Data Streams to invoke a Lambda function that writes the data to Amazon S3.

D.

Enable Amazon Kinesis Data Streams on the DynamoDB table to capture the trade data changes. Configure a data stream to be the input for Amazon Data Firehose. Write the data from Data Firehose to Amazon S3.

Buy Now
Questions 119

A company is developing an application in the AWS Cloud. The application ' s HTTP API contains critical information that is published in Amazon API Gateway. The critical information must be accessible from only a limited set of trusted IP addresses that belong to the company ' s internal network.

Which solution will meet these requirements?

Options:

A.

Set up an API Gateway private integration to restrict access to a predefined set ot IP addresses.

B.

Create a resource policy for the API that denies access to any IP address that is not specifically allowed.

C.

Directly deploy the API in a private subnet. Create a network ACL. Set up rules to allow the traffic from specific IP addresses.

D.

Modify the security group that is attached to API Gateway to allow inbound traffic from only the trusted IP addresses.

Buy Now
Questions 120

An ecommerce company hosts an analytics application on AWS. The company deployed the application to one AWS Region. The application generates 300 MB of data each month. The application stores the data in JSON format. The data must be accessible in milliseconds when needed. The company must retain the data for 30 days. The company requires a disaster recovery solution to back up the data.

Options:

A.

Deploy an Amazon OpenSearch Service cluster in the primary Region and in a second Region. Enable OpenSearch Service cluster replication. Configure the clusters to expire data after 30 days. Modify the application to use OpenSearch Service to store the data.

B.

Deploy an Amazon S3 bucket in the primary Region and in a second Region. Enable versioning on both buckets. Use the Standard storage class. Configure S3 Lifecycle policies to expire objects after 30 days. Configure S3 Cross-Region Replication from the bucket in the primary bucket to the backup bucket.

C.

Deploy an Amazon Aurora PostgreSQL global database. Configure cluster replication between the primary Region and a second Region. Use a replicated cluster endpoint during outages in the primary Region.

D.

Deploy an Amazon RDS for PostgreSQL cluster in the same Region where the application is deployed. Configure a read replica in a second Region as a backup.

Buy Now
Questions 121

A company uses an Amazon EC2 Auto Scaling group to host an API. The EC2 instances are in a target group that is associated with an Application Load Balancer (ALB). The company stores data in an Amazon Aurora PostgreSQL database.

The API has a weekly maintenance window. The company must ensure that the API returns a static maintenance response during the weekly maintenance window.

Which solution will meet this requirement with the LEAST operational overhead?

Options:

A.

Create a table in Aurora PostgreSQL that has fields to contain keys and values. Create a key for a maintenance flag. Set the flag when the maintenance window starts. Configure the API to query the table for the maintenance flag and to return a maintenance response if the flag is set. Reset the flag when the maintenance window is finished.

B.

Create an Amazon Simple Queue Service (Amazon SQS) queue. Subscribe the EC2 instances to the queue. Publish a message to the queue when the maintenance window starts. Configure the API to return a maintenance message if the instances receive a maintenance start message from the queue. Publish another message to the queue when the maintenance window is finished to restore normal operation.

C.

Create a listener rule on the ALB to return a maintenance response when the path on a request matches a wildcard. Set the rule priority to one. Perform the maintenance. When the maintenance window is finished, delete the listener rule.

D.

Create an Amazon Simple Notification Service (Amazon SNS) topic Subscribe the EC2 instances to the topic Publish a message to the topic when the maintenance window starts. Configure the API to return a maintenance response if the instances receive the maintenance start message from the topic. Publish another message to the topic when the maintenance window finshes to restore normal operation.

Buy Now
Questions 122

An ecommerce company is migrating its on-premises workload to the AWS Cloud. The workload consists of a web application and a backend Microsoft SQL Server database. The company expects a high volume of customers during a promotional event. The new AWS infrastructure must be highly available and scalable.

Which solution will meet these requirements with the LEAST administrative overhead?

Options:

A.

Migrate the web application to two EC2 instances across two Availability Zones behind an Application Load Balancer. Migrate the database to Amazon RDS for Microsoft SQL Server with read replicas in both Availability Zones.

B.

Migrate the web application to an EC2 instance in an Auto Scaling group across two Availability Zones behind an Application Load Balancer. Migrate the database to two EC2 instances across separate Regions with database replication.

C.

Migrate the web application to EC2 instances in an Auto Scaling group across two Availability Zones behind an Application Load Balancer. Migrate the database to Amazon RDS with a Multi-AZ deployment.

D.

Migrate the web application to three EC2 instances across three Availability Zones behind an Application Load Balancer. Migrate the database to three EC2 instances across three Availability Zones.

Buy Now
Questions 123

A company has resources across multiple AWS Regions and accounts. A newly hired solutions architect discovers that a previous employee did not provide details about the resources inventory. The solutions architect needs to build and map the relationship details of the various workloads across all accounts.

Which solution will meet these requirements in the MOST operationally efficient way?

Options:

A.

Use AWS Systems Manager Inventory to generate a map view from the detailed view report.

B.

Use AWS Step Functions to collect workload details. Build architecture diagrams of the workloads manually.

C.

Use Workload Discovery on AWS to generate architecture diagrams of the workloads.

D.

Use AWS X-Ray to view the workload details. Build architecture diagrams with relationships.

Buy Now
Questions 124

A company has a production Amazon RDS for MySQL database. The company needs to create a new application that will read frequently changing data from the database with minimal impact on the database ' s overall performance. The application will rarely perform the same query more than once.

What should a solutions architect do to meet these requirements?

Options:

A.

Set up an Amazon ElastiCache cluster. Query the results in the cluster.

B.

Set up an Application Load Balancer ALB. Query the results in the ALB.

C.

Set up a read replica for the database. Query the read replica.

D.

Set up querying of database snapshots. Query the database snapshots.

Buy Now
Questions 125

A company wants to standardize its Amazon Elastic Block Store (Amazon EBS) volume encryption strategy. The company also wants to minimize the cost and configuration effort required to operate the volume encryption check.

Which solution will meet these requirements?

Options:

A.

Write API calls to describe the EBS volumes and to confirm the EBS volumes are encrypted. Use Amazon EventBridge to schedule an AWS Lambda function to run the API calls.

B.

Write API calls to describe the EBS volumes and to confirm the EBS volumes are encrypted. Run the API calls on an AWS Fargate task.

C.

Create an AWS Identity and Access Management (IAM) policy that requires the use of tags on EBS volumes. Use AWS Cost Explorer to display resources that are not properly tagged. Encrypt the untagged resources manually.

D.

Create an AWS Config rule for Amazon EBS to evaluate if a volume is encrypted and to flag the volume if it is not encrypted.

Buy Now
Questions 126

A website runs on Amazon EC2 behind an ALB with Amazon CloudFront in front. The site is receiving a high rate of unwanted requests from specific IP addresses.

How should the solutions architect address this problem?

Options:

A.

Use AWS Shield to configure IP deny rules.

B.

Increase Auto Scaling capacity.

C.

Configure VPC network ACL deny rules.

D.

Use AWS WAF with a rate-based rule on the CloudFront distribution.

Buy Now
Questions 127

A company recently migrated a data warehouse to AWS. The company has an AWS Direct Connect connection to AWS. Company users query the data warehouse by using a visualization tool. The average size of the queries that the data warehouse returns is 50 MB. The average visualization that the visualization tool produces is 500 KB in size. The result sets that the data warehouse returns are not cached.

The company wants to optimize costs for data transfers between the data warehouse and the company.

Which solution will meet this requirement?

Options:

A.

Host the visualization tool on premises. Connect to the data warehouse directly through the internet.

B.

Host the visualization tool in the same AWS Region as the data warehouse. Access the visualization tool through the internet.

C.

Host the visualization tool on premises. Connect to the data warehouse through the Direct Connect connection.

D.

Host the visualization tool in the same AWS Region as the data warehouse. Access the visualization tool through the Direct Connect connection.

Buy Now
Questions 128

A company uses AWS Lake Formation to govern its S3 data lake. It wants to visualize data in QuickSight by joining S3 data with Aurora MySQL operational data. The marketing team must see only specific columns.

Which solution provides column-level authorization with the least operational overhead?

Options:

A.

Use EMR to ingest database data into SPICE with only required columns.

B.

Use AWS Glue Studio to ingest database data into S3 and use IAM policies for column control.

C.

Use AWS Glue Elastic Views to create materialized S3 views with column restrictions.

D.

Use a Lake Formation blueprint to ingest database data to S3. Use Lake Formation for column-level access control. Use Athena as the QuickSight data source.

Buy Now
Questions 129

A company runs an application on Amazon EC2 instances. The application is deployed in private subnets in three Availability Zones of the us-east-1 Region. The instances must be able to connect to the internet to download files. The company wants a design that is highly available across the Region.

Which solution should be implemented to ensure that there are no disruptions to internet connectivity?

Options:

A.

Deploy a NAT instance in a private subnet of each Availability Zone.

B.

Deploy a NAT gateway in a public subnet of each Availability Zone.

C.

Deploy a transit gateway in a private subnet of each Availability Zone.

D.

Deploy an internet gateway in a public subnet of each Availability Zone.

Buy Now
Questions 130

A company asks a solutions architect to review the architecture for its messaging application. The application uses TCP and UDP traffic. The company is planning to deploy a new VoIP feature, but its 10 test users in other countries are reporting poor call quality.

The VoIP application runs on an Amazon EC2 instance with more than enough resources. The HTTP portion of the company ' s application behind an Application Load Balancer has no issues.

What should the solutions architect recommend for the company to do to address the VoIP performance issues?

Options:

A.

Use AWS Global Accelerator.

B.

Implement Amazon CloudFront into the architecture.

C.

Use an Amazon Route 53 geoproximity routing policy.

D.

Migrate from Application Load Balancers to Network Load Balancers.

Buy Now
Questions 131

A disaster response team is using drones to collect images of recent storm damage. The response team ' s laptops lack the storage and compute capacity to transfer the images and process the data.

While the team has Amazon EC2 instances for processing and Amazon S3 buckets for storage, network connectivity is intermittent and unreliable. The images need to be processed to evaluate the damage.

What should a solutions architect recommend?

Options:

A.

Use AWS Snowball Edge devices to process and store the images.

B.

Upload the images to Amazon Simple Queue Service (Amazon SQS) during intermittent connectivity to EC2 instances.

C.

Configure Amazon Data Firehose to create multiple delivery streams aimed separately at the S3 buckets for storage and the EC2 instances for processing images.

D.

Use AWS Storage Gateway pre-installed on a hardware appliance to cache the images locally for Amazon S3 to process the images when connectivity becomes available.

Buy Now
Questions 132

A company is building a solution to provide customers with an API that accesses financial data. The API backend needs to compute tax data for each request. The company anticipates greater demand to access the data during the last 3 months of each year.

A solutions architect needs to design a scalable solution that can meet the regular demand and the peak demand at the end of each year.

Which solution will meet these requirements?

Options:

A.

Host the API on an Amazon EC2 instance that runs third-party software. Configure the EC2 instance to perform tax computations.

B.

Deploy an Amazon API Gateway REST API. Create an AWS Lambda function to perform tax computations. Integrate the Lambda function with the REST API.

C.

Create an Application Load Balancer (ALB) in front of two Amazon EC2 instances. Configure the EC2 instances to perform tax computations.

D.

Deploy an Amazon API Gateway REST API. Configure an Amazon EC2 instance to perform tax computations. Integrate the EC2 instance with the REST API.

Buy Now
Questions 133

A company discovers that an Amazon DynamoDB Accelerator (DAX) cluster for the company ' s web application workload is not encrypting data at rest. The company needs to resolve thesecurity issue.

Which solution will meet this requirement?

Options:

A.

Stop the existing DAX cluster. Enable encryption at rest for the existing DAX cluster, and start the cluster again.

B.

Delete the existing DAX cluster. Recreate the DAX cluster, and configure the new cluster to encrypt the data at rest.

C.

Update the configuration of the existing DAX cluster to encrypt the data at rest.

D.

Integrate the existing DAX cluster with AWS Security Hub to automatically enable encryption at rest.

Buy Now
Questions 134

A company wants to share data that is collected from self-driving cars with the automobile community. The data will be made available from within an Amazon S3 bucket. The company wants to minimize its cost of making this data available to other AWS accounts.

What should a solutions architect do to accomplish this goal?

Options:

A.

Create an S3 VPC endpoint for the bucket.

B.

Configure the S3 bucket to be a Requester Pays bucket.

C.

Create an Amazon CloudFront distribution in front of the S3 bucket.

D.

Require that the files be accessible only with the use of the BitTorrent protocol.

Buy Now
Questions 135

A company wants to send data from its on-premises systems to Amazon S3 buckets. The company created the S3 buckets in three different accounts. The company must send the data privately without the data traveling across the internet. The company has no existing dedicated connectivity to AWS.

Which combination of steps should a solutions architect take to meet these requirements? (Select TWO.)

Options:

A.

Establish a networking account in the AWS Cloud. Create a private VPC in the networking account. Set up an AWS Direct Connect connection with a private VIF between the on-premises environment and the private VPC.

B.

Establish a networking account in the AWS Cloud. Create a private VPC in the networking account. Set up an AWS Direct Connect connection with a public VIF between the on-premises environment and the private VPC.

C.

Create an Amazon S3 interface endpoint in the networking account.

D.

Create an Amazon S3 gateway endpoint in the networking account.

E.

Establish a networking account in the AWS Cloud. Create a private VPC in the networking account. Peer VPCs from the accounts that host the S3 buckets with the VPC in the network account.

Buy Now
Questions 136

A company is launching a new application that requires a structured database to store user profiles, application settings, and transactional data. The database must be scalable with application traffic and must offer backups.

Which solution will meet these requirements MOST cost-effectively?

Options:

A.

Deploy a self-managed database on Amazon EC2 instances by using open source software. Use Spot Instances for cost optimization. Configure automated backups to Amazon S3.

B.

Use Amazon RDS. Use on-demand capacity mode for the database with General Purpose SSD storage. Configure automatic backups with a retention period of 7 days.

C.

Use Amazon Aurora Serverless for the database. Use serverless capacity scaling. Configure automated backups to Amazon S3.

D.

Deploy a self-managed NoSQL database on Amazon EC2 instances. Use Reserved Instances for cost optimization. Configure automated backups directly to Amazon S3 Glacier Flexible Retrieval.

Buy Now
Questions 137

A company is launching a new application that requires a structured database to store user profiles, application settings, and transactional data. The database must be scalable with application traffic and must offer backups.

Which solution will meet these requirements MOST cost-effectively?

Options:

A.

Deploy a self-managed database on Amazon EC2 instances by using open-source software. Use Spot Instances for cost optimization. Configure automated backups to Amazon S3.

B.

Use Amazon RDS. Use on-demand capacity mode for the database with General Purpose SSD storage. Configure automatic backups with a retention period of 7 days.

C.

Use Amazon Aurora Serverless for the database. Use serverless capacity scaling. Configure automated backups to Amazon S3.

D.

Deploy a self-managed NoSQL database on Amazon EC2 instances. Use Reserved Instances for cost optimization. Configure automated backups directly to Amazon S3 Glacier Flexible Retrieval.

Buy Now
Questions 138

A company runs a container application on a Kubernetes cluster in the company ' s data center. The application uses Advanced Message Queuing Protocol AMQP to communicate with a message queue. The data center cannot scale fast enough to meet the company ' s expanding business needs. The company wants to migrate the workloads to AWS.

Which solution will meet these requirements with the LEAST operational overhead?

Options:

A.

Migrate the container application to Amazon ECS. Use Amazon SQS to retrieve the messages.

B.

Migrate the container application to Amazon EKS. Use Amazon MQ to retrieve the messages.

C.

Use highly available Amazon EC2 instances to run the application. Use Amazon MQ to retrieve the messages.

D.

Use AWS Lambda functions to run the application. Use Amazon SQS to retrieve the messages.

Buy Now
Questions 139

A company hosts an application on AWS. The application gives users the ability to upload photos and store the photos in an Amazon S3 bucket. The company wants to use Amazon CloudFront and a custom domain name to upload the photo files to the S3 bucket in the eu-west-1 Region.

Which solution will meet these requirements? (Select TWO.)

Options:

A.

Use AWS Certificate Manager (ACM) to create a public certificate in the us-east-1 Region. Use the certificate in CloudFront

B.

Use AWS Certificate Manager (ACM) to create a public certificate in eu-west-1. Use the certificate in CloudFront.

C.

Configure Amazon S3 to allow uploads from CloudFront. Configure S3 Transfer Acceleration.

D.

Configure Amazon S3 to allow uploads from CloudFront origin access control (OAC).

E.

Configure Amazon S3 to allow uploads from CloudFront. Configure an Amazon S3 website endpoint.

Buy Now
Questions 140

A company uses an organization in AWS Organizations to manage multiple AWS accounts. The company is migrating users from IAM to AWS IAM Identity Center.

The company wants to ensure that no new IAM users can be created in any of the member accounts. The company wants to allow only existing IAM users to have access to the accounts.

Which solution will meet these requirements?

Options:

A.

Create a service control policy SCP that denies the iam:CreateUser action. Apply the SCP to all the member accounts in the organization.

B.

Create an IAM policy that denies all IAM write operations. Attach the policy to all the users.

C.

Create an IAM group in each account. Attach a policy that denies the iam:CreateAccessKey action to the IAM group. Add the existing IAM users to the IAM group.

D.

Create a permissions boundary that denies the iam:CreateAccessKey action. Attach the permissions boundary to all IAM users and IAM groups in the organization.

Buy Now
Questions 141

A company runs a critical three-tier web application that consists of multiple virtual machines (VMs) and virtual databases in an on-premises environment. The company wants to set up a disaster recovery (DR) environment in AWS.

The company requires a 15-minute recovery time objective (RTO). The company must be able to test the failover solution to validate the recovery. The solution must provide an automated failover mechanism.

Which solution will meet these requirements?

Options:

A.

Use AWS Backup to create backups of the on-premises VMs and to restore the backups in AWS. Configure recovery to Amazon EC2 instances to meet the RTO requirement.

B.

Use AWS Database Migration Service (AWS DMS) to replicate the on-premises databases to Amazon RDS. Set up AWS Storage Gateway for baseline and incremental data replication to AWS to meet the RTO requirement.

C.

Use AWS DataSync and AWS Storage Gateway to migrate the baseline and incremental data to AWS. Use Amazon EC2, Amazon S3, and an Application Load Balancer to set up the DR environment.

D.

Use AWS Elastic Disaster Recovery to replicate the VMs incrementally to AWS. Configure Elastic Disaster Recovery to automate the DR process.

Buy Now
Questions 142

A company runs a production application on a fleet of Amazon EC2 instances. The application reads messages from an Amazon Simple Queue Service (Amazon SQS) queue and processes the messages in parallel. The message volume is unpredictable and highly variable.

The company must ensure that the application continually processes messages without any downtime.

Which solution will meet these requirements MOST cost-effectively?

Options:

A.

Use only Spot Instances to handle the maximum capacity required.

B.

Use only Reserved Instances to handle the maximum capacity required.

C.

Use Reserved Instances to handle the baseline capacity. Use Spot Instances to provide additional capacity when required.

D.

Use Reserved Instances in an EC2 Auto Scaling group to handle the minimum capacity. Configure an auto scaling policy that is based on the SQS queue backlog.

Buy Now
Questions 143

A company needs a data encryption solution for a machine learning (ML) process. The solution must use an AWS managed service. The ML process currently reads a large number of objects in Amazon S3 that are encrypted by a customer managed AWS KMS key. The current process incurs significant costs because of excessive calls to AWS Key Management Service (AWS KMS) to decrypt S3 objects. The company wants to reduce the costs of API calls to decrypt S3 objects.

Options:

A.

Switch from a customer managed KMS key to an AWS managed KMS key.

B.

Remove the AWS KMS encryption from the S3 bucket. Use a bucket policy to encrypt the data instead.

C.

Recreate the KMS key in AWS CloudHSM.

D.

Use S3 Bucket Keys to perform server-side encryption with AWS KMS keys (SSE-KMS) to encrypt and decrypt objects from Amazon S3.

Buy Now
Questions 144

A company sets up an organization in AWS Organizations that contains 10AWS accounts. A solutions architect must design a solution to provide access to the accounts for several thousand employees. The company has an existing identity provider (IdP). The company wants to use the existing IdP for authentication to AWS.

Which solution will meet these requirements?

Options:

A.

Create IAM users for the employees in the required AWS accounts. Connect IAM users to the existing IdP. Configure federated authentication for the IAM users.

B.

Set up AWS account root users with user email addresses and passwords that are synchronized from the existing IdP.

C.

Configure AWS IAM Identity Center Connect IAM Identity Center to the existing IdP Provision users and groups from the existing IdP

D.

Use AWS Resource Access Manager (AWS RAM) to share access to the AWS accounts with the users in the existing IdP.

Buy Now
Questions 145

A company plans to use an Amazon S3 bucket to archive backup data. Regulations require the company to retain the backup data for 7 years.

During the retention period, the company must prevent users, including administrators, from deleting the data. The company can delete the data after 7 years.

Which solution will meet these requirements?

Options:

A.

Create an S3 bucket policy that denies delete operations for 7 years. Create an S3 Lifecycle policy to delete the data after 7 years.

B.

Create an S3 Object Lock default retention policy that retains data for 7 years in governance mode. Create an S3 Lifecycle policy to delete the data after 7 years.

C.

Create an S3 Object Lock default retention policy that retains data for 7 years in compliance mode. Create an S3 Lifecycle policy to delete the data after 7 years.

D.

Create an S3 Batch Operations job to set a legal hold on each object for 7 years. Create an S3 Lifecycle policy to delete the data after 7 years.

Buy Now
Questions 146

A company wants to migrate an Oracle database to AWS. The database consists of a single table that contains millions of geographic information systems (GIS) images that are high resolution and are identified by a geographic code.

When a natural disaster occurs, tens of thousands of images get updated every few minutes. Each geographic code has a single image or row that is associated with it. The company wants a solution that is highly available and scalable during such events.

Options:

A.

Store the images and geographic codes in a database table. Use Oracle running on an Amazon RDS Multi-AZ DB instance.

B.

Store the images in Amazon S3 buckets. Use Amazon DynamoDB with the geographic code as the key and the image S3 URL as the value.

C.

Store the images and geographic codes in an Amazon DynamoDB table. Configure DynamoDB Accelerator (DAX) during times of high load.

D.

Store the images in Amazon S3 buckets. Store geographic codes and image S3 URLs in a database table. Use Oracle running on an Amazon RDS Multi-AZ DB instance.

Buy Now
Questions 147

A company is migrating its online shopping platform to AWS and wants to adopt a serverless architecture.

The platform has a user profile and preference service that does not have a defined schema. The platform allows user-defined fields.

Profile information is updated several times daily. The company must store profile information in a durable and highly available solution. The solution must capture modifications to profile data for future processing.

Which solution will meet these requirements?

Options:

A.

Use an Amazon RDS for PostgreSQL instance to store profile data. Use a log stream in Amazon CloudWatch Logs to capture modifications.

B.

Use an Amazon DynamoDB table to store profile data. Use Amazon DynamoDB Streams to capture modifications.

C.

Use an Amazon ElastiCache (Redis OSS) cluster to store profile data. Use Amazon Data Firehose to capture modifications.

D.

Use an Amazon Aurora Serverless v2 cluster to store the profile data. Use a log stream in Amazon CloudWatch Logs to capture modifications.

Buy Now
Questions 148

An ecommerce company runs a multi-tier application on AWS. The frontend and backend tiers run on Amazon EC2 instances. The database tier runs on an Amazon RDS for MySQL DB instance.

The application makes frequent calls to return identical datasets from the database. These frequent calls cause performance slowdowns. A solutions architect must improve the performance of the application backend.

Which solution will meet this requirement?

Options:

A.

Configure an Amazon Simple Notification Service (Amazon SNS) topic between the EC2 instances and the RDS DB instance.

B.

Configure an Amazon ElastiCache (Redis OSS) cache. Configure the backend EC2 instances to read from the cache.

C.

Configure an Amazon DynamoDB Accelerator (DAX) cluster. Configure the backend EC2 instances to read from the cluster.

D.

Configure Amazon Data Firehose to stream the calls to the database.

Buy Now
Questions 149

An ecommerce company runs a multi-tier application on AWS. The frontend and backend tiers both run on Amazon EC2 instances. The database tier runs on an Amazon RDS for MySQL DB instance. The backend tier communicates with the RDS DB instance.

The application makes frequent calls to return identical datasets from the database. The frequent calls on the database cause performance slowdowns. A solutions architect must improve the performance of the application backend.

Which solution will meet this requirement?

Options:

A.

Configure an Amazon Simple Notification Service (Amazon SNS) topic between the EC2 instances and the RDS DB instance.

B.

Configure an Amazon ElastiCache (Redis OSS) cache. Configure the backend EC2 instances to read from the cache.

C.

Configure an Amazon DynamoDB Accelerator (DAX) cluster. Configure the backend EC2 instances to read from the cluster.

D.

Configure Amazon Data Firehose to stream the calls to the database.

Buy Now
Questions 150

A media company hosts its video processing workload on AWS. The workload uses Amazon EC2 instances in an Auto Scaling group to handle varying levels of demand. The workload stores the original videos and the processed videos in an Amazon S3 bucket.

The company wants to ensure that the video processing workload is scalable. The company wants to prevent failed processing attempts because of resource constraints. The architecturemust be able to handle sudden spikes in video uploads without impacting the processing capability.

Which solution will meet these requirements with the LEAST overhead?

Options:

A.

Migrate the workload from Amazon EC2 instances to AWS Lambda functions. Configure an Amazon S3 event notification to invoke the Lambda functions when a new video is uploaded. Configure the Lambda functions to process videos directly and to save processed videos back to the S3 bucket.

B.

Migrate the workload from Amazon EC2 instances to AWS Lambda functions. Use Amazon S3 to invoke an Amazon Simple Notification Service (Amazon SNS) topic when a new video is uploaded. Subscribe the Lambda functions to the SNS topic. Configure the Lambda functions to process the videos asynchronously and to save processed videos back to the S3 bucket.

C.

Configure an Amazon S3 event notification to send a message to an Amazon Simple Queue Service (Amazon SQS) queue when a new video is uploaded. Configure the existing Auto Scaling group to poll the SQS queue, process the videos, and save processed videos back to the S3 bucket.

D.

Configure an Amazon S3 upload trigger to invoke an AWS Step Functions state machine when a new video is uploaded. Configure the state machine to orchestrate the video processing workflow by placing a job message in the Amazon SQS queue. Configure the job message to invoke the EC2 instances to process the videos. Save processed videos back to the S3 bucket.

Buy Now
Questions 151

A company is migrating some of its applications to AWS. The company wants to migrate and modernize the applications quickly after it finalizes networking and security strategies. The company has set up an AWS Direct Connect connection in a central network account.

The company expects to have hundreds of AWS accounts and VPCs in the near future. The corporate network must be able to access the resources on AWS seamlessly and also must be able to communicate with all the VPCs. The company also wants to route its cloud resources to the internet through its on-premises data center.

Which combination of steps will meet these requirements? (Select THREE.)

Options:

A.

Create a Direct Connect gateway in the central account. In each of the accounts, create an association proposal by using the Direct Connect gateway and the account ID for every virtual private gateway.

B.

Create a Direct Connect gateway and a transit gateway in the central network account. Attach the transit gateway to the Direct Connect gateway by using a transit VIF.

C.

Provision an internet gateway. Attach the internet gateway to subnets. Allow internet traffic through the gateway.

D.

Share the transit gateway with other accounts. Attach VPCs to the transit gateway.

E.

Provision VPC peering as necessary.

F.

Provision only private subnets. Open the necessary route on the transit gateway and customer gateway to allow outbound internet traffic from AWS to flow through NAT services that run in the data center.

Buy Now
Questions 152

A law firm needs to make hundreds of files readable for the general public. The law firm must prevent members of the public from modifying or deleting the files before a specified future date. Which solution will meet these requirements MOST securely?

Options:

A.

Upload the files to an Amazon S3 bucket that is configured for static website hosting. Grant read-only IAM permissions to any AWS principals that access the S3 bucket until the specified date.

B.

Create a new Amazon S3 bucket. Enable S3 Versioning. Use S3 Object Lock and set a retention period based on the specified date. Create an Amazon CloudFront distribution to serve content from the bucket. Use an S3 bucket policy to restrict access to the CloudFront origin access control (OAC).

C.

Create a new Amazon S3 bucket. Enable S3 Versioning. Configure an event trigger to run an AWS Lambda function if a user modifies or deletes an object. Configure the Lambda function to replace the modified or deleted objects with the original versions of the objects from a private S3 bucket.

D.

Upload the files to an Amazon S3 bucket that is configured for static website hosting. Select the folder that contains the files. Use S3 Object Lock with a retention period based on the specified date. Grant read-only IAM permissions to any AWS principals that access the S3 bucket.

Buy Now
Questions 153

A company is storing data that will not be frequently accessed in the AWS Cloud. If the company needs to access the data, the data must be retrieved within 12 hours. The company wants a solution that is cost-effective for storage costs per gigabyte.

Which Amazon S3 storage class will meet these requirements?

Options:

A.

S3 Standard

B.

S3 Glacier Flexible Retrieval

C.

S3 One Zone-Infrequent Access (S3 One Zone-IA)

D.

S3 Standard-Infrequent Access (S3 Standard-IA)

Buy Now
Questions 154

A solutions architect is designing the storage architecture for a new web application used for storing and viewing engineering drawings. All application components will be deployed on the AWS infrastructure. The application design must support caching to minimize the amount of time that users wait for the engineering drawings to load. The application must be able to store petabytes of data.

Which combination of storage and caching should the solutions architect use?

Options:

A.

Amazon S3 with Amazon CloudFront

B.

Amazon S3 Glacier Deep Archive with Amazon ElastiCache

C.

Amazon Elastic Block Store (Amazon EBS) volumes with Amazon CloudFront

D.

AWS Storage Gateway with Amazon ElastiCache

Buy Now
Questions 155

A developer needs to export the contents of several Amazon DynamoDB tables into Amazon S3 buckets to comply with company data regulations. The developer uses the AWS CLI to runcommands to export from each table to the proper S3 bucket. The developer sets up AWS credentials correctly and grants resources appropriate permissions. However, the exports of some tables fail.

What should the developer do to resolve this issue?

Options:

A.

Ensure that point-in-time recovery is enabled on the DynamoDB tables.

B.

Ensure that the target S3 bucket is in the same AWS Region as the DynamoDB table.

C.

Ensure that DynamoDB streaming is enabled for the tables.

D.

Ensure that DynamoDB Accelerator (DAX) is enabled.

Buy Now
Questions 156

A company plans to store sensitive user data on Amazon S3. Internal security compliance requirements mandate encryption of data before sending it to Amazon S3.

What should a solutions architect recommend to satisfy these requirements?

Options:

A.

Server-side encryption with customer-provided encryption keys

B.

Client-side encryption with Amazon S3 managed encryption keys

C.

Server-side encryption with keys stored in AWS Key Management Service (AWS KMS)

D.

Client-side encryption with a key stored in AWS Key Management Service (AWS KMS)

Buy Now
Questions 157

Question:

A finance company collects streaming data for a real-time search and visualization system. They want to migrate to AWS using a native solution for ingest, search, and visualization.

Options:

Options:

A.

Use EC2 to ingest/process data to S3 → Athena + Managed Grafana

B.

Use EMR to ingest/process to Redshift → Redshift Spectrum + QuickSight

C.

Use EKS to ingest/process to DynamoDB → CloudWatch Dashboards

D.

Use Kinesis Data Streams → Amazon OpenSearch Service → Amazon QuickSight

Buy Now
Questions 158

A company processes streaming data by using Amazon Kinesis Data Streams and an AWS Lambda function. The streaming data comes from devices that are connected to the internet. The company is experiencing scaling problems and needs to implement shard-level control and custom checkpointing.

Which solution will meet these requirements with the LEAST latency?

Options:

A.

Connect Kinesis Data Streams to Amazon Data Firehose to ingest incoming data to an Amazon S3 bucket. Configure S3 Event Notifications to invoke the Lambda function.

B.

Increase the provisioned concurrency settings for the Lambda function. Stream the data from Kinesis Data Streams to an Amazon Simple Queue Service (Amazon SQS) standard queue. Invoke the Lambda function to process the messages.

C.

Run the Lambda function code in an Amazon Elastic Container Service (Amazon ECS) container that runs on AWS Fargate. Change the code to use the Kinesis Client Library (KCL).

D.

Increase the memory and provisioned concurrency settings for the Lambda function. Stream the data from Kinesis Data Streams to an Amazon Simple Queue Service (Amazon SQS) FIFO queue. Configure the Lambda function to be invoked by the SQS queue.

Buy Now
Questions 159

A company needs to set up a centralized solution to audit API calls to AWS for workloads that run on AWS services and non AWS services. The company must store logs of the audits for 7 years.

Which solution will meet these requirements with the LEAST operational overhead?

Options:

A.

Set up a data lake in Amazon S3. Incorporate AWS CloudTrail logs and logs from non AWS services into the data lake. Use CloudTrail to store the logs for 7 years.

B.

Configure custom integrations for AWS CloudTrail Lake to collect and store CloudTrail events from AWS services and non AWS services. Use CloudTrail to store the logs for 7 years.

C.

Enable AWS CloudTrail for AWS services. Ingest non AWS services into CloudTrail to store the logs for 7 years

D.

Create new Amazon CloudWatch Logs groups. Send the audit data from non AWS services to the CloudWatch Logs groups. Enable AWS CloudTrail for workloads that run on AWS. Use CloudTrail to store the logs for 7 years.

Buy Now
Questions 160

A company stores sensitive financial reports in an Amazon S3 bucket. To comply with auditing requirements, the company must encrypt the data at rest. Users must not have the ability to change the encryption method or remove encryption when the users upload data. The company must be able to audit all encryption and storage actions. Which solution will meet these requirements and provide the MOST granular control?

Options:

A.

Enable default server-side encryption with Amazon S3 managed keys (SSE-S3) for the S3 bucket. Apply a bucket policy that denies any upload requests that do not include the x-amz-server-side-encryption header.

B.

Configure server-side encryption with AWS KMS (SSE-KMS) keys. Use an S3 bucket policy to reject any data that is not encrypted by the designated key.

C.

Use client-side encryption before uploading the reports. Store the encryption keys in AWS Secrets Manager.

D.

Enable default server-side encryption with Amazon S3 managed keys (SSE-S3). Use AWS Identity and Access Management (IAM) to prevent users from changing S3 bucket settings.

Buy Now
Questions 161

A company uses Amazon RDS (or PostgreSQL to run its applications in the us-east-1 Region. The company also uses machine learning (ML) models to forecast annual revenue based on neat real-time reports. The reports are generated by using the same RDS for PostgreSQL database. The database performance slows during business hours. The company needs to improve database performance.

Which solution will meet these requirements MOST cost-effectively?

Options:

A.

Create a cross-Region read replica. Configure the reports to be generated from the read replica.

B.

Activate Multi-AZ DB instance deployment for RDS for PostgreSQL. Configure the reports to be generated from the standby database.

C.

Use AWS Data Migration Service (AWS DMS) to logically replicate data lo a new database. Configure the reports to be generated from the new database.

D.

Create a read replica in us-east-1. Configure the reports to be generated from the read replica.

Buy Now
Questions 162

The DNS provider that hosts a company ' s domain name records is experiencing outages that cause service disruption for a website running on AWS. The company needs to migrate to a more resilient managed DNS service and wants the service to run on AWS.

What should a solutions architect do to rapidly migrate the DNS hosting service?

Options:

A.

Create an Amazon Route 53 public hosted zone for the domain name. Import the zone file containing the domain records hosted by the previous provider.

B.

Create an Amazon Route 53 private hosted zone for the domain name. Import the zone file containing the domain records hosted by the previous provider.

C.

Create a Simple AD directory in AWS. Enable zone transfer between the DNS provider and AWS Directory Service for Microsoft Active Directory for the domain records.

D.

Create an Amazon Route 53 Resolver inbound endpoint in the VPC. Specify the IP addresses that the provider ' s DNS will forward DNS queries to. Configure the provider ' s DNS to forward DNS queries for the domain to the IP addresses that are specified in the inbound endpoint.

Buy Now
Questions 163

A company runs a three-tier web application in a VPC on AWS. The company deployed an application load balancer ALB in a public subnet. The web tier and application tier Amazon EC2 instances are deployed in a private subnet. The company uses a self-managed MySQL database that runs on EC2 instances in an isolated private subnet for the database tier.

The company wants a mechanism that will give a DevOps team the ability to use SSH to access all the servers. The company also wants to have a centrally managed log of all connections made to the servers.

Which combination of solutions will meet these requirements with the MOST operational efficiency? Select TWO.

Options:

A.

Create a bastion host in the public subnet. Configure security groups in the public, private, and isolated subnets to allow SSH access.

B.

Create an interface VPC endpoint for AWS Systems Manager Session Manager. Attach the endpoint to the VPC.

C.

Create an IAM policy that grants access to AWS Systems Manager Session Manager. Attach the IAM policy to the EC2 instances.

D.

Create a gateway VPC endpoint for AWS Systems Manager Session Manager. Attach the endpoint to the VPC.

E.

Attach an AmazonSSMManagedInstanceCore AWS managed IAM policy to all the EC2 instance roles.

Buy Now
Questions 164

A company uses an organization in AWS Organizations to manage multiple AWS accounts. Multiple teams access each AWS account by assuming IAM roles. Each team has a unique IAM role. Each IAM role has a unique set of permissions.

A security team wants to automate some security tasks by deploying AWS Lambda functions within each AWS account. The security team wants to ensure that only members of the security team can modify the Lambda functions directly.

Which solution will meet these requirements?

Options:

A.

Create a service control policy SCP that prevents any entity from making changes to Lambda functions except for the IAM role of the security team that is specified in the Principal key. Attach the SCP to the root of the organization.

B.

Create an IAM policy that denies all changes to the Amazon Resource Names ARNs of the Lambda functions. Attach the IAM policy to the root user of each AWS account.

C.

Create a service control policy SCP that denies all changes to Lambda functions. Attach the SCP to the root of the organization.

D.

Create a service control policy SCP that prevents any entity from making changes to Lambda functions except for the IAM role of the security team that is specified in the Condition clause. Attach the SCP to the root of the organization.

Buy Now
Questions 165

A company hosts a photo sharing web application on AWS. Users upload and share thousands of photos each hour. The company needs a durable storage solution that provides retrieval mechanisms for the photos. Most uploaded photos are not accessed often after 30 days, but the company does not want to delete older photos.

Which solution will meet these requirements in the MOST cost-effective way?

Options:

A.

Store the photos in an Amazon EFS file system for immediate use. Use AWS Backup with on-demand backups and point-in-time recovery PITR to store photos that are older than 30 days.

B.

Store the photos in an Amazon S3 bucket. Use Amazon S3 Lifecycle configurations to move photos that are older than 30 days to S3 Intelligent-Tiering.

C.

Store the photos in Amazon DynamoDB for immediate use. Use AWS Backup with on-demand backups and point-in-time recovery PITR to store photos that are older than 30 days.

D.

Store the photos in Amazon FSx for Lustre for immediate use. Use AWS Backup with continuous backups and point-in-time recovery PITR to store photos that are older than 30 days.

Buy Now
Questions 166

A company wants to use AWS Direct Connect to connect on-premises networks to AWS. The company runs many VPCs in a single Region and plans to scale to hundreds of VPCs.

Which service will simplify and scale the network architecture?

Options:

A.

VPC endpoints

B.

AWS Transit Gateway

C.

Amazon Route 53

D.

AWS Secrets Manager

Buy Now
Questions 167

A company needs a solution to integrate transaction data from several Amazon DynamoDB tables into an existing Amazon Redshift data warehouse. The solution must maintain the provisioned throughput of DynamoDB.

Which solution will meet these requirements with the LEAST operational overhead?

Options:

A.

Create an Amazon S3 bucket. Configure DynamoDB to export to the bucket on a regular schedule. Use an Amazon Redshift COPY command to read from the S3 bucket.

B.

Use an Amazon Redshift COPY command to read directly from each DynamoDB table.

C.

Create an Amazon S3 bucket. Configure an AWS Lambda function to read from the DynamoDB tables and write to the S3 bucket on a regular schedule. Use Amazon Redshift Spectrum to access the data in the S3 bucket.

D.

Use Amazon Athena Federated Query with a DynamoDB connector and an Amazon Redshift connector to read directly from the DynamoDB tables.

Buy Now
Questions 168

A solutions architect is designing the architecture for a two-tier web application. The web application consists of an internet-facing Application Load Balancer (ALB) that forwards traffic to an Auto Scaling group of Amazon EC2 instances.

The EC2 instances must be able to access an Amazon RDS database. The company does not want to rely solely on security groups or network ACLs. Only the minimum resources that are necessary should be routable from the internet.

Which network design meets these requirements?

Options:

A.

Place the ALB, EC2 instances, and RDS database in private subnets.

B.

Place the ALB in public subnets. Place the EC2 instances and RDS database in private subnets.

C.

Place the ALB and EC2 instances in public subnets. Place the RDS database in private subnets.

D.

Place the ALB outside the VPC. Place the EC2 instances and RDS database in private subnets.

Buy Now
Questions 169

A company runs several applications on Amazon EC2 instances. The company stores configuration files in an Amazon S3 bucket.

A solutions architect must provide the company ' s applications with access to the configuration files. The solutions architect must follow AWS best practices for security.

Which solution will meet these requirements?

Options:

A.

Use the AWS account root user access keys.

B.

Use the AWS access key ID and the EC2 secret access key.

C.

Use an IAM role to grant the necessary permissions to the applications.

D.

Activate multi-factor authentication (MFA) and versioning on the S3 bucket.

Buy Now
Questions 170

A company runs an environment where data is stored in an Amazon S3 bucket. The objects are accessed frequently throughout the day. The company has strict data encryption requirements fordata that is stored in the S3 bucket. The company currently uses AWS Key Management Service (AWS KMS) for encryption.

The company wants to optimize costs associated with encrypting S3 objects without making additional calls to AWS KMS.

Which solution will meet these requirements?

Options:

A.

Use server-side encryption with Amazon S3 managed keys (SSE-S3).

B.

Use an S3 Bucket Key for server-side encryption with AWS KMS keys (SSE-KMS) on the new objects.

C.

Use client-side encryption with AWS KMS customer managed keys.

D.

Use server-side encryption with customer-provided keys (SSE-C) stored in AWS KMS.

Buy Now
Questions 171

A company needs to grant a team of developers access to the company ' s AWS resources. The company must maintain a high level of security for the resources.

The company requires an access control solution that will prevent unauthorized access to the sensitive data.

Which solution will meet these requirements?

Options:

A.

Share the IAM user credentials for each development team member with the rest of the team to simplify access management and to streamline development workflows.

B.

Define IAM roles that have fine-grained permissions based on the principle of least privilege. Assign an IAM role to each developer.

C.

Create IAM access keys to grant programmatic access to AWS resources. Allow only developers to interact with AWS resources through API calls by using the access keys.

D.

Create an Amazon Cognito user pool. Grant developers access to AWS resources by using the user pool.

Buy Now
Questions 172

A company runs a production database on Amazon RDS for MySQL. The company wants to upgrade the database version for security compliance reasons. Because the database contains critical data, the company wants a quick solution to upgrade and test functionality without losing any data.

Which solution will meet these requirements with the LEAST operational overhead?

Options:

A.

Create an RDS manual snapshot. Upgrade to the new version of Amazon RDS for MySQL.

B.

Use native backup and restore. Restore the data to the upgraded new version of Amazon RDS for MySQL.

C.

Use AWS DMS to replicate the data to the upgraded new version of Amazon RDS for MySQL.

D.

Use Amazon RDS Blue/Green Deployments to deploy and test production changes.

Buy Now
Questions 173

AWS Lambda functions need shared access to internal libraries and reference data that are updated independently by different teams.

Which solution will meet these requirements?

Options:

A.

Use Amazon EBS Multi-Attach.

B.

Store data in the Lambda /tmp directory.

C.

Use Amazon EFS mounted to Lambda.

D.

Use Amazon FSx for Windows File Server.

Buy Now
Questions 174

A company has a web application that uses several web servers that run on Amazon EC2 instances. The instances use a shared Amazon RDS for MySQL database.

The company requires a secure method to store database credentials. The credentials must be automatically rotated every 30 days without affecting application availability.

Which solution will meet these requirements?

Options:

A.

Store database credentials in AWS Secrets Manager. Create an AWS Lambda function to automatically rotate the credentials. Use Amazon EventBridge to run the Lambda function on a schedule. Grant the necessary IAM permissions to allow the web servers to access Secrets Manager.

B.

Store database credentials in AWS Systems Manager OpsCenter. Grant the necessary IAM permissions to allow the web servers to access OpsCenter.

C.

Store database credentials in an Amazon S3 bucket. Create an AWS Lambda function to automatically rotate the credentials. Use Amazon EventBridge to run the Lambda function on a schedule. Grant the necessary IAM permissions to allow the web servers to retrieve credentials from the S3 bucket.

D.

Store the credentials in a local file on each of the web servers. Use an AWS KMS key to encrypt the credentials. Create a cron job on each server to rotate the credentials every 30 days.

Buy Now
Questions 175

A company is using an Amazon Elastic Kubernetes Service (Amazon EKS) cluster. The company must ensure that Kubernetes service accounts in the EKS cluster have secure and granular access to specific AWS resources by using IAM roles for service accounts (IRSA).

Which combination of solutions will meet these requirements? (Select TWO.)

Options:

A.

Create an IAM policy that defines the required permissions. Attach the policy directly to the IAM role of the EKS nodes.

B.

Implement network policies within the EKS cluster to prevent Kubernetes service accounts from accessing specific AWS services.

C.

Modify the EKS cluster ' s IAM role to include permissions for each Kubernetes service account. Ensure a one-to-one mapping between IAM roles and Kubernetes roles.

D.

Define an IAM role that includes the necessary permissions. Annotate the Kubernetes service accounts with the Amazon Resource Name (ARN) of the IAM role.

E.

Set up a trust relationship between the IAM roles for the service accounts and an OpenID Connect (OIDC) identity provider.

Buy Now
Questions 176

A company currently runs a Linux-based application in a self-managed Docker container that runs on Amazon EC2 instances. The application runs a lightweight data processing tool that always completes its job within 3 minutes. The company wants an alternative deployment solution for the application to reduce infrastructure management overhead. The company is willing to make any required changes to the image.

Which solution will meet this requirement with the LEAST operational overhead?

Options:

A.

Deploy the application as an AWS Lambda function that uses the container image.

B.

Deploy the application on Amazon EKS with the AWS Fargate launch type.

C.

Deploy the application on Amazon ECS with the AWS Fargate launch type.

D.

Deploy the application as a custom Amazon Machine Image (AMI) by using AWS Batch.

Buy Now
Questions 177

An ecommerce company runs a PostgreSQL database on an Amazon EC2 instance. The database stores data in Amazon Elastic Block Store (Amazon EBS) volumes. The daily peak input/output transactions per second (IOPS) do not exceed 15,000 IOPS. The company wants to migrate the database to Amazon RDS for PostgreSQL and to provision disk IOPS performance that is independent of disk storage capacity.

Which solution will meet these requirements MOST cost-effectively?

Options:

A.

Configure General Purpose SSD (gp2) EBS volumes. Provision a 5 TiB volume.

B.

Configure Provisioned IOPS SSD (io1) EBS volumes. Provision 15,000 IOPS.

C.

Configure General Purpose SSD (gp3) EBS volumes. Provision 15,000 IOPS.

D.

Configure magnetic EBS volumes to achieve maximum IOPS.

Buy Now
Questions 178

A finance company uses an on-premises search application to collect streaming data from various producers. The application provides real-time updates to search and visualization features. The company is planning to migrate to AWS and wants to use an AWS native solution. Which solution will meet these requirements?

Options:

A.

Use Amazon EC2 instances to ingest and process the data streams to Amazon S3 buckets for storage. Use Amazon Athena to search the data. Use Amazon Managed Grafana to create visualizations.

B.

Use Amazon EMR to ingest and process the data streams to Amazon Redshift for storage. Use Amazon Redshift Spectrum to search the data. Use Amazon QuickSight to create visualizations.

C.

Use Amazon Elastic Kubernetes Service (Amazon EKS) to ingest and process the data streams to Amazon DynamoDB for storage. Use Amazon CloudWatch to create graphical dashboards to search and visualize the data.

D.

Use Amazon Kinesis Data Streams to ingest and process the data streams to Amazon OpenSearch Service. Use OpenSearch Service to search the data. Use Amazon QuickSight to create visualizations.

Buy Now
Questions 179

A company has an on-premises volume backup solution that has reached its end of life. The company wants to use AWS as part of a new backup solution and wants to maintain local access to all the data while it is backed up on AWS. The company wants to ensure that the data backed up on AWS is automatically and securely transferred.

Which solution meets these requirements?

Options:

A.

Use AWS Snowball to migrate data out of the on-premises solution to Amazon S3. Configure on-premises systems to mount the Snowball S3 endpoint to provide local access to the data.

B.

Use AWS Snowball Edge to migrate data out of the on-premises solution to Amazon S3. Use the Snowball Edge file interface to provide on-premises systems with local access to the data.

C.

Use AWS Storage Gateway and configure a cached volume gateway. Run the Storage Gateway software appliance on premises and configure a percentage of data to cache locally. Mount the gateway storage volumes to provide local access to the data.

D.

Use AWS Storage Gateway and configure a stored volume gateway. Run the Storage Gateway software appliance on premises and map the gateway storage volumes to on-premises storage. Mount the gateway storage volumes to provide local access to the data.

Buy Now
Questions 180

A company needs to design a solution to process videos that users upload to an Amazon S3 bucket. Each video file is approximately 1 GB in size and takes approximately 20 minutes to process. During peak hours, the company expects to process approximately 100 simultaneous uploads. The video file processing is stateless and can run in parallel as soon as the video files arrive in the S3 bucket.

Which solution will meet these requirements in the MOST cost-effective way?

Options:

A.

Use an AWS Lambda function to process each video. Split the video files into chunks, and use AWS Step Functions to orchestrate multiple processing steps.

B.

Use an Amazon EKS cluster with AWS Fargate profiles to deploy one container for each uploaded video. Configure an Amazon EventBridge rule to invoke the cluster when a user uploads a video.

C.

Use Amazon EC2 On-Demand Instances in an Auto Scaling group to process each file. Configure the Auto Scaling policy to increase the number of instances based on the number of files that the application needs to process.

D.

Use an Amazon ECS cluster with the AWS Fargate launch type. Use Fargate Spot capacity to run one container task for each uploaded video. Configure an Amazon EventBridge rule to invoke the cluster when a user uploads a video.

Buy Now
Questions 181

A company is building an ecommerce platform that will allow customers to place orders online. Customer traffic varies significantly. An order-processing microservice is running on a group of Amazon EC2 instances. A solutions architect must ensure that the application remains responsive and decoupled from the frontend. The application must also be able to reprocess orders that the application fails to process on the first attempt. Which solution will meet these requirements?

Options:

A.

Deploy an Application Load Balancer in front of the order-processing microservice. Configure the Amazon EC2 instances to scale out automatically based on CPU utilization metrics as traffic increases.

B.

Deploy an Amazon SQS queue to integrate the frontend and the order-processing microservice. Configure the frontend to send messages to the queue. Configure the EC2 instances to process messages from the queue.

C.

Establish direct HTTPS connections from the frontend to the microservice. Use a dynamically expanding thread pool to handle concurrency at the microservice layer.

D.

Use Amazon Kinesis Data Streams to ingest all order requests from the frontend. Configure the Amazon EC2 instances to continuously poll the stream and process orders in near real time.

Buy Now
Questions 182

A solutions architect is creating a data processing job that runs once daily and can take up to 2 hours to complete. If the job is interrupted, it has to restart from the beginning.

How should the solutions architect address this issue in the MOST cost-effective manner?

Options:

A.

Create a script that runs locally on an Amazon EC2 Reserved Instance that is triggered by a cron job.

B.

Create an AWS Lambda function triggered by an Amazon EventBridge scheduled event.

C.

Use an Amazon Elastic Container Service (Amazon ECS) Fargate task triggered by an Amazon EventBridge scheduled event.

D.

Use an Amazon Elastic Container Service (Amazon ECS) task running on Amazon EC2 triggered by an Amazon EventBridge scheduled event.

Buy Now
Questions 183

A company has an on-premises application that uses SFTP to collect financial data from multiple vendors. The company is migrating to the AWS Cloud. The company has created an application that uses Amazon S3 APIs to upload files from vendors.

Some vendors run their systems on legacy applications that do not support S3 APIs. The vendors want to continue to use SFTP-based applications to upload data. The company wants to use managed services for the needs of the vendors that use legacy applications.

Which solution will meet these requirements with the LEAST operational overhead?

Options:

A.

Create an AWS Database Migration Service (AWS DMS) instance to replicate data from the storage of the vendors that use legacy applications to Amazon S3. Provide the vendors with the credentials to access the AWS DMS instance.

B.

Create an AWS Transfer Family endpoint for vendors that use legacy applications.

C.

Configure an Amazon EC2 instance to run an SFTP server. Instruct the vendors that use legacy applications to use the SFTP server to upload data.

D.

Configure an Amazon S3 File Gateway for vendors that use legacy applications to upload files to an SMB file share.

Buy Now
Questions 184

A company wants to protect AWS-hosted resources, including Application Load Balancers and CloudFront distributions. They need near real-time visibility into attacks and a dedicated AWS response team for DDoS events.

Which AWS service meets these requirements?

Options:

A.

AWS WAF

B.

AWS Shield Standard

C.

Amazon Macie

D.

AWS Shield Advanced

Buy Now
Questions 185

An events company runs a web application on Amazon EKS that uses an Amazon DynamoDB table. The table has 1,000 RCUs and 500 WCUs provisioned. The application uses eventually consistent reads.

Traffic is usually low but occasionally spikes. During spikes, DynamoDB throttles requests, causing user-facing errors.

What should a solutions architect do to reduce these errors?

Options:

A.

Change the DynamoDB table to on-demand capacity mode.

B.

Create a DynamoDB read replica.

C.

Purchase DynamoDB reserved capacity.

D.

Use strongly consistent reads.

Buy Now
Questions 186

An ecommerce company is preparing to deploy a web application on AWS to ensure continuous service for customers. The architecture includes a web application that the company hosts on Amazon EC2 instances, a relational database in Amazon RDS, and static assets that the company stores in Amazon S3.

The company wants to design a robust and resilient architecture for the application.

Options:

A.

Deploy Amazon EC2 instances in a single Availability Zone. Deploy an RDS DB instance in the same Availability Zone. Use Amazon S3 with versioning enabled to store static assets.

B.

Deploy Amazon EC2 instances in an Auto Scaling group across multiple Availability Zones. Deploy a Multi-AZ RDS DB instance. Use Amazon CloudFront to distribute static assets.

C.

Deploy Amazon EC2 instances in a single Availability Zone. Deploy an RDS DB instance in a second Availability Zone for cross-AZ redundancy. Serve static assets directly from the EC2 instances.

D.

Use AWS Lambda functions to serve the web application. Use Amazon Aurora Serverless v2 for the database. Store static assets in Amazon Elastic File System (Amazon EFS) One Zone-Infrequent Access (One Zone-IA).

Buy Now
Questions 187

A social media company wants to store its database of user profiles, relationships, and interactions in the AWS Cloud. The company needs an application to monitor any changes in the database. The application needs to analyze the relationships between the data entities and to provide recommendations to users.

Which solution will meet these requirements with the LEAST operational overhead?

Options:

A.

Use Amazon Neptune to store the information. Use Amazon Kinesis Data Streams to process changes in the database.

B.

Use Amazon Neptune to store the information. Use Neptune Streams to process changes in the database.

C.

Use Amazon Quantum Ledger Database (Amazon QLDB) to store the information. Use Amazon Kinesis Data Streams to process changes in the database.

D.

Use Amazon Quantum Ledger Database (Amazon QLDB) to store the information. Use Neptune Streams to process changes in the database.

Buy Now
Questions 188

A company needs to design a hybrid network architecture The company ' s workloads are currently stored in the AWS Cloud and in on-premises data centers The workloads require single-digit latencies to communicate The company uses an AWS Transit Gateway transit gateway to connect multiple VPCs

Which combination of steps will meet these requirements MOST cost-effectively? (Select TWO.)

Options:

A.

Establish an AWS Site-to-Site VPN connection to each VPC.

B.

Associate an AWS Direct Connect gateway with the transit gateway that is attached to the VPCs.

C.

Establish an AWS Site-to-Site VPN connection to an AWS Direct Connect gateway.

D.

Establish an AWS Direct Connect connection. Create a transit virtual interface (VIF) to a Direct Connect gateway.

E.

Associate AWS Site-to-Site VPN connections with the transit gateway that is attached to the VPCs

Buy Now
Questions 189

A company is building an application that runs on several Linux-based containers in Amazon ECS. The containers must have shared access to log files and configuration data. The application requires a POSIX-compliant file system that provides high availability and scalability.

Which solution will meet these requirements in the MOST cost-effective way?

Options:

A.

Configure an Amazon EFS file system with elastic throughput.

B.

Deploy an Amazon S3 bucket and configure shared access and object-level permissions.

C.

Configure an Amazon FSx for Lustre file system with the Persistent 2 deployment type.

D.

Attach an Amazon EBS volume to an Amazon EC2 instance to share access across containers.

Buy Now
Questions 190

An online gaming company is transitioning user data storage to Amazon DynamoDB to support the company ' s growing user base. The current architecture includes DynamoDB tables that contain user profiles, achievements, and in-game transactions.

The company needs to design a robust, continuously available, and resilient DynamoDB architecture to maintain a seamless gaming experience for users.

Which solution will meet these requirements MOST cost-effectively?

Options:

A.

Create DynamoDB tables in a single AWS Region. Use on-demand capacity mode. Use global tables to replicate data across multiple Regions.

B.

Use DynamoDB Accelerator (DAX) to cache frequently accessed data. Deploy tables in a single AWS Region and enable auto scaling. Configure Cross-Region Replication manually to additional Regions.

C.

Create DynamoDB tables in multiple AWS Regions. Use on-demand capacity mode. Use DynamoDB Streams for Cross-Region Replication between Regions.

D.

Use DynamoDB global tables for automatic multi-Region replication. Deploy tables in multiple AWS Regions. Use provisioned capacity mode. Enable auto scaling.

Buy Now
Questions 191

A company is building a new web application that serves static and dynamic content from an API. Users will access the application from around the world. The company wants to minimize latency in the most cost-effective way.

Which solution will meet these requirements MOST cost-effectively?

Options:

A.

Deploy the static content to an Amazon S3 bucket. Use an Amazon API Gateway HTTP API to serve the dynamic content. Create an Amazon CloudFront distribution that uses the S3 bucket and the HTTP API as origins. Enable caching for static content.

B.

Deploy the static content to an Amazon S3 bucket. Provide the bucket website endpoint to users. Use an Amazon API Gateway HTTP API with caching enabled to serve the dynamic content.

C.

Deploy the static content to an Amazon S3 bucket. Use two Amazon EC2 instances as web servers. Deploy an Application Load Balancer to distribute traffic. Create an Amazon CloudFront distribution in front of the S3 bucket to cache static content.

D.

Deploy the static content to an Amazon S3 bucket. Provide the bucket website endpoint to users. Create an Amazon CloudFront distribution in front of the S3 bucket to cache static content.

Buy Now
Questions 192

A developer used the AWS SDK to create an application that aggregates and produces log records for 10 services. The application delivers data to an Amazon Kinesis Data Streams stream.

Each record contains a log message with a service name, creation timestamp, and other log information. The stream has 15 shards in provisioned capacity mode. The stream uses service name as the partition key.

The developer notices that when all the services are producing logs,ProvisionedThroughputExceededException errors occur during PutRecord requests. The stream metrics show that the write capacity the applications use is below the provisioned capacity.

How should the developer resolve this issue?

Options:

A.

Change the capacity mode from provisioned to on-demand.

B.

Double the number of shards until the throttling errors stop occurring.

C.

Change the partition key from service name to creation timestamp.

D.

Use a separate Kinesis stream for each service to generate the logs.

Buy Now
Questions 193

A company is creating a payment processing application that supports TLS connections from IPv4 clients. The application requires outbound access to the public internet. The application must allow users to access the application from a single entry point while maintaining the lowest possible attack surface.

The company wants to use Amazon ECS tasks to deploy the application. The company wants to enable awsvpc network mode.

Which solution will meet these requirements?

Options:

A.

Create a VPC that has an internet gateway, public subnets, and private subnets. Deploy a Network Load Balancer (NLB) and a NAT gateway in the public subnets. Deploy the ECS tasks in the private subnets.

B.

Create a VPC that has an egress-only internet gateway, public subnets, and private subnets. Deploy an Application Load Balancer (ALB) and a NAT gateway in the public subnets. Deploy the ECS tasks in the private subnets.

C.

Create a VPC that has an internet gateway, public subnets, and private subnets. Deploy an Application Load Balancer (ALB) in the public subnets. Deploy the ECS tasks in the public subnets.

D.

Create a VPC that has an egress-only internet gateway, public subnets, and private subnets. Deploy a Network Load Balancer (NLB) in the public subnets. Deploy the ECS tasks in the public subnets.

Buy Now
Questions 194

A company runs an ecommerce application on Amazon EC2 instances behind an Application Load Balancer. The instances run in an Amazon EC2 Auto Scaling group across multiple Availability Zones. The Auto Scaling group scales based on CPU utilization metrics. The ecommerce application stores the transaction data in a MySQL 8.0 database that is hosted on a large EC2 instance.

The database ' s performance degrades quickly as application load increases. The application handles more read requests than write transactions. The company wants a solution that will automatically scale the database to meet the demand of unpredictable read workloads while maintaining high availability.

Options:

A.

Use Amazon Redshift with a single node for leader and compute functionality.

B.

Use Amazon RDS with a Single-AZ deployment. Configure Amazon RDS to add reader instances in a different Availability Zone.

C.

Use Amazon Aurora with a Multi-AZ deployment. Configure Aurora Auto Scaling with Aurora Replicas.

D.

Use Amazon ElastiCache (Memcached) with EC2 Spot Instances.

Buy Now
Questions 195

A company has a large data workload that runs for 6 hours each day. The company cannot lose any data while the process is running. A solutions architect is designing an Amazon EMR cluster configuration to support this critical data workload.

Which solution will meet these requirements MOST cost-effectively?

Options:

A.

Configure a long-running cluster that runs the primary node and core nodes on On-Demand Instances and the task nodes on Spot Instances.

B.

Configure a transient cluster that runs the primary node and core nodes on On-Demand Instances and the task nodes on Spot Instances.

C.

Configure a transient cluster that runs the primary node on an On-Demand Instance and the core nodes and task nodes on Spot Instances.

D.

Configure a long-running cluster that runs the primary node on an On-Demand Instance, the core nodes on Spot Instances, and the task nodes on Spot Instances.

Buy Now
Questions 196

A company hosts an application on AWS and has generated approximately 2.5 TB of data over 12 years. The data is stored on Amazon EBS volumes.

The company wants a cost-effective backup solution for long-term storage and must be able to retrieve the data within minutes for audits.

Which solution will meet these requirements?

Options:

A.

Create EBS snapshots.

B.

Use Amazon S3 Glacier Deep Archive.

C.

Use Amazon S3 Glacier Flexible Retrieval.

D.

Use Amazon Elastic File System (Amazon EFS).

Buy Now
Questions 197

A company requires centralized auditing for all AWS accounts and compliance monitoring against AWS Foundational Security Best Practices (FSBP) with minimal operational overhead.

Which solution will meet these requirements?

Options:

A.

Deploy AWS Control Tower in the management account. Enable AWS Security Hub and Account Factory.

B.

Deploy AWS Control Tower in a member account.

C.

Use AWS Managed Services (AMS) with GuardDuty.

D.

Use AWS Managed Services (AMS) with Security Hub.

Buy Now
Questions 198

A company decides to use AWS Key Management Service (AWS KMS) for data encryption operations. The company must create a KMS key and automate the rotation of the key. The company also needs the ability to deactivate the key and schedule the key for deletion.

Which solution will meet these requirements?

Options:

A.

Create an asymmetric customer managed KMS key. Enable automatic key rotation.

B.

Create a symmetric customer managed KMS key. Disable the envelope encryption option.

C.

Create a symmetric customer managed KMS key. Enable automatic key rotation.

D.

Create an asymmetric customer managed KMS key. Disable the envelope encryption option.

Buy Now
Questions 199

A company has an e-commerce site. The site is designed as a distributed web application hosted in multiple AWS accounts under one AWS Organizations organization. The web application is comprised of multiple microservices. All microservices expose their AWS services either through Amazon CloudFront distributions or public Application Load Balancers (ALBs). The company wants to protect public endpoints from malicious attacks and monitor security configurations. Which solution will meet these requirements with the LEAST operational overhead?

Options:

A.

Use AWS WAF to protect the public endpoints. Use AWS Firewall Manager from a dedicated security account to manage rules in AWS WAF. Use AWS Config rules to monitor the Regional and global WAF configurations.

B.

Use AWS WAF to protect the public endpoints. Apply AWS WAF rules in each account. Use AWS Config rules and AWS Security Hub to monitor the WAF configurations of the ALBs and the CloudFront distributions.

C.

Use AWS WAF to protect the public endpoints. Use AWS Firewall Manager from a dedicated security account to manage the rules in AWS WAF. Use Amazon Inspector and AWS Security Hub to monitor the WAF configurations of the ALBs and the CloudFront distributions.

D.

Use AWS Shield Advanced to protect the public endpoints. Use AWS Config rules to monitor the Shield Advanced configuration for each account.

Buy Now
Questions 200

A developer is creating a serverless application that performs video encoding. The encoding process runs as background jobs and takes several minutes to encode each video. The process must not send an immediate result to users.

The developer is using Amazon API Gateway to manage an API for the application. The developer needs to run test invocations and request validations. The developer must distribute API keys to control access to the API.

Which solution will meet these requirements?

Options:

A.

Create an HTTP API. Create an AWS Lambda function to handle the encoding jobs. Integrate the function with the HTTP API. Use the Event invocation type to call the Lambda function.

B.

Create a REST API with the default endpoint type. Create an AWS Lambda function to handle the encoding jobs. Integrate the function with the REST API. Use the Event invocation type to call the Lambda function.

C.

Create an HTTP API. Create an AWS Lambda function to handle the encoding jobs. Integrate the function with the HTTP API. Use the RequestResponse invocation type to call the Lambda function.

D.

Create a REST API with the default endpoint type. Create an AWS Lambda function to handle the encoding jobs. Integrate the function with the REST API. Use the RequestResponse invocation type to call the Lambda function.

Buy Now
Questions 201

A company is deploying an application in three AWS Regions using an Application Load Balancer. Amazon Route 53 will be used to distribute traffic between these Regions.

Which Route 53 configuration should a solutions architect use to provide the MOST high-performing experience?

Options:

A.

Create an A record with a latency policy.

B.

Create an A record with a geolocation policy.

C.

Create a CNAME record with a failover policy.

D.

Create a CNAME record with a geoproximity policy.

Buy Now
Questions 202

A company is developing a rating system for its ecommerce web application. The company needs a solution to save ratings that users submit in an Amazon DynamoDB table. The company wants to ensure that developers do not need to interact directly with the DynamoDB table. The solution must be scalable and reusable.

Which solution will meet these requirements with the LEAST operational overhead?

Options:

A.

Create an Application Load Balancer ALB. Create an AWS Lambda function, and set the function as a target group in the ALB. Invoke the Lambda function by using the PutItem method through the ALB.

B.

Create an AWS Lambda function. Configure the Lambda function to interact with the DynamoDB table by using the PutItem method from Boto3. Invoke the Lambda function from the web application.

C.

Create an Amazon SQS queue and an AWS Lambda function that has an SQS trigger type. Instruct the developers to add customer ratings to the SQS queue as JSON messages. Configure the Lambda function to fetch the ratings from the queue and store the ratings in DynamoDB.

D.

Create an Amazon API Gateway REST API. Define a resource and create a new POST method. Choose AWS as the integration type, and select DynamoDB as the service. Set the action to PutItem.

Buy Now
Questions 203

A company runs a non-production application on an Amazon EC2 instance that has the Amazon CloudWatch agent installed. The CloudWatch agent monitors application processes and sends custom metrics to CloudWatch.

The application has a critical bug that causes crashes that require an instance reboot. The company does not currently have the resources to address the bug, but the server needs to remain as operational as possible. The company manually reboots the instance several times each day. The company needs a solution to automate the instance reboots until the company can address the root cause of the bug.

Which solution will meet this requirement with the LEAST amount of operational overhead?

Options:

A.

Use a CloudWatch alarm state change event to invoke Amazon EventBridge to run AWS Systems Manager Run Command to restart the instance.

B.

Use a CloudWatch alarm to invoke an AWS Lambda function to run AWS Systems Manager Run Command to restart the instance.

C.

Use a CloudWatch alarm to invoke an Amazon SNS topic that notifies the operations team to restart the instance.

D.

Use a CloudWatch alarm to invoke an AWS Lambda function that automatically notifies the company through chat to restart the instance.

Buy Now
Questions 204

A company hosts an application on Amazon EC2 On-Demand Instances in an Auto Scaling group. Application peak hours occur at the same time each day. Application users experience slow application performance at the start of peak hours. The application performs normally 2–3 hours after peak hours begin. The company wants to ensure that the application works properly at the start of peak hours.

Which solution will meet these requirements?

Options:

A.

Configure an Application Load Balancer to distribute traffic properly to the instances.

B.

Configure a dynamic scaling policy for the Auto Scaling group to launch new instances based on memory utilization.

C.

Configure a dynamic scaling policy for the Auto Scaling group to launch new instances based on CPU utilization.

D.

Configure a scheduled scaling policy for the Auto Scaling group to launch new instances before peak hours.

Buy Now
Questions 205

A news company that has reporters all over the world is hosting its broadcast system on AWS. The reporters send live broadcasts to the broadcast system. The reporters use software on their phones to send live streams through the Real Time Messaging Protocol (RTMP).

A solutions architect must design a solution that gives the reporters the ability to send the highest quality streams The solution must provide accelerated TCP connections back to the broadcast system.

What should the solutions architect use to meet these requirements?

Options:

A.

Amazon CloudFront

B.

AWS Global Accelerator

C.

AWS Client VPN

D.

Amazon EC2 instances and AWS Elastic IP addresses

Buy Now
Questions 206

A company wants to migrate an on-premises video processing application to AWS. Processing times range from 5–30 minutes. The application must run multiple jobs in parallel. The application processes videos that users upload to an Amazon S3 bucket.

Which solution will meet these requirements with the LEAST operational overhead?

Options:

A.

Configure the S3 bucket to send S3 event notifications to an Amazon SQS standard queue. Deploy the application on an Amazon ECS cluster. Configure automatic scaling for AWS Fargate tasks based on the SQS queue size.

B.

Configure the S3 bucket to send S3 event notifications to an Amazon SQS FIFO queue. Deploy the application on Amazon EC2 instances. Create an Auto Scaling group to scale based on the SQS queue size.

C.

Configure the S3 bucket to send S3 event notifications to an Amazon SQS standard queue. Deploy the application as an AWS Lambda function. Configure the Lambda function to poll the SQS queue.

D.

Configure the S3 bucket to send S3 event notifications to an Amazon SNS topic. Deploy the application as an AWS Lambda function. Configure the SNS topic to invoke the Lambda function.

Buy Now
Questions 207

An application is experiencing performance issues based on increased demand. This increased demand is on read-only historical records that are pulled from an Amazon RDS-hosted database with custom views and queries. A solutions architect must improve performance without changing the database structure.

Which approach will improve performance and MINIMIZE management overhead?

Options:

A.

Deploy Amazon DynamoDB, move all the data, and point to DynamoDB.

B.

Deploy Amazon ElastiCache (Redis OSS) and cache the data for the application.

C.

Deploy Memcached on Amazon EC2 and cache the data for the application.

D.

Deploy Amazon DynamoDB Accelerator (DAX) on Amazon RDS to improve cache performance.

Buy Now
Questions 208

A company runs a Windows-based ecommerce application on Amazon EC2 instances. The application has a very high transaction rate. The company requires a durable storage solution that can deliver 200,000 IOPS for each EC2 instance.

Which solution will meet these requirements?

Options:

A.

Host the application on EC2 instances that have Provisioned IOPS SSD (io2) Block Express Amazon Elastic Block Store (Amazon EBS) volumes attached.

B.

Install the application on an Amazon EMR cluster. Use Hadoop Distributed File System (HDFS) with General Purpose SSD (gp3) Amazon Elastic Block Store (Amazon EBS) volumes.

C.

Use Amazon FSx for Lustre as shared storage across the EC2 instances that run the application.

D.

Host the application on EC2 instances that have SSD instance store volumes and General Purpose SSD (gp3) Amazon Elastic Block Store (Amazon EBS) volumes attached.

Buy Now
Questions 209

A company is designing an application to run in a VPC on AWS. The application consists of Amazon EC2 instances that run in private subnets as part of an Auto Scaling group. The application stores data in an Amazon RDS DB instance.

The company attaches a security group named web-servers to the EC2 instances. The company attaches a security group named database to the DB instance.

The company needs a solution to establish communication between the EC2 instances and the DB instance.

Which solution will meet this requirement?

Options:

A.

Configure the inbound rule for the database security group to allow access from the current set of IP addresses that the EC2 instances use.

B.

Configure the inbound rule of the database security group to allow access from the web-servers security group. Configure an outbound rule for the web-servers security group to allow access to the database security group.

C.

Configure the inbound rule of the database security group to allow access by specifying the Auto Scaling group ID.

D.

Configure the outbound rule of the database security group to allow access to the web-servers security group. Configure an inbound rule for the web-servers security group to allow access from the database security group.

Buy Now
Questions 210

A company needs a solution to enforce data encryption at rest on Amazon EC2 instances. The solution must automatically identify noncompliant resources and enforce compliance policies on findings.

Which solution will meet these requirements with the LEAST administrative overhead?

Options:

A.

Use an IAM policy that allows users to create only encrypted Amazon Elastic Block Store (Amazon EBS) volumes. Use AWS Config and AWS Systems Manager to automate the detection and remediation of unencrypted EBS volumes.

B.

Use AWS Key Management Service (AWS KMS) to manage access to encrypted Amazon Elastic Block Store (Amazon EBS) volumes. Use AWS Lambda and Amazon EventBridge to automate the detection and remediation of unencrypted EBS volumes.

C.

Use Amazon Macie to detect unencrypted Amazon Elastic Block Store (Amazon EBS) volumes. Use AWS Systems Manager Automation rules to automatically encrypt existing and new EBS volumes.

D.

Use Amazon Inspector to detect unencrypted Amazon Elastic Block Store (Amazon EBS) volumes. Use AWS Systems Manager Automation rules to automatically encrypt existing and new EBS volumes.

Buy Now
Questions 211

A company has customers located across the world. The company wants to use automation to secure its systems and network infrastructure The company ' s security team must be able to track and audit all incremental changes to the infrastructure.

Which solution will meet these requirements?

Options:

A.

Use AWS Organizations to set up the infrastructure. Use AWS Config to track changes

B.

Use AWS Cloud Formation to set up the infrastructure. Use AWS Config to track changes.

C.

Use AWS Organizations to set up the infrastructure. Use AWS Service Catalog to track changes.

D.

Use AWS Cloud Formation to set up the infrastructure. Use AWS Service Catalog to track changes.

Buy Now
Questions 212

A company wants to relocate its on-premises MySQL database to AWS. The database accepts regular imports from a client-facing application, which causes a high volume of write operations. The company is concerned that the amount of traffic might be causing performance issues within the application.

Options:

A.

Provision an Amazon RDS for MySQL DB instance with Provisioned IOPS SSD storage. Monitor write operation metrics by using Amazon CloudWatch. Adjust the provisioned IOPS if necessary.

B.

Provision an Amazon RDS for MySQL DB instance with General Purpose SSD storage. Place an Amazon ElastiCache cluster in front of the DB instance. Configure the application to query ElastiCache instead.

C.

Provision an Amazon DocumentDB (with MongoDB compatibility) instance with a memory-optimized instance type. Monitor Amazon CloudWatch for performance-related issues. Change the instance class if necessary.

D.

Provision an Amazon Elastic File System (Amazon EFS) file system in General Purpose performance mode. Monitor Amazon CloudWatch for IOPS bottlenecks. Change to Provisioned Throughput performance mode if necessary.

Buy Now
Questions 213

A company uses server-side encryption with AWS KMS keys SSE-KMS to encrypt objects that the company stores in an Amazon S3 bucket. The company requires all objects in the S3 bucket to be replicated to a secondary AWS account in the same AWS Region. All objects in the source account S3 bucket must be available in the secondary account within several minutes. All replicated objects must be immediately accessible. The company has already modified the key policy for the KMS key that encrypts the bucket in the source account to allow access from the secondary account.

Which solution will meet these requirements?

Options:

A.

Create a new S3 bucket in the secondary account. Configure an AWS PrivateLink connection between the new S3 bucket and the existing S3 bucket. Grant PrivateLink permission to access the KMS keys that encrypt the data.

B.

Create an AWS Backup job for the source S3 bucket. Create a backup vault in the secondary AWS account. Configure the backup plan to copy the backup jobs to the new backup vault.

C.

Create a new S3 bucket in the secondary account. Configure an S3 replication rule on the source bucket to replicate objects to the secondary account. Enable S3 Replication Time Control S3 RTC.

D.

Configure an AWS Lambda function in the source account to automatically invoke an Amazon S3 Batch Operations job to copy the objects to the secondary account S3 bucket every five minutes.

Buy Now
Questions 214

A company ' s software development team needs an Amazon RDS Multi-AZ cluster. The RDS cluster will serve as a backend for a desktop client that is deployed on premises. The desktop client requires direct connectivity to the RDS cluster.

The company must give the development team the ability to connect to the cluster by using the client when the team is in the office.

Which solution provides the required connectivity MOST securely?

Options:

A.

Create a VPC and two public subnets. Create the RDS cluster in the public subnets. Use AWS Site-to-Site VPN with a customer gateway in the company ' s office.

B.

Create a VPC and two private subnets. Create the RDS cluster in the private subnets. Use AWS Site-to-Site VPN with a customer gateway in the company ' s office.

C.

Create a VPC and two private subnets. Create the RDS cluster in the private subnets. Use RDS security groups to allow the company ' s office IP ranges to access the cluster.

D.

Create a VPC and two public subnets. Create the RDS cluster in the public subnets. Create a cluster user for each developer. Use RDS security groups to allow the users to access the cluster.

Buy Now
Questions 215

A company hosts an industrial control application that receives sensor input through Amazon Kinesis Data Streams. The application needs to support new sensors for real-time anomaly detection in monitored equipment.

The company wants to integrate new sensors in a loosely-coupled, fully managed, and serverless way. The company cannot modify the application code.

Which solution will meet these requirements?

Options:

A.

Forward the existing stream in Kinesis Data Streams to Amazon Managed Service for Apache Flink for anomaly detection. Use a second stream in Kinesis Data Streams to send the Flink output to the application.

B.

Use Amazon Data Firehose to stream data to Amazon S3. Use Amazon Redshift Spectrum to perform anomaly detection on the S3 data. Use S3 Event Notifications to invoke an AWS Lambda function that sends analyzed data to the application through a second stream in Kinesis Data Streams.

C.

Configure Amazon EC2 instances in an Auto Scaling group to consume data from the data stream and to perform anomaly detection. Create a second stream in Kinesis Data Streams to send data from the EC2 instances to the application.

D.

Configure an Amazon Elastic Container Service (Amazon ECS) task that uses Amazon EC2 instances to consume data from the data stream and to perform anomaly detection. Create a second stream in Kinesis Data Streams to send data from the containers to the application.

Buy Now
Questions 216

A company wants to migrate a Microsoft SQL Server database server from an on-premises data center to AWS. The company needs access to the operating system of the SQL Server database.

Which solution will meet these requirements?

Options:

A.

Migrate the database to Amazon Aurora Serverless.

B.

Migrate the database to Amazon RDS for SQL Server.

C.

Migrate the database to Amazon EC2 instances that run SQL Server.

D.

Migrate the database to Amazon Redshift.

Buy Now
Questions 217

A company is deploying a critical application by using Amazon RDS for MySQL. The application must be highly available and must recover automatically. The company needs to support interactive users (transactional queries) and batch reporting (analytical queries) with no more than a 4-hour lag. The analytical queries must not affect the performance of the transactional queries.

Which solution will meet these requirements?

Options:

A.

Configure Amazon RDS for MySQL in a Multi-AZ DB instance deployment with one standby instance. Point the transactional queries to the primary DB instance. Point the analytical queries to a secondary DB instance that runs in a different Availability Zone.

B.

Configure Amazon RDS for MySQL in a Multi-AZ DB cluster deployment with two standby instances. Point the transactional queries to the primary DB instance. Point the analytical queries to the reader endpoint.

C.

Configure Amazon RDS for MySQL to use multiple read replicas across multiple Availability Zones. Point the transactional queries to the primary DB instance. Point the analytical queries to one of the replicas in a different Availability Zone.

D.

Configure Amazon RDS for MySQL as the primary database for the transactional queries with automated backups enabled. Each night, create a read-only database from the most recent snapshot to support the analytical queries. Terminate the previously created database.

Buy Now
Questions 218

Question:

A company runs a mobile game app that stores session data (up to 256 KB) for up to 48 hours. The data updates frequently and must be deleted automatically after expiration. Restorability is also required.

Options:

Options:

A.

Use an Amazon DynamoDB table to store the session data. Enable point-in-time recovery (PITR) and TTL.

B.

Use Amazon MemoryDB and enable PITR and TTL.

C.

Store session data in S3 Standard. Enable Versioning and a Lifecycle rule to expire objects after 48 hours.

D.

Store data in S3 Intelligent-Tiering with Versioning and a Lifecycle rule to expire after 48 hours.

Buy Now
Questions 219

A company uses two AWS accounts named Account A and Account B. Account A hosts a data analytics application. Account B hosts a data lake in an Amazon S3 bucket. Data analysts in Account A need to access the data lake in Account B. The access solution must be secure, use temporary credentials, enforce the principle of least privilege, and avoid long-term access keys.

Which solution will meet these requirements?

Options:

A.

Create IAM users in Account B and share the access keys for the users with analysts in Account A.

B.

Use an S3 bucket policy to configure the S3 bucket in Account B to be publicly accessible.

C.

Configure a resource-based policy for the S3 bucket in Account B to allow access from an IAM role in Account A.

D.

Use a bastion host in Account B to proxy analyst requests from Account A through an Amazon EC2 instance.

Buy Now
Questions 220

A company is running a web-based game in two Availability Zones in the us-west-2 Region. The web servers use an Application Load Balancer (ALB) in public subnets. The ALB has an SSL certificate from AWS Certificate Manager (ACM) with a custom domain name. The game is written in JavaScript and runs entirely in a user ' s web browser.

The game is increasing in popularity in many countries around the world. The company wants to update the application architecture and optimize costs without compromising performance.

What should a solutions architect do to meet these requirements?

Options:

A.

Use Amazon CloudFront and create a global distribution that points to the ALB. Reuse the existing certificate from ACM for the CloudFront distribution. Use Amazon Route 53 to update the application alias to point to the distribution.

B.

Use AWS CloudFormation to deploy the application stack to AWS Regions near countries where the game is popular. Use ACM to create a new certificate for each application instance. Use Amazon Route 53 with a geolocation routing policy to direct traffic to the local application instance.

C.

Use Amazon S3 and create an S3 bucket in AWS Regions near countries where the game is popular. Deploy the HTML and JavaScript files to each S3 bucket. Use ACM to create a new certificate for each S3 bucket. Use Amazon Route 53 with a geolocation routing policy to direct traffic to the local S3 bucket.

D.

Use Amazon S3 and create an S3 bucket in us-west-2. Deploy the HTML and JavaScript files to the S3 bucket. Use Amazon CloudFront and create a global distribution with the S3 bucket as the origin. Use ACM to create a new certificate for the distribution. Use Amazon Route 53 to update the application alias to point to the distribution.

Buy Now
Questions 221

A company runs a web application on Amazon EC2 instances in an Auto Scaling group that has a target group. The company designed the application to work with session affinity (sticky sessions) for a better user experience.

The application must be available publicly over the internet as an endpoint. A WAF must be applied to the endpoint for additional security. Session affinity (sticky sessions) must be configured on the endpoint.

Options:

A.

Create a public Network Load Balancer. Specify the application target group.

B.

Create a Gateway Load Balancer. Specify the application target group.

C.

Create a public Application Load Balancer. Specify the application target group.

D.

Create a second target group. Add Elastic IP addresses to the EC2 instances.

E.

Create a web ACL in AWS WAF. Associate the web ACL with the endpoint.

Buy Now
Questions 222

A company uses Amazon EC2 instances and Amazon Elastic Block Store (Amazon EBS) volumes to run an application. The company creates one snapshot of each EBS volume every day.

The company needs to prevent users from accidentally deleting the EBS volume snapshots. The solution must not change the administrative rights of a storage administrator user.

Which solution will meet these requirements with the LEAST administrative effort?

Options:

A.

Create an IAM role that has permission to delete snapshots. Attach the role to a new EC2 instance. Use the AWS CLI from the new EC2 instance to delete snapshots.

B.

Create an IAM policy that denies snapshot deletion. Attach the policy to the storage administrator user.

C.

Add tags to the snapshots. Create tag-level retention rules in the Recycle Bin for EBS snapshots. Configure rule lock settings for the retention rules.

D.

Take EBS snapshots by using the EBS direct APIs. Copy the snapshots to an Amazon S3 bucket. Configure S3 Versioning and Object Lock on the bucket.

Buy Now
Questions 223

A solutions architect must design a database solution for a high-traffic ecommerce web application. The database stores customer profiles and shopping cart information. The database must support a peak load of several million requests each second and deliver responses in milliseconds. The operational overhead for managing and scaling the database must be minimized.

Which database solution should the solutions architect recommend?

Options:

A.

Amazon Aurora

B.

Amazon DynamoDB

C.

Amazon RDS

D.

Amazon Redshift

Buy Now
Questions 224

A company is planning to connect a remote office to its AWS infrastructure. The office requires permanent and secure connectivity to AWS. The connection must provide secure access to resources in two VPCs. However, the VPCs must not be able to access each other.

Options:

A.

Create two transit gateways. Set up one AWS Site-to-Site VPN connection from the remote office to each transit gateway. Connect one VPC to the transit gateway. Configure route table propagation to the appropriate transit gateway based on the destination VPC IP range.

B.

Set up one AWS Site-to-Site VPN connection from the remote office to each of the VPCs. Update the VPC route tables with static routes to the remote office resources.

C.

Set up one AWS Site-to-Site VPN connection from the remote office to one of the VPCs. Set up VPC peering between the two VPCs. Update the VPC route tables with static routes to the remote office and peered resources.

D.

Create a transit gateway. Set up an AWS Direct Connect gateway and one Direct Connect connection between the remote office and the Direct Connect gateway. Associate the transit gateway with the Direct Connect gateway. Configure a separate private virtual interface (VIF) for each VPC, and configure routing.

Buy Now
Questions 225

A company uses an Amazon RDS for MySQL database with provisioned IOPS in a Multi-AZ deployment. The company recently migrated the database to Amazon DynamoDB tables successfully. However, the company needs to retain the RDS for MySQL database for several months for occasional post-migration testing and debugging.

The company took a snapshot of the RDS database immediately after the migration. The RDS database must be available to query within 10 minutes when needed.

Which solution will meet these requirements in the MOST cost-effective way?

Options:

A.

Use the stop-db-cluster AWS CLI command and the stop-db-instance CLI command to stop the RDS database. Restart the database as needed by using CLI commands.

B.

Create a new RDS database. Attach Amazon EBS magnetic volumes that contain the original RDS database snapshot to the new database. Terminate the original RDS database.

C.

Create a new RDS database in a single Availability Zone based on the original RDS database snapshot. Terminate the original RDS database.

D.

Create an Amazon Aurora MySQL Serverless v2 cluster based on the RDS database snapshot. Terminate the original RDS database.

Buy Now
Questions 226

A company hosts a website on Amazon EC2 instances behind an Application Load Balancer (ALB). The instances run Amazon Linux in an Auto Scaling group. Each instance stores product manuals on Amazon EBS volumes.

New instances often start with outdated data and may take up to 30 minutes to download updates. The company needs a solution ensuring all instances always have up-to-date product manuals, can scale rapidly, and does not require application code changes.

Which solution will meet these requirements?

Options:

A.

Store the product manuals on instance store volumes attached to each EC2 instance.

B.

Store the product manuals in an Amazon S3 bucket. Configure EC2 instances to download updates from the bucket.

C.

Store the product manuals in an Amazon EFS file system. Mount the EFS volume on the EC2 instances.

D.

Store the product manuals in an S3 bucket using S3 Standard-IA. Configure EC2 instances to download updates from S3.

Buy Now
Questions 227

A company runs a web application that uses Amazon RDS for MySQL to store relational data. Data in the database does not change frequently.

A solutions architect notices that during peak usage times, the database has performance issues when it serves the data. The company wants to improve the performance of the database.

Which combination of steps will meet these requirements? (Select TWO.)

Options:

A.

Integrate AWS WAF with the application.

B.

Create a read replica for the database. Redirect read traffic to the read replica.

C.

Create an Amazon ElastiCache (Memcached) cluster. Configure the application and the database to integrate with the cluster.

D.

Use the Amazon S3 One Zone-Infrequent Access (S3 One Zone-IA) storage class to store the data that changes infrequently.

E.

Migrate the database to Amazon DynamoDB. Configure the application to use the DynamoDB database.

Buy Now
Questions 228

A company stores customer data in a multitenant Amazon S3 bucket. Each customer ' s data is stored in a prefix that is unique to the customer. The company needs to migrate data for specific customers to a new. dedicated S3 bucket that is in the same AWS Region as the source bucket. The company must preserve object metadata such as creation date and version IDs.

After the migration is finished, the company must delete the source data for the migrated customers from the original multitenant S3 bucket.

Which combination of solutions will meet these requirements with the LEAST overhead? (Select THREE.)

Options:

A.

Create a new S3 bucket as a destination bucket. Enable versioning on the new bucket.

B.

Use S3 batch operations to copy objects from the specified prefixes to the destination bucket.

C.

Use the S3 CopyObject API, and create a script to copy data to the destination S3 bucket.

D.

Configure S3 Same-Region Replication (SRR) to replicate existing data from the specified prefixes in the source bucket to the destination bucket.

E.

Configure AWS DataSync to migrate data from the specified prefixes in the source bucket to the destination bucket.

F.

Use an S3 Lifecycle policy to delete objects from the source bucket after the data is migrated to the destination bucket.

Buy Now
Questions 229

A company operates a food delivery service. Because of recent growth, the company ' s order processing system is experiencing scaling problems during peak traffic hours. The current architecture includes Amazon EC2 instances in an Auto Scaling group that collect orders from an application. A second group of EC2 instances in an Auto Scaling group fulfills the orders.

The order collection process occurs quickly, but the order fulfillment process can take longer. Data must not be lost because of a scaling event.

A solutions architect must ensure that the order collection process and the order fulfillment process can both scale adequately during peak traffic hours.

Which solution will meet these requirements?

Options:

A.

Use Amazon CloudWatch to monitor the CPUUtilization metric for each instance in both Auto Scaling groups. Configure each Auto Scaling group ' s minimum capacity to meet its peak workload value.

B.

Use Amazon CloudWatch to monitor the CPUUtilization metric for each instance in both Auto Scaling groups. Configure a CloudWatch alarm to invoke an Amazon SNS topic to create additional Auto Scaling groups on demand.

C.

Provision two Amazon SQS queues. Use one SQS queue for order collection. Use the second SQS queue for order fulfillment. Configure the EC2 instances to poll their respective queues. Scale the Auto Scaling groups based on notifications that the queues send.

D.

Provision two Amazon SQS queues. Use one SQS queue for order collection. Use the second SQS queue for order fulfillment. Configure the EC2 instances to poll their respective queues. Scale the Auto Scaling groups based on the number of messages in each queue.

Buy Now
Questions 230

A company is building a serverless application to process orders from an ecommerce site. The application needs to handle bursts of traffic during peak usage hours and to maintain high availability. The orders must be processed asynchronously in the order the application receives them.

Which solution will meet these requirements?

Options:

A.

Use an Amazon Simple Notification Service (Amazon SNS) topic to receive orders. Use an AWS Lambda function to process the orders.

B.

Use an Amazon Simple Queue Service (Amazon SQS) FIFO queue to receive orders. Use an AWS Lambda function to process the orders.

C.

Use an Amazon Simple Queue Service (Amazon SQS) standard queue to receive orders. Use AWS Batch jobs to process the orders.

D.

Use an Amazon Simple Notification Service (Amazon SNS) topic to receive orders. Use AWS Batch jobs to process the orders.

Buy Now
Questions 231

A company wants to visualize its AWS spend and resource usage. The company wants to use an AWS managed service to provide visual dashboards.

Which solution will meet these requirements?

Options:

A.

Configure an export in AWS Data Exports. Use Amazon QuickSight to create a cost and usage dashboard. View the data in QuickSight.

B.

Configure one custom budget in AWS Budgets for costs. Configure a second custom budget for usage. Schedule daily AWS Budgets reports by using the two budgets as sources.

C.

Configure AWS Cost Explorer to use user-defined cost allocation tags with hourly granularity to generate detailed data.

D.

Configure an export in AWS Data Exports. Use the standard export option. View the data in Amazon Athena.

Buy Now
Questions 232

A company runs a web application that uses an Amazon RDS for MySQL database. A company employee caused data loss by accidentally editing information in a database table.

The company must be able to recover from similar incidents in the future. The company must be able to restore the database to a specific point in time within the previous 30 days. The solution must restore the database with a maximum of 5 minutes of data loss.

Which solution will meet these requirements?

Options:

A.

Read replicas

B.

Manual snapshots

C.

Automated backups

D.

Multi-AZ deployments

Buy Now
Questions 233

A company is creating a mobile financial app that gives users the ability to sign up and store personal information. The app uses an Amazon DynamoDB table to store user details and preferences.

The app generates a credit score report by using the data that is stored in DynamoDB. The app sends credit score reports to users once every month.

The company needs to provide users with an option to remove their data and preferences. The app must delete customer data within one month of receiving a request to delete the data.

Which solution will meet these requirements with the LEAST operational overhead?

Options:

A.

Create an AWS Lambda function to delete user information. Create an Amazon EventBridge rule that runs when a specified TTL expires. Configure the EventBridge rule to invoke the Lambda function.

B.

Create a DynamoDB stream. Create an AWS Lambda function to delete user information. When a specified TTL expires, write user information to the DynamoDB stream from the DynamoDB table. Configure the DynamoDB stream to invoke the Lambda function to delete user information.

C.

Enable TTL in DynamoDB. Set the expiration date as an attribute. Create an AWS Lambda function to set the TTL based on the expiration date value. Invoke the Lambda function when a user requests to delete personal data.

D.

Enable TTL in DynamoDB. Create an AWS Lambda function to delete user information. Configure AWS Config to detect the DynamoDB state change when TTL expires and to invoke the Lambda function.

Buy Now
Questions 234

A company hosts a video streaming web application in a VPC. The company uses a Network Load Balancer (NLB) to handle TCP traffic for real-time data processing. There have been unauthorized attempts to access the application.

The company wants to improve application security with minimal architectural change to prevent unauthorized attempts to access the application.

Which solution will meet these requirements?

Options:

A.

Implement a series of AWS WAF rules directly on the NLB to filter out unauthorized traffic.

B.

Recreate the NLB with a security group to allow only trusted IP addresses.

C.

Deploy a second NLB in parallel with the existing NLB configured with a strict IP address allow list.

D.

Use AWS Shield Advanced to provide enhanced DDoS protection and prevent unauthorized access attempts.

Buy Now
Questions 235

A finance company uses backup software to back up its data to physical tape storage on-premises. To comply with regulations, the company needs to store the data for 7 years. The company must be able to restore archived data within one week when necessary.

The company wants to migrate the backup data to AWS to reduce costs. The company does not want to change the current backup software.

Which solution will meet these requirements MOST cost-effectively?

Options:

A.

Use AWS Storage Gateway Tape Gateway to copy the data to virtual tapes. Use AWS DataSync to migrate the virtual tapes to the Amazon S3 Standard-Infrequent Access (S3 Standard-IA). Change the target of the backup software to S3 Standard-IA.

B.

Convert the physical tapes to virtual tapes. Use AWS DataSync to migrate the virtual tapes to Amazon S3 Glacier Flexible Retrieval. Change the target of the backup software to the S3 Glacier Flexible Retrieval.

C.

Use AWS Storage Gateway Tape Gateway to copy the data to virtual tapes. Migrate the virtual tapes to Amazon S3 Glacier Deep Archive. Change the target of the backup software to the virtual tapes.

D.

Convert the physical tapes to virtual tapes. Use AWS Snowball Edge storage-optimized devices to migrate the virtual tapes to Amazon S3 Glacier Flexible Retrieval. Change the target of the backup software to S3 Glacier Flexible Retrieval.

Buy Now
Questions 236

A company needs to archive an on-premises relational database. The company wants to retain the data. The company needs to be able to run SQL queries on the archived data to create annual reports.

Which solution will meet these requirements with the LEAST operational overhead?

Options:

A.

Use AWS DMS to migrate the on-premises database to an Amazon RDS instance. Retire the on-premises database. Maintain the RDS instance in a stopped state until the data is needed for reports.

B.

Set up database replication from the on-premises database to an Amazon EC2 instance. Retire the on-premises database. Make a snapshot of the EC2 instance. Maintain the EC2 instance in a stopped state until the data is needed for reports.

C.

Create a database backup on premises. Use AWS DataSync to transfer the data to Amazon S3. Create an S3 Lifecycle configuration to move the data to S3 Glacier Deep Archive. Restore the backup to Amazon EC2 instances to run reports.

D.

Use AWS DMS to migrate the on-premises databases to Amazon S3 in Apache Parquet format. Store the data in S3 Glacier Flexible Retrieval. Use Amazon Athena to run reports.

Buy Now
Questions 237

A company uses Amazon S3 to store customer data that contains personally identifiable information (PII) attributes. The company needs to make the customer information available to company resources through an AWS Glue Catalog. The company needs to have fine-grained access control for the data so that only specific IAM roles can access the PII data.

Options:

A.

Create one IAM policy that grants access to PII. Create a second IAM policy that grants access to non-PII data. Assign the PII policy to the specified IAM roles.

B.

Create one IAM role that grants access to PII. Create a second IAM role that grants access to non-PII data. Assign the PII policy to the specified IAM roles.

C.

Use AWS Lake Formation to provide the specified IAM roles access to the PII data.

D.

Use AWS Glue to create one view for PII data. Create a second view for non-PII data. Provide the specified IAM roles access to the PII view.

Buy Now
Questions 238

A company has developed an API using an Amazon API Gateway REST API and AWS Lambda functions. The API serves static and dynamic content to users worldwide. The company wants to decrease the latency of transferring content for API requests.

Options:

Options:

A.

Deploy the REST API as an edge-optimized API endpoint. Enable caching. Enable content encoding in the API definition to compress the application data in transit.

B.

Deploy the REST API as a Regional API endpoint. Enable caching. Enable content encoding in the API definition to compress the application data in transit.

C.

Deploy the REST API as an edge-optimized API endpoint. Enable caching. Configure reserved concurrency for the Lambda functions.

D.

Deploy the REST API as a Regional API endpoint. Enable caching. Configure reserved concurrency for the Lambda functions.

Buy Now
Questions 239

A company ' s data platform uses an Amazon Aurora MySQL database. The database has multiple read replicas and multiple DB instances across different Availability Zones. Users have recently reported errors from the database that indicate that there are too many connections. The company wants to reduce the failover time by 20% when a read replica is promoted to primary writer.

Which solution will meet this requirement?

Options:

A.

Switch from Aurora to Amazon RDS with Multi-AZ cluster deployment.

B.

Use Amazon RDS Proxy in front of the Aurora database.

C.

Switch to Amazon DynamoDB with DynamoDB Accelerator DAX for read connections.

D.

Switch to Amazon Redshift with relocation capability.

Buy Now
Questions 240

A company has an application that serves clients that are deployed in more than 20.000 retail storefront locations around the world. The application consists of backend web services that are exposed over HTTPS on port 443 The application is hosted on Amazon EC2 Instances behind an Application Load Balancer (ALB). The retail locations communicate with the web application over the public internet. The company allows each retail location to register the IP address that the retail location has been allocated by its local ISP.

The company ' s security team recommends to increase the security of the application endpoint by restricting access to only the IP addresses registered by the retail locations.

What should a solutions architect do to meet these requirements?

Options:

A.

Associate an AWS WAF web ACL with the ALB Use IP rule sets on the ALB to filter traffic Update the IP addresses in the rule to Include the registered IP addresses

B.

Deploy AWS Firewall Manager to manage the ALB. Configure firewall rules to restrict traffic to the ALB Modify the firewall rules to include the registered IP addresses.

C.

Store the IP addresses in an Amazon DynamoDB table. Configure an AWS Lambda authorization function on the ALB to validate that incoming requests are from the registered IP addresses.

D.

Configure the network ACL on the subnet that contains the public interface of the ALB Update the ingress rules on the network ACL with entries for each of the registered IP addresses.

Buy Now
Questions 241

A company stores a large number of image files in an Amazon S3 bucket. The images need to be readily available for 180 days. The company rarely accesses images that are older than 180 days. However, the company must be able to access images immediately when necessary.

The company wants to archive images that are older than 360 days, but the company must be able to access the images instantly when required. The images cannot be deleted. The company requires high availability and redundancy throughout the entire lifecycle of the files.

The company will use S3 Standard storage for the first 180 days. The company needs to configure S3 Lifecycle rules to handle the remaining lifecycle stages of the files.

Which solution will meet these requirements MOST cost-effectively?

Options:

A.

Transition the objects to S3 One Zone-Infrequent Access S3 One Zone-IA after 180 days. Transition the objects to S3 Glacier Instant Retrieval after 360 days.

B.

Transition the objects to S3 One Zone-Infrequent Access S3 One Zone-IA after 180 days. Transition the objects to S3 Glacier Flexible Retrieval after 360 days.

C.

Transition the objects to S3 Standard-Infrequent Access S3 Standard-IA after 180 days. Transition the objects to S3 Glacier Instant Retrieval after 360 days.

D.

Transition the objects to S3 Standard-Infrequent Access S3 Standard-IA after 180 days. Transition the objects to S3 Glacier Flexible Retrieval after 360 days.

Buy Now
Questions 242

A company runs a critical public application on Amazon Elastic Kubernetes Service (Amazon EKS) clusters. The application has a microservices architecture. The company needs to implement a solution that collects, aggregates, and summarizes metrics and logs from the application in a centralized location.

Which solution will meet these requirements in the MOST operationally efficient way?

Options:

A.

Run the Amazon CloudWatch agent in the existing EKS cluster. Use a CloudWatch dashboard to view the metrics and logs.

B.

Configure a data stream in Amazon Kinesis Data Streams. Use Amazon Kinesis Data Firehose to read events and to deliver the events to an Amazon S3 bucket. Use Amazon Athena to view the events.

C.

Configure AWS CloudTrail to capture data events. Use Amazon OpenSearch Service to query CloudTrail.

D.

Configure Amazon CloudWatch Container Insights in the existing EKS cluster. Use a CloudWatch dashboard to view the metrics and logs.

Buy Now
Questions 243

A company runs all its business applications in the AWS Cloud. The company uses AWS Organizations to manage multiple AWS accounts.

A solutions architect needs to review all permissions that are granted to IAM users to determine which IAM users have more permissions than required.

Which solution will meet these requirements with the LEAST administrative overhead?

Options:

A.

Use Network Access Analyzer to review all access permissions in the company ' s AWS accounts.

B.

Create an AWS CloudWatch alarm that activates when an IAM user creates or modifies resources in an AWS account.

C.

Use AWS Identity and Access Management IAM Access Analyzer to review all the company ' s resources and accounts.

D.

Use Amazon Inspector to find vulnerabilities in existing IAM policies.

Buy Now
Questions 244

A company is building an application that needs to process real-time streaming data. The application must process and transform the data and then store the data for later analysis.

Which solution will meet these requirements with the LEAST operational overhead?

Options:

A.

Use Amazon Kinesis Data Streams to ingest streaming data. Configure Amazon EC2 instances to process and transform data records from the data streams. Configure the EC2 instances to store the processed and transformed data in an Amazon RDS for MySQL database.

B.

Send streaming data to an Amazon SQS queue. Configure AWS Lambda functions to process the data in the SQS queue. Store the processed data in an Amazon DynamoDB table.

C.

Use Amazon Kinesis Data Streams to ingest streaming data. Configure an AWS Lambda function to process and transform data records from the data streams. Configure the Lambda function to store the processed and transformed data in an Amazon DynamoDB table.

D.

Send streaming data to an Amazon SNS topic. Create an application to process the data on an Amazon EC2 instance. Store the processed data in an Amazon ElastiCache cache.

Buy Now
Questions 245

A company wants to share data between applications that run in separate AWS accounts. The company wants to use Amazon API Gateway REST APIs to expose private APIs. The company wants to ensure that only authorized accounts can invoke the private APIs.

Which solution will meet this requirement?

Options:

A.

Use an API Gateway interface endpoint policy to grant access to specific accounts.

B.

Use an API Gateway resource policy to grant access to specific accounts.

C.

Use cross-account IAM policies to grant access to the private APIs.

D.

Use AWS Lambda authorizers to grant access to specific accounts.

Buy Now
Questions 246

A company is developing a SaaS solution for customers. The solution runs on Amazon EC2 instances that have Amazon Elastic Block Store (Amazon EBS) volumes attached.

Within the SaaS application, customers can request how much storage they need. The application needs to allocate the amount of block storage each customer requests.

A solutions architect must design an operationally efficient solution that meets the storage scaling requirement.

Which solution will meet these requirements MOST cost-effectively?

Options:

A.

Migrate the data from the EBS volumes to an Amazon S3 bucket. Use the Amazon S3 Standard storage class.

B.

Migrate the data from the EBS volumes to an Amazon Elastic File System (Amazon EFS) file system. Use the EFS Standard storage class. Invoke an AWS Lambda function to increase the EFS volume capacity based on user input.

C.

Migrate the data from the EBS volumes to an Amazon FSx for Windows File Server file system. Invoke an AWS Lambda function to increase the capacity of the file system based on user input.

D.

Invoke an AWS Lambda function to increase the size of EBS volumes based on user input by using EBS Elastic Volumes.

Buy Now
Questions 247

A company has an application that runs only on Amazon EC2 Spot Instances. The instances run in an Amazon EC2 Auto Scaling group with scheduled scaling actions. However, the capacity does not always increase at the scheduled times, and instances terminate many times a day. A solutions architect must ensure that the instances launch on time and have fewer interruptions.

Which action will meet these requirements?

Options:

A.

Specify the capacity-optimized allocation strategy for Spot Instances. Add more instance types to the Auto Scaling group.

B.

Specify the capacity-optimized allocation strategy for Spot Instances. Increase the size of the instances in the Auto Scaling group.

C.

Specify the lowest-price allocation strategy for Spot Instances. Add more instance types to the Auto Scaling group.

D.

Specify the lowest-price allocation strategy for Spot Instances. Increase the size of the instances in the Auto Scaling group.

Buy Now
Questions 248

A company is designing a new Amazon Elastic Kubernetes Service (Amazon EKS) deployment to host multi-tenant applications that use a single cluster. The company wants to ensure that each pod has its own hosted environment. The environments must not share CPU, memory, storage, or elastic network interfaces.

Which solution will meet these requirements?

Options:

A.

Use Amazon EC2 instances to host self-managed Kubernetes clusters. Use taints and tolerations to enforce isolation boundaries.

B.

Use Amazon EKS with AWS Fargate. Use Fargate to manage resources and to enforce isolation boundaries.

C.

Use Amazon EKS and self-managed node groups. Use taints and tolerations to enforce isolation boundaries.

D.

Use Amazon EKS and managed node groups. Use taints and tolerations to enforce isolation boundaries.

Buy Now
Questions 249

A company uses AWS WAF to protect its web applications. A solutions architect configures a web ACL that uses several rules, including a rule that inspects the HTTP request body for malicious content.

The solutions architect notices that the web ACL is not inspecting large HTTP POST requests properly. As a result, suspicious activities are not being detected. Some large HTTP POST requests are more than 8 MB in size.

The solutions architect must ensure that the web ACL inspects the large HTTP POST requests properly.

Which solution will meet this requirement?

Options:

A.

Create two custom AWS WAF rules. Configure one rule to block all oversized requests. Configure the second rule with a higher priority to allow large requests from legitimate hosts.

B.

Enable AWS Shield Advanced. Reconfigure the web ACL to block oversized requests by using Shield Advanced.

C.

Verify that the Content-Type header is correctly set in the HTTP requests that AWS WAF rules inspect.

D.

Create an AWS Lambda function to preprocess the large requests before AWS rules inspect the requests.

Buy Now
Questions 250

An ecommerce company is launching a new marketing campaign. The company anticipates the campaign to generate ten times the normal number of daily orders through the company ' s ecommerce application. The campaign will last 3 days.

The ecommerce application architecture is based on Amazon EC2 instances in an Auto Scaling group and an Amazon RDS for MySQL database. The application writes order transactions to an Amazon Elastic File System (Amazon EFS) file system before the application writes orders to the database. During normal operations, the application write operations peak at 5,000 IOPS.

A solutions architect needs to ensure that the application can handle the anticipated workload during the marketing campaign.

Which solution will meet this requirement?

Options:

A.

For the duration of the campaign, increase the provisioned IOPS for the RDS for MySQL database. Set the Amazon EFS throughput mode to Bursting throughput.

B.

For the duration of the campaign, increase the provisioned IOPS for the RDS for MySQL database. Set the Amazon EFS throughput mode to Elastic throughput.

C.

Convert the database to a Multi-AZ deployment. Set the Amazon EFS throughput mode to Elastic throughput for the duration of the campaign.

D.

Use AWS Database Migration Service (AWS DMS) to convert the database to RDS for PostgreSQL. Set the Amazon EFS throughput mode to Bursting throughput.

Buy Now
Questions 251

A company uses Amazon EC2 instances behind an Application Load Balancer (ALB) to serve content to users. The company uses Amazon Elastic Block Store (Amazon EBS) volumes to store data.

The company needs to encrypt data in transit and at rest.

Which combination of services will meet these requirements? (Select TWO.)

Options:

A.

Amazon GuardDuty

B.

AWS Shield

C.

AWS Certificate Manager (ACM)

D.

AWS Secrets Manager

E.

AWS Key Management Service (AWS KMS)

Buy Now
Questions 252

A company wants to build a serverless application in which multiple microservices need to exchange messages. The company needs to ensure that messages that the microservices send to one another are processed exactly once in the exact order the messages are sent. Which solution will meet these requirements in the MOST operationally efficient way?

Options:

A.

Create an Amazon SQS FIFO queue. Configure the microservices to use the SQS queue to exchange messages.

B.

Use Amazon SNS topics to connect the microservices to one another. Subscribe the microservices to the SNS topics. Use the Amazon SNS API to send and receive notifications between microservices.

C.

Create an Amazon SQS standard queue. Connect the microservices to one another by using Amazon EventBridge events that the microservices exchange through the SQS queue.

D.

Use Amazon Managed Streaming for Apache Kafka Amazon MSK on Amazon EC2 instances to deploy the application.

Buy Now
Questions 253

A company wants to re-architect an application to use Amazon SQS queues. The company must ensure that the application can handle sudden increases in traffic.

Which Amazon SQS feature will help meet this requirement?

Options:

A.

FIFO queues

B.

Visibility timeout

C.

Message batching

D.

Long polling

Buy Now
Questions 254

A company has established a new AWS account. The account is newly provisioned and no changes have been made to the default settings. The company is concerned about the security of the AWS account root user.

What should be done to secure the root user?

Options:

A.

Create IAM users for daily administrative tasks. Disable the root user.

B.

Create IAM users for daily administrative tasks. Enable multi-factor authentication on the root user.

C.

Generate an access key for the root user. Use the access key for daily administration tasks instead of the AWS Management Console.

D.

Provide the root user credentials to the most senior solutions architect. Have the solutions architect use the root user for daily administration tasks.

Buy Now
Questions 255

A company uses Amazon Redshift to store structured data and Amazon S3 to store unstructured data. The company wants to analyze the stored data and create business intelligence reports. The company needs a data visualization solution that is compatible with Amazon Redshift and Amazon S3.

Which solution will meet these requirements?

Options:

A.

Use Amazon Redshift query editor v2 to analyze data stored in Amazon Redshift. Use Amazon Athena to analyze data stored in Amazon S3. Use Amazon QuickSight to access Amazon Redshift and Athena, visualize the data analyses, and create business intelligence reports.

B.

Use Amazon Redshift Serverless to analyze data stored in Amazon Redshift. Use Amazon S3 Object Lambda to analyze data stored in Amazon S3. Use Amazon Managed Grafana to access Amazon Redshift and Object Lambda, visualize the data analyses, and create business intelligence reports.

C.

Use Amazon Redshift Spectrum to analyze data stored in Amazon Redshift. Use Amazon Athena to analyze data stored in Amazon S3. Use Amazon QuickSight to access Amazon Redshift and Athena, visualize the data analyses, and create business intelligence reports.

D.

Use Amazon OpenSearch Service to analyze data stored in Amazon Redshift and Amazon S3. Use Amazon Managed Grafana to access OpenSearch Service, visualize the data analyses, and create business intelligence reports.

Buy Now
Questions 256

A company wants to migrate from an on-premises data center to AWS. The data center hosts a storage server that stores data in an NFS-based file system. The storage server stores 200 GB of data. The company needs to migrate the data without interruption to existing services. Multiple resources in AWS must be able to access the data by using the NFS protocol.

Which combination of steps will meet these requirements MOST cost-effectively? (Select TWO.)

Options:

A.

Create an Amazon FSx for Lustre file system.

B.

Create an Amazon Elastic File System (Amazon EFS) file system.

C.

Create an Amazon S3 bucket to receive the data.

D.

Create an Amazon FSx for Windows file system.

E.

Install an AWS DataSync agent in the on-premises data center. Use a DataSync task between the on-premises file system and the AWS file system.

Buy Now
Questions 257

A company has migrated a two-tier application from its on-premises data center to the AWS Cloud. The data tier is a Multi-AZ deployment of Amazon RDS for Oracle with 12 TiB of General Purpose SSD Amazon EBS storage. The application is designed to read and store documents in the database as binary large objects (BLOBs) with an average document size of 6 MB.

The database size has grown over time, reducing performance and increasing the cost of storage. The company must improve the database performance and needs a solution that is highly available and resilient.

Which solution will meet these requirements MOST cost-effectively?

Options:

A.

Reduce the RDS DB instance size. Increase the storage capacity to 24 TiB. Change the storage type to Magnetic.

B.

Increase the RDS DB instance size. Increase the storage capacity to 24 TiB. Change the storage type to Provisioned IOPS.

C.

Create an Amazon S3 bucket. Update the application to store documents in the S3 bucket. Store the object metadata in the existing database.

D.

Create an Amazon DynamoDB table. Update the application to use DynamoDB. Use AWS DMS to migrate data from the Oracle database to DynamoDB.

Buy Now
Questions 258

A company needs a solution to prevent photos with unwanted content from being uploaded to the company’s web application. The solution must not involve training a machine learning (ML) model.

Which solution will meet these requirements?

Options:

A.

Create and deploy a model by using Amazon SageMaker Autopilot. Create a real-time endpoint that the web application invokes when new photos are uploaded.

B.

Create an AWS Lambda function that uses Amazon Rekognition to detect unwanted content. Create a Lambda function URL that the web application invokes when new photos are uploaded.

C.

Create an Amazon CloudFront function that uses Amazon Comprehend to detect unwanted content. Associate the function with the web application.

D.

Create an AWS Lambda function that uses Amazon Rekognition Video to detect unwanted content. Create a Lambda function URL that the web application invokes when new photos are uploaded.

Buy Now
Questions 259

A company runs an application on Microsoft SQL Server databases in an on-premises data center. The company wants to migrate to AWS and optimize costs for its infrastructure on AWS.

Which solution will meet these requirements?

Options:

A.

Migrate the databases to Amazon EC2 instances that use SQL Server Amazon Machine Images (AMIs) provided by AWS.

B.

Migrate to Amazon Aurora PostgreSQL by using Babelfish for Aurora PostgreSQL.

C.

Migrate the databases to a PostgreSQL database that runs on Amazon EC2 instances.

D.

Migrate the databases to Amazon RDS for Microsoft SQL Server.

Buy Now
Questions 260

A company hosts customer data in an Amazon S3 bucket. The company wants to ensure that only specific applications that run on Amazon EC2 instances in a private subnet have access to the S3 bucket. The applications must not require long-term AWS access keys. The company needs to log all access to S3 objects for auditing purposes.

Which solution will meet these requirements?

Options:

A.

Create an S3 bucket policy that allows access only from the private subnet ' s IP range. Configure each EC2 instance to use access keys that are stored in AWS Systems Manager Parameter Store. Configure Amazon S3 server access logging.

B.

Create an IAM role that has access to the S3 bucket. Attach the IAM role to the EC2 instances. Update the bucket policy to allow access only for the role. Use AWS CloudTrail to log data events for the bucket.

C.

Create an IAM user, an access key, and a secret key. Store the keys in AWS Secrets Manager. Configure the EC2 instances to retrieve the keys. Use AWS CloudTrail management events to track bucket access.

D.

Create a gateway VPC endpoint for Amazon S3. Update the S3 bucket policy to allow access only through the endpoint. Attach an IAM role to the EC2 instances that has appropriate S3 permissions. Use VPC Flow Logs to track VPC endpoint activity.

Buy Now
Questions 261

An ecommerce company runs a transaction processing system within a large application on a set of Amazon EC2 instances behind an Application Load Balancer ALB. The transaction process handles order creation, payment initiation, and inventory updates.

The company has observed performance issues in the transaction workflow as the volume of transactions has increased. The company wants to re-architect the transaction process to introduce horizontal scalability and to improve cost efficiency.

Which solution will meet these requirements?

Options:

A.

Decouple the transaction system into microservices that run on AWS Lambda functions. Expose the microservices through a central Amazon API Gateway REST API. Use Amazon SQS queues to decouple order creation and payment processing.

B.

Migrate the transaction system to an Amazon EKS cluster. Deploy the Kubernetes Vertical Pod Autoscaler to manage application scalability.

C.

Add caching layers to the transaction system by using an Amazon ElastiCache cluster. Scale the EC2 instances to the largest size available to handle the increased load.

D.

Decouple the transaction system into microservices. Deploy each microservice as a separate application to its own dedicated group of EC2 instances. Place each group of instances behind a separate ALB. Scale the application by launching larger EC2 instance sizes as needed.

Buy Now
Questions 262

A company needs to provide a team of contractors with temporary access to the company ' s AWS resources for a short-term project. The contractors need different levels of access to AWS services. The company needs to revoke permissions for all the contractors when the project is finished.

Which solution will meet these requirements with the LEAST operational overhead?

Options:

A.

Use IAM to create a user account for each contractor. Attach policies that define access levels for the contractors to the user accounts. Manually deactivate the accounts when the project is finished.

B.

Use AWS STS to generate temporary credentials for the contractors. Provide the contractors access based on predefined roles. Set the access to automatically expire when the project is finished.

C.

Configure AWS Config rules to monitor the contractors ' access patterns. Use AWS Config rules to automatically revoke permissions that are not in use or that are too permissive.

D.

Use AWS CloudTrail and custom Amazon EventBridge triggers to audit the contractors ' actions. Adjust the permissions for each contractor based on activity logs.

Buy Now
Questions 263

A company processes large amounts of data by using Amazon EC2 instances in an Auto Scaling group. The data processing jobs run for up to 48 hours each week. The data processing jobs can handle interruptions. However, the company wants to minimize the interruptions.

The company wants to use the latest generation of Amazon EC2 instances each year.

Which solution will meet these requirements in the MOST cost-effective way?

Options:

A.

Purchase Convertible Reserved Instances (RIs) on an All Upfront basis for a 3-year term for the instance types currently in use.

B.

Purchase Standard Reserved Instances (RIs) on an All Upfront basis for a 1-year term for the instance types in use.

C.

Purchase Spot Instances with a price-capacity-optimized allocation strategy. Override instance types in the Auto Scaling group.

D.

Purchase Spot Instances with a capacity-optimized allocation strategy. Override instance types in the Auto Scaling group.

Buy Now
Exam Code: SAA-C03
Exam Name: AWS Certified Solutions Architect - Associate (SAA-C03)
Last Update: May 6, 2026
Questions: 879

PDF + Testing Engine

$49.5  $164.99

Testing Engine

$37.5  $124.99
buy now SAA-C03 testing engine

PDF (Q&A)

$31.5  $104.99
buy now SAA-C03 pdf
dumpsmate guaranteed to pass

24/7 Customer Support

DumpsMate's team of experts is always available to respond your queries on exam preparation. Get professional answers on any topic of the certification syllabus. Our experts will thoroughly satisfy you.

Site Secure

mcafee secure

TESTED 06 May 2026