More Test Questions - 1 Flashcards

1
Q

An application is being created that will use Amazon EC2 instances to generate and store data. Another set of EC2 instances will then analyze and modify the data. Storage requirements will be significant and will continue to grow over time. The application architects require a storage solution. Which actions would meet these needs?

  1. 1: Store the data in an Amazon EBS volume. Mount the EBS volume on the application instances
  2. 2: Store the data in an Amazon EFS filesystem. Mount the file system on the application instances
  3. 3: Store the data in Amazon S3 Glacier. Update the vault policy to allow access to the application instances
  4. 4: Store the data in AWS Storage Gateway. Setup AWS Direct Connect between the Gateway appliance and the EC2 instances
A
  1. 1: Store the data in an Amazon EBS volume. Mount the EBS volume on the application instances
  2. 2: Store the data in an Amazon EFS filesystem. Mount the file system on the application instances
  3. 3: Store the data in Amazon S3 Glacier. Update the vault policy to allow access to the application instances
  4. 4: Store the data in AWS Storage Gateway. Setup AWS Direct Connect between the Gateway appliance and the EC2 instances
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

A company hosts a multiplayer game on AWS. The application uses Amazon EC2 instances in a single Availability Zone and users connect over Layer 4. Solutions Architect has been tasked with making the architecture highly available and also more cost-effective. How can the solutions architect best meet these requirements? (Select TWO)

  1. 1: Configure an Auto Scaling group to add or remove instances in the Availability Zone automatically
  2. 2: Increase the number of instances and use smaller EC2 instance types
  3. 3: Configure a Network Load Balancer in front of the EC2 instances
  4. 4: Configure an Application Load Balancer in front of the EC2 instances
  5. Configure an Auto Scaling group to add or remove instances in multiple Availability Zones automatically
A
  1. Configure an Auto Scaling group to add or remove instances in the Availability Zone automatically
  2. 2: Increase the number of instances and use smaller EC2 instance types
  3. 3: Configure a Network Load Balancer in front of the EC2 instances
  4. 4: Configure an Application Load Balancer in front of the EC2 instances
  5. Configure an Auto Scaling group to add or remove instances in multiple Availability Zones automatically
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

A company delivers content to subscribers distributed globally from an application running on AWS. The application uses a fleet of Amazon EC2 instance in a private subnet behind an Application Load Balancer (ALB). Due to an update in copyright restrictions, it is necessary to block access for specific countries. What is the EASIEST method to meet this requirement?

  1. 1: Modify the ALB security group to deny incoming traffic from blocked countries
  2. 2: Modify the security group for EC2 instances to deny incoming traffic from blocked countries
  3. 3: Use Amazon CloudFront to serve the application and deny access to blocked countries
  4. 4: Use a network ACL to block the IP address ranges associated with the specific countries
A
  1. 1: Modify the ALB security group to deny incoming traffic from blocked countries
  2. 2: Modify the security group for EC2 instances to deny incoming traffic from blocked countries
  3. 3: Use Amazon CloudFront to serve the application and deny access to blocked countries
  4. 4: Use a network ACL to block the IP address ranges associated with the specific countries
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

A company stores important data in an Amazon S3 bucket. A solutions architect needs to ensure that data can be recovered in case of accidental deletion. Which action will accomplish this?

  1. 1: Enable Amazon S3 versioning
  2. 2: Enable Amazon S3 Intelligent-Tiering
  3. 3: Enable an Amazon S3 lifecycle policy
  4. 4: Enable Amazon S3 cross-Region replication
A
  1. 1: Enable Amazon S3 versioning
  2. 2: Enable Amazon S3 Intelligent-Tiering
  3. 3: Enable an Amazon S3 lifecycle policy
  4. 4: Enable Amazon S3 cross-Region replication
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

A company is migrating from an on-premises infrastructure to the AWS Cloud. One of the company’s applications stores files on a Windows file server farm that uses Distributed File System Replication (DFSR) to keep data in sync. A solutions architect needs to replace the file server farm. Which service should the solutions architect use?

  1. 1: Amazon EFS
  2. 2: Amazon FSx
  3. 3: Amazon S3
  4. 4: AWS Storage Gateway
A
  1. 1: Amazon EFS
  2. 2: Amazon FSx
  3. 3: Amazon S3
  4. 4: AWS Storage Gateway
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

A website runs on Amazon EC2 instances in an Auto Scaling group behind an Application Load Balancer (ALB) which serves as an origin for an Amazon CloudFront distribution. An AWS WAF is being used to protect against SQL injection attacks. A review of security logs revealed an external malicious IP that needs to be blocked from accessing the website. What should a solutions architect do to protect the application?

  1. 1: Modify the network ACL on the CloudFront distribution to add a deny rule for the malicious IP address
  2. 2: Modify the configuration of AWS WAF to add an IP match condition to block the malicious IP address
  3. 3: Modify the network ACL for the EC2 instances in the target groups behind the ALB to deny the malicious IP address
  4. 4: Modify the security groups for the EC2 instances in the target groups behind the ALB to deny the malicious IP address
A
  1. 1: Modify the network ACL on the CloudFront distribution to add a deny rule for the malicious IP address
  2. 2: Modify the configuration of AWS WAF to add an IP match condition to block the malicious IP address
  3. 3: Modify the network ACL for the EC2 instances in the target groups behind the ALB to deny the malicious IP address
  4. 4: Modify the security groups for the EC2 instances in the target groups behind the ALB to deny the malicious IP address
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

An ecommerce website runs on Amazon EC2 instances behind an Application Load Balancer (ALB). The application is stateless and elastic and scales from a minimum of 10 instances, up to a maximum of 200 instances. For at least 80% of the time at least 40 instances are required. Which solution should be used to minimize costs?

  1. 1: Purchase Reserved Instances to cover 200 instances
  2. 2: Purchase Reserved Instances to cover 80 instances. Use Spot Instances to cover the remaining instances
  3. 3: Purchase On-Demand Instances to cover 40 instances. Use Spot Instances to cover the remaining instances
  4. Purchase Reserved Instances to cover 40 instances. Use On-Demand and Spot Instances to cover the remaining instances
A
  1. 1: Purchase Reserved Instances to cover 200 instances
  2. 2: Purchase Reserved Instances to cover 80 instances. Use Spot Instances to cover the remaining instances
  3. 3: Purchase On-Demand Instances to cover 40 instances. Use Spot Instances to cover the remaining instances
  4. Purchase Reserved Instances to cover 40 instances. Use On-Demand and Spot Instances to cover the remaining instances
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

A solutions architect is creating a system that will run analytics on financial data for 4 hours a night, 5 days a week. The analysis is expected to run for the same duration and cannot be interrupted once it is started. The system will be required for a minimum of 1 year. Which type of Amazon EC2 instances should be used to reduce the cost of the system?

  1. 1: Spot Instances
  2. 2: On-Demand Instances
  3. 3: Standard Reserved Instances
  4. 4: Scheduled Reserved Instances
A
  1. 1: Spot Instances
  2. 2: On-Demand Instances
  3. 3: Standard Reserved Instances
  4. 4: Scheduled Reserved
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

A solutions architect needs to backup some application log files from an online ecommerce store to Amazon S3. It is unknown how often the logs will be accessed or which logs will be accessed the most. The solutions architect must keep costs as low as possible by using the appropriate S3 storage class. Which S3 storage class should be implemented to meet these requirements?

  1. 1: S3 Glacier
  2. 2: S3 Intelligent-Tiering
  3. 3: S3 Standard-Infrequent Access (S3 Standard-IA)
  4. 4: S3 One Zone-Infrequent Access (S3 One Zone-IA)
A
  1. 1: S3 Glacier
  2. 2: S3 Intelligent-Tiering
  3. 3: S3 Standard-Infrequent Access (S3 Standard-IA)
  4. 4: S3 One Zone-Infrequent Access (S3 One Zone-IA)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

A solutions architect is designing a new service that will use an Amazon API Gateway API on the frontend. The service will need to persist data in a backend database using key-value requests. Initially, the data requirements will be around 1 GB and future growth is unknown. Requests can range from 0 to over 800 requests per second. Which combination of AWS services would meet these requirements? (Select TWO)

  1. 1: AWS Fargate
  2. 2: AWS Lambda
  3. Amazon DynamoDB
  4. 4: Amazon EC2 Auto Scaling 5: Amazon RDS
A
  1. 1: AWS Fargate
  2. 2: AWS Lambda
  3. Amazon DynamoDB
  4. 4: Amazon EC2 Auto Scaling 5: Amazon RDS
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

A company’s application is running on Amazon EC2 instances in a single Region. In the event of a disaster, a solutions architect needs to ensure that the resources can also be deployed to a second Region. Which combination of actions should the solutions architect take to accomplish this? (Select TWO)

  1. 1: Detach a volume on an EC2 instance and copy it to an Amazon S3 bucket in the second Region
  2. 2: Launch a new EC2 instance from an Amazon Machine Image (AMI) in the second Region
  3. 3: Launch a new EC2 instance in the second Region and copy a volume from Amazon S3 to the new instance
  4. 4: Copy an Amazon Machine Image (AMI) of an EC2 instance and specify the second Region for the destination
  5. 5: Copy an Amazon Elastic Block Store (Amazon EBS) volume from Amazon S3 and launch an EC2 instance in the second Region using that EBS volume
A
  1. 1: Detach a volume on an EC2 instance and copy it to an Amazon S3 bucket in the second Region
  2. 2: Launch a new EC2 instance from an Amazon Machine Image (AMI) in the second Region
  3. 3: Launch a new EC2 instance in the second Region and copy a volume from Amazon S3 to the new instance
  4. 4: Copy an Amazon Machine Image (AMI) of an EC2 instance and specify the second Region for the destination
  5. 5: Copy an Amazon Elastic Block Store (Amazon EBS) volume from Amazon S3 and launch an EC2 instance in the second Region using that EBS volume
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

A solutions architect is creating a document submission application for a school. The application will use an Amazon S3 bucket for storage. The solution must prevent accidental deletion of the documents and ensure that all versions of the documents are available. Users must be able to upload and modify the documents. Which combination of actions should be taken to meet these requirements? (Select TWO)

1: Set read-only permissions on the bucket
2: Enable versioning on the bucket
3: Attach an IAM policy to the bucket
4: Enable MFA Delete on the bucket
5: Encrypt the bucket using AWS SSE-S3

A

1: Set read-only permissions on the bucket
2: Enable versioning on the bucket
3: Attach an IAM policy to the bucket
4: Enable MFA Delete on the bucket
5: Encrypt the bucket using AWS SSE-S3

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

A solutions architect is designing an application on AWS. The compute layer will run in parallel across EC2 instances. The compute layer should scale based on the number of jobs to be processed. The compute layer is stateless. The solutions architect must ensure that the application is loosely coupled and the job items are durably stored. Which design should the solutions architect use?

1: Create an Amazon SNS topic to send the jobs that need to be processed. Create an Amazon EC2 Auto Scaling group for the compute application. Set the scaling policy for the Auto Scaling group to add and remove nodes based on CPU usage
2: Create an Amazon SQS queue to hold the jobs that need to be processed. Create an Amazon EC2 Auto Scaling group for the compute application. Set the scaling policy for the Auto Scaling group to add and remove nodes based on network usage
3: Create an Amazon SQS queue to hold the jobs that needs to be processed. Create an Amazon EC2 Auto Scaling group for the compute application. Set the scaling policy for the Auto Scaling group to add and remove nodes based on the number of items in the SQS queue
4: Create an Amazon SNS topic to send the jobs that need to be processed. Create an Amazon EC2 Auto Scaling group for the compute application. Set the scaling policy for the Auto Scaling group to add and remove nodes based on the number of messages published to the SNS topic

A

1: Create an Amazon SNS topic to send the jobs that need to be processed. Create an Amazon EC2 Auto Scaling group for the compute application. Set the scaling policy for the Auto Scaling group to add and remove nodes based on CPU usage
2: Create an Amazon SQS queue to hold the jobs that need to be processed. Create an Amazon EC2 Auto Scaling group for the compute application. Set the scaling policy for the Auto Scaling group to add and remove nodes based on network usage
3: Create an Amazon SQS queue to hold the jobs that needs to be processed. Create an Amazon EC2 Auto Scaling group for the compute application. Set the scaling policy for the Auto Scaling group to add and remove nodes based on the number of items in the SQS queue
4: Create an Amazon SNS topic to send the jobs that need to be processed. Create an Amazon EC2 Auto Scaling group for the compute application. Set the scaling policy for the Auto Scaling group to add and remove nodes based on the number of messages published to the SNS topic

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

A team are planning to run analytics jobs on log files each day and require a storage solution. The size and number of logs is unknown and data will persist for 24 hours only. What is the MOST cost-effective solution?

1: Amazon S3 Glacier Deep Archive
2: Amazon S3 Standard
3: Amazon S3 Intelligent-Tiering
4: Amazon S3 One Zone-Infrequent Access (S3 One Zone-IA) Edition.

A

1: Amazon S3 Glacier Deep Archive

2: Amazon S3 Standard

3: Amazon S3 Intelligent-Tiering
4: Amazon S3 One Zone-Infrequent Access (S3 One Zone-IA) Edition.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

A company runs a web application that serves weather updates. The application runs on a fleet of Amazon EC2 instances in a Multi-AZ Auto scaling group behind an Application Load Balancer (ALB). The instances store data in an Amazon Aurora database. A solutions architect needs to make the application more resilient to sporadic increases in request rates. Which architecture should the solutions architect implement? (Select TWO)

1: Add and AWS WAF in front of the ALB
2: Add Amazon Aurora Replicas
3: Add an AWS Transit Gateway to the Availability Zones
4: Add an AWS Global Accelerator endpoint
5: Add an Amazon CloudFront distribution in front of the ALB

A

1: Add and AWS WAF in front of the ALB

2: Add Amazon Aurora Replicas

3: Add an AWS Transit Gateway to the Availability Zones
4: Add an AWS Global Accelerator endpoint

5: Add an Amazon CloudFront distribution in front of the ALB

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

An Amazon VPC contains several Amazon EC2 instances. The instances need to make API calls to Amazon DynamoDB. A solutions architect needs to ensure that the API calls do not traverse the internet. How can this be accomplished? (Select TWO)

1: Create a route table entry for the endpoint
2: Create a gateway endpoint for DynamoDB
3: Create a new DynamoDB table that uses the endpoint
4: Create an ENI for the endpoint in each of the subnets of the VPC
5: Create a VPC peering connection between the VPC and DynamoDB

A

1: Create a route table entry for the endpoint

2: Create a gateway endpoint for DynamoDB

3: Create a new DynamoDB table that uses the endpoint
4: Create an ENI for the endpoint in each of the subnets of the VPC
5: Create a VPC peering connection between the VPC and DynamoDB

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

A solutions architect is designing the infrastructure to run an application on Amazon EC2 instances. The application requires high availability and must dynamically scale based on demand to be cost efficient. What should the solutions architect do to meet these requirements?

1: Configure an Application Load Balancer in front of an Auto Scaling group to deploy instances to multiple Regions
2: Configure an Amazon CloudFront distribution in front of an Auto Scaling group to deploy instances to multiple Regions
3: Configure an Application Load Balancer in front of an Auto Scaling group to deploy instances to multiple Availability Zones
4: Configure an Amazon API Gateway API in front of an Auto Scaling group to deploy instances to multiple Availability Zones

A

1: Configure an Application Load Balancer in front of an Auto Scaling group to deploy instances to multiple Regions
2: Configure an Amazon CloudFront distribution in front of an Auto Scaling group to deploy instances to multiple Regions

3: Configure an Application Load Balancer in front of an Auto Scaling group to deploy instances to multiple Availability Zones

4: Configure an Amazon API Gateway API in front of an Auto Scaling group to deploy instances to multiple Availability Zones

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

A retail company with many stores and warehouses is implementing IoT sensors to gather monitoring data from devices in each location. The data will be sent to AWS in real time. A solutions architect must provide a solution for ensuring events are received in order for each device and ensure that data is saved for future processing. Which solution would be MOST efficient?

1: Use Amazon Kinesis Data Streams for real-time events with a partition key for each device. Use Amazon Kinesis Data Firehose to save data to Amazon S3
2: Use Amazon Kinesis Data Streams for real-time events with a shard for each device. Use Amazon Kinesis Data Firehose to save data to Amazon EBS
3: Use an Amazon SQS FIFO queue for real-time events with one queue for each device. Trigger an AWS Lambda function for the SQS queue to save data to Amazon EFS
4: Use an Amazon SQS standard queue for real-time events with one queue for each device. Trigger an AWS Lambda function from the SQS queue to save data to Amazon S3

A

1: Use Amazon Kinesis Data Streams for real-time events with a partition key for each device. Use Amazon Kinesis Data Firehose to save data to Amazon S3

2: Use Amazon Kinesis Data Streams for real-time events with a shard for each device. Use Amazon Kinesis Data Firehose to save data to Amazon EBS
3: Use an Amazon SQS FIFO queue for real-time events with one queue for each device. Trigger an AWS Lambda function for the SQS queue to save data to Amazon EFS
4: Use an Amazon SQS standard queue for real-time events with one queue for each device. Trigger an AWS Lambda function from the SQS queue to save data to Amazon S3

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

An organization want to share regular updates about their charitable work using static webpages. The pages are expected to generate a large amount of views from around the world. The files are stored in an Amazon S3 bucket. A solutions architect has been asked to design an efficient and effective solution. Which action should the solutions architect take to accomplish this?

1: Generate presigned URLs for the files
2: Use cross-Region replication to all Regions
3: Use the geoproximity feature of Amazon Route 53
4: Use Amazon CloudFront with the S3 bucket as its origin

A

1: Generate presigned URLs for the files
2: Use cross-Region replication to all Regions
3: Use the geoproximity feature of Amazon Route 53
4: Use Amazon CloudFront with the S3 bucket as its origin

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

An insurance company has a web application that serves users in the United Kingdom and Australia. The application includes a database tier using a MySQL database hosted in eu-west-2. The web tier runs from eu-west-2 and ap-southeast-2. Amazon Route 53 geoproximity routing is used to direct users to the closest web tier. It has been noted that Australian users receive slow response times to queries. Which changes should be made to the database tier to improve performance?

1: Migrate the database to Amazon RDS for MySQL. Configure Multi-AZ in the Australian Region
2: Migrate the database to Amazon DynamoDB. Use DynamoDB global tables to enable replication to additional Regions
3: Deploy MySQL instances in each Region. Deploy an Application Load Balancer in front of MySQL to reduce the load on the primary instance
4: Migrate the database to an Amazon Aurora global database in MySQL compatibility mode. Configure read replicas in ap-southeast-2

A

1: Migrate the database to Amazon RDS for MySQL. Configure Multi-AZ in the Australian Region
2: Migrate the database to Amazon DynamoDB. Use DynamoDB global tables to enable replication to additional Regions
3: Deploy MySQL instances in each Region. Deploy an Application Load Balancer in front of MySQL to reduce the load on the primary instance

4: Migrate the database to an Amazon Aurora global database in MySQL compatibility mode. Configure read replicas in ap-southeast-2

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

A web application runs in public and private subnets. The application architecture consists of a web tier and database tier running on Amazon EC2 instances. Both tiers run in a single Availability Zone (AZ). Which combination of steps should a solutions architect take to provide high availability for this architecture? (Select TWO)

1: Create new public and private subnets in the same AZ for high availability
2: Create an Amazon EC2 Auto Scaling group and Application Load Balancer (ALB) spanning multiple AZs
3: Add the existing web application instances to an Auto Scaling group behind an Application Load Balancer (ALB)
4: Create new public and private subnets in a new AZ. Create a database using Amazon EC2 in one AZ
5: Create new public and private subnets in the same VPC, each in a new AZ. Migrate the database to an Amazon RDS multi-AZ deployment

A

1: Create new public and private subnets in the same AZ for high availability

2: Create an Amazon EC2 Auto Scaling group and Application Load Balancer (ALB) spanning multiple AZs

3: Add the existing web application instances to an Auto Scaling group behind an Application Load Balancer (ALB)
4: Create new public and private subnets in a new AZ. Create a database using Amazon EC2 in one AZ

5: Create new public and private subnets in the same VPC, each in a new AZ. Migrate the database to an Amazon RDS multi-AZ deployment

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
22
Q

An application running on an Amazon ECS container instance using the EC2 launch type needs permissions to write data to Amazon DynamoDB. How can you assign these permissions only to the specific ECS task that is running the application?

1: Create an IAM policy with permissions to DynamoDB and attach it to the container instance
2: Create an IAM policy with permissions to DynamoDB and assign It to a task using the taskRoleArn parameter
3: Use a security group to allow outbound connections to DynamoDB and assign it to the container instance
4: Modify the AmazonECSTaskExecutionRolePolicy policy to add permissions for DynamoDB

A

1: Create an IAM policy with permissions to DynamoDB and attach it to the container instance

2: Create an IAM policy with permissions to DynamoDB and assign It to a task using the taskRoleArn parameter

3: Use a security group to allow outbound connections to DynamoDB and assign it to the container instance
4: Modify the AmazonECSTaskExecutionRolePolicy policy to add permissions for DynamoDB

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
23
Q

An organization has a large amount of data on Windows (SMB) file shares in their on-premises data center. The organization would like to move data into Amazon S3. They would like to automate the migration of data over their AWS Direct Connect link. Which AWS service can assist them?

1: AWS Database Migration Service (DMS)
2: AWS CloudFormation
3: AWS Snowball
4: AWS DataSync

A

1: AWS Database Migration Service (DMS)
2: AWS CloudFormation
3: AWS Snowball

4: AWS DataSync

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
24
Q

The database tier of a web application is running on a Windows server on-premises. The database is a Microsoft SQL Server database. The application owner would like to migrate the database to an Amazon RDS instance. How can the migration be executed with minimal administrative effort and downtime?

1: Use the AWS Server Migration Service (SMS) to migrate the server to Amazon EC2.Use AWS Database Migration Service (DMS) to migrate the database to RDS
2: Use the AWS Database Migration Service (DMS) to directly migrate the database to RDS
3: Use AWS DataSync to migrate the data from the database to Amazon S3. Use AWS Database Migration Service (DMS) to migrate the database to RDS
4: Use the AWS Database Migration Service (DMS) to directly migrate the database to RDS. Use the Schema Conversion Tool (SCT) to enable conversion from Microsoft SQL Server to Amazon RDS

A

1: Use the AWS Server Migration Service (SMS) to migrate the server to Amazon EC2.Use AWS Database Migration Service (DMS) to migrate the database to RDS

2: Use the AWS Database Migration Service (DMS) to directly migrate the database to RDS

3: Use AWS DataSync to migrate the data from the database to Amazon S3. Use AWS Database Migration Service (DMS) to migrate the database to RDS
4: Use the AWS Database Migration Service (DMS) to directly migrate the database to RDS. Use the Schema Conversion Tool (SCT) to enable conversion from Microsoft SQL Server to Amazon RDS

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
25
Q

A new application will run across multiple Amazon ECS tasks. Front-end application logic will process data and then pass that data to a back-end ECS task to perform further processing and write the data to a datastore. The Architect would like to reduce-interdependencies so failures do no impact other components. Which solution should the Architect use?

1: Create an Amazon Kinesis Firehose delivery stream and configure the front-end to add data to the stream and the back-end to read data from the stream
2: Create an Amazon Kinesis Firehose delivery stream that delivers data to an Amazon S3 bucket, configure the front-end to write data to the stream and the back-end to read data from Amazon S3
3: Create an Amazon SQS queue that pushes messages to the back-end. Configure the front-end to add messages to the queue
4: Create an Amazon SQS queue and configure the front-end to add messages to the queue and the back-end to poll the queue for messages

A

1: Create an Amazon Kinesis Firehose delivery stream and configure the front-end to add data to the stream and the back-end to read data from the stream
2: Create an Amazon Kinesis Firehose delivery stream that delivers data to an Amazon S3 bucket, configure the front-end to write data to the stream and the back-end to read data from Amazon S3
3: Create an Amazon SQS queue that pushes messages to the back-end. Configure the front-end to add messages to the queue

4: Create an Amazon SQS queue and configure the front-end to add messages to the queue and the back-end to poll the queue for messages

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
26
Q

An application receives images uploaded by customers and stores them on Amazon S3. An AWS Lambda function then processes the images to add graphical elements. The processed images need to be available for users to download for 30 days, after which time they can be deleted. Processed images can be easily recreated from original images. The Original images need to be immediately available for 30 days and be accessible within 24 hours for another 90 days. Which combination of Amazon S3 storage classes is most cost-effective for the original and processed images? (Select TWO)

1: Store the original images in STANDARD for 30 days, transition to GLACIER for 90 days, then expire the data
2: Store the original images in STANDARD_IA for 30 days and then transition to DEEP_ARCHIVE
3: Store the processed images in ONEZONE_IA and then expire the data after 30 days
4: Store the processed images in STANDARD and then transition to GLACIER after 30 days
5: Store the original images in STANDARD for 30 days, transition to DEEP_ARCHIVE for 90 days, then expire the data

A

1: Store the original images in STANDARD for 30 days, transition to GLACIER for 90 days, then expire the data

2: Store the original images in STANDARD_IA for 30 days and then transition to DEEP_ARCHIVE

3: Store the processed images in ONEZONE_IA and then expire the data after 30 days

4: Store the processed images in STANDARD and then transition to GLACIER after 30 days
5: Store the original images in STANDARD for 30 days, transition to DEEP_ARCHIVE for 90 days, then expire the data

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
27
Q

Amazon EC2 instances in a development environment run between 9am and 5pm Monday-Friday. Production instances run 24/7. Which pricing models should be used? (Select TWO)

1: Use Spot instances for the development environment
2: Use Reserved instances for the development environment
3: Use scheduled reserved instances for the development environment
4: Use Reserved instances for the production environment
5: Use On-Demand instances for the production environment

A

1: Use Spot instances for the development environment
2: Use Reserved instances for the development environment

3: Use scheduled reserved instances for the development environment

4: Use Reserved instances for the production environment

5: Use On-Demand instances for the production environment

28
Q

An application running on Amazon EC2 needs to asynchronously invoke an AWS Lambda function to perform data processing. The services should be decoupled. Which service can be used to decouple the compute services?

1: Amazon SQS
2: Amazon SNS
3: Amazon MQ
4: AWS Step Functions

A

1: Amazon SQS

2: Amazon SNS

3: Amazon MQ
4: AWS Step Functions

29
Q

A manual script that runs a few times a week and completes within 10 minutes needs to be replaced with an automated solution. Which of the following options should an Architect use?

1: Use a cron job on an Amazon EC2 instance
2: Use AWS Batch
3: Use AWS Lambda
4: Use AWS CloudFormation

A

1: Use a cron job on an Amazon EC2 instance
2: Use AWS Batch

3: Use AWS Lambda

4: Use AWS CloudFormation

30
Q

A company wishes to restrict access to their Amazon DynamoDB table to specific, private source IP addresses from their VPC. What should be done to secure access to the table?

1: Create an interface VPC endpoint in the VPC with an Elastic Network Interface (ENI)
2: Create a gateway VPC endpoint and add an entry to the route table
3: Create the Amazon DynamoDB table in the VPC
4: Create an AWS VPN connection to the Amazon DynamoDB endpoint

A

1: Create an interface VPC endpoint in the VPC with an Elastic Network Interface (ENI)

2: Create a gateway VPC endpoint and add an entry to the route table

3: Create the Amazon DynamoDB table in the VPC
4: Create an AWS VPN connection to the Amazon DynamoDB endpoint

31
Q

An AWS Organization has an OU with multiple member accounts in it. The company needs to restrict the ability to launch only specific Amazon EC2 instance types. How can this policy be applied across the accounts with the least effort?

1: Create an SCP with an allow rule that allows launching the specific instance types
2: Create an SCP with a deny rule that denies all but the specific instance types
3: Create an IAM policy to deny launching all but the specific instance types
4: Use AWS Resource Access Manager to control which launch types can be used

A

1: Create an SCP with an allow rule that allows launching the specific instance types

2: Create an SCP with a deny rule that denies all but the specific instance types

3: Create an IAM policy to deny launching all but the specific instance types
4: Use AWS Resource Access Manager to control which launch types can be used

32
Q

A new relational database is being deployed on AWS. The performance requirements are unknown. Which database service does not require you to make capacity decisions upfront?

1: Amazon DynamoDB
2: Amazon Aurora Serverless
3: Amazon ElastiCache
4: Amazon RDS

A

1: Amazon DynamoDB

2: Amazon Aurora Serverless

3: Amazon ElastiCache
4: Amazon RDS

33
Q

An Amazon RDS Read Replica is being deployed in a separate region. The master database is not encrypted but all data in the new region must be encrypted. How can this be achieved?

1: Enable encryption using Key Management Service (KMS) when creating the cross-region Read Replica
2: Encrypt a snapshot from the master DB instance, create an encrypted cross-region Read Replica from the snapshot
3: Enabled encryption on the master DB instance, then create an encrypted cross-region Read Replica
4: Encrypt a snapshot from the master DB instance, create a new encrypted master DB instance, and then create an encrypted cross-region Read Replica

A

1: Enable encryption using Key Management Service (KMS) when creating the cross-region Read Replica
2: Encrypt a snapshot from the master DB instance, create an encrypted cross-region Read Replica from the snapshot
3: Enabled encryption on the master DB instance, then create an encrypted cross-region Read Replica

4: Encrypt a snapshot from the master DB instance, create a new encrypted master DB instance, and then create an encrypted cross-region Read Replica

34
Q

A legacy tightly-coupled High Performance Computing (HPC) application will be migrated to AWS. Which network adapter type should be used?

1: Elastic Network Interface (ENI)
2: Elastic Network Adapter (ENA)
3: Elastic Fabric Adapter (EFA)
4: Elastic IP Address

A

1: Elastic Network Interface (ENI)
2: Elastic Network Adapter (ENA)

3: Elastic Fabric Adapter (EFA)

4: Elastic IP Address

35
Q

A new application is to be published in multiple regions around the world. The Architect needs to ensure only 2 IP addresses need to be whitelisted. The solution should intelligently route traffic for lowest latency and provide fast regional failover. How can this be achieved?

1: Launch EC2 instances into multiple regions behind an NLB with a static IP address
2: Launch EC2 instances into multiple regions behind an ALB and use a Route 53 failover routing policy
3: Launch EC2 instances into multiple regions behind an NLB and use AWS Global Accelerator
4: Launch EC2 instances into multiple regions behind an ALB and use Amazon CloudFront with a pair of static IP addresses

A

1: Launch EC2 instances into multiple regions behind an NLB with a static IP address
2: Launch EC2 instances into multiple regions behind an ALB and use a Route 53 failover routing policy

3: Launch EC2 instances into multiple regions behind an NLB and use AWS Global Accelerator

4: Launch EC2 instances into multiple regions behind an ALB and use Amazon CloudFront with a pair of static IP addresses

36
Q

A company is deploying a big data and analytics workload. The analytics will be run from a fleet of thousands of EC2 instances across multiple AZs. Data needs to be stored on a shared storage layer that can be mounted and accessed concurrently by all EC2 instances. Latency is not a concern however extremely high throughput is required. What storage layer would be most suitable for this requirement?

1: Amazon EFS in General Purpose mode
2: Amazon EFS in Max I/O mode
3: Amazon EBS PIOPS
4: Amazon S3

A

1: Amazon EFS in General Purpose mode

2: Amazon EFS in Max I/O mode

3: Amazon EBS PIOPS
4: Amazon S3

37
Q

A Solutions Architect is designing a highly-scalable system to track records. Records must remain available for immediate download for three months, and then the records must be deleted. What’s the most appropriate decision for this use case?

1: Store the files on Amazon EBS, and create a lifecycle policy to remove the files after three months
2: Store the files on Amazon Glacier, and create a lifecycle policy to remove the files after three months
3: Store the files on Amazon S3, and create a lifecycle policy to remove the files after three months
4: Store the files on Amazon EFS, and create a lifecycle policy to remove the files after three months

A

1: Store the files on Amazon EBS, and create a lifecycle policy to remove the files after three months
2: Store the files on Amazon Glacier, and create a lifecycle policy to remove the files after three months

3: Store the files on Amazon S3, and create a lifecycle policy to remove the files after three months

4: Store the files on Amazon EFS, and create a lifecycle policy to remove the files after three months

38
Q

You are a Solutions Architect at Digital Cloud Training. A large multi-national client has requested a design for a multi-region, multi-master database. The client has requested that the database be designed for fast, massively scaled applications for a global user base. The database should be a fully managed service including the replication. Which AWS service can deliver these requirements?

1: DynamoDB with Global Tables and Multi-Region Replication
2: EC2 instances with EBS replication
3: S3 with Cross Region Replication
4: RDS with Multi-AZ

A

1: DynamoDB with Global Tables and Multi-Region Replication

2: EC2 instances with EBS replication
3: S3 with Cross Region Replication
4: RDS with Multi-AZ

39
Q

Your company is starting to use AWS to host new web-based applications. A new two-tier application will be deployed that provides customers with access to data records. It is important that the application is highly responsive and retrieval times are optimized. You’re looking for a persistent data store that can provide the required performance. From the list below what AWS service would you recommend for this requirement?

1: RDS in a multi-AZ configuration
2: ElastiCache with the Redis engine
3: Kinesis Data Streams
4: ElastiCache with the Memcached engine

A

1: RDS in a multi-AZ configuration

2: ElastiCache with the Redis engine

3: Kinesis Data Streams
4: ElastiCache with the Memcached engine

40
Q

A Linux instance running in your VPC requires some configuration changes to be implemented locally and you need to run some commands.

Which of the following can be used to securely access the instance?

1: SSL/TLS certificate
2: Public key
3: Key Pairs
4: EC2 password

A

1: SSL/TLS certificate
2: Public key

3: Key Pairs

4: EC2 password

41
Q

A manufacturing company captures data from machines running at customer sites. Currently, thousands of machines send data every 5 minutes, and this is expected to grow to hundreds of thousands of machines in the near future. The data is logged with the intent to be analyzed in the future as needed. What is the SIMPLEST method to store this streaming data at scale?

1: Create an Amazon EC2 instance farm behind an ELB to store the data in Amazon EBS Cold HDD volumes
2: Create an Amazon SQS queue, and have the machines write to the queue
3: Create an Amazon Kinesis Firehose delivery stream to store the data in Amazon S3
4: Create an Auto Scaling Group of Amazon EC2 instances behind ELBs to write data into Amazon RDS

A

1: Create an Amazon EC2 instance farm behind an ELB to store the data in Amazon EBS Cold HDD volumes
2: Create an Amazon SQS queue, and have the machines write to the queue

3: Create an Amazon Kinesis Firehose delivery stream to store the data in Amazon S3

4: Create an Auto Scaling Group of Amazon EC2 instances behind ELBs to write data into Amazon RDS

42
Q

There is a temporary need to share some video files that are stored in a private S3 bucket. The consumers do not have AWS accounts and you need to ensure that only authorized consumers can access the files. What is the best way to enable this access?

1: Enable public read access for the S3 bucket
2: Use CloudFront to distribute the files using authorization hash tags
3: Generate a pre-signed URL and distribute it to the consumers
4: Configure an allow rule in the Security Group for the IP addresses of the consumers

A

1: Enable public read access for the S3 bucket
2: Use CloudFront to distribute the files using authorization hash tags

3: Generate a pre-signed URL and distribute it to the consumers

4: Configure an allow rule in the Security Group for the IP addresses of the consumers

43
Q

A Solutions Architect needs to improve performance for a web application running on EC2 instances launched by an Auto Scaling group. The instances run behind an ELB Application Load Balancer. During heavy use periods the ASG doubles in size and analysis has shown that static content stored on the EC2 instances is being requested by users in a specific geographic location. How can the Solutions Architect reduce the need to scale and improve the application performance?

1: Store the contents on Amazon EFS instead of the EC2 root volume
2: Implement Amazon Redshift to create a repository of the content closer to the users
3: Create an Amazon CloudFront distribution for the site and redirect user traffic to the distribution
4: Re-deploy the application in a new VPC that is closer to the users making the requests

A

1: Store the contents on Amazon EFS instead of the EC2 root volume
2: Implement Amazon Redshift to create a repository of the content closer to the users

3: Create an Amazon CloudFront distribution for the site and redirect user traffic to the distribution

4: Re-deploy the application in a new VPC that is closer to the users making the requests

44
Q

A company needs to store data for 5 years. The company will need to have immediate and highly available access to the data at any point in time but will not require frequent access. Which lifecycle action should be taken to meet the requirements while reducing costs?

1: Transition objects from Amazon S3 Standard to the GLACIER storage class
2: Transition objects to expire after 5 years
3: Transition objects from Amazon S3 Standard to Amazon S3 Standard-Infrequent Access (S3 Standard-IA)
4: Transition objects from Amazon S3 Standard to Amazon S3 One Zone-Infrequent Access (S3 One Zone-IA)

A

1: Transition objects from Amazon S3 Standard to the GLACIER storage class
2: Transition objects to expire after 5 years

3: Transition objects from Amazon S3 Standard to Amazon S3 Standard-Infrequent Access (S3 Standard-IA)

4: Transition objects from Amazon S3 Standard to Amazon S3 One Zone-Infrequent Access (S3 One Zone-IA)

45
Q

A retail organization is deploying a new application that will read and write data to a database. The company wants to deploy the application in three different AWS Regions in an active-active configuration. The databases need to replicate to keep information in sync. Which solution best meets these requirements?

1: AWS Database Migration Service with change data capture
2: Amazon DynamoDB with global tables
3: Amazon Athena with Amazon S3 cross-region replication
4: Amazon Aurora Global Database

A

1: AWS Database Migration Service with change data capture

2: Amazon DynamoDB with global tables

3: Amazon Athena with Amazon S3 cross-region replication
4: Amazon Aurora Global Database

46
Q

You are a Solutions Architect at Digital Cloud Training. One of your clients runs an application that writes data to a DynamoDB table. The client has asked how they can implement a function that runs code in response to item level changes that take place in the DynamoDB table. What would you suggest to the client?

1: Enable server access logging and create an event source mapping between AWS Lambda and the S3 bucket to which the logs are written
2: Enable DynamoDB Streams and create an event source mapping between AWS Lambda and the relevant stream
3: Create a local secondary index that records item level changes and write some custom code that responds to updates to the index
4: Use Kinesis Data Streams and configure DynamoDB as a producer

A

1: Enable server access logging and create an event source mapping between AWS Lambda and the S3 bucket to which the logs are written

2: Enable DynamoDB Streams and create an event source mapping between AWS Lambda and the relevant stream

3: Create a local secondary index that records item level changes and write some custom code that responds to updates to the index
4: Use Kinesis Data Streams and configure DynamoDB as a producer

47
Q

A recent security audit uncovered some poor deployment and configuration practices within your VPC. You need to ensure that applications are deployed in secure configurations. How can this be achieved in the most operationally efficient manner?

1: Remove the ability for staff to deploy applications
2: Use CloudFormation with securely configured templates
3: Manually check all application configurations before deployment
4: Use AWS Inspector to apply secure configurations

A

1: Remove the ability for staff to deploy applications

2: Use CloudFormation with securely configured templates

3: Manually check all application configurations before deployment
4: Use AWS Inspector to apply secure configurations

48
Q

A Solutions Architect needs to transform data that is being uploaded into S3. The uploads happen sporadically and the transformation should be triggered by an event. The transformed data should then be loaded into a target data store. What services would be used to deliver this solution in the MOST cost-effective manner? (Select TWO)

1: Configure a CloudWatch alarm to send a notification to CloudFormation when data is uploaded
2: Configure S3 event notifications to trigger a Lambda function when data is uploaded and use the Lambda function to trigger the ETL job
3: Configure CloudFormation to provision a Kinesis data stream to transform the data and load it into S3
4: Use AWS Glue to extract, transform and load the data into the target data store
5: Configure CloudFormation to provision AWS Data Pipeline to transform the data

A

1: Configure a CloudWatch alarm to send a notification to CloudFormation when data is uploaded

2: Configure S3 event notifications to trigger a Lambda function when data is uploaded and use the Lambda function to trigger the ETL job

3: Configure CloudFormation to provision a Kinesis data stream to transform the data and load it into S3

4: Use AWS Glue to extract, transform and load the data into the target data store

5: Configure CloudFormation to provision AWS Data Pipeline to transform the data

49
Q

An application you manage uses Auto Scaling and a fleet of EC2 instances. You recently noticed that Auto Scaling is scaling the number of instances up and down multiple times in the same hour. You need to implement a remediation to reduce the amount of scaling events. The remediation must be cost-effective and preserve elasticity. What design changes would you implement? (Select TWO)

1: Modify the CloudWatch alarm period that triggers your Auto Scaling scale down policy
2: Modify the Auto Scaling group termination policy to terminate the newest instance first
3: Modify the Auto Scaling group termination policy to terminate the oldest instance first
4: Modify the Auto Scaling group cool-down timers
5: Modify the Auto Scaling policy to use scheduled scaling actions

A

1: Modify the CloudWatch alarm period that triggers your Auto Scaling scale down policy

2: Modify the Auto Scaling group termination policy to terminate the newest instance first
3: Modify the Auto Scaling group termination policy to terminate the oldest instance first

4: Modify the Auto Scaling group cool-down timers

5: Modify the Auto Scaling policy to use scheduled scaling actions

50
Q

An application runs on two EC2 instances in private subnets split between two AZs. The application needs to connect to a CRM SaaS application running on the Internet. The vendor of the SaaS application restricts authentication to a whitelist of source IP addresses and only 2 IP addresses can be configured per customer. What is the most appropriate and cost-effective solution to enable authentication to the SaaS application?

1: Use a Network Load Balancer and configure a static IP for each AZ
2: Use multiple Internet-facing Application Load Balancers with Elastic IP addresses
3: Configure redundant Internet Gateways and update the routing tables for each subnet
4: Configure a NAT Gateway for each AZ with an Elastic IP address

A

1: Use a Network Load Balancer and configure a static IP for each AZ
2: Use multiple Internet-facing Application Load Balancers with Elastic IP addresses
3: Configure redundant Internet Gateways and update the routing tables for each subnet

4: Configure a NAT Gateway for each AZ with an Elastic IP address

51
Q

An application tier of a multi-tier web application currently hosts two web services on the same set of instances. The web services each listen for traffic on different ports. Which AWS service should a Solutions Architect use to route traffic to the service based on the incoming request path?

1: Amazon Route 53
2: Amazon CloudFront
3: Application Load Balancer (ALB)
4: Classic Load Balancer (CLB)

A

1: Amazon Route 53
2: Amazon CloudFront

3: Application Load Balancer (ALB)

4: Classic Load Balancer (CLB)

52
Q

The data scientists in your company are looking for a service that can process and analyze real-time, streaming data. They would like to use standard SQL queries to query the streaming data. Which combination of AWS services would deliver these requirements?

1: DynamoDB and EMR
2: Kinesis Data Streams and Kinesis Data Analytics
3: ElastiCache and EMR
4: Kinesis Data Streams and Kinesis Firehose

A

1: DynamoDB and EMR

2: Kinesis Data Streams and Kinesis Data Analytics

3: ElastiCache and EMR
4: Kinesis Data Streams and Kinesis Firehose

53
Q

An e-commerce application is hosted in AWS. The last time a new product was launched, the application experienced a performance issue due to an enormous spike in traffic. Management decided that capacity must be doubled this week after the product is launched. What is the MOST efficient way for management to ensure that capacity requirements are met?

1: Add a Step Scaling policy
2: Add a Simple Scaling policy
3: Add a Scheduled Scaling action
4: Add Amazon EC2 Spot instances

A

1: Add a Step Scaling policy
2: Add a Simple Scaling policy

3: Add a Scheduled Scaling action

4: Add Amazon EC2 Spot instances

54
Q

You need to configure an application to retain information about each user session and have decided to implement a layer within the application architecture to store this information. Which of the options below could be used? (Select TWO)

1: Sticky sessions on an Elastic Load Balancer (ELB)
2: A block storage service such as Elastic Block Store (EBS)
3: A workflow service such as Amazon Simple Workflow Service (SWF)
4: A relational data store such as Amazon RDS
5: A key/value store such as ElastiCache Redis

A

1: Sticky sessions on an Elastic Load Balancer (ELB)

2: A block storage service such as Elastic Block Store (EBS)
3: A workflow service such as Amazon Simple Workflow Service (SWF)
4: A relational data store such as Amazon RDS

5: A key/value store such as ElastiCache Redis

55
Q

An application running on an external website is attempting to initiate a request to your company’s website using API calls to Amazon API Gateway. A problem has been reported in which the requests are failing with an error that includes the following text: “Cross-Origin Request Blocked: The Same Origin Policy disallows reading the remote resource” You have been asked to resolve the problem, what is the most likely solution?

1: The IAM policy does not allow access to the API
2: The ACL on the API needs to be updated
3: The request is not secured with SSL/TLS
4: Enable CORS on the APIs resources using the selected methods under the API Gateway

A

1: The IAM policy does not allow access to the API
2: The ACL on the API needs to be updated
3: The request is not secured with SSL/TLS

4: Enable CORS on the APIs resources using the selected methods under the API Gateway

56
Q

A solutions Architect is designing a new workload where an AWS Lambda function will access an Amazon DynamoDB table. What is the MOST secure means of granting the Lambda function access to the DynamoDB table?

1: Create an identity and access management (IAM) role with the necessary permissions to access the DynamoDB table, and assign the role to the Lambda function
2: Create a DynamoDB username and password and give them to the Developer to use in the Lambda function
3: Create an identity and access management (IAM) user and create access and secret keys for the user. Give the user the necessary permissions to access the DynamoDB table. Have the Developer use these keys to access the resources
4: Create an identity and access management (IAM) role allowing access from AWS Lambda and assign the role to the DynamoDB table

A

1: Create an identity and access management (IAM) role with the necessary permissions to access the DynamoDB table, and assign the role to the Lambda function

2: Create a DynamoDB username and password and give them to the Developer to use in the Lambda function
3: Create an identity and access management (IAM) user and create access and secret keys for the user. Give the user the necessary permissions to access the DynamoDB table. Have the Developer use these keys to access the resources
4: Create an identity and access management (IAM) role allowing access from AWS Lambda and assign the role to the DynamoDB table

57
Q

You are a Solutions Architect at a media company and you need to build an application stack that can receive customer comments from sporting events. The application is expected to receive significant load that could scale to millions of messages within a short space of time following high-profile matches. As you are unsure of the load required for the database layer what is the most cost-effective way to ensure that the messages are not dropped?

1: Use DynamoDB and provision enough write capacity to handle the highest expected load
2: Write the data to an S3 bucket, configure RDS to poll the bucket for new messages
3: Create an SQS queue and modify the application to write to the SQS queue. Launch another application instance the polls the queue and writes messages to the database
4: Use RDS Auto Scaling for the database layer which will automatically scale as required

A

1: Use DynamoDB and provision enough write capacity to handle the highest expected load
2: Write the data to an S3 bucket, configure RDS to poll the bucket for new messages

3: Create an SQS queue and modify the application to write to the SQS queue. Launch another application instance the polls the queue and writes messages to the database

4: Use RDS Auto Scaling for the database layer which will automatically scale as required

58
Q

An organization in the health industry needs to create an application that will transmit protected health data to thousands of service consumers in different AWS accounts. The application servers run on EC2 instances in private VPC subnets. The routing for the application must be fault tolerant. What should be done to meet these requirements?

1: Create a virtual private gateway connection between each pair of service provider VPCs and service consumer VPCs
2: Create a proxy server in the service provider VPC to route requests from service consumers to the application servers
3: Create a VPC endpoint service and grant permissions to specific service consumers to create a connection
4: Create an internal Application Load Balancer in the service provider VPC and put application servers behind it

A

1: Create a virtual private gateway connection between each pair of service provider VPCs and service consumer VPCs
2: Create a proxy server in the service provider VPC to route requests from service consumers to the application servers

3: Create a VPC endpoint service and grant permissions to specific service consumers to create a connection

4: Create an internal Application Load Balancer in the service provider VPC and put application servers behind it

59
Q

A Solutions Architect is developing an encryption solution. The solution requires that data keys are encrypted using envelope protection before they are written to disk. Which solution option can assist with this requirement?

1: API Gateway with STS
2: IAM Access Key
3: AWS Certificate Manager
4: AWS KMS API

A

1: API Gateway with STS
2: IAM Access Key
3: AWS Certificate Manager

4: AWS KMS API

60
Q

A research company is developing a data lake solution in Amazon S3 to analyze huge datasets. The solution makes infrequent SQL queries only. In addition, the company wants to minimize infrastructure costs. Which AWS service should be used to meet these requirements?

1: Amazon Aurora
2: Amazon RDS for MySQL
3: Amazon Athena
4: Amazon Redshift Spectrum

A

1: Amazon Aurora
2: Amazon RDS for MySQL

3: Amazon Athena

4: Amazon Redshift Spectrum

61
Q

Your company shares some HR videos stored in an Amazon S3 bucket via CloudFront. You need to restrict access to the private content so users coming from specific IP addresses can access the videos and ensure direct access via the Amazon S3 bucket is not possible. How can this be achieved?

1: Configure CloudFront to require users to access the files using signed cookies, create an origin access identity (OAI) and instruct users to login with the OAI
2: Configure CloudFront to require users to access the files using a signed URL, create an origin access identity (OAI) and restrict access to the files in the Amazon S3 bucket to the OAI
3: Configure CloudFront to require users to access the files using signed cookies, and move the files to an encrypted EBS volume
4: Configure CloudFront to require users to access the files using a signed URL, and configure the S3 bucket as a website endpoint

A

1: Configure CloudFront to require users to access the files using signed cookies, create an origin access identity (OAI) and instruct users to login with the OAI

2: Configure CloudFront to require users to access the files using a signed URL, create an origin access identity (OAI) and restrict access to the files in the Amazon S3 bucket to the OAI

3: Configure CloudFront to require users to access the files using signed cookies, and move the files to an encrypted EBS volume
4: Configure CloudFront to require users to access the files using a signed URL, and configure the S3 bucket as a website endpoint

62
Q

The company you work for is currently transitioning their infrastructure and applications into the AWS cloud. You are planning to deploy an Elastic Load Balancer (ELB) that distributes traffic for a web application running on EC2 instances. You still have some application servers running on-premise and you would like to distribute application traffic across both your AWS and on-premises resources. How can this be achieved?

1: Provision a Direct Connect connection between your on-premises location and AWS and create a target group on an ALB to use IP based targets for both your EC2 instances and on-premises servers
2: Provision a Direct Connect connection between your on-premises location and AWS and create a target group on an ALB to use Instance ID based targets for both your EC2 instances and on-premises servers
3: Provision an IPSec VPN connection between your on-premises location and AWS and create a CLB that uses cross-zone load balancing to distributed traffic across EC2 instances and on-premises servers
4: This cannot be done, ELBs are an AWS service and can only distribute traffic within the AWS cloud

A

1: Provision a Direct Connect connection between your on-premises location and AWS and create a target group on an ALB to use IP based targets for both your EC2 instances and on-premises servers

2: Provision a Direct Connect connection between your on-premises location and AWS and create a target group on an ALB to use Instance ID based targets for both your EC2 instances and on-premises servers
3: Provision an IPSec VPN connection between your on-premises location and AWS and create a CLB that uses cross-zone load balancing to distributed traffic across EC2 instances and on-premises servers
4: This cannot be done, ELBs are an AWS service and can only distribute traffic within the AWS cloud

63
Q

An application you are designing receives and processes files. The files are typically around 4GB in size and the application extracts metadata from the files which typically takes a few seconds for each file. The pattern of updates is highly dynamic with times of little activity and then multiple uploads within a short period of time. What architecture will address this workload the most cost efficiently?

1: Use a Kinesis data stream to store the file, and use Lambda for processing
2: Store the file in an EBS volume which can then be accessed by another EC2 instance for processing
3: Upload files into an S3 bucket, and use the Amazon S3 event notification to invoke a Lambda function to extract the metadata
4: Place the files in an SQS queue, and use a fleet of EC2 instances to extract the metadata

A

1: Use a Kinesis data stream to store the file, and use Lambda for processing
2: Store the file in an EBS volume which can then be accessed by another EC2 instance for processing

3: Upload files into an S3 bucket, and use the Amazon S3 event notification to invoke a Lambda function to extract the metadata

4: Place the files in an SQS queue, and use a fleet of EC2 instances to extract the metadata

64
Q

The website for a new application received around 50,000 requests each second and the company wants to use multiple applications to analyze the navigation patterns of the users on their website so they can personalize the user experience. What can a Solutions Architect use to collect page clicks for the website and process them sequentially for each user?

1: Amazon Kinesis Data Streams
2: Amazon SQS FIFO queue
3: AWS CloudTrail trail
4: Amazon SQS standard queue

A

1: Amazon Kinesis Data Streams

2: Amazon SQS FIFO queue
3: AWS CloudTrail trail
4: Amazon SQS standard queue

65
Q

You are building an application that will collect information about user behavior. The application will rapidly ingest large amounts of dynamic data and requires very low latency. The database must be scalable without incurring downtime. Which database would you recommend for this scenario?

1: RDS with MySQL
2: DynamoDB
3: RedShift
4: RDS with Microsoft SQL

A

1: RDS with MySQL

2: DynamoDB

3: RedShift
4: RDS with Microsoft SQL