sap full3 Flashcards
(396 cards)
A social media company has VPC Flow Logs enabled for its NAT gateway. The security team is seeing Action = ACCEPT for inbound traffic that comes from the public IP address 198.21.200.1 destined for a private EC2 instance. The team must determine whether the traffic represents unsolicited inbound connections from the internet. The first two octets of the VPC CIDR block are 205.1.
Which option can address this requirement?
Inspect the VPC Flow Logs using the CloudWatch console and select the log group that contains the NAT gateway’s ENI and the EC2 instance’s ENI. Leverage a query filter with the destination address set as like 205.1 and the source address set as like 198.21.200.1. Execute the stats command to filter the sum of bytes transferred by the source address and the destination address
The engineering team at a retail company wants to establish a dedicated, encrypted, low latency, and high throughput connection between its data center and AWS Cloud. The engineering team has set aside sufficient time to account for the operational overhead of establishing this connection.
Which option represents the MOST optimal solution with the LEAST infrastructure set up required for provisioning the end to end connection?
Use AWS Direct Connect along with a site-to-site VPN to establish a connection between the data center and AWS Cloud
A financial services company had a security incident recently and wants to review the security of its two-tier server architecture. The company wants to ensure that it follows the principle of least privilege while configuring the security groups for access between the EC2 instance-based app servers and RDS MySQL database servers. The security group for the EC2 instances as well as the security group for the MySQL database servers has no inbound and outbound rules configured currently.
As an AWS Certified Solutions Architect Professional, which 2 options would you recommend to adhere to the given requirements?
- Create an outbound rule in the security group for the EC2 instance app servers using TCP protocol on port 3306. Set the destination as the security group for the MySQL DB servers
- Create an inbound rule in the security group for the MySQL DB servers using TCP protocol on port 3306. Set the source as the security group for the EC2 instance app servers
A bioinformatics company leverages multiple open source tools to manage data analysis workflows running on its on-premises servers to process biological data which is generated and stored on a Network Attached Storage (NAS). The existing workflow receives around 100 GB of input biological data for each job run and individual jobs can take several hours to process the data. The CTO at the company wants to re-architect its proprietary analytics workflow on AWS to meet the workload demands and reduce the turnaround time from months to days. The company has provisioned a high-speed AWS Direct Connect connection. The final result needs to be stored in Amazon S3. The company is expecting approximately 20 job requests each day.
Which of the following options would you recommend for the given use case?
Leverage AWS DataSync to transfer the biological data to Amazon S3. Use S3 events to trigger an AWS Lambda function that starts an AWS Step Functions workflow for orchestrating an AWS Batch job that processes the biological data
An e-commerce company is investigating user reports of its Java-based web application errors on the day of the Thanksgiving sale. The development team recovered the logs created by the EC2 instance-hosted web servers and reviewed Aurora DB cluster performance metrics. Some of the web servers were terminated before logs could be collected and the Aurora metrics were inadequate for query performance analysis.
Which of the following steps would you recommend to make the monitoring process more reliable to troubleshoot any future events due to traffic spikes?
ERROR!
A retail company offers its services to the customers via APIs that leverage Amazon API Gateway and Lambda functions. The company also has a legacy API hosted on an Amazon EC2 instance that is used by the company’s supply chain partners. The security and audit team at the company has raised concerns over the use of these APIs and wants a solution to secure them all from any vulnerabilities, DDoS attacks, and malicious exploits.
What would you use to address the security requirements of the company?
Use AWS Web Application Firewall (WAF) as the first line of defense to protect the API Gateway APIs against malicious exploits and DDoS attacks. Install Amazon Inspector on the EC2 instance to check for vulnerabilities. Configure Amazon GuardDuty to monitor any malicious attempts to access the APIs illegally
An Amazon S3 bucket is shared by three different teams (managing their own separate AWS accounts) for document uploads. Initially, the S3 bucket settings were set to default. Later, the bucket sees the following updates:
After week 1, S3 Object Ownership bucket-level settings were used and all Access Control Lists (ACLs) were disabled. The three teams uploaded their documents to the shared bucket with this new setting.
After week 2, S3 bucket level settings were again set back to default and the ACLs were enabled once more
What is the outcome of these action(s) on the documents uploaded after week 1 and what are the key points of consideration for future S3 bucket configurations?
- You, as the bucket owner, still own any objects that were written to the bucket while the bucket owner enforced setting was applied. These objects are not owned by the object writer, even if you re-enable ACLs
- If you used object ACLs for permissions management before you applied the bucket owner enforced setting and you didn’t migrate these object ACL permissions to your bucket policy after you re-enable ACLs, these permissions are restored
The development team at a company needs to implement a client-side encryption mechanism for objects that will be stored in a new Amazon S3 bucket. The team created a CMK that is stored in AWS Key Management Service (AWS KMS) for this purpose. The team created the following IAM policy and attached it to an IAM role:
{
“Version”: “2012-10-17”,
“Id”: “key-policy-1”,
“Statement”: [
{
“Sid”: “GetPut”,
“Effect”: “Allow”,
“Action”: [
“s3:GetObject”,
“s3:PutObject”
],
“Resource”: “arn:aws:s3:::ExampleBucket/*”
},
{
“Sid”: “KMS”,
“Effect”: “Allow”,
“Action”: [
“kms:Decrypt”,
“kms:Encrypt”
],
“Resource”: “arn:aws:kms:us-west-1:111122223333:key/keyid-12345”
}
]
}
The team was able to successfully get existing objects from the S3 bucket while testing. But any attempts to upload a new object resulted in an error. The error message stated that the action was forbidden.
Which IAM policy action should be added to the IAM policy to resolve the error?
kms:GenerateDataKey
A multi-national company operates hundreds of AWS accounts and the CTO wants to rationalize the operational costs. The CTO has mandated a centralized process for purchasing new Reserved Instances (RIs) or modifying existing RIs. Whereas earlier the business units (BUs) would directly purchase or modify RIs in their own AWS accounts independently, now all BUs must be denied independent purchase and the BUs must submit requests to a dedicated central team for purchasing RIs.
As an AWS Certified Solutions Architect Professional, which of the following solutions would you combine to enforce the new process most efficiently?
ERROR!
An Amazon Simple Storage Service (Amazon S3) bucket has been configured to host a static website. While using the S3 static website endpoint, the testing team has complained that they are receiving access denied error for this website.
What are the key points to consider while configuring an S3 bucket as a static website?
- Objects can’t be encrypted by AWS Key Management Service (AWS KMS)
- The AWS account that owns the bucket must also own the object
AWS KMS doesn’t support anonymous requests. As a result, any Amazon S3 bucket that allows anonymous or public access will not apply to objects that are encrypted with AWS KMS
A financial services company wants to set up an AWS WAF-based solution to manage AWS WAF rules across multiple AWS accounts that are structured under different Organization Units (OUs) in AWS Organizations. The solution should automatically update and remediate noncompliant AWS WAF rules in all accounts. The solution should also facilitate adding or removing accounts or OUs from managed AWS WAF rule sets as needed.
Which of the following solutions is the most operationally efficient to address the given use case?
Create an AWS Organizations organization-wide AWS Config rule that mandates all resources in the selected OUs to be associated with the AWS WAF rules. Configure automated remediation actions by using AWS Systems Manager Automation documents to fix non-compliant resources. Set up AWS WAF rules by using an AWS CloudFormation stack set to target the same OUs where the AWS Config rule is applied
A data analytics company uses Amazon S3 as the data lake to store the input data that is ingested from the IoT field devices on an hourly basis. The ingested data has attributes such as the device type, ID of the device, the status of the device, the timestamp of the event, the source IP address, etc. The data runs into millions of records per day and the company wants to run complex analytical queries on this data daily for product improvements for each device type.
Which is the most optimal way to save this data to get the best performance from the millions of data points processed daily?
Store the data in Apache ORC, partitioned by date and sorted by device type of the device
A social learning platform allows students to connect with other students as well as experts and professionals from academic, research institutes and industry. The engineering team at the company manages 5 Amazon EC2 instances that make read-heavy database requests to the Amazon RDS for PostgreSQL DB cluster. As an AWS Certified Solutions Architect Professional, you have been asked to make the database cluster resilient from a disaster recovery perspective.
Which features will help you prepare for database disaster recovery?
ERROR!
An ed-tech company needs to deliver its video-on-demand (VOD) content to approximately 1 million users in a cost-effective way. The learning material is in the form of videos with a maximum size of 10 GB each. The videos are highly watched when initially uploaded and subsequently have very less views after 6-8 months. While the old videos might not be accessed regularly, they need to be immediately accessible when needed. With trainers and material doubling every few months, the number of videos has exploded over the last few months, dramatically increasing the cost of storage for the company.
What is the most cost-effective way of storing these videos to address the given use case?
Use Amazon S3 Intelligent-Tiering storage class to store the video files. Configure this S3 bucket as the origin of an Amazon CloudFront distribution for delivering the contents to the customers
A social media company manages a multi-AZ VPC environment consisting of public subnets and private subnets. Each public subnet contains a NAT Gateway as well as an Internet Gateway. Most of the company’s applications are deployed in the private subnets and these applications read and write data to Kinesis Data Streams. The company has hired you as an AWS Certified Solutions Architect Professional to reduce costs and optimize the applications. Upon analysis in the AWS Cost Explorer, you notice that the cost in the EC2-Other category is consistently high due to the increasing NAT Gateway data transfer charges.
What do you recommend to address this requirement?
Set up an interface VPC endpoint for Kinesis Data Streams in the VPC. Ensure that the VPC endpoint policy allows traffic from the applications
A global biomedicine company has built a Genomics Solution on AWS Cloud. The company’s labs generate hundreds of terabytes of research data daily. To further accelerate the innovation process, the engineering team at the company wants to move most of the on-premises data into Amazon S3, Amazon EFS, and Amazon FSx for Windows File Server easily, quickly, and cost-effectively. The team would like to automate and accelerate online data transfers to these AWS storage services.
As a Solutions Architect Professional, which solution would you recommend as the BEST fit?
Use AWS DataSync to automate and accelerate online data transfers to the given AWS storage services
AWS DataSync is an online data transfer service that simplifies, automates, and accelerates copying large amounts of data to and from AWS storage services over the internet or AWS Direct Connect.
A retail company recently saw a huge spike in its monthly AWS spend. Upon further investigation, it was found that some developers had accidentally launched Amazon RDS instances in unexpected Regions. The company has hired you as an AWS Certified Solutions Architect Professional to establish best practices around least privileges for developers and control access to on-premises as well as AWS Cloud resources using Active Directory. The company has mandated you to institute a mechanism to control costs by restricting the level of access that developers have to the AWS Management Console without impacting their productivity. The company would also like to allow developers to launch RDS instances only in us-east-1 Region without limiting access to other services in any Region.
How can you help the company achieve the new security mandate while minimizing the operational burden on the DevOps team?
Configure SAML-based authentication tied to an IAM role that has the PowerUserAccess managed policy attached to it. Attach a customer-managed policy that denies access to RDS in any AWS Region except us-east-1
leading video creation and distribution company has recently migrated to AWS Cloud for digitally transforming its movie business. The company wants to speed up its media distribution process and improve data security while also reducing costs and eliminating errors. The company wants to set up a Digital Cinema Network that would allow it to store content in Amazon S3 as well as to accelerate the online distribution of movies and advertising to theaters in 38 key media markets worldwide. The company also wants to do an accelerated online migration of hundreds of terabytes of files from their on-premises data center to Amazon S3 and then establish a mechanism for low-latency access of the migrated data for ongoing updates from the on-premises applications.
As a Solutions Architect Professional, wha would you select as the MOST performant solution for the given use-case?
Use AWS DataSync to migrate existing data to Amazon S3 and then use File Gateway for low latency access to the migrated data for ongoing updates from the on-premises applications
A stock trading firm uses AWS Cloud for its IT infrastructure. The firm runs several trading-risk simulation applications, developing complex algorithms to simulate diverse scenarios in order to evaluate the financial health of its customers. The firm stores customers’ financial records on Amazon S3. The engineering team needs to implement an archival solution based on Amazon S3 Glacier to enforce regulatory and compliance controls on the archived data.
As a Solutions Architect Professional, which of the following solutions would you recommend?
Use S3 Glacier vault to store the sensitive archived data and then use a vault lock policy to enforce compliance controls
The DevOps team for a CRM SaaS company wants to implement a patching plan on AWS Cloud for a large mixed fleet of Windows and Linux servers. The patching plan has to be auditable and must be implemented securely to ensure compliance with the company’s business requirements.
As a Solutions Architect Professional, which option would you recommend to address these requirements with MINIMAL effort?
ERROR!
An e-commerce company wants to rollout and test a blue-green deployment for its global application in the next couple of days. Most of the customers use mobile phones which are prone to DNS caching. The company has only two days left before the big sale will be launched.
As a Solutions Architect Professional, which option would you suggest to test the deployment on as many users as possible in the given time frame?
Use AWS Global Accelerator to distribute a portion of traffic to a particular deployment
With AWS Global Accelerator, you can shift traffic gradually or all at once between the blue and the green environment and vice-versa without being subject to DNS caching on client devices and internet resolvers, traffic dials and endpoint weights changes are effective within seconds.
An e-commerce company is planning to migrate its IT infrastructure from the on-premises data center to AWS Cloud to ramp up its capabilities well in time for the upcoming Holiday Sale season. The company’s CTO has hired you as an AWS Certified Solutions Architect Professional to design a distributed, highly available and loosely coupled order processing application. The application is responsible for receiving and processing orders before storing them in a DynamoDB table. The application has seen sporadic traffic spikes in the past and the CTO wants the application to be able to scale during marketing campaigns to process the orders with minimal disruption.
Which option would you recommend as the MOST reliable solution to address these requirements?
Ingest the orders in an SQS queue and trigger a Lambda function to process them
A company allows property owners and travelers to connect with each other for the purpose of renting unique vacation spaces around the world. The engineering team at the company uses Amazon MySQL RDS DB cluster because it simplifies much of the time-consuming administrative tasks typically associated with databases. The team uses Multi-Availability Zone (Multi-AZ) deployment to further automate its database replication and augment data durability. The current cluster configuration also uses Read Replicas. An intern has joined the team and wants to understand the replication capabilities for Multi-AZ as well as Read Replicas for the given RDS cluster.
As a Solutions Architect Professional, which capability would you identify as correct for the given database?
Multi-AZ follows synchronous replication and spans at least two Availability Zones within a single region. Read Replicas follow asynchronous replication and can be within an Availability Zone, Cross-AZ, or Cross-Region
An e-commerce company runs a data archival workflow once a month for its on-premises data center which is connected to the AWS Cloud over a minimally used 10-Gbps Direct Connect connection using a private virtual interface to its virtual private cloud (VPC). The company internet connection is 200 Mbps, and the usual archive size is around 140 TB that is created on the first Friday of a month. The archive must be transferred and available in Amazon S3 by the next Monday morning.
As a Solutions Architect Professional, which option would you recommend as the LEAST expensive way to address the given use-case?
Configure a public virtual interface on the 10-Gbps Direct Connect connection and then copy the data to S3 over the connection