AWS IAM and S3 Flashcards

1
Q

What is IAM?

A

Identity Access Management:
- Core service in AWS that helps you control access to RESOURCES (S3, lambda, etc)
- The users perform ACTIONS to resources (create a bucket in S3)
- Authorizations to make ACTIONS depends on POLICIES

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

is IAM universal?

A

Yes, it is not regional

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What is the root account?

A

The account created when you first set up your AWS account.

In an organization, the root account should be used for billing only, we shouldn’t deploy anything here

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

explain new users in relation with IAM?

A
  • when they are created they have no access
  • They are assigned a Access key ID and a Secret Access key to be able to login throw API calls and the command line.
  • You could generate a “signed URL” to be able to access the console with the Access Key ID and the Secret Access Key.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

S3 tips for exam? (storage, bucket namespace, object based type, restriction, MFA, HTTP code, etc)

A

BUCKET NAMESPACE
- S3 bucket name is globally unique, and the namespace is shared by all AWS accounts. This means that after a bucket is created, the name of that bucket cannot be used by another AWS account in any AWS Region until the bucket is deleted.

OBJECT BASED TYPE
- S3 is “object-based”. That means it has these attributes
- Key (name)
- Value (the data of the file)
- Version ID
- Metadata (data about data you are storing)

STORAGE
- Files can be from 0bytes to 500 terabytes (0B - 500TB)
- Unlimited storage
- Files are stored in buckets

RESTRICTIONS
- Not suitable to install a OS on

HTTP CODE
- Successful updates will generate an HTTP 200 status code

MFA
- you can turn on MFA delete

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

S3: how to restrict Bucket access?

A
  • Bucket Policy: applies to the whole bucket
  • Object Policy: Applies to individual files
  • IAM Policies to Users&Groups: applies to users and groups
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Types of S3. Which ones are more expensive

A
  • S3 Standard
  • S3 Intelligent Tier: Optimize cost by automatically moving data to the most effective Access tier (good if you don’t have thousands of files).
  • S3 Infrequently Access (IA): Data that is access less frequently but needs to be access fast.
  • S3 One Zone IA: For where you want the infrequently access feature but without the multiple availability zones.
  • S3 Glacier: secure, durable, low-cost, storage class for data archiving
  • S3 Glacier Deep Archive: Even lower S3 Glacier, where an access time of 12 hours is accepted
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Which are the S3 types that are more expensive and more cheap?

A

Sorted by price (most expensive to less):
S3 Standard
S3 - IA (Infrequently Access)
S3 Intelligent Tiering
S3 One zone - IA (Infrequently Access)
S3 Glacier
S3 Glacier Deeper Archive

You try to avoid S3 Standard, using S3 Intelligent Tiering unless you have thousands and thousands of files (it charges you for thousands of files a commission for automation).
If you don’t have to worry about redundancy you should go with S3 One Zone IA but bear in mind that if that zone fails, you lose your data.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Can S3 versioning be disable?

A

After enabling versioning on a bucket, it cannot be disabled. Versioning can only be suspended after it has been enabled. Things to note: Suspending versioning stops any new versions of the objects from being created.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

What are the S3 types of encryptions?

A

ENCRYPTION IN TRANSIT (SSL/TSL by HTTPS)

SERVER SIDE ENCRYPTION
- Key S3 (SSE S3)
- AWS Key Management Service (SSE KMS)
- Server Side Encryption With customer provided keys
Encrypt the object your self and then uploaded yourself

SSE = Server Side Encryption

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

When you delete a file it deletes all of their versions? And what happen when you modify the permission of the file (make it public)?

A

No, you have to go manually and delete all versions of that file.

Same for making a file public. It makes public the last version only. If you want previous versions to be public, you have to do it manually.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

What can you do with lifecycle of S3?

A
  • Delete old version after X amount of days
  • Change the storage tier (type) of S3 (like from default to IA after some days of not using it)

You create rules for this type of things

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

What is S3 Object Lock?

A

Store an object (file) using the model “write once, read many (WORM)”.

It can help to avoid deleting or modifying for X amount of time or indefinitely.

You can set it individually or for a bucket

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

What are the modes of S3 Object Lock?

A
  • Governance mode: just users with the permissions can edit or delete
  • compliance mode: no user can edit or delete
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

What is S3 Glacier Vault Lock?

A

Specify controls such us WORM in a Vault lock policy and lock policy for future edits. Once lock, the policy can not be changed.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

what is a prefix in S3 and how it could improve performance?

A

If I have the path of a file mybucket/folder/subfolder/file.png, the prefix would be folder/subfolder`.

You can have X amount of requests per seconds per prefix.

If you use more prefix, you can do more requests per second.

If you use KMS it has it’s limits because of their API

17
Q

What is S3 Select?

A

It is used to retrieve a subset of data from an object (file) by using simple SQL expressions.

Get data by rows and columns using simple SQL queries.

Save money on data transfer and increase speed.

For example, if you have a CSV file, you could query it without having to download it.

18
Q

Tips for AWS Account?

A
  • multi factor auth for root account
  • strong and complex password on root account
  • paying account just be used for billing purpose only. Do not deploy here
  • Enable/disable AWS services using Service Control Policies (SCP) either on OU or on individual accounts
19
Q

3 different ways to share S3 buckets across accounts?

A
  • Using bucket policies and IAM (applies to the entire bucket)
  • Individual Object Access: Using ACLs and IAM. Programmatic Access Only
  • Cross Account IAM Roles. Programmatic AND console access.
20
Q

Cross Region Replication for S3, what is and tips?

A

Replication enables automatic, asynchronous copying of objects across Amazon S3 buckets.

  • Version should be enabled in the destination and source bucket
  • files in an existing bucket are not automatically replicated. Just the new ones after turning on the region replication
  • all subsequent updated files will be replicated automatically
  • Delete markers or versions are not going to be replicated
21
Q

When to use Cross-Region Replication

A

Meet compliance requirements – Although Amazon S3 stores your data across multiple geographically distant Availability Zones by default, compliance requirements might dictate that you store data at even greater distances. To satisfy these requirements, use Cross-Region Replication to replicate data between distant AWS Regions.

Minimize latency – If your customers are in two geographic locations, you can minimize latency in accessing objects by maintaining object copies in AWS Regions that are geographically closer to your users.

Increase operational efficiency – If you have compute clusters in two different AWS Regions that analyze the same set of objects, you might choose to maintain object copies in those Regions.

22
Q

When to use Same-Region Replication

A

Aggregate logs into a single bucket – If you store logs in multiple buckets or across multiple accounts, you can easily replicate logs into a single, in-Region bucket. Doing so allows for simpler processing of logs in a single location.

Configure live replication between production and test accounts – If you or your customers have production and test accounts that use the same data, you can replicate objects between those multiple accounts, while maintaining object metadata.

Abide by data sovereignty laws – You might be required to store multiple copies of your data in separate AWS accounts within a certain Region. Same-Region Replication can help you automatically replicate critical data when compliance regulations don’t allow the data to leave your country.

23
Q

What S3 Transfer acceleration?

A

Users upload the file to a Cloudfront edge location, you have a custom link for that, and then from there, the file is uploaded to the actual S3 bucket.

In that way, it accelerates the uploads to S3

24
Q

what is data sync?

A
  • used to move large amounts of data from on premises to AWS
  • used with NFS (Network File system) and SMB compatible file systems
  • Replication can be done hourly, daily or weekly
  • Install the DataSync agent to start the replication
  • Can be used to replicate from EFS (Amazon Elastic File System) to another EFS
25
Q

related with cloudfront, explain “Edge Location”, “Origin” and “Distribution”, “Web Distribution and RTMP”

A

Edge location: is the location where the content will be cached

Origin: the origin of the files (for example the S3 bucket, an EC2 instance, Route53)

Distribution: Name given the CDN (Collection of edge locations)

Web distribution is used for web and RTMP for media streaming

26
Q

Can you clear cache in cloudfront?

A

Yes but you will be charged

27
Q

IAM vs SCP (Service Control Policies)

A

SCPs are similar to AWS Identity and Access Management (IAM) permission policies and use almost the same syntax. However, an SCP never grants permissions. Instead, SCPs are JSON policies that specify the maximum permissions for the affected accounts.