CloudAcademy Practice Exam Flashcards

1
Q

When configuring Azure Firewall network rules to allow connections to an application’s DNS server, which port number(s) should you select?

A. 53
B. 22
C. 25
D. 67 and 68

A

A. 53

Explanation:

SSH - 22

SSH is also referred to as 'Secure Shell'. It operates on the port number 22 of the TCP protocol. It carries out the task of remotely connecting to a remote server or host. It allows you to execute a number of commands and move your files remotely as well. However, it is one of the most secure ways of accessing your files remotely. Using this port, you can remotely connect to a computer and move your files with ease. This port sends the data over the network in an encrypted form which adds an extra layer of security on it. In addition to this, only authorized people will be able to remotely log on to their systems using the Port 22 which makes sure that the information does not get into unauthorized hands. It provides the chance to move files within networks as well as gives the privilege to move files between different networks securely. It operates at the Application Layer of the TCP/IP Model and is considered as one of the most secure and reliable ports for accessing files remotely.

DNS - 53

DNS is referred to as 'Domain Name System'. It operates on the port 53 of TCP and UDP protocols. DNS makes use of relational databases to link the host names of the computers or networks to their respective IP Addresses. The port 53 waits for requests from DHCP to transfer the data over the network. It operates on the Application Layer of the TCP/IP Model.

DHCP - 67, 68
DHCP is also known as ‘Dynamic Host Configuration Protocol’. It basically runs on the UDP protocol. The basic purpose of DHCP is to assign IP Address related information to the clients on a network automatically. This information may comprise of subnet mask, IP Address etc. Many of the devices are automatically configured to look for IP Addresses using DHCP when they connect on a network. It makes it quite reliable to assign all the devices on a network with automatically produced IP Addresses. It generally operates on the Application layer of the TCP/IP Model. DHCP basically makes use of 2 ports; Port 67 and Port 68.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

You need to investigate unexpected errors caused by requests initiated by web applications hosted on Azure. You suspect errors were caused by several types of resources, including compute, storage, notifications, and key management.

The best course of action is to compile all the data and then create queries to analyze the data manually.

What type of log data should you review first, and what service should you use to review the data?

A. Review diagnostic logs with Log Analytics
B. Review activity logs with Azure Event Hub
C. Review application logs with Azure Queue Storage
D. Review boot diagnostic logs with Azure Table Storage

A

A. Review diagnostic logs with Log Analytics

Explanation:
To determine the right type of data to analyze, the key factor is that the errors were likely within requests from Azure services, which are actions tracked by diagnostic logs. It would also not be application logs, in this case, because the errors involve multiple types of resources, not just compute resources.

To determine the best service, the ability to create queries of log data is offered specifically by Log Analytics.
Bookmark
Learn more: https://docs.microsoft.com/en-us/azure/monitoring-and-diagnostics/monitoring-overview-of-diagnostic-logs?toc=/azure/azure-monitor/toc.json

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Microsoft Defender for Cloud security policies can do all except which of the following?

A. Data collection from deployed resources
B. Security recommendations based on general best practices
C. Provide instructions on how to address existing security vulnerabilities
D. Enforce compliance with general security best practices

A

D. Enforce compliance with general security best practices

Explanation:
Microsoft Defender for Cloud is focused on monitoring your environment and alerting you to potential security threats. On its own, it does not enforce compliance - this is possible through Azure Policy.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

A company is building an application which is going to be hosted in Azure. They want the application to allow users to sign up by using their existing social accounts. Which of the following methods can help fulfill this requirement?

A. Set up a separate tenant. Use SAML to allow users to sign-up.
B. Set up a separate tenant. Use ws-federation to allow users to sign-up.
C. Create a separate Azure B2C directory. Register the application with the Azure Active Directory B2C directory.
D. Create a separate Azure directory. Register the application with the Azure Active Directory directory.

A

C. Create a separate Azure B2C directory. Register the application with the Azure Active Directory B2C directory.

Explanation:
Azure B2C is a separate directory that can make it easier for consumers that have social accounts to sign up for applications hosted in Azure. When they use Azure Active Directory B2C, the consumers can sign up for your applications by using their existing social media accounts (Facebook, Google, Amazon, LinkedIn) or by creating new credentials.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

You are the owner of a resource group that contains the following Azure resources:

VNet1, which contains Subnet1. Subnet1 is assigned a routing table, and a network security group named NSG-1.
SubNet1 contains an ARM virtual machine 1 with a private IP address only.

VM-Database1 needs to connect to an on-premises static IP address (216.3.128.12) to request software updates. You do not want to reveal the IP address of the ARM virtual machine 1. All inbound traffic aside from the software updates should be blocked.

Which steps should you take to allow the database to connect successfully for updates while limiting threats? (Choose 2 answers.)

A. Deploy a private load balancer associated with the ARM virtual machine.
B. Deploy a NAT gateway associated with Subnet1.
C. Update NSG-1 to allow outbound traffic to and from 216.3.128.12 over port 443. Include no other rules allowing traffic.
D. Update NSG-1 to allow outbound traffic to 216.3.128.12 over port 443. Include no other rules allowing traffic.

A

B. Deploy a NAT gateway associated with Subnet1.
D. Update NSG-1 to allow outbound traffic to 216.3.128.12 over port 443. Include no other rules allowing traffic.

Explanation:
Network security group security rules are evaluated by priority using the 5-tuple information (source, source port, destination, destination port, and protocol) to allow or deny the traffic. A flow record is created for existing connections. Communication is allowed or denied based on the connection state of the flow record. The flow record allows a network security group to be stateful.

Deploy a Network Address Translation or NAT gateway to enable Source Network Address Translation (SNAT). As Microsoft explains in its documentation:

Source Network Address Translation (SNAT) rewrites the source of a flow to originate from a different IP address and/or port. Typically, SNAT is used when a private network needs to connect to a public host over the internet. SNAT allows multiple compute resources within the private VNet to use the same single Public IP address or set of IP addresses (prefix) to connect to the internet.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What term describes a private, encrypted connection between an on-premises location and Azure, in which traffic technically passes over the internet?

A. A DNS zone
B. A Site-to-Site VPN connection
C. A Point-to-Site VPN connection
D. A VNet Peering connection

A

B. A Site-to-Site VPN connection

Explanation:
You want to be familiar with what is available in Azure in terms of connecting different network sites, whether it be on-premises Azure or Azure-to-Azure and their limitations. For example, a site to site connection is your typical on-premises to Azure connection and although traffic technically passes over the internet, this is a private connection, traffic is encrypted and secured through IPSec tunnels. A point-to-site connection is for connecting individual client computers to an Azure Virtual Network. A VNet Peering connection is an Azure-to-Azure connection which does not have to use VPN Gateways for connectivity across Azure VNets.
Bookmark
Learn more: https://docs.microsoft.com/en-us/azure/vpn-gateway/vpn-gateway-howto-site-to-site-resource-manager-portal

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

You are configuring data security settings for separate Azure SQL databases. Database A stores social security numbers, which you want to prevent any users or applications from viewing. The social security numbers appear in one column within a single table of Database A.

Database B stores credit card information, including credit card numbers, which only privileged database administrators should be able to see. The credit card numbers appear in columns within several tables in Database B.

How should you configure the data encryption settings for these databases to meet these requirements?

A. Enable ‘Always Encrypted’ for Database A, and Dynamic Data Masking (DDM) for Database B.
B. Enable ‘Always Encrypted’ for Database A and Database B.
C. Enable Dynamic Data Masking (DDM) for Database A, and ‘Always Encrypted’ for Database B.
D. Enable Dynamic Data Masking (DDM) for Database A and Database B.

A

A. Enable ‘Always Encrypted’ for Database A, and Dynamic Data Masking (DDM) for Database B.

Explanation:
Always Encrypted’ prevents any users or applications from viewing or decrypting data, so in cases where data should be stored but never accessed by anyone accept the customer, this feature should be enabled.

Dynamic Data Masking allows only privileged users to view specific data.

How often data appears within a database would not affect the encryption feature you enable, only how you apply it, which is not a factor in answering this question

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

How does Microsoft Defender for Cloud ensure compliance with company and regulatory security requirements?

A. Customization by administrators
B. Pre-defined policies in the Azure subscription
C. Centralized Policy Management
D. By making recommendations to remediate security vulnerabilities

A

C. Centralized Policy Management

Explanation:
Through centralized policy management, compliance with company and regulatory security requirements is ensured.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Azure Policy focuses on enforcing organizational standards on Azure ______________.

A. resources
B. users
C. groups
D. costs

A

A. resources

Explanation:
With policies, you can prevent users in your organization from breaking conventions that are needed to manage your organization’s resources. It is important to note that policies and RBAC work together. However, there are differences. RBAC focuses on the actions a user can perform at different scopes while policy focuses on resource actions at various scopes.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

You have a microservice application hosted on Azure App Services named Azure Service Environment 1. The application communicates with on-premise database servers and data analysis applications. You need to find an effective monitoring solution to do the following:

Monitor performance of Azure Service Environment 1 and the on-premise database servers.
Provide alerts when communication between the on-premise database and Azure Service Environment 1 is disrupted.
Provide quantitative data regarding customer usage.

What Azure services or features within Azure App Service can meet all your requirements?

A. Azure Application Insights
B. Azure Monitor
C. Azure App Service Diagnostic Logs
D. Azure App Service Metrics

A

A. Azure Application Insights

Explanation:
Application Insights can collect data from applications in Azure, running on-premise, or on other clouds. The integration with Azure Web Apps makes it exceptionally easy to use in Azure.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Which of the following statements regarding multi-factor authentication and Azure Privileged Identity Management, or PIM, is correct?

A. All PIM users are required to complete multi-factor authentication when logging in.
B. Multi-factor authentication for PIM users requires Azure MFA.
C. Existing on-premise Active Directory ID providers can manage MFA for PIM logins.
D. Existing third-party ID providers can perform MFA for PIM logins.

A

C. Existing on-premise Active Directory ID providers can manage MFA for PIM logins.

Explanation:
There are two options for validating MFA when a user activates a role.

The simplest option is to rely on Azure MFA for users who are activating a privileged role. To do this, first check that those users are licensed, if necessary, and have registered for Azure MFA. For more information about how to deploy Azure MFA, see Deploy cloud-based Azure Multi-Factor Authentication. It is recommended, but not required, that you configure Azure AD to enforce MFA for these users when they sign in. This is because the MFA checks will be made by PIM itself.

Alternatively, if users authenticate on-premises, you can have your identity provider be responsible for MFA. For example, if you have configured AD Federation Services to require smartcard-based authentication before accessing Azure AD, Securing cloud resources with Azure Multi-Factor Authentication and AD FS includes instructions for configuring AD FS to send claims to Azure AD. When a user tries to activate a role, PIM will accept that MFA has already been validated for the user once it receives the appropriate claims.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

You delete an existing B2C tenant, and re-create it with the same domain name. Now users are not able to sign-in. What does Microsoft recommend in this situation?

A. Create the B2C tenant with a different domain name.
B. Create the B2C tenant with an identical domain name.
C. Call support
D. Create the B2C tenant with a similar domain name and add a number to it which you will delete in the future.

A

A. Create the B2C tenant with a different domain name.

Explanation:
There are known issues when deleting an existing B2C tenant and recreating it with the same domain name. In order to resolve this issue create a B2C tenant with a different domain name.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

A company hosts a web-based .Net application in Azure. They require that whenever an abnormal activity occurs, such as high page request rate, a custom application is notified so that it can be handled accordingly. Which option below meets this requirement?

A. Create an alert in the Azure dashboard and configure the email alert. Ensure the custom application consumes the email alerts.
B. Create a custom powershell utility to check the the application request rate and then alerts the custom application accordingly.
C. Create an alert and use the Webhook functionality to send the notification to the custom application.
D. Create a custom utility that monitors and checks the application request rate and then sends the alert to the custom application.

A

C. Create an alert and use the Webhook functionality to send the notification to the custom application.

Explanation:
Webhooks allow one to route an Azure alert notification to other systems for post-processing or custom actions. A lot of custom systems support webhooks, hence this is the ideal implementation to alert third party systems to any irregularities generated by alerts in Azure.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Through a series of corporate acquisitions, your company recently acquired two new branch offices. You would like to sync your identity and access management systems using Azure Active Directory to allow shared access to training applications and other corporate resources. Key details are listed below.

Your office (Office 1) has an on-premise employee training application with single sign-on managed through Active Directory Federation Services.
Office 1 and Office 2 are different branch offices that share access to company resources within the same single Azure AD tenant.
Office 3 is a recently acquired office with separate resources secured by a proprietary identity and access management solution. Technically Office 3 is a different company owned by the same parent company that owns Offices 1 and 2.

Office 1 has enabled the necessary Azure Active Directory single sign-on solution to allow access to the on-premise training application through Azure and Office 365. Office 2 can now connect and access the training application, but Office 3 cannot.

Which Azure AD feature can allow Office 3 employees to access the training resources with their existing IAM credentials?

A. Azure AD B2B
B. Azure AD B2C
C. Azure Hybrid Identities
D. Azure AD Connect

A

A. Azure AD B2B

Explanation:
Azure Active Directory Business-to-Business collaboration, also known as Azure AD B2B, allows an organization to securely share company applications and company services with guest users from other organizations while retaining control over company data. With Azure AD B2B, an organization can work with external partners, even if they don’t use Azure AD. The invitation and redemption process of Azure AD B2B allows users in a partner organization to use their own credentials to access a company’s resources. Because the partner organization uses its own identity management solution, the external administrative overhead for the sharing organization is essentially non-existent. There’s no requirement to manage external accounts or passwords, nor is there a need to synchronize accounts or manage account lifecycles.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

As your company’s database administrator and owner of a Cosmos DB account, you need to create a new Cosmos DB database to support an application currently being developed. You also need to grant access to a member of your IT staff who will be testing the new application. The developer will need to create containers and modify the Cosmos DB database settings to fine-tune them.

Additionally, you will need to create the necessary credentials for the application, which will be hosted on Azure App Service web apps, to connect with the database, and will upload, modify and read data to fulfill expected requests.

To simplify the testing process, you would like to create a set of application credentials that persists while the test resources themselves may be continuously created and deleted throughout the development process. You also want to provide access to the IT staff member following general security best practices.

How should you proceed?

A. Create the database using the primary or secondary read-write master key. Assign the IT staff member Account Contributor role through Azure Active Directory. Create a user-assigned managed identity for the application hosted on Azure App Service web apps.
B. Create the database using the primary read-write master key. Provide access to the IT staff member using the secondary read-write master key. Create a user-assigned managed identity for the application hosted on Azure App Service web apps.
C. Create the database using the primary or secondary read-write master key. Assign the IT staff member Account Contributor role through Azure Active Directory. Create a system-assigned managed identity for the application hosted on Azure App Service web apps.
D. Create the database using the primary or secondary read-write master key. Provide access to the IT staff member using the primary read-only master key. Create a system-assigned managed identity for the application hosted on Azure App Service web apps.

A

A. Create the database using the primary or secondary read-write master key. Assign the IT staff member Account Contributor role through Azure Active Directory. Create a user-assigned managed identity for the application hosted on Azure App Service web apps.

Explanation:

The master keys are essentially the root access keys for the Cosmos DB account owner, and can be used to create resources, but should not be shared. Assigning permissions via RBAC is the best course of action in this case, and creating a user-assigned managed identity means the credentials will persist and can be repeatedly assigned to new and different resources in the dev/test environment.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Your organization is implementing an application that will be published through the Azure Active Directory (Azure AD) application proxy primarily enabling access to on-premises applications. The application relies on a central on-premises directory like Windows Server Active Directory. What statement describes how identity and access management occur?

A. Access to this application is enabled through an X.509 certificate and SSH key.
B. Access to this application is enabled through directory information and token issuance.
C. The access credential may be a federation token or user-name and password for an account that was previously provisioned in the application.
D. Access to this applications is enabled by triggering the proxy to deliver the application content to the end user while honoring the on-premises sign-on requirement.

A

D. Access to this applications is enabled by triggering the proxy to deliver the application content to the end user while honoring the on-premises sign-on requirement.

Explanation:
It important to understand that the way the authorization is enacted on the target application varies depending on how the application was integrated with Azure AD. There are on-premises applications. These applications are published through the Azure AD application proxy primarily enabling access to on-premises applications. These applications rely on a central on-premises directory like Windows Server Active Directory. Access to these applications is enabled by triggering the proxy to deliver the application content to the end user while honoring the on-premises sign-on requirement.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

CORRECT

You work as a security manager for a company that builds and manages customer applications. As part of your service agreement, you allow customers to conduct spot checks of their applications in the production environment.

You typically apply resource locks that prevent any modifications to all resources being checked, to prevent any accidental changes during the review.

Your customer, Contoso, will be checking all resources within Resource Group 1 over the next five business days. Resource Group 1 contains 50 resources in total.

Resource Group 1 contains a critical database hosted on VM1 that needs extensive updates and patching over the next five business days.

What is the most efficient way to apply resource locks to the resources in Resource Group 1 to prevent accidental modifications, but still allow the database on VM1 to be modified?

A. Move VM1 to a new resource group and then apply a CanNotDelete resource lock to VM1. Then apply a ReadOnly resource lock to Resource Group 1.
B. Apply ReadOnly resource locks to all resources except VM1. Apply a CanNotDelete resource lock to VM1.
C. Apply a ReadOnly resource lock to Resource Group 1. Then apply a CanNotDelete resource lock to VM1.
D. Apply a ReadOnly resource lock to Resource Group 1, and apply that lock to the customer reviewer’s Azure AD guest user account.

A

A. Move VM1 to a new resource group and then apply a CanNotDelete resource lock to VM1. Then apply a ReadOnly resource lock to Resource Group 1.

Explanation:
Resource locks, when applied, apply to all users and roles. They cannot be applied to only specific users or roles. When a resource lock is applied to a resource group, it is inherited by all the resources within it, and in the case that multiple locks are applied to a single resource, the most restrictive lock will take effect.

18
Q

You are the senior Azure SQL Database architect for a Wall Street brokerage firm. Your firm has numerous governmental regulations that require retention of the automatic full Azure SQL database backups for 7 years, which is far beyond Azure’s Database automatic backup feature of 7-35 days.

What does Microsoft recommend as the best way to accomplish your long backup requirements?

A. Use the Azure SQL Database Geo-replication tool as it can handle backups for that time period.
B. Use the Azure SQL Database Long-Term Backup Retention feature.
C. Run nightly tape backup jobs and store the tapes in a third-party vault offsite.
D. Use a third-party backup and recovery tool to store the databases for seven years on a rotating basis

A

B. Use the Azure SQL Database Long-Term Backup Retention feature.

Explanation:
The Long-Term Backup Retention feature enables you to store your Azure SQL Database backups in an Azure Recovery Services vault for up to 10 years. This feature can be used for applications that have regulatory, compliance, or other business purposes that require you to retain the automatic full database backups beyond the 7-35 days provided by SQL Database’s automatic backups.

19
Q

Of the following choices, which is the name of the most current Active Directory synchronization tool?

A. Azure AD Connect
B. Azure Active Directory Synchronization Services (DirSync)
C. Azure Active Directory Synchronization Services (AAD Sync)
D. Forefront Identity Manager (FIM)

A

A. Azure AD Connect

Explanation:
Azure AD Connect is the most current Active Directory synchronization tool. The other tools are considered legacy and/or deprecated but still functional.

20
Q

Azure AD applications can access a key vault within Azure Key Vault using which credentials?

A. SAS token
B. Multi-Factor Authentication
C. Username and Password
D. Client Id and Client Secret or Certificate

A

D. Client Id and Client Secret or Certificate

Explanation:
Azure AD applications use their Client Id and Client Secret or Certificate to access the Key Vault. SAS tokens are used to temporarily grant access to Azure Storage. Multi-Factor Authentication is for authenticating user signin but not used by applications to access the Key Vault. Username and Password is not a valid option for applications accessing the Key Vault.

21
Q

Your organization wants to secure customer personal data stored within your Azure Virtual Machine (VM) environment. You suggest Azure Disk Encryption, which is an option available to both Linux and Windows VMs.

While the encryption process is actually pretty straightforward, and is as easy as deploying a VM extension in PowerShell, what is one caveat to the process that adds a level of complexity?

A. Bitlocker enabled and Azure Backup Service are mutually exclusive processes.
B. Bitlocker is ineffective at encrypting the operating system.
C. A mechanism must be in place to manage the encryption keys for the encrypted disk.
D. The process of creating the encryption keys is complex.

A

C. A mechanism must be in place to manage the encryption keys for the encrypted disk.

Explanation:
The one caveat to the Bitlocker process that adds a somewhat difficult level of complexity is managing the encryption keys that go along with encrypting your disk. After all, if you lock something away, someone has to keep track of the keys to reopen it. The good news is Azure provides what is called the Azure Key Vault service which is used to help you manage and control your disk-encryption keys and secrets used by cloud applications and services.

22
Q

CORRECT

Azure Update Management can manage operating system updates for which combination of operating systems and platforms?

A. Any Azure-hosted virtual machine running supported Windows or Linux operating systems
B. Any cloud-based virtual machine (Azure or non-Azure) running a supported Windows operating systems
C. Any Azure-hosted or on-premises computer running supported Windows or Linux operating systems
D. Any cloud-based (Azure or non-Azure) or on-premises computer running supported Windows or Linux operating systems

A

D. Any cloud-based (Azure or non-Azure) or on-premises computer running supported Windows or Linux operating systems

Explanation:
You can use the Update Management solution in Azure Automation to manage operating system updates for your Windows and Linux computers in Azure, in on-premises environments, and in other cloud providers. You can quickly assess the status of available updates on all agent computers and manage the process of installing required updates for servers.

23
Q

Stuart is a contractor who needs read and write access to resources within two resource groups, Resource Group 1 and Resource Group 2. He will assist with updates to live applications within both resource groups.

The role assignment has the following requirements:

For security reasons, all hired contract employees must complete MFA for each login to Azure.
Due to the urgency of the project, Stuart should have immediate access to all resource upon assignment.

What choices below best meet these requirements? (Choose 2 answers)

A. Assign an active role type.
B. Assign an eligible role type.
C. Require MFA while on active assignment.
D. Require MFA upon role activation and active assignment.

A

A. Assign an active role type.
C. Require MFA while on active assignment.

Explanation:
There are two assignment types: active and eligible. Eligible assignments require an action to be activated, which could be MFA or providing a justification for activation. Active roles are essentially pre-approved, and require no further MFA or justification.

For Stuart to have immediate access, he needs an active role, and with an active role, MFA upon activation is not a valid option, only MFA while on active assignment

24
Q

You are configuring Azure Firewall outbound network rule to allow connections to an IP address via Port 53. Which protocol should you select?

A. UDP
B. POP
C. DHCP
D. HTTPS

A

A. UDP

Explanation:
To create our network rule, we need to select the Network Rule Collection tab. Now from here, we’ll choose the option to add a network rule collection and we’ll call this NetworkCollection. Again, we’ll set our priority to 200 and we’re going to allow our traffic. At this point, we need to define our rule. So under IP addresses, under the Rule section here. For our name, we’re going to call it AllowDNS. We’ll choose UDP for the protocol since DNS is UDP traffic.

25
Q

What statement comparing Azure resource roles and Azure Privileged Identity Management (PIM) is correct?

A. Azure resource roles are not integrated into Azure PIM
B. By default, Global Administrators can perform read and write operations managed within Azure PIM,
C. Managing Azure resource roles and PIM settings both require Azure MFA.
D. Azure resource roles include a hierarchy, while PIM settings are resource specific.

A

D. Azure resource roles include a hierarchy, while PIM settings are resource specific.

Explanation:
The concept of a resource hierarchy is unique to Azure resource roles. This hierarchy enables the inheritance of role assignments from a parent resource object downward to all child resources within the parent container.

PIM settings are configured for each role of a resource. Unlike assignments, these settings are not inherited and apply strictly to the resource role.
Bookmark
Learn more: https://docs.microsoft.com/en-us/azure/active-directory/privileged-identity-management/pim-resource-roles-eligible-visibility#azure-resource-role-approval-workflow

26
Q

Reviewing your recently launched application, you notice several changes that could affect performance or security. You are concerned that the IT management team is not reviewing corporate requirements in these areas.

You want to automate a process to review Azure resources, including storage accounts and VMs, against the rules you specify. You want to be automatically notified of any noncompliance.

What steps can you take to accomplish this goal?

A. Create resource policies with Azure Policy.
B. Implement ‘Read-Only’ resource locks.
C. Create alerts using Azure Log Analytics.
D. Initialize change tracking with Azure Automation and Log Analytics.

A

A. Create resource policies with Azure Policy.

Explanation:
Azure Policy is the only option that allows you to specify requirements and review resources against those requirements.

27
Q

Which of the following security validation methods simply tests the responsiveness of your website at regular intervals?

A. URL ping tests
B. Custom telemetry tests
C. playback of recorded web requests
D. custom attack surface reviews

A

A. URL ping tests

Explanation:
At the most basic level, there is the URL ping test, which as the name implies, tests basic responsiveness of your website at regular intervals, logging results. The ping test can be configured through the Azure portal

28
Q

When you configure key management for storage accounts, you must ensure which of the following key vault configurations?

A. The key vault is in the same region as the storage account
B. The key vault is in the same subscription as the storage account
C. The key vault is linked to a container within the storage account
D. Make sure of all of these key configurations are correct

A

A. The key vault is in the same region as the storage account

Explanation:
Azure Key Vault is a multi-tenant service and uses a pool of Hardware Security Modules (HSMs) in each Azure location.

All HSMs at Azure locations in the same geographic region share the same cryptographic boundary (Thales Security World). For example, East US and West US share the same security world because they belong to the US geo location. Similarly, all Azure locations in Japan share the same security world and all Azure locations in Australia, India, and so on.

A backup taken of a key from a key vault in one Azure location can be restored to a key vault in another Azure location, as long as both of these conditions are true:

Both of the Azure locations belong to the same geographical location
Both of the key vaults belong to the same Azure subscription

For example, a backup taken by a given subscription of a key in a key vault in West India, can only be restored to another key vault in the same subscription and geolocation; West India, Central India or South India.
Bookmark

29
Q

Your company is being audited, and an external accountant needs access to review and download specific files from the blob storage and file storage services within one specific Azure storage account.

You currently use Azure Active Directory to control access to the Azure storage account in question. However, you have been told you need to provide the accountant with immediate access to the blob and file storage account without any further information.

How can you provide necessary access, but also limit it to the blobs in question?

A. Provide the accountant with read-only access to the specific Azure Blob and File storage services with a service-level shared access signature token. Allow all read requests but limit write requests to LIST and GET. Specify the HTTPS protocol is required to accept requests.
B. Assign the accountant a guest role in Azure Active Directory with read-only access to the specific Azure Blob and File services in the Azure Storage account.
C. Assign the accountant a contributor role access to the entire storage account using Azure AD role-based access control (RBAC).
D. Provide the accountant with read-only access to the specific Azure Blob and File storage services with an account-level shared access signature token. Allow all read requests but limit write requests to LIST and GET. Specify the HTTPS protocol is required to accept requests.

A

D. Provide the accountant with read-only access to the specific Azure Blob and File storage services with an account-level shared access signature token. Allow all read requests but limit write requests to LIST and GET. Specify the HTTPS protocol is required to accept requests.

Explanation:
In this case, an account-level SAS is required because the accountant needs access to two separate services in the account. You do not have the necessary information to create a guest or contributor account to control the accountant’s access, but you can add controls to require requests are sent via an HTTPS protocol, and also control the specific read/write actions.

30
Q

As a medical insurance provider, you are undergoing an annual insurance claims audit. You need to provide access to records for the prior calendar year to a claims reviewer for the next five business days. The records are organized by client by year, and distributed across multiple Azure storage accounts as your storage account increases in size.

This reviewer does not currently have any system access via your Azure Active Directory tenant.

You need to limit access to records for the prior calendar year, and prevent the reviewer from performing any update or delete operations. You also want to limit access to the time period of five business days.

What choice provides the required access securely and with the greatest administrative efficiency?

A. Provide access to the files using shared access signatures (SAS) limiting access to the specific files created in the prior calendar year for the five business days.
B. Copy all relevant records to a separate storage account in Azure Storage. Create an Azure AD guest user through the Azure portal for the claims reviewer, and provide the private storage key to the reviewer to only that storage account. Remove the guest user account after five business days.
C. Create an Azure AD guest user through the Azure portal for the claims reviewer. Give the guest user permission to access the Azure storage accounts with relevant records via Azure Key Vault permissions. Remove the guest user account after five business days.
D. Create an Azure AD guest user through the Azure portal for the claims reviewer. Create a custom role using role-based access controls (RBAC) to grant authorization to read the specific files. Remove the guest user account after five business days.

A

A. Provide access to the files using shared access signatures (SAS) limiting access to the specific files created in the prior calendar year for the five business days.

Explanation:
This question requires a basic understanding of Azure AD, role-based access controls, Azure Key Vault, and Shared Access Signatures. All the options are relatively feasible, but the question asks which choice is the most secure and efficient.

Creating a permanent role with assigned access to files for a temporary issue is not best practice, and is not very efficient compared to the best option, which is using shared access signatures. Shared Access Signatures are available for Azure Storage, and allow you to limit access to a specific set of files, for a specific period of time, and with limited authorization to perform only approved actions.

The URL provided below provides an overview of shared access signatures and Azure Key Vault, but to understand all related services discussed in this question, you will need to review Cloud Academy’s Managing Role-Based Access Controls course, and Designing for Azure Identity Management.
Bookmark

31
Q

Which of these roles are able to make changes to the default security policy in Microsoft Defender for Cloud? (Choose 2 answers).

A. End User
B. Contributor
C. Subscriber
D. Owner

A

B. Contributor
D. Owner

Explanation:
The ability to make changes to the default security policy requires you to be an owner, contributor, or a security administrator of the Azure subscription.

32
Q

Which of the following validation methods can be implemented through the use of Visual Studio Enterprise?

A. custom telemetry tests
B. URL ping tests
C. playback of recorded web requests
D. custom attack surface reviews

A

C. playback of recorded web requests

Explanation:
Using Visual Studio, you can playback a recorded series of web requests against your website. These multi-step tests are created in Visual Studio Enterprise.

33
Q

You have deployed multiple AKS nodes with a Linux operating system and as a security best practice, you want to regularly review and update security settings for the operating system.

Which statements below describe how security patches are applied to AKS nodes with a Linux operating system? (Choose 2 answers)

A. All node operating systems are automatically upgraded nightly.
B. AKS automatically applies the latest updates.
C. AKS automatically reboots nodes as required to complete an update.
D. AKS automatically performs node pool upgrades nightly.

A

A. All node operating systems are automatically upgraded nightly.
B. AKS automatically applies the latest updates.

Explanation:
In the case of Linux nodes, Azure automatically applies the latest OS security patches on a nightly basis. However, it’s important to note that if a particular Linux OS update requires a host reboot, that reboot is NOT automatically performed. Instead, you can manually reboot the node when it’s convenient. You can also use Kured, which is an open-source reboot daemon for Kubernetes.

34
Q

CORRECT

The CIS Microsoft Azure Foundations Security Benchmark provides several recommended best practices related to identity and access management.

Each of the following is a best practice except for which one?

A. Avoid unnecessary guest user accounts in Azure Active Directory
B. Enable Azure Multi-Factor Authentication (MFA)
C. Establish intervals for reviewing user authentication methods
D. Enable Self-Service Group Management

A

D. Enable Self-Service Group Management

Explanation:
There are two different implementation levels that CIS bases their recommendations on. There are also several different categories of recommendations that are made. The two levels are called Level 1 and Level 2. I know, very original.
Level 1 recommendations are the minimum recommended security settings that should be configured on ALL systems.

Level 1 recommendations typically cause little or no interruption of services, nor do they usually result in reduce functionality.
Level 2 recommendations are designed for highly secure environments. That being the case, they can sometimes result in reduced functionality of the systems they are implemented on.

Each of these choices is a valid recommendation except enabling Self-Service Group Management. With this feature, you can enable users to create and manage their own security groups or Office 365 groups in Azure Active Directory (Azure AD). The owner of the group can approve or deny membership requests, and can delegate control of group membership.

A level 2 recommendation is to disable this feature, but this is intended for highly secure environments and disabling this could arguably compromise efficiency for the sake of security.

35
Q

In comparison to Azure Kubernetes Clusters (AKS), how does the Azure Container Instances (ACI) service offer an increased level of security better-suited for multi-tenant environments?

A. ACI offers fine-grained permission controls through Azure Active Directory that is currently not supported by AKS.
B. ACI can be deployed into VNets while AKS clusters cannot.
C. ACI offers greater application isolation with the use of a hypervisor.
D. ACI integrates with Azure Security Center while AKS clusters do not.

A

C. ACI offers greater application isolation with the use of a hypervisor.

Explanation:
Historically, containers have offered application dependency isolation and resource governance but have not been considered sufficiently hardened for hostile multi-tenant usage. Azure Container Instances guarantees your application is as isolated in a container as it would be in a VM.

36
Q

INCORRECT

Your team is spending too much time recovering from unplanned events, specifically when small resource updates occur that disrupt service operations, or noncompliant resources are created.

You want to automate a process to review log data related to resource updates. You also need to design specific queries and potentially alerts related to these kinds of noncompliant resource updates.

What type of logs would you analyze, and with what Azure service?

A. Analyze activity logs with Log Analytics
B. Analyze diagnostic logs with Event Grid
C. Analyze application logs with Stream Analytics
D. Analyze diagnostic logs with Event Hub

A

A. Analyze activity logs with Log Analytics

Explanation:
There are three types of logs we need to be aware of: activity logs, diagnostic logs, and application logs, or guest OS logs. Let’s take a look at where these logs exist within an Azure subscription in relation to the resources they are monitoring. Here we have a Non-Compute Resource, which is tightly integrated and delivered through Azure providers, for example a network security group. Next to this, we have a Compute Resource.

This is a virtual machine with a guest OS, like Windows or Linux, and it has an application installed like IIS or Apache. Activity logs provide a record of operations from a subscription level, executed against the resource. For example, when administrative tasks are performed on the resource, like creating a resource or updating the properties of an existing resource, this will generate an event in the activity log. Diagnostic logs are collected within a subscription at an Azure resource level for services like VPN gateways or network security groups. Not all Azure services have an option for diagnostic logging, and the level of detail you can capture varies. You can view a full list of resources that support diagnostic logging from the Microsoft Azure website. Application logs are logs generated by applications or services within a guest OS. These logs are collected from within the operating system through an agent. Application logs can be collected from core services, like Windows Event logs, or from applications like IIS. Diagnostic logging can be enabled in a couple of ways: using the Azure portal, PowerShell, Azure CLI or the REST API via Azure Resource Manager.
Bookmark

37
Q

Which Azure domain service is based in Azure rather than on-premise, and is designed to help migrate on-premise applications which need Active Directory Domain Service authentication to the cloud?

A. Do-It-Yourself Active Directory Domain Services
B. Azure AD Standalone
C. Azure Active Directory Hybrid ID Solution
D. Azure Active Directory Domain Services Solution

A

D. Azure Active Directory Domain Services Solution

Explanation:
Azure AD Domain Services solution is a cloud-based, lightweight option to meet on-premises identity requirements for network application development and testing. It isn’t meant to replace your on-premises identity solution but rather act as a mechanism to help migrate on-premises applications that require AD DS authentication methods to the cloud.

38
Q

When configuring Azure Firewall, which type of rule is specific to Azure Firewall and allows it to access fully qualified domain names from a subnet?

A. Application rules
B. Network rules
C. Network Security Group rules
D. Application Security Group rules

A

A. Application rules

Explanation:
Azure Firewall supports rules and rule collections. A rule collection is a set of rules that share the same order and priority. Rule collections are executed in order of their priority. Network rule collections are higher priority than application rule collections, and all rules are terminating.

There are three types of rule collections:

Application rules: Configure fully qualified domain names (FQDNs) that can be accessed from a subnet.
Network rules: Configure rules that contain source addresses, protocols, destination ports, and destination addresses.
NAT rules: Configure DNAT rules to allow incoming connections.

39
Q

How many default data classification labels are there when you enable Azure Information Protection?

A. 3
B. 4
C. 5
D. 6

A

B. 4

Explanation:
The default Azure Information Protection classification labels are:

Personal
General
Confidential
Highly Confidential
40
Q

You are configuring security settings for your Azure Data Lake, and want to integrate a Data Lake service endpoint within an existing VNet. Which steps should you implement to configure this? (Choose 2 answers)

A. Configure your Azure Data Lake in the same resource group as your VNet
B. Configure a Microsoft Azure Active Directory Service endpoint
C. Deploy the endpoint in your selected VNET
D. Disable connectivity from Azure services outside of the selected VNET

A

B. Configure a Microsoft Azure Active Directory Service endpoint
C. Deploy the endpoint in your selected VNET

Explanation:
To use virtual network integration with data lake storage gen1, you must create a virtual network in the same region as your data lake storage account. You need to configure a service endpoint with the Microsoft Azure Active Directory as the service. After creating your virtual network in the same region as your data lake, you need to go to your data lake and click on Firewall and virtual networks. Choose the Selected network radio button and then Add existing virtual network. In the Add networks blade, select your virtual network and the subnet and click Add. Below the firewall section under exceptions, you can enable connectivity from Azure services outside of your selected network.