test1 Flashcards
(75 cards)
Case Study
This is a case study. Case studies are not timed separately. You can use as much exam time as you would like to complete each case. However, there may be additional case studies and sections on this exam. You must manage your time to ensure that you are able to complete all questions included on this exam in the time provided.
To answer the questions included in a case study, you will need to reference information that is provided in the case study. Case studies might contain exhibits and other resources that provide more information about the scenario that is described in the case study. Each question is independent of the other questions in this case study.
At the end of this case study, a review screen will appear. This screen allows you to review your answers and to make changes before you move to the next section of the exam. After you begin a new section, you cannot return to this section.
To start the case study
To display the first question in this case study, click the Next button. Use the buttons in the left pane to explore the content of the case study before you answer the questions. Clicking these buttons displays information such as business requirements, existing environment, and problem statements. If the case study has an All Information tab, note that the information displayed is identical to the information displayed on the subsequent tabs. When you are ready to answer a question, click the Question button to return to the question.
Overview
Contoso, Ltd. is a manufacturing company that has 15,000 employees.
The company uses SAP for sales and manufacturing.
Contoso has sales offices in New York and London and manufacturing facilities in Boston and Seattle.
Existing Environment
Active Directory
The network contains an on-premises Active Directory domain named ad.contoso.com. User email addresses use a domain name of contoso.com.
SAP Environment
The current SAP environment contains the following components:
- SAP Solution Manager
- SAP ERP Central Component (SAP ECC)
- SAP Supply Chain Management (SAP SCM)
- SAP application servers that run Windows Server 2008 R2
- SAP HANA database servers that run SUSE Linux Enterprise Server 12 (SLES 12)
Problem Statements
Contoso identifies the following issues in its current environment:
- The SAP HANA environment lacks adequate resources.
- The Windows servers are nearing the end of support.
- The datacenters are at maximum capacity.
Requirements
Planned Changes
Contoso identifies the following planned changes:
- Deploy Azure Virtual WAN.
- Migrate the application servers to Windows Server 2016.
- Deploy ExpressRoute connections to all of the offices and manufacturing facilities.
- Deploy SAP landscapes to Azure for development, quality assurance, and production.
All resources for the production landscape will be in a resource group named SAP Production.
Business goals
Contoso identifies the following business goals:
- Minimize costs whenever possible.
- Migrate SAP to Azure without causing downtime.
- Ensure that all SAP deployments to Azure are supported by SAP.
- Ensure that all the production databases can withstand the failure of an Azure region.
- Ensure that all the production application servers can restore daily backups from the last 21 days.
Technical Requirements
Contoso identifies the following technical requirements:
- Inspect all web queries.
- Deploy an SAP HANA cluster to two datacenters.
- Minimize the bandwidth used for database synchronization.
- Use Active Directory accounts to administer Azure resources.
- Ensure that each production application server has four 1-TB data disks.
- Ensure that an application server can be restored from a backup created during the last five days within 15 minutes.
- Implement an approval process to ensure that an SAP administrator is notified before another administrator attempts to make changes to the Azure virtual machines that host SAP.
It is estimated that during the migration, the bandwidth required between Azure and the New York office will be 1 Gbps. After the migration, a traffic burst of up to 3 Gbps will occur.
Proposed Backup Policy
An Azure administrator proposes the backup policy shown in the following exhibit.
Policy name:
✅ SapPolicy
Backup schedule
Frequency: Daily
Time: 3:30 AM
Timezone: (UTC) Coordinated Universal Time
Instant Restore
Retain instant recovery snapshot(s) for 5 Day(s)
Retention range
✅ Retention of daily backup point
At: 3:30 AM
For: 14 Day(s)
✅ Retention of weekly backup point
On: Sunday
At: 3:30 AM
For: 8 Week(s)
✅ Retention of monthly backup point
Week Based - Day Based
On: First Sunday
At: 3:30 AM
For: 12 Month(s)
✅ Retention of yearly backup point
Week Based - Day Based
In: January
On: First Sunday
At: 3:30 AM
For: 7 Year(s)
An Azure administrator provides you with the Azure Resource Manager template that will be used to provision the production application servers.
{
“apiVersion”: “2017-03-30”,
“type”: “Microsoft.Compute/virtualMachines”,
“name”: “[parameters(‘vmname’)]”,
“location”: “EastUS”,
“dependsOn”: [
“[resourceId(‘Microsoft.Network/networkInterfaces/’, parameters(‘vmname’))]”
],
“properties”: {
“hardwareProfile”: {
“vmSize”: “[parameters(‘vmSize’)]”
},
“osProfile”: {
“computerName”: “[parameters(‘vmname’)]”,
“adminUsername”: “[parameters(‘adminUsername’)]”,
“adminPassword”: “[parameters(‘adminPassword’)]”
},
“storageProfile”: {
“ImageReference”: {
“publisher”: “MicrosoftWindowsServer”,
“offer”: “WindowsServer”,
“sku”: “2016-datacenter”,
“version”: “latest”
},
“osDisk”: {
“name”: “[concat(parameters(‘vmname’), ‘-OS’)]”,
“caching”: “ReadWrite”,
“createOption”: “FromImage”,
“diskSizeGB”: 128,
“managedDisk”: {
“storageAccountType”: “[parameters(‘storageAccountType’)]”
}
},
“copy”: [
{
“name”: “DataDisks”,
“count”: “[parameters(‘diskCount’)]”,
“input”: {
“caching”: “None”,
“diskSizeGB”: 1024,
“lun”: “[copyIndex(‘datadisks’)]”
}
}
]
}
}
}
Topic 3, Misc. Questions
You plan to migrate an SAP HANA instance to Azure.
You need to gather CPU metrics from the last 24 hours from the instance.
Solution: You query views from SAP HANA Studio.
Does this meet the goal?
Yes
No
Correct Answer: Yes
Why It’s Correct:
SAP HANA Studio is a supported tool for monitoring and managing SAP HANA instances, including gathering performance metrics like CPU usage.
System views in SAP HANA (e.g., M_HOST_RESOURCE_UTILIZATION) typically store resource utilization data, including CPU metrics, for at least 24 hours in a standard configuration, which meets the goal of gathering this data for migration planning.
The AZ-120 exam tests knowledge of SAP HANA administration and Azure migration preparation. Using SAP HANA Studio to query performance metrics is a practical and SAP-supported approach for an on-premises environment, making it the closest correct answer.
No Azure-specific tools are required at this stage since the instance is still on-premises, and the question focuses on gathering metrics from the existing SAP HANA instance.
You plan to migrate an SAP HANA instance to Azure.
You need to gather CPU metrics from the last 24 hours from the instance.
Solution: You run SAP HANA Quick Sizer.
Does this meet the goal?
Yes
No
Correct Answer: No
Why It’s Correct:
SAP HANA Quick Sizer is designed to calculate resource requirements (e.g., CPU, memory) for an SAP HANA deployment based on business workload inputs, not to gather or analyze historical performance data like CPU metrics from the last 24 hours.
The goal requires actual CPU usage data from the running SAP HANA instance, which Quick Sizer cannot provide. Tools like SAP HANA Studio, SAP HANA Cockpit, or OS-level monitoring are needed instead.
For the AZ-120 exam, understanding the distinction between sizing tools (like Quick Sizer) and monitoring tools (like SAP HANA Studio) is key. Since the solution does not align with the requirement, No is the correct answer.
You have an on- premises SAP environment hosted on VMware VSphere that in Microsoft SQL Server as the database platform. You plan to migrate the environment to Azure. The database platform will remain the same. You need gather information to size the target Azure Environment for the migration.
What should you use?
Azure Monitor
the SAP NANA sizing report
the SAP EarlyWatch Alert report
Azure Advisor
Correct Answer: The SAP EarlyWatch Alert report
Why It’s Correct:
The SAP EarlyWatch Alert report provides detailed performance and resource utilization data (e.g., CPU, memory, database IOPS) from the existing on-premises SAP environment, including the Microsoft SQL Server database. This data is critical for sizing the target Azure environment, such as selecting appropriate Azure VM types (e.g., E-series for SQL Server) and storage configurations (e.g., Premium SSD).
Unlike Azure Monitor and Azure Advisor, which are Azure-specific and require the workload to be in Azure, EarlyWatch Alert works with the on-premises system. The SAP HANA sizing report is irrelevant because the database is SQL Server, not HANA.
For the AZ-120 exam, the EarlyWatch Alert report is a recognized tool for SAP migration planning, making it the closest and most correct answer for gathering sizing information.
You plan to migrate an SAP HANA instance to Azure.
You need to gather CPU metrics from the last 24 hours from the instance.
Solution: You use Monitoring from the SAP HANA Cockpit.
Does this meet the goal?
Yes
No
Correct Answer: Yes
Why It’s Correct:
SAP HANA Cockpit’s Monitoring feature allows administrators to view historical CPU metrics, typically covering at least the last 24 hours in a standard configuration, meeting the goal of gathering this data for migration planning.
It is a native SAP HANA tool, well-suited for monitoring an on-premises instance, and does not require Azure-specific tools since the system has not yet been migrated.
For the AZ-120 exam, understanding how to use SAP HANA Cockpit for performance monitoring is relevant to migration preparation, as it provides the data needed to size Azure VMs (e.g., M-series for SAP HANA). Thus, Yes is the correct answer.
HOTSPOT
For each of the following statements, select Yes if the stamen is true. Otherwise, select No. NOTE: Each correct selection is worth one point.
Statements Yes No
Oracle Real Application Clusters (RAC) can be used to provide high availability of SAP databases on Azure.
You can host SAP databases on Azure by using Oracle on a virtual machine that runs Windows Server 2016.
You can host SAP databases on Azure by using Oracle on a virtual machine that runs SUSE Linux Enterprise Server 12 (SLES 12).
Summary of Answers
Statement 1: No (Oracle RAC is not supported for SAP on Azure)
Statement 2: Yes (Oracle on Windows Server 2016 is supported)
Statement 3: Yes (Oracle on SLES 12 is supported)
Why These Are Correct
Statement 1 (No): Oracle RAC’s lack of support on Azure for SAP workloads is a key distinction in the AZ-120 exam. The preference for Data Guard aligns with Azure’s architecture and SAP’s certification requirements.
Statement 2 (Yes): Windows Server 2016 is a fully supported OS for Oracle databases in Azure SAP deployments, reflecting flexibility in OS choices for SAP customers.
Statement 3 (Yes): SLES 12 is widely used and certified for SAP-on-Oracle deployments, making it a standard option in Azure.
A company named Contoso, Ltd. has users across the globe. Contoso is evaluating whether to migrate SAP to Azure.
The SAP environment runs on SUSE Linux Enterprise Server (SLES) servers and SAP HANA databases.
The Suite on HANA database is 4 TB.
You need to recommend a migration solution to migrate SAP application servers and the SAP HANA databases. The solution must minimize downtime.
Which migration solutions should you recommend? To answer, select the appropriate options in the answer area. NOTE: Each correct selection is worth one point.
Answer Area
SAP application servers:
⏷ AzCopy
⏷ Azure Site Recovery
⏷ SAP HANA system replication
⏷ System Copy for SAP Systems
SAP HANA databases:
⏷ AzCopy
⏷ Azure Site Recovery
⏷ SAP HANA system replication
⏷ System Copy for SAP Systems
Correct Answer:
SAP application servers: Azure Site Recovery
SAP HANA databases: SAP HANA system replication
Recommended Solutions:
SAP Application Servers: Azure Site Recovery
Why Correct:
ASR supports migrating SLES-based VMs (SAP application servers) from on-premises to Azure with continuous replication and a planned failover, minimizing downtime to a brief cutover window.
It’s widely used for SAP application server migrations in Azure and aligns with AZ-120 exam objectives for VM-based migrations.
Unlike AzCopy or System Copy, ASR avoids lengthy offline data transfers, and SAP HANA system replication is irrelevant for application servers.
SAP HANA Databases: SAP HANA system replication
Why Correct:
SAP HANA system replication enables near-zero downtime migration by replicating the 4 TB database to an Azure-based HANA instance (e.g., on an M-series VM). Once replication is complete, a quick switchover minimizes downtime.
It’s SAP-certified, supports large databases, and is recommended by SAP and Microsoft for HANA migrations to Azure with minimal disruption.
ASR isn’t suitable for database consistency, AzCopy requires significant downtime, and System Copy involves prolonged outages, making HANA system replication the best choice.
Your company has a an on-premises SAP environment.
Recently, the company split into two companies named Litware, inc and Contoso.Ltd. Litware retained the SAP environment.
Litware plans to export data that is relevant only to Contoso. The export will be 1.5 TB.
Contoso build a new SAP environment on Azure.
You need to recommend a solution for Litware to make the data available to Contoso in Azure.
The solution must meet the following requirements:
- Minimize the impact on the network.
- Minimize the administrative effort for Litware.
What should you include in the recommendation.
Azure Migrate
Azure Databox
Azure Site Recovery
Azure import/Export service
Recommended Solution: Azure Data Box
Correct Answer: Azure Data Box
Reasoning:
Minimizes Network Impact: The 1.5 TB of data is transferred offline via a physical device, avoiding the need for a high-bandwidth internet transfer that could strain Litware’s network.
Minimizes Administrative Effort: Microsoft ships the Data Box to Litware, who only needs to export the SAP data to the device using standard tools (e.g., rsync or SMB). Once shipped back, Azure uploads the data to a storage account, requiring no further management from Litware. This is simpler than the Import/Export Service, which involves managing disks.
SAP Context: For SAP environments, exporting data (e.g., database dumps or file extracts) to a device like Data Box is a practical approach, especially for a one-time transfer of 1.5 TB. Contoso can then access the data from Azure storage and import it into their SAP system.
You are building an SAP environment by using Azure Resource Manager templates. The SAP environment will use Linux virtual machines.
You need to correlate the LUN of the data disks in the template to the volume of the virtual machines.
Which command should you run/
Is /dev/ disk/azure/root
Is /dev/ disk/azure/scsil
Tree /dev/ disk/azure/root
Tree /dev/disk/azure/resource
Correct Answer: ls /dev/disk/azure/scsil (interpreted as ls /dev/disk/azure/scsi1)
Why It’s Correct:
Assuming “scsil” is a typo for scsi1, the command ls /dev/disk/azure/scsi1 lists the symbolic links for data disks (e.g., lun0 → /dev/sdc), allowing you to correlate the LUNs specified in the ARM template (e.g., lun: 0) to the device names on the Linux VM. This is the standard method on Azure Linux VMs for disk identification.
The other options either target the wrong path (root), use an invalid path (resource), or rely on a non-standard tool (tree) that doesn’t align with typical SAP-on-Azure administration.
For the AZ-120 exam, this aligns with the need to manage disk configurations for SAP workloads, making it the closest correct answer despite the apparent typo. If “scsil” is not a typo and no correction is intended, none of the options are perfectly valid, but scsil remains the closest match to the intended solution.
This question requires that you evaluate the underlined text to determine if it is correct.
You have an SAP environment on Azure that uses Microsoft SQL server as the RDBMS.
You plan to migrate to an SAP HANA database.
To calculate the amount of memory and disk space required for the database, you can use SAP Quick Sizer.
Instructions: Review the BOLD text, If the makes the stamen correct, select ‘’No change is needed. “ if the statement is incorrect select the answer choice that makes the statement correct.
No change is needed.
Azure Migrate
/SDF/HDB_SIZING
SQL Server Management Studio (SSMS)
Final Answer
Selection: No change is needed
Reason: SAP Quick Sizer is a valid and correct tool for calculating memory and disk space requirements for an SAP HANA database in this Azure migration scenario, aligning with AZ-120 exam objectives.
Correct Answer: No change is needed
Why Correct: The bolded text, SAP Quick Sizer, makes the statement correct. SAP Quick Sizer is a widely recognized SAP tool for sizing SAP HANA systems, including during migrations, and is appropriate for estimating memory and disk space based on workload inputs. The AZ-120 exam often emphasizes SAP planning tools like Quick Sizer for infrastructure sizing on Azure, and the statement aligns with this context. While /SDF/HDB_SIZING is a more precise tool for analyzing an existing database during a migration, the question’s phrasing (“you can use”) doesn’t mandate the most specific tool—Quick Sizer is sufficient and correct.
Why Others Are Incorrect:
Azure Migrate: Not relevant for database sizing or SAP HANA.
/SDF/HDB_SIZING: More specific but not required by the statement’s broad wording.
SSMS: Lacks SAP HANA sizing capabilities.
You are deploying an SAP production landscape to Azure.
Your company’s chief information security officer (CISO) requires that the SAP deployment complies with ISO 27001.
You need to generate a compliance report for ISO 27001.
What should you use?
Azure Security Center
Azure Log Analytics
Azure Active Directory (Azure AD)
Azure Monitor
Correct Answer: Azure Security Center
Reasoning:
ISO 27001 Compliance: ISO 27001 is an international standard for information security management systems (ISMS). It requires organizations to assess and manage risks, implement controls, and demonstrate compliance through auditable reports. Azure Security Center (Microsoft Defender for Cloud) provides a Regulatory Compliance Dashboard that specifically supports tracking and reporting compliance with standards like ISO 27001.
Functionality: Azure Security Center offers built-in tools to assess the security posture of your Azure resources, map them to ISO 27001 controls, and generate compliance reports. It includes features like continuous monitoring, security recommendations, and the ability to export compliance data, which are essential for meeting the CISO’s requirement to generate a report.
SAP on Azure Context: For an SAP deployment, which involves virtual machines, databases (e.g., SAP HANA), and networking components, Azure Security Center can evaluate the security and compliance of these resources against ISO 27001 standards. This is critical for a production landscape where security and compliance are paramount.
Exam Relevance (AZ-120): The AZ-120 exam (“Planning and Administering Microsoft Azure for SAP Workloads”) focuses on managing SAP workloads on Azure, including security and compliance. Azure Security Center is a key tool covered in this context for ensuring compliance with standards like ISO 27001.
A customer enterprise SAP environment plans to migrate to Azure. The environment uses servers that runs Windows Server 2016 and Microsoft SQL Server.
The environment is critical and requires a comprehensive business continuity and disaster recovery (BCDRJ strategy that minimizes the recovery point objective (RPO) and the recovery time objective (RTO).
The customer wants a resilient environment that has a secondary site that is at least 250 Kilometers away. You need to recommend a solution for the customer.
Which two solutions should you recommend? Each correct answer presents part of the solution. NOTE; Each correct selection Is worth one point.
an internal load balancer to route Internet traffic
warm standby virtual machines in Azure Availability Zones.
warn standby virtual machines in paired regions
Warm standby virtual machine an Azure Availability Set that uses geo-redundant storage (GRS)
Azure Traffic Manager to route incoming traffic.
Final Answer:
Warm standby virtual machines in paired regions: Provides the DR foundation with a secondary site 250 km away, minimizing RPO and RTO through pre-deployed VMs.
Azure Traffic Manager to route incoming traffic: Ensures business continuity by seamlessly redirecting traffic to the active site (primary or secondary), complementing the DR setup.
Azure Traffic Manager to route incoming traffic
Why Correct?
Traffic Routing: Azure Traffic Manager is a DNS-based traffic routing service that can direct user traffic to the primary site under normal conditions and failover to the secondary site (in the paired region) during a disaster. This ensures seamless redirection, minimizing RTO.
Global Resilience: It supports SAP environments by providing a single endpoint for clients, routing them to the active site (primary or secondary), which is critical for maintaining business continuity.
Complementary to Warm Standby: When paired with warm standby VMs in a secondary region, Traffic Manager ensures that once failover occurs, users are automatically directed to the secondary site without manual intervention.
SAP Relevance: For SAP applications, which often have front-end components (e.g., SAP GUI or web interfaces), Traffic Manager ensures availability of these services across regions.
Why It Fits: This addresses the business continuity aspect by ensuring users can access the SAP environment even after a failover, complementing the DR setup in paired regions.
Warm standby virtual machine in an Azure Availability Set that uses geo-redundant storage (GRS)
Why Incorrect:
Availability Set Limitation: An Availability Set ensures VMs are spread across fault domains and update domains within a single region, not across regions. It doesn’t support the 250 km separation.
GRS Limitation: Geo-redundant storage (GRS) replicates data to a secondary region (meeting the distance requirement), but it’s a storage-level solution, not a VM-level solution. Warm standby VMs require compute readiness, not just storage replication, and GRS alone doesn’t ensure VM failover.
Ambiguity: The phrasing suggests a single-region solution with GRS, which doesn’t fully align with a comprehensive DR strategy for SAP VMs.
Context: This might contribute to data resilience, but it doesn’t provide a full DR solution with warm VMs in a secondary site.
HOTSPOT
You have SAP ERP on Azure.
For SAP high availability, you plan to deploy ASCS/ERS instances across Azure Availability Zones and to use failover clusters.
For each of the following statements, select Yes if the statement is true. Otherwise, select No. NOTE: Each correct selection is worth one point.
Statements Yes No
To create a failover solution, you can use an Azure Basic Load Balancer for Azure virtual machines deployed across the Azure Availability Zones. ⭘ ⭘
You can deploy Azure Availability Sets within an Azure Availability Zone. ⭘ ⭘
The solution must use Azure managed disks. ⭘ ⭘
Final Answers:
To create a failover solution, you can use an Azure Basic Load Balancer for Azure virtual machines deployed across the Azure Availability Zones.
No (Requires Azure Standard Load Balancer for cross-zone failover support.)
You can deploy Azure Availability Sets within an Azure Availability Zone.
No (Availability Sets and Zones are distinct HA mechanisms; this scenario uses Zones.)
The solution must use Azure managed disks.
Yes (Managed disks are required for zonal deployments and SAP HA.)
Statement 1: “To create a failover solution, you can use an Azure Basic Load Balancer for Azure virtual machines deployed across the Azure Availability Zones.”
Answer: No
Reasoning: For failover cluster solutions involving SAP ASCS/ERS instances across Azure Availability Zones, you cannot use an Azure Basic Load Balancer. The Basic Load Balancer does not support cross-zone load balancing or the health probe functionality required for failover clusters (e.g., Windows Server Failover Clustering or Linux Pacemaker). Instead, the Azure Standard Load Balancer is required because it supports Availability Zones, provides health probes to detect active cluster nodes, and ensures proper routing in a high-availability (HA) setup. This is a key requirement for SAP HA deployments on Azure, as the Standard SKU is explicitly recommended for such scenarios.
Statement 2: “You can deploy Azure Availability Sets within an Azure Availability Zone.”
Answer: Yes
Statement 3: “The solution must use Azure managed disks.”
Answer: Yes
Reasoning: When deploying VMs across Azure Availability Zones for SAP workloads, Azure managed disks are mandatory. Unmanaged disks (e.g., those using storage accounts) are not supported for zonal deployments because they lack the flexibility and resilience required for HA configurations across zones. Managed disks provide features like zone-redundant storage (ZRS) or local redundancy within a zone, which align with SAP HA requirements. For failover clusters (e.g., using Azure shared disks for Windows or NFS for Linux), managed disks ensure consistent performance and availability, making them a requirement for this solution.
You deploy an SAP environment on Azure.
Your company has a Service Level Agreement (SLA) of 99.99% for SAP.
You implement Azure Availability Zones that have the following components:
- Redundant SAP application servers
- ASCS/ERS instances that use a failover cluster
- Database high availability that has a primary instance and a secondary instance
You need to validate the load distribution to the application servers.
What should you use?
SAP Solution Manager
Azure Monitor
SAPControl
SAP Web Dispatcher
Final Answer:
SAP Web Dispatcher
Why SAP Web Dispatcher is the closest to the correct answer:
The question asks for a tool to “validate the load distribution to the application servers.” In an SAP environment deployed on Azure with redundant application servers, the SAP Web Dispatcher is the component responsible for distributing the load. By examining its configuration and logs, you can validate how traffic is being allocated across the servers. This aligns with the requirements of the Azure AZ-120 exam, which tests knowledge of SAP workload management on Azure, including high-availability configurations and load balancing.
HOTSPOT
For each of the following statements, select yes if the statement is true. Otherwise, select No. NOTE: Each correct selection is worth one point.
Statements Yes No
You can use NIPING to examine network latency between an SAP HANA database server and an SAP application server hosted on Azure. ⭘ ⭘
You can use LoadRunner to generate traffic between a client and an SAP application server hosted on Azure. ⭘ ⭘
You can use the SAP HANA HW Configuration Check Tool (HWCCT) to examine network latency between an SAP HANA database server and an SAP application server hosted on Azure. ⭘ ⭘
Final Answers:
Statement Yes No
You can use NIPING to examine network latency between an SAP HANA database server and an SAP application server hosted on Azure. ✔️
You can use LoadRunner to generate traffic between a client and an SAP application server hosted on Azure. ✔️
You can use the SAP HANA HW Configuration Check Tool (HWCCT) to examine network latency between an SAP HANA database server and an SAP application server hosted on Azure. ✔️
Summary of Reasoning:
NIPING (Yes): A dedicated SAP tool for measuring network latency between SAP components, applicable in Azure for HANA and application server communication.
LoadRunner (Yes): A performance testing tool that can simulate client traffic to an SAP application server, widely used for SAP workload testing on Azure.
HWCCT (No): Focused on validating SAP HANA hardware performance, not specifically designed for measuring latency between HANA and application servers.
HOTSPOT
You have ah SAP environment on Azure that contains a single-tenant SAP NANA server at instance 03.
You need to monitor the network throughput from an SAP application server to the SAP HANA server.
How should you complete the script? To answer, select the appropriate options in the answer are. NOTE: Each correct selection is worth one point.
$HANA = <dropdown> -Name HANAP01-NIC -ResourceGroupName Production
Get-AzNetworkInterface
Get-AzNetworkUsage
Get-AzNetworkWatcher
Get-AzVM</dropdown>
$APP = Get-AzVM -Name AppP01 -ResourceGroupName Production
New-AzNetworkWatcherConnectionMonitor -NetworkWatcher (Get-AzNetworkWatcher)
-Name HANA -DestinationAddress (($HANA).IpConfigurations.PrivateIPAddress)
-DestinationPort <dropdown>
1433
1434
30115
30315
-SourceResourceId $APP.Id</dropdown>
Final Answers:
$HANA = <dropdown> -Name HANAP01-NIC -ResourceGroupName Production
Answer: Get-AzNetworkInterface
Why Correct: This cmdlet retrieves the network interface object, providing access to the private IP address required for the connection monitor.
-DestinationPort <dropdown>
Answer: 30315
Why Correct: This is the SQL port for SAP HANA instance 03, matching the scenario’s requirement for monitoring HANA communication.
Why These Are Correct for AZ-120:
Network Monitoring: The AZ-120 exam tests knowledge of Azure tools like Network Watcher for monitoring SAP workloads. New-AzNetworkWatcherConnectionMonitor is the correct cmdlet for monitoring network performance (e.g., throughput) between VMs, requiring the source VM ID, destination IP, and port.
SAP HANA Specifics: Understanding HANA’s port numbering (e.g., 3<instance>15) is critical for SAP-on-Azure deployments, a key focus of the exam.
Azure Resources: Using Get-AzNetworkInterface aligns with Azure’s resource model, where NICs are separate objects that hold IP configurations, a concept tested in AZ-120.</instance></dropdown></dropdown>
HOTSPOT
You are deploying an SAP environment across Azure Availability Zones.
The environment has the following components:
✑ ASCS/ERS instances that use a failover cluster
✑ SAP application servers across the Azure Availability Zones
✑ Database high availability by using a native database solution
For each of the following statements, select Yes if the statement is true. Otherwise, select No. NOTE: Each correct selection is worth one point.
Statements Yes No
Network latency is a limiting factor when deploying DBMS instances that use synchronous replication across the Azure Availability Zones. ⭘ ⭘
The performance of SAP systems can be validated by using ABAPMeter. ⭘ ⭘
To help identify the best Azure Availability Zones for deploying the SAP components, you can use NIPING to verify network latency between the zones. ⭘ ⭘
Final Answers:
Statement Yes No
Network latency is a limiting factor when deploying DBMS instances that use synchronous replication across the Azure Availability Zones. ✔️
The performance of SAP systems can be validated by using ABAPMeter. ✔️
To help identify the best Azure Availability Zones for deploying the SAP components, you can use NIPING to verify network latency between the zones.
Statement 1:
“Network latency is a limiting factor when deploying DBMS instances that use synchronous replication across the Azure Availability Zones.”
Answer: Yes
Why it’s correct:
Synchronous replication in a database management system (DBMS) requires that data written to the primary instance is replicated to the secondary instance before the transaction is considered complete. When deploying across Azure Availability Zones (which are physically separated data centers within the same region), network latency between zones can impact the performance of synchronous replication. Azure documentation and SAP best practices emphasize that low latency (typically less than 2-5 milliseconds) is critical for synchronous replication to avoid performance degradation. Since Availability Zones are geographically distinct, network latency is indeed a limiting factor that must be considered when designing high-availability DBMS solutions for SAP. This aligns with AZ-120 exam topics on planning SAP workloads with high availability.
Statement 2:
“The performance of SAP systems can be validated by using ABAPMeter.”
Answer: Yes
Why it’s correct:
ABAPMeter (transaction code STAD or related tools in SAP) is a performance analysis tool within the SAP ABAP environment. It allows administrators to measure and validate the performance of SAP systems by analyzing metrics such as response times, database calls, and CPU usage for ABAP-based transactions. In the context of an SAP environment on Azure, ABAPMeter can be used to assess the performance of SAP application servers and ensure they meet the required Service Level Agreements (SLAs). This is a valid tool for performance validation in SAP systems, making “Yes” the correct choice for this statement. This aligns with AZ-120’s focus on monitoring and optimizing SAP workloads.
Statement 3:
“To help identify the best Azure Availability Zones for deploying the SAP components, you can use NIPING to verify network latency between the zones.”
Answer: Yes
Why it’s correct:
NIPING is a network testing tool provided by SAP to measure network latency and bandwidth between systems. When planning an SAP deployment across Azure Availability Zones, verifying network latency is critical, especially for components like ASCS/ERS (failover clustering) and DBMS instances (synchronous replication), which are sensitive to delays. By running NIPING between virtual machines in different Availability Zones, you can collect latency data to determine if the zones meet SAP’s stringent network requirements (e.g., low latency for high-availability setups). This is a recommended practice in SAP-on-Azure deployments and is relevant to the AZ-120 exam’s emphasis on network planning and optimization.
HOTSPOT
for each of the following statements, select Yes if the statement is true. Otherwise. select No. NOTE: Each correct selection is worth one point.
Statements Yes No
When configuring an Azure virtual machine, the Azure Enhanced Monitoring features are required to monitor SAP application performance. ⭘ ⭘
To successfully start an Azure virtual machine that contains SAP, you must have Azure Enhanced Monitoring installed. ⭘ ⭘
If you deploy SAP by using the Azure Resource Manager templates for SAP, Azure Enhanced Monitoring is installed automatically. ⭘ ⭘
Final Answers:
Statement Yes No
When configuring an Azure virtual machine, the Azure Enhanced Monitoring features are required to monitor SAP application performance. ✔️
To successfully start an Azure virtual machine that contains SAP, you must have Azure Enhanced Monitoring installed. ✔️
If you deploy SAP by using the Azure Resource Manager templates for SAP, Azure Enhanced Monitoring is installed automatically. ✔️
Summary of Reasoning:
Monitoring Requirement (No): Azure Enhanced Monitoring enhances SAP performance monitoring but isn’t strictly required, as basic monitoring or SAP tools can still function without it.
VM Startup (No): Starting a VM with SAP doesn’t depend on the monitoring extension; it’s an operational add-on, not a startup necessity.
ARM Templates (Yes): Official Azure ARM templates for SAP automate the installation of Azure Enhanced Monitoring, aligning with best practices for SAP deployments.
HOTSPOT
For each of the following statements, select Yes if the statement is true. Otherwise, select No. NOTE: Each correct selection is worth one point.
Statements Yes No
The Azure Enhanced Monitoring Extension for SAP stores performance data in an Azure Storage account. ⭘ ⭘
You can enable the Azure Enhanced Monitoring Extension for SAP on a SUSE Linux Enterprise Server 12 (SLES 12) server by running the Set-AzVMAEMExtension cmdlet. ⭘ ⭘
You can enable the Azure Enhanced Monitoring Extension for SAP on a server that runs Windows Server 2016 by running the Set-AzVMAEMExtension cmdlet. ⭘ ⭘
Final Answers:
Statement Yes No
The Azure Enhanced Monitoring Extension for SAP stores performance data in an Azure Storage account. ✔️
You can enable the Azure Enhanced Monitoring Extension for SAP on a SUSE Linux Enterprise Server 12 (SLES 12) server by running the Set-AzVMAEMExtension cmdlet. ✔️
You can enable the Azure Enhanced Monitoring Extension for SAP on a server that runs Windows Server 2016 by running the Set-AzVMAEMExtension cmdlet. ✔️
Summary of Reasoning:
Storage Account (No): The extension collects and provides real-time data to SAP systems but doesn’t store it in an Azure Storage account; storage is handled by other services like Azure Monitor.
SLES 12 (Yes): The Set-AzVMAEMExtension cmdlet enables the extension on SLES 12, a supported OS for SAP, aligning with Azure’s monitoring capabilities for Linux-based SAP workloads.
Windows Server 2016 (Yes): The same cmdlet enables the extension on Windows Server 2016, another supported OS, ensuring monitoring integration for Windows-based SAP deployments.
HOTSPOT
You are integrating SAP HANA and Azure Active Directory (Azure AD).
For each of the following statements, select Yes if the statement is true. Otherwise, select No. NOTE: Each correct selection is worth one point.
Statements Yes No
SAP HANA supports SAML authentication for single-sign on (SSO). ⭘ ⭘
SAP HANA supports OAuth2 authentication for single-sign on (SSO). ⭘ ⭘
You can use Azure role-based access control (RBAC) to provide users with the ability to sign in to SAP HANA. ⭘ ⭘
Final Answer
SAP HANA supports SAML authentication for single-sign on (SSO): Yes
SAP HANA supports OAuth2 authentication for single-sign on (SSO): No
You can use Azure role-based access control (RBAC) to provide users with the ability to sign in to SAP HANA: No
- SAP HANA supports SAML authentication for single-sign on (SSO)
Answer: Yes
Why it’s correct:
SAP HANA supports SAML (Security Assertion Markup Language) for single sign-on (SSO). SAML is a standard protocol for exchanging authentication and authorization data between an identity provider (IdP) like Azure AD and a service provider (SP) like SAP HANA.
SAP HANA Context: Since SAP HANA 1.0 SPS 10 (and later versions), it has built-in support for SAML-based SSO. This allows users to authenticate to SAP HANA using credentials managed by an external IdP, such as Azure AD, without needing to re-enter credentials.
Azure AD Integration: Azure AD supports SAML 2.0, and you can configure it as the IdP for SAP HANA by setting up an enterprise application in Azure AD, exchanging metadata, and configuring SAP HANA’s SAML settings (e.g., via HANA Cockpit or XS Admin).
AZ-120 Relevance: SAML SSO is a common integration method for SAP HANA on Azure, making this statement true. - SAP HANA supports OAuth2 authentication for single-sign on (SSO)
Answer: No
Why it’s incorrect:
SAP HANA does not natively support OAuth2 for SSO in the traditional sense of user authentication to the database. OAuth2 is an authorization framework commonly used for API access and token-based authentication, not for direct user SSO to the HANA database.
SAP HANA Context: While SAP HANA supports OAuth for specific scenarios (e.g., securing XS applications or REST APIs in SAP HANA Extended Application Services), this is for application-level access, not for SSO to the HANA database itself (e.g., via HANA Studio or JDBC/ODBC clients). SSO to SAP HANA relies on SAML, Kerberos, or X.509 certificates, not OAuth2.
Azure AD Comparison: Azure AD supports OAuth2 for many applications, but SAP HANA’s SSO integration with Azure AD uses SAML, not OAuth2.
AZ-120 Relevance: OAuth2 isn’t a standard SSO method for SAP HANA, making this statement false. - You can use Azure role-based access control (RBAC) to provide users with the ability to sign in to SAP HANA
Answer: No
Why it’s incorrect:
Azure RBAC (Role-Based Access Control) is an Azure-specific authorization system that manages permissions to Azure resources (e.g., VMs, storage accounts) at the management plane. It does not directly control authentication or access to applications like SAP HANA running on those resources.
SAP HANA Context: Signing in to SAP HANA requires database-level authentication (e.g., via SAML, Kerberos, or username/password), managed within HANA’s own user management system. Azure RBAC can grant users permissions to manage the Azure VM hosting SAP HANA (e.g., start/stop), but it doesn’t provide sign-in capabilities to the HANA database itself.
Azure AD vs. RBAC: Azure AD handles identity and authentication (e.g., SSO to SAP HANA via SAML), while RBAC handles resource permissions. These are distinct mechanisms, and RBAC doesn’t integrate with SAP HANA’s authentication.
HOTSPOT
Your on-premises network contains SAP and non-SAP applications. ABAP-based SAP systems are integrated with IDAP and use user name/password-based authentication for logon. You plan to migrate the SAP applications to Azure.
For each of the following statements, select Yes if the statement is true. Otherwise, select No. NOTE: Each correct selection is worth one point.
Statements Yes No
Azure Active Directory (Azure AD) pass-through authentication enables users to connect to the ABAP-based SAP systems on Azure by using their on-premises user name/password. ⭘ ⭘
Azure Active Directory (Azure AD) password hash synchronization enables users to connect to the ABAP-based SAP systems on Azure by using their on-premises user name/password. ⭘ ⭘
Active Directory Federation Services (AD FS) supports authentication between on-premises Active Directory and Azure systems that use different domains. ⭘ ⭘
Final Answers:
Statement Yes No
Azure Active Directory (Azure AD) pass-through authentication enables users to connect to the ABAP-based SAP systems on Azure by using their on-premises user name/password. ✔️
Azure Active Directory (Azure AD) password hash synchronization enables users to connect to the ABAP-based SAP systems on Azure by using their on-premises user name/password. ✔️
Active Directory Federation Services (AD FS) supports authentication between on-premises Active Directory and Azure systems that use different domains. ✔️
Summary:
Statement 1: No – ABAP-based SAP systems don’t natively support Azure AD PTA for logon without additional integration.
Statement 2: No – PHS also doesn’t directly enable ABAP logon without further SSO configuration.
Statement 3: Yes – AD FS supports federation across different domains, a standard capability in hybrid Azure setups.
You deploy on SAP environment on Azure.
You need to monitor the performance of the SAP NetWeaver environment by using the Azure Enhanced Monitoring Extension for SAP.
What should you do first?
From Azure CLI, install the Linux Diagnostic Extension.
From the Azure portal, enable the Azure Network Watcher Agent.
From the Azure portal, enable the Custom Script Extension.
From Azure CL
run the az v aem m set command.
Final Answer:
From Azure CLI, run the az v aem m set command. (Corrected to az vm aem set)
Why Correct?
Direct Action: The az vm aem set command specifically installs and configures the Azure Enhanced Monitoring Extension for SAP, which is the tool required to monitor SAP NetWeaver performance. It’s a single, actionable step that meets the question’s objective.
First Step: In the context of enabling the extension, running this command is the initial action to deploy it to the VM. While prerequisites like deploying the VM and installing SAP software are assumed (since the environment is already deployed), this is the first step specific to enabling the monitoring extension.
OS Agnostic: The command works for both Windows and Linux VMs, aligning with SAP NetWeaver’s flexibility (e.g., Windows Server or SLES/RHEL), and the question doesn’t specify an OS, making it broadly applicable.
AZ-120 Alignment: The exam tests knowledge of SAP-specific tools and extensions on Azure. The az vm aem set command is a documented method for enabling the Azure Enhanced Monitoring Extension, as seen in Azure’s SAP workload documentation.
HOTSPOT
You have an SAP development landscape on Azure.
For each of the following statements, select Yes if the statement is true. Otherwise, select No. NOTE: Each correct selection is worth one point.
Statements Yes No
You can use SAP Landscape Management (LaMa) to automate stopping, starting, and deallocating SAP virtual machines. ⭘ ⭘
You can use SAP Solution Manager to automate stopping, starting, and deallocating SAP virtual machines. ⭘ ⭘
You can use SAP HANA Cockpit to automate stopping, starting, and deallocating SAP virtual machines. ⭘ ⭘
Correct Answers:
You can use SAP Landscape Management (LaMa) to automate stopping, starting, and deallocating SAP virtual machines: Yes
You can use SAP Solution Manager to automate stopping, starting, and deallocating SAP virtual machines: No
You can use SAP HANA Cockpit to automate stopping, starting, and deallocating SAP virtual machines: No
Why These Are Correct:
SAP LaMa (Yes):
LaMa is purpose-built for SAP system orchestration and, with the Azure Adapter, can automate VM operations like starting, stopping, and deallocating in Azure. This aligns with AZ-120’s focus on managing SAP landscapes in the cloud.
SAP Solution Manager (No):
SolMan is a monitoring and management tool, not an automation platform for Azure VM operations. It lacks native VM control capabilities, making it unsuitable for this task in the context of the exam.
SAP HANA Cockpit (No):
HANA Cockpit is a database-specific tool and cannot manage Azure VMs. It’s limited to HANA database operations, not infrastructure automation, as required by the AZ-120 objectives.
You migrate an SAP environment to Azure.
You need to inspect all the outbound traffic from the SAP application servers to the Internet.
Which two Azure resources should you use? Each correct answer presents part of the solution. Network Performance Monitor
Azure Firewall
Azure Traffic Manager
Azure Load Balancer NAT rules
Azure user-defined routes
a web application firewall (WAF) for Azure Application Gateway
Final Answer:
Azure Firewall
Azure user-defined routes
Why These Are Correct:
Azure Firewall:
Purpose: Azure Firewall provides deep packet inspection, filtering, and logging for outbound traffic. It can analyze traffic from SAP application servers to the Internet, enforce security policies (e.g., allow only specific SAP-related outbound connections), and log details for auditing.
AZ-120 Relevance: For SAP on Azure, Azure Firewall is a recommended security solution to monitor and secure traffic, aligning with the exam’s focus on network security and compliance.
Fit: Directly meets the goal of inspecting outbound traffic.
Azure user-defined routes (UDR):
Purpose: UDRs ensure all outbound traffic from the SAP application servers’ subnet is routed through Azure Firewall (e.g., by setting the next hop to the firewall’s IP). Without UDRs, traffic might bypass the firewall via Azure’s default Internet routing, preventing full inspection.
AZ-120 Relevance: The exam emphasizes network design for SAP, including forcing traffic through security appliances. UDRs are a standard practice in Azure hub-and-spoke architectures for SAP deployments.
Fit: Complements Azure Firewall by directing traffic for inspection, forming a complete solution.
DRAG DROP
You have an on-premises SAP environment that runs on SUSE Linux Enterprise Server (SLES) servers and Oracle. The version of the SAP ERP system is 6.06 and the version of the portal is SAP NetWeaver 7.3.
You need to recommend a migration strategy to migrate the SAP ERP system and the portal to Azure.
The solution must be hosted on SAP HANA.
What should you recommend? To answer, drag the appropriate tools to the correct components. Each tool may be used once, more than once, or not at all. You may need to drag the split bar between panes or scroll to view content. NOTE: Each correct selection is worth one point.
Tools
SAP heterogeneous system copy
Software Update Manager (SUM) Database Migration Option (DMO) with System Update
Software Update Manager (SUM) Database Migration Option (DMO) with System Move
Software Update Manager (SUM) Database Migration Option (DMO) without System Update
Answer Area
To migrate the SAP ERP system: [________]
To migrate the portal: [________]
Final Answers:
Answer Area:
To migrate the SAP ERP system: Software Update Manager (SUM) Database Migration Option (DMO) with System Move
To migrate the portal: SAP heterogeneous system copy
Why These Are Correct:
SAP ERP (ECC 6.06) - SUM DMO with System Move:
Reason: ECC is an ABAP system, and SUM DMO is optimized for ABAP migrations to HANA. “System Move” supports the platform shift to Azure while converting the database from Oracle to HANA. It’s efficient and aligns with SAP’s modern migration tools, per SAP Note 1813548 and Azure documentation.
AZ-120 Fit: The exam emphasizes DMO for ABAP systems moving to HANA on Azure.
Portal (NetWeaver 7.3) - SAP heterogeneous system copy:
Reason: NetWeaver 7.3 Java doesn’t support SUM DMO (DMO is ABAP-only). A heterogeneous system copy is the SAP-certified approach for Java stacks, involving database export/import (Oracle to HANA) and platform migration to Azure, per SAP’s migration guides.
AZ-120 Fit: Tests knowledge of distinct migration strategies for ABAP vs. Java SAP systems.
Validation:
ECC: SUM DMO with System Move keeps the process streamlined, avoiding unnecessary updates unless required (not specified).
NetWeaver: Heterogeneous system copy is the only viable option from the list for Java systems, ensuring HANA compatibility.