Pocket Prep 2 Flashcards

1
Q

A recent breach has caused a company to take a good look at their Platform as a Service (PaaS) database and its setup. They had discovered that there was one crucial setting that had a default value. By not changing that value away from the default, they had left, effectively, an open door into their database. What security risk have they experienced?

A. Insecure design
B. Security misconfiguration
C. Broken access control
D. Cryptographic failures

A

B. Security misconfiguration

Explanation:
Security misconfiguration occurs when the security settings are not set to proper values or because of a variety of other problems, such as out-of-date software, unpatched systems, overly informative error messages, default account with default passwords left on the system, unnecessary features enabled, and so on.

Insecure design is focused on design flaws. We need to use threat modeling and secure design patterns and principles to avoid this risk.

Cryptographic failures leave data exposed either because it was not encrypted or it was improperly encrypted.

Broken access control involves issues such as not following the principle of least privilege, not having access controls on the APIs, elevation of privilege available to users, etc.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Onyx has just been notified by the Security Operations Center (SOC) that an incident has occurred and he is needed. He specializes in digital forensics, especially in the cloud. The incident involved a bad actor accessing a Structured Query Language (SQL) database in their Platform as a Service (PaaS) deployment. He knows to take proper measures to protect any evidence he collects from contamination and alteration.

What type of activity is this?

A. Chain of custody
B. Due care
C. Due diligence
D. Digital forensics

A

A. Chain of custody

Explanation:
The chain of custody is crucial in maintaining the integrity and reliability of evidence or items of importance. It ensures that the evidence is handled, stored, and transferred in a secure and accountable manner, enabling confidence in its authenticity and validity for legal or investigatory purposes.

Due diligence is a comprehensive and systematic process of research, investigation, and analysis conducted by individuals, organizations, or entities to assess and evaluate the potential risks, opportunities, and legal, financial, or operational implications associated with a specific transaction, investment, or business relationship. It is typically performed before making important decisions or entering into agreements to ensure informed decision-making and risk mitigation. The corporation/management should have done their due diligence in hiring Onyx and adding him to the Incident Response team in a forensic investigator capacity.

Due care refers to the level of caution, prudence, and diligence that a reasonable person or organization exercises to prevent harm or minimize risks. It is a legal and ethical concept that applies to various domains, including business operations, information security, healthcare, and professional practice.

Digital forensics is what he is doing in the question/scenario. The actual question is about protecting evidence from alteration specifically. That is chain of custody.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Malicious code is sent to a user’s browser from a trusted website that steals a session token for the bad actor. What type of attack is being described?

A. Structured Query Language (SQL) injection
B. Cross Site Request Forgery (CSRF)
C. eXternal Markup Language (XML) external entities (XEE)
D. Cross-site scripting (XSS)

A

D. Cross-site scripting (XSS)

Explanation:
Cross-site scripting (CSS or XSS) is a type of injection attack. These attacks occur when an attacker is able to send malicious code to a user’s browser without having to go through any validation process because it is coming from a trusted website that the user willingly connected to. Essentially, the victim visits a website or web application that delivers and executes the malicious code to the user’s browser.

CSRF is an attack where the user is tricked into sending a request to a website that executes something on that website. This could include transferring funds, changing email addresses, etc.

SQL is a language used to talk to a specific type of database. A user should not be allowed to enter an SQL request from the user interface. Allowing this to happen by trusting the user’s input allows a bad actor to enter an SQL command into an interface of some sort.

XEE is an attack where an application that parses XML content receives a request that references an external entity. This could lead to the disclosure of sensitive information or cause a denial of service, among others.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Which of the following organizations published the Egregious Eleven to describe the most common cloud vulnerabilities?

A. NIST
B. CSA
C. OWASP
D. SANS

A

B. CSA

Explanation:
Several organizations provide resources designed to teach about the most common vulnerabilities in different environments. Some examples include:

Cloud Security Alliance Top Threats to Cloud Computing: These lists name the most common threats, such as the Egregious 11. According to the CSA, the top cloud security threats include data breaches; misconfiguration and inadequate change control; lack of cloud security architecture and strategy; and insufficient identity, credential access, and key management.
OWASP Top 10: The Open Web Application Security Project (OWASP) maintains multiple top 10 lists, but its web application list is the most famous and is updated every few years. The top four threats in the 2021 list were broken access control, cryptographic failures, injection, and insecure design.
SANS CWE Top 25: SANS maintains a Common Weakness Enumeration (CWE) that describes all common security errors. Its Top 25 list highlights the most dangerous and impactful weaknesses each year. In 2021, the top four were out-of-bounds write, improper neutralization of input during web page generation (cross-site scripting), out-of-bounds read, and improper input validation.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

The American Institute of Certified Public Accountants (AICPA) published the Privacy Management Framework (PMF) as an update to the former Generally Accepted Privacy Principles (GAPP). When they updated it, there was a significant change in technology and laws around the world.

The driving law that had the biggest influence on the PMF was which of the following?

A. Child Online Privacy Protection Rule (COPPA)
B. Federal Act on Data Protection
C. General Data Protection Regulation (GDPR)
D. Privacy Act of 1988

A

C. General Data Protection Regulation (GDPR)

Explanation:
The AICPA says that GDPR had an impact on the update to PMF from GAPP. The GDPR went into effect in 2017, and the enforcement began in May of 2018. It requires a new level of protection of personal data compared to before. It gives the natural person the right to control their personal data.

COPPA is a U.S. regulation on the topic of protecting children’s privacy. It is in the process of being updated. The update is not complete as of this writing, and it therefore did not have an impact on the PMF update.

The Privacy Act of 1988 is Australian and too old to have had much of an impact.

The Federal Act on Data Protection is from Switzerland and could have been an influential law, but the AICPA says it was the GDPR.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Organizations such as the Cloud Security Alliance (CSA) and the Open Web Application Security Project (OWASP) publish information about cloud threats and risks. Who is responsible for mitigating these risks in an organization?

A. Security professionals
B. Executive management
C. Cloud Service Provider (CSP)
D. Database administrators

A

A. Security professionals

Explanation:
It is the security professionals’ responsibility to protect their organizations from the threats to cloud computing and mitigate the risks where feasible. Security professionals work at the Cloud Customer (CC) and the Cloud Service Provider (CSP).

The CSP certainly has a responsibility to protect their business and their customers, but not all customers’ issues are the CSP problem, nor are they the cause. So, the more generic answer of security professionals works better here.

Database administrators are responsible for the database(s) they are working on—but, only those parts. The security professionals could be in any department, on any project, so again, the more generic answer is better.

Executive management is accountable, not responsible. Accountability cannot be delegated, responsibility can be.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Which form of auditor is accountable for evaluating the effectiveness of a provider’s service and detecting control flaws between the Cloud Service Customer (CSC) and Cloud Service Provider (CSP)?

A. External auditor
B. Third-party auditor
C. Cloud auditor
D. Internal auditor

A

C. Cloud auditor

Explanation:
A cloud auditor is uniquely tasked with the responsibility of auditing cloud systems and cloud-based applications. The cloud auditor is responsible for evaluating the cloud service’s efficiency and finding control gaps between the cloud customer and the cloud service provider. This term is defined in ISO/IEC 17788.

It is arguable that this is an external auditor to both the CSC and the CSP. When you count the entities, they are the third party: The first is the customer, the second is the cloud provider, and the third is the auditor. There is actually a fourth as well that would be the contractors of the auditor company. However, the most appropriate term of these choices is cloud auditor simply because this is a cloud exam and that is how auditor is defined in ISO/IEC 17788.

An internal auditor would work for the company that they are auditing. This is not the best choice when talking about auditing a CSP. Independence is a better idea to ensure impartiality in the audit and ensure trust in the audit results.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Which of the following is used to identify vulnerabilities in an application’s third-party dependencies?

A. DAST
B. IAST
C. SCA
D. SAST

A

C. SCA

Explanation:
Static Application Security Testing (SAST): SAST tools inspect the source code of an application for vulnerable code patterns. It can be performed early in the software development lifecycle but can’t catch some vulnerabilities, such as those visible only at runtime.
Dynamic Application Security Testing (DAST): DAST bombards a running application with anomalous inputs or attempted exploits for known vulnerabilities. It has no knowledge of the application’s internals, so it can miss vulnerabilities. However, it is capable of detecting runtime vulnerabilities and configuration errors (unlike SAST).
Interactive Application Security Testing (IAST): IAST places an agent inside an application and monitors its internal state while it is running. This enables it to identify unknown vulnerabilities based on their effects on the application.
Software Composition Analysis (SCA): SCA is used to identify the third-party dependencies included in an application and may generate a software bill of materials (SBOM). This enables the developer to identify vulnerabilities that exist in this third-party code.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

What type of system is a systematic approach to information security comprised of processes, technology, and people designed to assist in the protection and management of an organization’s information?

A. Privacy Management Framework (PMF)
B. International Standards Organization/International Electrotechnical Commission (ISO/IEC) 27042
C. Information Security Management System (ISMS)
D. Configuration Management DataBase (CMDB)

A

C. Information Security Management System (ISMS)

Explanation:
An Information Security Management System (ISMS) is intended to protect the confidentiality, availability, and integrity of an organization’s data. The most effective ISMSs are those that are aligned with the organization’s standards and include specific information on compliance requirements. ISO/IEC 27001/2 is often used to build a corporation’s ISMS.

PMF, formerly known as the Generally Accepted Privacy Principles (GAPP), aids a business in protecting personal information, a.k.a. Personally Identifiable Information (PII). It is very similar in its details to the European Union’s General Data Protection Regulation (EU GDPR).

A CMDB is used by a business to track the configurations, or baselines, for the company’s hardware and software assets.

ISO/IEC 27042 provides guidance for the analysis and technical interpretation of electronic evidence in support of incident response.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Nathan is the information security manager for a large pharmaceutical company. They have moved to an Infrastructure as a Service (IaaS) cloud environment. Their software developers build almost all the software that is used within the business. They also sell access to their application to other companies like them as a Software as a Service (SaaS). To ensure that their systems will work at all times, they perform a variety of different types of tests.

In which security test does the tester try to actively attempt to attack or compromise a live system using the same types of tools that an actual attacker would use to simulate a real-life scenario?

A. Vulnerability scan
B. Penetration test
C. Static Application Security Test (SAST)
D. Static code analysis

A

B. Penetration test

Explanation:
During a penetration test, the tester is trying to actively break into the live systems. This is meant to simulate a real-life scenario and, therefore, the tester will use the same type of tools that an actual attacker would use to compromise a system.

During Static Application Security Testing (SAST), the tester has knowledge of and access to the source code, and all testing is done in an offline manner. A static code analysis is another name for SAST.

Vulnerability scans are usually done by an organization against their own systems to ensure that their systems are hardened against known vulnerabilities.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Which of the following roles collects data and has overall responsibility for it?

A. Data Steward
B. Data Processor
C. Data Owner
D. Data Custodian

A

C. Data Owner

Explanation:
There are several roles and responsibilities related to data ownership, including:

Data Owner: The data owner creates or collects the data and is responsible for it.
Data Custodian: A data custodian is responsible for maintaining or administrating the data. This includes securing the data based on instructions from the data owner.
Data Steward: The data steward ensures that the data’s context and meaning are understood and that it is used properly.
Data Processor: A data processor uses the data, including manipulating, storing, or moving it. Cloud providers are data processor
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Select the correct order of the cloud data lifecycle.

A. Create, share, use, store, archive, destroy
B. Create, store, use, share, archive, destroy
C. Create, use, archive, store, share, destroy
D. Create, use, store, share, archive, destroy

A

B. Create, store, use, share, archive, destroy

Explanation:
The phases of the cloud data lifecycle are as follows:

Create: Any time data is considered new (this can be brand new data, data migrated from another system, or existing data which is modified), it is in the create phase.
Store: Data is stored immediately after it is created. Storing methods include files residing on a file server, remote object storage, and data written to a database.
Use: When data is consumed or processed by an application or user, it is in the use phase.
Share: When data is made available for use outside of the system it was created in, this is known as the share phase.
Archive: The final stage is archive, in which data is moved to long-term storage and no longer considered active.
Destroy: As the name suggests, the destroy phase is where data is removed completely. In cloud environments, this is done using methods such as overwriting and cryptographic erasure.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Allison is working with the team that is planning and building their capabilities to recover when disaster strikes. Which is the LEAST useful metric for business requirements and capabilities that needs to be understood for business continuity and disaster recovery in the cloud?

A. How long are you down?
B. How much capacity for data?
C. Computing power for systems?
D. How much data may you lose?

A

B. How much capacity for data?

Explanation:
How much data storage capacity is not a good indicator of business requirements and capabilities for continuity and disaster recovery in the cloud.

Three metrics are used to assess business capabilities: Recovery Time Objective (RTO), which indicates how long systems are down, Recovery Point Objective (RPO), which indicates how much data may be lost, and Recovery Service Level (RSL), which indicates how much processing power is required to maintain systems following a disaster.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

An application development team is working to build a new cloud-based application. The team is currently debating the Application Programming Interface (API) that they will use. Their application will use the File Transfer Protocol (FTP). Which of the following options do they have to choose from?

A. Representation State Transfer (REST) can be used in combination with SOAP order FTP
B. Only SOAP can be used over FTP
C. SOAP and Representation State Transfer (REST) can both be implemented over FTP
D. Only Representation State Transfer (REST) can be used over FTP

A

C. SOAP and Representation State Transfer (REST) can both be implemented over FTP

Explanation:
While SOAP and REST most commonly use the HTTP protocol for transmission, it is possible for them to use the FTP protocol and other communication protocols as well.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Shadow IT is a security threat associated with which of the following?

A. Data Corruption
B. Unauthorized Provisioning
C. Unauthorized Access
D. Malware

A

B. Unauthorized Provisioning

Explanation:
Data storage in the cloud faces various potential threats, including:

Unauthorized Access: Cloud customers should implement access controls to prevent unauthorized users from accessing data. Also, a cloud service provider (CSP) should implement controls to prevent data leakage in multitenant environments.
Unauthorized Provisioning: The ease of setting up cloud data storage may lead to shadow IT, where cloud resources are provisioned outside of the oversight of the IT department. This can incur additional costs to the organization and creates security and compliance challenges since the security team can’t secure data that they don’t know exists.
Regulatory Non-Compliance: Various regulations mandate security controls and other requirements for certain types of data. A failure to comply with these requirements — by failing to protect data or allowing it to flow outside of jurisdictional boundaries — could result in fines, legal action, or a suspension of the business’s ability to operate.
Jurisdictional Issues: Different jurisdictions have different laws and regulations regarding data security, usage, and transfer. Many CSPs have locations around the world, which can violate these laws if data is improperly protected or stored in an unauthorized location.
Denial of Service: Cloud environments are publicly accessible and largely accessible via the Internet. This creates the risk of Denial of Service attacks if the CSP does not have adequate protections in place.
Data Corruption or Destruction: Data stored in the cloud can be corrupted or destroyed by accident, malicious intent, or natural disasters.
Theft or Media Loss: CSPs are responsible for the physical security of their data centers. If these security controls fail, an attacker may be able to steal the physical media storing an organization’s data.
Malware: Ransomware and other malware increasingly target cloud environments as well as local storage. Access controls, secure backups, and anti-malware solutions are essential to protecting cloud data against theft or corruption.
Improper Disposal: The CSP is responsible for ensuring that physical media is disposed of correctly at the end of life. Cloud customers can also protect their data by using encryption to make the data stored on a drive unreadable.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Gunner is a cloud data architect working with a pharmaceutical company. They are planning the deployment of data storage into the public cloud that they are using. The data that he is currently discussing with the Chief Information Security Officer (CISO), Gia, is going to be used for analysis. They are trying to predict the success of certain pharmaceutical drugs that they are researching. One of the challenges that they have is the large amount of information that they are sorting through. They have patient records, test results, X-Rays, etc. to analyze for success.

What can they use for this?

A. Data mart
B. Data warehouse
C. Big data
D. Data lake

A

D. Data lake

Explanation:
A data lake is a repository that allows the centralization of data in large volumes. This allows for analysis of the data to drive insights and predictions for the business. It is similar to what data warehousing provided in the structured data world. Data lakes allow for the collection of video, audio, logs, texts, sensor data, documents, social media posts, etc. Therefore, it is unstructured in nature.

A data warehouse is the centralized collection of data for the purpose of analysis to drive insights and predictions. Most of the data in a data warehouse comes from databases. This is a structured format.

A data mart is actually a small specialized collection of data from the databases. This is also structured.

Big data is a collection of data. Very large in nature, it is characterized by the Vs: Volume, Velocity, Variety, Variability, and Veracity.

17
Q

A cloud administrator, Jocelyn, needs to create a new golden image. This image is for a computer server. There is a current version in use in production. This new image requires updates from the vendor as well as configuration changes to match corporate policy changes. What is the FIRST step this administrator needs to take to create a baseline image?

A. Connect to the instance
B. Run a copy of the current image
C. Customize the instance
D. Stop the instance and create a new image

A

B. Run a copy of the current image

Explanation:
The first step in creating a new golden image is to run a copy of the current image someplace where it can be safely modified. Once it is running as a virtual machine, the administrator can connect to it and update, patch, and/or alter that image anyway it needs to be done. Once the image is updated as necessary, it should be stopped. Once it is stopped, a new golden image file can be created. The word baselining can be used for this process as well.

18
Q

When developing a business continuity and disaster recovery (BC/DR) policy, which of the following aspects of data retention policies is the MOST important to consider?

A. Regulatory Requirements
B. Retention Periods
C. Archiving and Retrieval Procedures and Mechanisms
D. Data Classification

A

C. Archiving and Retrieval Procedures and Mechanisms

Explanation:
Data retention policies define how long an organization stores particular types of data. Some of the key considerations for data retention policies include:

Retention Periods: Defines how long data should be stored. This usually refers to archived data rather than data in active use.
Regulatory Requirements: Various regulations have rules regarding data retention. These may mandate that data only be retained for a certain period of time or the minimum time that data should be saved. Typically, the first refers to personal data, while the second is business and financial data or security records.
Data Classification: The classification level of data may impact its retention period or the means by which the data should be stored and secured.
Retention Requirements: In some cases, specific requirements may exist for how data should be stored. For example, sensitive data should be encrypted at rest. Data retention may also be impacted by legal holds.
Archiving and Retrieval Procedures and Mechanisms: Different types of data may have different requirements for storage and retrieval. For example, data used as backups as part of a BC/DR policy may need to be more readily accessible than long-term records.
Monitoring, Maintenance, and Enforcement: Data retention policies should have rules regarding when and how the policies will be reviewed, updated, audited, and enforced
19
Q

A small business was unhappy with its cloud provider’s services. For this reason, the company decided to remove all data and applications from its cloud provider’s environment. It was then able to move to another provider without any major impact on its production and operations. What BEST describes the ability to do this?

A. Reversibility to get all artifacts back from the first provider and portability to move all their data without having to reenter it
B. Interoperability to get all artifacts back from the first provider and move all their data without having to reenter it
C. Portability to get all artifacts back from the first provider and interoperability to move all their data without having to reenter it
D. Interoperability between the two cloud providers allows the movement of their data and portability for the data

A

A. Reversibility to get all artifacts back from the first provider and portability to move all their data without having to reenter it

Explanation:
Correct answer: Reversibility to get all artifacts back from the first provider and portability to move all their data without having to reenter it

Reversibility is the ability of a cloud customer to quickly remove all data, applications, artifacts, and anything else that may reside in the cloud provider’s environment. Portability is the ability to move data (or software portability) from one provider to another without having to reenter it.

Interoperability is the ability to use a piece of data on two different systems.

These definitions can be found in ISO/IEC 17788.

20
Q

An information security manager suspects that attackers have been targeting her organization’s servers. She wants to put a system in place, isolated from all production systems, to trick attackers into thinking it is a legitimate server. This will allow her to monitor the attackers’ behavior and see what they are trying to do on her network.

What is this isolated system called?

A. Intrustion Detection System (IDS)
B. Jumpbox
C. Demilitarized zone (DMZ)
D. Honeypot

A

D. Honeypot

Explanation:
A honeypot is a system used to trick attackers into thinking it is an actual production system. The honeypot is kept separated and isolated from all other systems on the network. When an attack gains access to a honeypot, it allows administrators to monitor the behavior of the attack and see what they are trying to accomplish on the network.

A jumpbox is a server that is used by administrators to log in to other systems. It is placed at an entry point, usually moving between lower security and higher security systems.

DMZs are a space, or subnet, that is constructed between the external untrusted network (e.g., the internet) and the internal trusted network (e.g., the LAN and data center). The DMZ is a great place to install the honeypot.

IDSs are systems that have the ability to monitor traffic and are placed either in the network (NIDS) or on the end system, the host (HIDS). It is a passive monitor of traffic. It is not a server, and the bad actor should not even know it is there.