Cert Flashcards

(65 cards)

1
Q

What is the primary purpose of logging systems?

A

Logging systems collect store and analyze logs from various sources to help with debugging monitoring and security.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Name one example of a logging system.

A

Datadog

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What are the four main benefits of logging systems?

A

Troubleshooting, Monitoring, Security, and Compliance

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

How do logs help with troubleshooting?

A

Logs help identify and resolve issues by providing insights into what happened when and why.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

How do logging systems support monitoring?

A

Logging systems enable real-time monitoring of system health and performance allowing for proactive identification of potential problems.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

How do logs enhance security?

A

Logs can be used to detect and investigate security incidents such as unauthorized access or malicious activity.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

How do logging systems help with compliance?

A

Logging systems can help organizations meet regulatory requirements by providing audit trails and evidence of system activity.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What are the five main types of log sources?

A

Server, Container, Cloud, Client, and Other existing logging services

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

What are Datadog server integrations?

A

Datadog offers several integrations to forward logs from a server to Datadog using a log configuration block in the conf.yaml file.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

How are logs collected from containers?

A

It depends on where the agent is deployed or run or how logs are routed (e.g. Docker host agent Docker container agent or Kubernetes DaemonSet).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

How are logs collected from cloud sources?

A

Through subscribing to logs on cloud provider services (e.g. AWS CloudWatch) and sending them through streams like AWS Kinesis.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

How can logs be collected from clients?

A

Through SDKs or libraries such as the datadog-logs SDK for JavaScript clients.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

What are the three main types of Datadog integrations?

A

Agent-based, Authentication (crawler) based, and Library integrations

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

What are Agent-based integrations?

A

Integrations installed with the Datadog Agent.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

What are Authentication-based integrations?

A

Integrations where credentials are provided for obtaining data with an API (e.g. for AWS).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

What are Library integrations?

A

Integrations that use the Datadog API to send data from applications based on the language they are written in.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

What is the recommended log format for Datadog?

A

JSON format is always recommended.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

How is log collection enabled in Docker?

A

Using the DD_LOGS_ENABLED=true environment variable.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

How is log collection enabled in Kubernetes using the Datadog operator?

A

By setting logCollection.enabled: true in the datadog-agent.yaml manifest.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

How is log collection enabled on a host?

A

By changing logs_enabled from false to true in the datadog.yaml configuration file.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

What are the two options for log filtering?

A

exclude_at_match (exclude logs containing a pattern) and include_at_match (include only logs containing a pattern)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
22
Q

What does the mask_sequences option do for log obfuscation?

A

It replaces all matched groups with the value of the replace_holder parameter.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
23
Q

What is the flow that logs follow when being ingested by Datadog?

A
  1. Logs are ingested, 2. JSON structured logs are preprocessed, 3. Logs are filtered through pipelines, 4. Standard Attributes are applied
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
24
Q

What is a pipeline in Datadog log processing?

A

A pipeline takes a subset of ingested logs and applies a list of sequential Processors.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
25
Name three types of processors in Datadog.
GROK parser, Remapper, URL parser, User Agent Parser, Category Processor, Status Remapper, etc.
26
What does the GROK parser do?
Extracts attributes from semi-structured text log messages without the need to learn regular expressions.
27
What does a Remapper do?
Remaps an attribute to another name.
28
What does the URL parser do?
Extracts all values from a URL attribute.
29
What does the User Agent Parser do?
Extracts all values from a user-agent attribute.
30
What does the Category Processor do?
Adds a new attribute matching a provided search query.
31
What does the Status Remapper do?
Remaps a specified attribute as the official log status.
32
What are Standard Attributes in Datadog?
A set of attributes that act as the backbone of the naming convention for an organization.
33
What is the difference between Log Explorer and Live Trail?
Log Explorer is for log troubleshooting and exploration, while Live Trail shows incoming log events in near real-time.
34
How would you search for logs with a specific tag in Datadog?
Using tag:value format, e.g. env:prod
35
How would you search for logs with a specific attribute in Datadog?
Using @attribute:value format, e.g. @url:www.petco.com
36
How would you search for logs containing a specific word in Datadog?
Just type the word e.g. hello
37
How would you search for logs containing a specific phrase in Datadog?
Use double quotes around the phrase, e.g. "hello world"
38
What are the three Boolean operators available for log searching?
AND (intersection), OR (union), and - (exclusion)
39
What does the wildcard search *:prod find?
All log attributes or tags with the exact value prod
40
What does the wildcard search *:prod* find?
All log attributes or tags for strings that start with prod
41
What is the difference between Facets and Measures in log analysis?
Facets are qualitative and categorical, while Measures are quantitative, continuous, and numerical
42
What are the three ways logs can be aggregated in Datadog?
By Fields, by Patterns, and by Transactions
43
What happens when aggregating logs by Fields?
All logs matching the query filter are aggregated into groups based on the query search values.
44
What is the purpose of Log Patterns?
Log Patterns cluster logs with similar values for the message field and group results by Status and Service.
45
What do Transactions aggregate?
Transactions aggregate indexed logs according to instances of a sequence of events.
46
Name three visualization types available in Datadog log analysis.
Lists, Timeseries, Top List, Nested Tables, Pie chart, Tree map
47
What are the three types of log monitors in Datadog?
Monitor over a log count, Monitor over a facet or attribute, Monitor over a measure
48
How do log samples enhance monitor notifications?
Log samples can be added to the message of the notification to provide context.
49
Why would you generate metrics from logs?
To still have visibility into logs that are excluded from indexes while managing costs.
50
What elements need to be defined when creating a log-based metric?
Query input to filter logs, field to track, dimensions (group by), optional percentile aggregations, and metric name
51
What export options are available from log searches?
Saved View, Dashboard widget, Monitor, Metric, CSV, Share
52
What three elements does a saved view keep track of?
A search query with time range, a custom default visualization, and a selected subset of facets
53
List three common agent issues to check when troubleshooting.
Hostname detection issues, internet connectivity, proxy configuration, API key configuration, site configuration, multiple agents, restart after config changes
54
What are three common log ingestion issues?
Logs daily quota reached, timestamp outside ingestion window, unable to parse timestamp, truncated logs
55
What is the time limit for log ingestion regarding timestamps?
Logs with timestamps older than 18 hours may not be ingested.
56
What is the size limit for log messages in Datadog?
75KB for message fields and 25KB for non-message fields.
57
What is the overall size limit for a log in Datadog?
1MB
58
What should you do if your Agent container stops right after starting?
Check for hostname detection issues.
59
When is an indexed log required for Datadog features?
For monitors, aggregations, and many analysis features
60
What configuration file is used for the Datadog Agent's main configuration?
datadog.yaml
61
What is the file path for the Datadog Agent configuration on Linux?
/etc/datadog-agent/
62
How are logs filtered through pipelines?
Based on log attributes - e.g. a log with source:nginx will be processed by the nginx pipeline.
63
What happens to logs that don't match any pipeline and aren't in JSON format?
They will not be parsed.
64
How do you check if your Agent is properly configured for a proxy?
Verify the proxy configuration in the Agent configuration.
65
What should you do after editing a yaml configuration file for the Datadog Agent?
Restart the Datadog Agent.