Vocabulary Bank Flashcards

(31 cards)

1
Q

Backpropagation through time (BPTT)

A

gradient-based technique for training RNNs by
unfolding them in time and applying backpropagation to change all the parameters in the RNN

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Batch size

A

the number of training examples utilized in one forward/backward pass through the
network, before the loss and subsequently gradients are calculated

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Bag-of-words

A

A text representation method in NLP where a document is represented as a
vector of word frequencies, ignoring grammar and word order.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Biases

A

Systematic errors in a dataset that can lead to unfair outcomes in a model

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Types of Biases

A

Confirmation
Historical
Labeling
Linguistic
Sampling
Selection

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Dataset

A

A collection of data used for training or evaluating machine learning models

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Deep learning

A

A subset of machine learning involving neural networks with many layers that
can learn representations of data

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Graphical processing unit (GPU)

A

A specialized hardware component designed to handle and
accelerate parallel processing tasks, particularly effective for rendering graphics and training
deep learning models by performing simultaneous computations across multiple cores

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Hyperparameter tuning

A

The process of optimizing the parameters that govern the training
process of machine learning models to improve performance

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Large language model (LLM)

A

A type of AI model trained on vast amounts of text data to
understand and generate human-like text

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Latency

A

The delay between the input to a system and the corresponding output.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Learning rate

A

controls the size of the steps the model takes when updating its parameters
during training - if the learning rate is increased, weights and biases of the network are updated
more significantly in each iteration

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Long short-term memory (LSTM)

A

A type of RNN designed to remember information for long
periods and mitigate the vanishing gradient problem

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Long-term dependency

A

refers to the challenge in sequence models, like Recurrent Neural
Networks (RNNs), of capturing and utilizing information from earlier in the input sequence to
make accurate predictions at later time steps

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Loss function

A

A function that measures the difference between the predicted output and the
actual output, guiding model training.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Memory cell state

A

In LSTM networks, the cell state carries long-term memory through the
network, allowing it to retain information across time steps

17
Q

Natural language processing

A

The field of AI focused on the interaction between computers
and human language

18
Q

Discourse integration

A

Understanding and maintaining coherence across multiple
sentences or turns in conversation

19
Q

Lexical analysis

A

The process of examining the structure of words.

20
Q

Pragmatic analysis

A

Understanding language in context, including the intended
meaning and implications

21
Q

Semantic analysis

A

The process of understanding the meaning of words and sentences

22
Q

Syntactical analysis (parsing)

A

Analyzing the grammatical structure of sentences

23
Q

Natural language understanding (NLU)

A

A modular set of systems that sequentially process
text input to better represent their meaning before they are input into a neural network such as a
transformer NN or LSTM

24
Q

Pre-processing

A

The process of cleaning and preparing raw data for analysis or model training

25
Recurrent neural network (RNN)
A type of neural network designed to handle sequential data by maintaining a hidden state that captures information from previous time steps
26
Self-attention mechanism
A technique in neural networks where each element of the input sequence considers or focuses on every other element, determining their relevance or importance, which improves the model's ability to capture dependencies and relationships within the sequence
27
Synthetic data
Data that is artificially generated rather than obtained by direct measurement.
28
Tensor processing unit (TPU)
A type of hardware accelerator specifically designed by Google to speed up machine learning workloads
29
Transformer neural network (transformer NN)
A type of neural network architecture that relies on self-attention mechanisms to process input data in parallel, rather than sequentially like RNNs
30
Vanishing gradient
A problem in training deep neural networks where gradients diminish exponentially as they are backpropagated through the network, impeding learning
31
Weights
The parameters in a neural network that are adjusted during training to minimize the loss function.