Class Ten Flashcards

1
Q

What is TensorFlow?

A

TensorFlow is an open-source machine learning framework developed by Google. It provides a flexible and efficient platform for building and deploying various machine learning models, including neural networks.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What is a Recurrent Neural Network (RNN)?

A

A Recurrent Neural Network is a type of neural network architecture that is designed to process sequential data. It has recurrent connections that allow information to be persisted across time steps, enabling it to model temporal dependencies.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What are the advantages of using Recurrent Neural Networks?

A

Advantages of using RNNs include their ability to handle variable-length sequential data, capture temporal dependencies, and generate predictions or sequences based on context.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What is the vanishing gradient problem in Recurrent Neural Networks?

A

The vanishing gradient problem refers to the issue of gradients diminishing or vanishing over time in the training of RNNs. It can make it challenging for the network to learn long-term dependencies.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What is the exploding gradient problem in Recurrent Neural Networks?

A

The exploding gradient problem occurs when the gradients in RNNs become extremely large during training, leading to numerical instability and difficulties in convergence.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

How can the vanishing gradient problem be mitigated in Recurrent Neural Networks?

A

The vanishing gradient problem in RNNs can be mitigated by using activation functions like ReLU, LSTM (Long Short-Term Memory) or GRU (Gated Recurrent Unit) cells, gradient clipping, and careful initialization of network weights.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What is Distributed Deep Learning?

A

Distributed Deep Learning refers to the training of deep learning models using multiple computing resources or devices, such as multiple machines or GPUs, to speed up the training process and handle larger datasets.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What are the advantages of Distributed Deep Learning?

A

Advantages of Distributed Deep Learning include reduced training time, the ability to train larger and more complex models, improved scalability, and the utilization of resources efficiently.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

What are the challenges in Distributed Deep Learning?

A

Challenges in Distributed Deep Learning include communication overhead between devices, data synchronization, load balancing, fault tolerance, and efficient parallelization of computations.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

How can model parallelism and data parallelism be utilized in Distributed Deep Learning?

A

Model parallelism involves distributing different parts of a neural network across multiple devices, while data parallelism involves replicating the model on each device and splitting the data for parallel training. Both techniques enable distributed training and can be combined based on the requirements of the model.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

What is parameter server architecture in Distributed Deep Learning?

A

Parameter server architecture is a distributed computing approach where a central parameter server stores and updates the model parameters, while multiple worker nodes perform computations and communicate with the parameter server.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

What are some techniques for improving the performance of Distributed Deep Learning?

A

Techniques for improving the performance of Distributed Deep Learning include optimizing network communication, overlapping communication with computation, using efficient data loading and preprocessing techniques, and using distributed optimization algorithms.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly