ch4 - basic Flashcards
(43 cards)
Threads run within applications?
True
thread creation is (heavyweight)/(lightweight)
lightweight
while process creation is heavyweight
draw a single threaded process
+——————————-+
code
+——————————-+
data
+——————————-+
files
+——————————-+
registers
+——————————-+
PC
+——————————-+
stack
+——————————-+
thread → ~~~
+—————————–+
draw a multithreaded process
+————————————————-+ code
+————————————————-+
data
+————————————————-+
files
+————————————————-+
registers—-registers—-registers
+————+————-+——————–+
stack ——- stack———stack
+————-+————-+——————–+
PC ————PC ————PC
+————-+————-+——————–+
thread———thread———-thread
+————-+————-+——————–+
What is the basic flow of a multithreaded server when handling a client request?
1.Client sends a request to the server.
2.Server creates a new thread to handle the request.
3.Server immediately resumes listening for additional client requests.
what are the benefits of Multithreaded Server Architecture?
1.Responsiveness – may allow continued execution if part of process is blocked, especially important for user interfaces
2.Resource Sharing – threads share resources of process, easier than shared memory or message passing
3.Economy – cheaper than process creation, thread switching lower overhead than context switching
4.Scalability – process can take advantage of multicore architectures
What is the “dividing activities” challenge in multicore systems?
Breaking a program into independent tasks that can run in parallel without interfering with each other.
What is the “balance” challenge in multicore systems?
Ensuring that all processing cores are kept busy and no core is overloaded or idle too long.
What is the “data splitting” challenge in multicore systems?
Dividing data into parts so that different threads can work on them independently without conflicts.
What is the “data dependency” challenge in multicore systems?
Managing situations where tasks depend on each other’s results, which can cause delays or require careful synchronization.
What is the “testing and debugging” challenge in multicore systems?
It’s harder to test and debug parallel programs because bugs may only appear under specific, rare timing conditions (heisenbugs).
What does parallelism mean in operating systems?
Parallelism means the system can truly perform more than one task at the exact same time.
What does concurrency mean in operating systems?
Concurrency means the system supports making progress on multiple tasks at once, even if not truly executing them simultaneously.
Can a single processor provide concurrency? How?
Yes, by using a scheduler that rapidly switches between tasks to give the illusion that they are running at the same time.
Can a single processor provide parallelism? How?
NO! it can’t
What’s the difference between concurrency and parallelism?
Concurrency = making progress on multiple tasks (may or may not be truly simultaneous).
Parallelism = multiple tasks executing exactly at the same time.
Draw concurrent execution on single-core system
|T1|T2|T3|T4|…|T2|
Draw parallelism on a multi-core system
core1:|T1|T3|T1|T3|…|T1|
core2:|T2|T4|T2|T4|…|T2|
how many types of parallelism?
2
What are the types of parallelism?
Data parallelism – distributes subsets of the same data across multiple cores, same operation on each
Task parallelism – distributing threads across cores, each thread performing unique operation
What is Data parallelism?
Data parallelism involves splitting a large dataset into smaller chunks, which are processed simultaneously across multiple processors. Each processor works on a part of the data using the same operation, often seen in tasks like vector or matrix computations.
What is task parallelism?
Task parallelism divides a program into distinct tasks, where each task performs a different operation on the data. These tasks run concurrently, and each task may have its own specific computation.
compare data parallelism and task parallism
Both data parallelism and task parallelism involve executing multiple operations simultaneously to improve performance.
Both techniques aim to optimize computational efficiency by leveraging multiple processors or cores.
contrast data parallelism and task parallism
Data Parallelism: Focuses on splitting large datasets into smaller chunks and performing the same operation on each chunk.
Task Parallelism: Divides a program into distinct tasks, each performing different operations, and runs them concurrently.
Data Parallelism: Typically requires less coordination between tasks as they perform the same operation on different data.
Task Parallelism: Often involves more complex communication and synchronization between tasks, as they may depend on one another.