Programming Resources
For Fun and Learning
Charles Cusack
Computer Science
Hope College
main

Python
C++
JAVA
PHP
SQL
Alice

ReadingQuestions


Algorithms_Dasgupta.html
Algorithms_Levitin.html
OpenMP.html
SIPC.html

SIPC.html

Reading Questions from A Sophomoric Introduction to Shared-Memory Parallelism and Concurrency (SIPC), Grossman.

  • Chapter 2-3.3:
    1. Define parallelism and concurrency
    2. If you have a Thread T, what is the difference between calling T.run() and T.start()?
    3. What does it mean that multi-threaded programs are nondeterministic?
    4. What does it mean for a call to a method (e.g. join) to block?
    5. Explain what fork and join mean in the context of parallel programs.
  • Chapter 3.4-3.6:
    1. What is the difference between RecursiveAction and RecursiveTask? When should each be used?
    2. How many ForkJoinPools should your program contain? How many calls to invoke should your program have?
    3. What is a reduction? Give an example of a computation that is a reduction.
    4. What is a map? Give an example of a computation that is a map.
    5. What is a downside of having the number of threads be about the same as the number of processors?
    6. What is the downside of having too many threads?
  • Chapter 4-5.2:
    1. Define each of the following, giving the appropriate notation and/or formula:
      • Work
      • Span
      • Speedup
      • Parallelism
    2. In a sentence or two, what does Amdahl's Law say?
    3. If half of an algorithm cannot be parallized, what is the best speedup you can attain no matter how many processors you have?
  • Chapter 5:
    1. What is the work and span of the parallel-prefix sum algorithm?
    2. What is the work and span of the parallel pack algorithm?
    3. What is the work and span of the parallel quicksort algorithm?
    4. What is the work and span of the parallel mergesort algorithm?
    5. What other parallel algorithm(s) is/are used to implement parallel quicksort?
  • Chapter 6:
    1. What does it mean for calls to interleave?
    2. Why can't you prevent bad interleavings by using a boolean variable that only allows one thread to access a segment of code at a time? (Hint: I should probably used the phrase "attempt to" in the preivous sentence.)
    3. Instead of using a boolean something called locks should be used. Why do they work when boolean variables do not?
    4. If a method is synchronized, what object is used as the lock?
    5. What is a reentrant lock and why are they important?
  • Chapter 7:
    1. Define race condition.
    2. Define bad interleaving.
    3. Define data race.
    4. What must be true of intermediate states for a concurrent program to be correct?
    5. What do compilers potentially have to do with data races?
  • Chapter 8-9:
    1. What are two things you can do (regarding data) to make concurrent programming easier?
    2. Give the 5 synchronization guidelines, including a sentence or two that explains each in a little more detail.
    3. Why didn't we need to worry about synchronization when we discussed algorithms like prefix sum and pack?
    4. Define deadlock.