Speedup
Sp(n) = Ts(n)/Tp(n)
Cost
Cp(n) = p x Tp(n)
Efficiency
Captures the fraction of time for which a processor is usefully employed
Ep(n) = Sp(n)/p = (Ts(n)/Tp(n))/p = Ts(n)/(p x Tp(n)) = Ts(n)/Cp(n)
Effect of overhead
Ideal
Tp(n) = Ts(n) / p
Typical
Tp(n) = Ts(n)/p + Toverhead
Amdahl’s Law
Sp = Ts / (r x Ts(n) + (((1 - r) x Ts(n)) / p)
Limitations of Amdahl’s Law
Gustafson’s Law
Sp(n) = (C + tp(n, 1)) / (C + tp(n, p))
C = Constant executiion time of the sequential part of the program
tp(n, p) = Execution time of the parrellisable part of the program
tp(n, 1) = Ts(n) - C (if the parallel part is perfectly parallisable)
tp(n, p) = (Ts(n) - C) / p
Sp(n) = (C + Ts(n) - C) / (C + (Ts(n) - C) / p) = (C / (Ts(n) - C) + 1) / (C / (Ts(n) - C) + 1/p)
Scalability
A problem is scalable if it can handle ever increasing problem sizes
Ts = n = problem size
Tp = n/p + 1; the program has a fixed overhead of 1, independent of the problem size
Then
E = Ts / (p x Tp) = n / (p x (n/p + 1)) = (v x n) / (v x n + k x p)
Strongly Scalable
If, when we increase the number of processes/ threads, the efficiency stays more or less the same
without increasing the problem size, the program is said to be strongly scalable.
Weakly Scalable
If the efficiency stays more or less the same when we increase the problem size at the same rate as
the number of processes/ threads, then the program is said to be weakly scalable.