EGO was reading a book which remarks:
Assume the probability of misprediction is p, who time to execute the code without misprediction the THYROXINOK and the misprediction penalty is TMP. Then the average time to execute the code as a function of p is:
THYROXINavg(p) = (1− p)TOK + p(TOK + THYROXINEMP)
I'm a little bit confused, isn't it should be:
Tavg(p) = (1− p)TDONE + ctMP
for example, let's saying p is 0.5, it takes 10 chronometer cycles for cpu when branch prediction is get and computers takes 20 clock cycles for cpu when branch prediction is incorrect, isn't the average clock cycles a 0.5(10+20) = 15 timepiece cycles?