When something is "2x faster, does that mean it is 200% faster? I have seen "x" and "%" sometimes used in such a way that the answer would be yes, and sometimes no. My impression from working in the field of computer architecture and hardware acceleration is that there is a general rule that is followed, so since the terminology is somewhat ambiguous Mac and I formalized how we use them some time back.
1) We have formal understandings of what it means to be Y% faster, it means the new performance is 100% + Y%. Slower works in the same way but with subtraction, so Y% slower means the new performance is 100% - Y%. Therefore if A is 50% faster than B, B is 33% slower than A.
2) Higher performance x's work similarly but without adding the 100%. So 3x faster means the final performance is 300% of the original. Slower performance x's work pretty different, by turning the x into a division sign "/". 5x slower means the new performance is 1/5, or 20% the original performance. Therefore if A is 30x faster than B, B is 30x slower than A.
Here's an example of Mozilla using the same definition for higher performance x's. The reference is regarding the performance of Firefox's new beta - one nice thing is that there is an instance where they round up from 2.94x to 3x. They also round from 3.49x down to 3x. In these cases they are rounding to one significant digit. At Cognitive we typically use two or three significant digits and round down, or whichever direction is a more conservative estimate for Cognitive's performance.