For clarification the graphs aren't comparing JIT vs interpreted they are comparing the JIT tier being allowed vs not allowed.
Even with JIT enabled most functions are still just interpreted, hence why the vast majority of tests are equal as the vast majority of tests are interpreted either way. It's only when the JS engine thinks it can start to realize performance gains on hot code that it will start to JIT it. You can see this behavior in the "Average improvement and regression" graph where JIT starts trading other stats for performance gains.
Knowing this and looking at top "daily browsing" sites you get results about exactly where you'd expect, the JIT engine is tuned to let the vast majority of the code on these sites be interpreted since much is only called a handful of times or less leading to little difference. You see a bit of the JIT engine tiering where it starts to pick up a few of the hot pieces of code at a trade off on the other stats.
If you look beyond "daily browsing" sites into web apps and such that's where JIT is actually focused and where you'll see the most gains. It's intentionally not trying to get involved on lightweight pages because it makes no sense to do there, regardless how much time and money is invested into the JIT it will always know the best performing strategy for some 1 pass JS to set the page layout is to interpret it.
Even with JIT enabled most functions are still just interpreted, hence why the vast majority of tests are equal as the vast majority of tests are interpreted either way. It's only when the JS engine thinks it can start to realize performance gains on hot code that it will start to JIT it. You can see this behavior in the "Average improvement and regression" graph where JIT starts trading other stats for performance gains.
Knowing this and looking at top "daily browsing" sites you get results about exactly where you'd expect, the JIT engine is tuned to let the vast majority of the code on these sites be interpreted since much is only called a handful of times or less leading to little difference. You see a bit of the JIT engine tiering where it starts to pick up a few of the hot pieces of code at a trade off on the other stats.
If you look beyond "daily browsing" sites into web apps and such that's where JIT is actually focused and where you'll see the most gains. It's intentionally not trying to get involved on lightweight pages because it makes no sense to do there, regardless how much time and money is invested into the JIT it will always know the best performing strategy for some 1 pass JS to set the page layout is to interpret it.