Can we compare the performance of smartphones and laptops? This is a delicate exercise.
Recently, UL Benchmarks and its famous 3DMark added the new Wild Life benchmark to the graphics tests offered. You can run this benchmark on your smartphone (Android and iOS) or your PC (Windows 10): you will find the download links here. Because yes, this benchmark is one of the first multiplatform benchmarks of UL Benchmarks.
Wild Life measures the performance of your GPU graphics chip by rendering a 3D game scene in real time. The “faster” the scene, the higher your benchmark score. 3DMark Wild Life thus allows the evaluation of advanced rendering techniques and possible post-processing effects on many devices, including smartphones. Wild Life uses the Vulkan API on Android devices and Windows PCs. On iOS devices, it uses the Metal API.
Who is the most powerful?
If we can run the same benchmark on smartphone and PC, can we compare the scores obtained? The exercise is easy to reproduce. We ran Wild Life on an iPhone 11 Pro (7159 points) and an iPhone 12 Pro (6835 points), but also a Dell XPS 13 2-in-1 7310 (6599 points) equipped with an Intel Core i7- CPU. 1065G7 with the iGPU Iri Plus. The results are similar with an ultrabook equipped with an AMD processor and its Radeon iGPU. It would be tempting to vibrates that the iPhone is more powerful than recent ultrabooks, but that would be too simple.
Already, Intel has reacted recently. We tested the Dell XPS 13 9310 equipped with the recent Intel Core (Tiger Lake) with the Xe graphics architecture. On Wild Life, it gets 12,049 points (50 fps on average), which puts it well above the latest iPhones. If we have fun running Wild Life on a stationary PC equipped with an Nvidia GeForce 2080 Ti… the score is 70 661 points (400 fps on average).
So why can’t we just compare smartphones and PCs? Because these are different platforms and it suffices to run Fortnite on an iPhone 12 Pro and on an ultrabook equipped with Windows 10 to realize that we are playing in totally different conditions. These benchmarks mainly allow you to compare two smartphones with each other, such as a Pixel 5 (1,138 points) and a Galaxy S20 FE 5G (3,982 points) for example. These results prove that Apple is ahead of its competition Android with Apple A13 or A14 chips.
Remember that the score generated by these benchmarks does not accurately reflect a user’s experience. Simply put, the numbers generated don’t directly correlate with user experience with devices and device makers, it’s easy to go wrong with these benchmarks. This often happens even among experts in new technologies. This is not intentional to mislead readers: it is certainly a lack of understanding of these tools. These benchmarks are often theoretical performance, regardless of battery life, operating systems, apps and games, but also real use cases.
You may be wondering ” why should i care “? First of all, if you look at the price history of processors or SoCs, you will find a direct correlation between perceived performance and price. Do not even think of invoking the “Apple rule” because for years they have dominated the smartphone market but also that of selling prices.
Our opinion is that benchmarks are supposed to give credibility to your experience and explain all kinds of performance differences between devices, whether good or bad. Here, we see that the 3D demo of Wild Life appears fluid above 30 fps (FPS). While the iPhone 12 Pro does less well than the iPhone 11 Pro (see visual above), the iPhone 12 Pro is the more stable of the two. This is ultimately what should be remembered from this benchmark.