> Lots of people are saying that Apple chips are good only because they have the node advantage. Let's see what they can do within the same power envelope now.
Depending on what you're doing, modern AMD chips are very close to M series chips under load (specifically efficiency). The biggest issue is idle power is still higher. My amd framework (7840u) takes between 5-10watts idle unless I use something like the xtu tuner.
> Single core performance is only useful for artificial benchmarks
That is nonsense that none of the CPU competitors would agree with. In most applications single core performance matters very much. Not every algorithm can be multi threaded and there is an unavoidable overhead with those that can be multi threaded. Only some parts of some applications can be multi threaded.
For example, a 20 core 500 MHz CPU is much less capable and responsive for real world usage than a 5 core 2 GHz CPU, despite having the same instruction count per cycle.
A 100 core 100 Mhz CPU would take forever to boot up and feel unusably slow.
I just showed you that Apple is equal or better in terms of single core performance. This thread is a bunch of childish fanboy nonsense, attaching egos to some brand of CPU manufacturer and ignoring actual benchmarks.
Personally I don't care about $20 price differences. On a developer salary who gives a shit about price? I own Apple, Intel, and AMD cpus. They're all good.
>I just showed you that Apple is equal or better in terms of single core performance. This thread is a bunch of childish fanboy nonsense, attaching egos to some brand of CPU manufacturer and ignoring actual benchmarks.
So, just because you used one metric, then I shouldnt look at the other metrics?
You said "perf and price". Benchmarks show they are not beating Apple hard at performance. For price there is no straightforward way to compare since you can't buy standalone Apple CPUs.
At that level its competing on efficiency and capped by power consumption. It can't reach 5 GHz with only 80 watts for all cores. Running at 3.5 GHz. The Intel and AMD CPUs need hundreds of watts and reach 5 GHz+. It's a tradeoff for efficiency. Different design decisions. Different tradeoffs. Not exactly competing in the same market segments.
Mac Pro is often used for video editing. The M2 Ultra has hardware acceleration for video encoding/decoding that would need a separate accelerator card on Intel or AMD to match: "M2 Ultra can support up to 22 streams of 8K ProRes 422 video playback"
Yeah that's a multi-core rating. Apple doesn't lead on that currently, but for some reason that list doesn't include the M2 Ultra which would have their highest multi-core rating.
Apple beats them on single-core ratings. You can see the single-core rating if you click on one of those results.
Price cannot be compared because we do not know the price of an Apple processor. In fact, Passmark does not include Apple processors in their "best value" listings [0]
>Price cannot be compared because we do not know the price of an Apple processor
It cannot be compared perfectly, but you can try to estimate its perf/$
I'm not saying this will be easy, but imagine if the whole laptop was e.g 10k usd instead of 4-5k, then you'd instantly feel that something is expensive
> It cannot be compared perfectly, but you can try to estimate its perf/$
Hardly.
I don't see how one would be able to identify and normalize all the required variables, e.g. median life expectancy, average performance across metrics per watt, average power usage, residual value, etc.
For instance, I can sell my 2017 MacBook Pro for roughly twice as much as my 2017 Thinkpad, which has better specs. How do you factor that?
I wouldn't because it makes no sense. Occurance of great deals aint relevant here, imo. Why would I care that customers do crazy stuff on 2nd hand market?
>I don't see how one would be able to identify and normalize all the required variables, e.g. median life expectancy, average performance across metrics per watt, average power usage, residual value, etc.
How about building system for similar price to Macbook and comparing their performance?
It, of course won't be ideal, fair, whatever, but ain't it what gamers do? They find PC configurations and check how games run on them.
Exactly. There are so many dimensions across which to evaluate it. What I care about the most is 1) ST thread (running my personal workload which is inherently single threaded), and 2) Rust compilation (MT compile/ST link).
For 1) my fastest iron is i9-13700KS and Apple M2. They are very close. My Zen 3 is great and is notably more power efficient, but I'll evaluate 14700KS-Zen 5-M3 when possible.
ADD: because of winter I'm loving my i9-13700KS (not kidding, my office would be freezing without it), but come summer I'll care about efficiency.
It's for the entire laptop, but it fluctuates rapidly and is realistically closer to 6-8w. Disclaimer, I have a USB charger with voltage/current readouts so these values are from observing that when fully charged, though AMD adrenalin reports very similar results.
I see similar with a Thinkpad P14s with 7840U. If you can, try powering off the screen and observing power draw. If I haven't confused myself, mine drops down to more like 2-3 watts in that state.
As far as I can tell, it is not enabling LCD panel self-refresh. This may be where the extra idle power is going with screen on? If you think about it, it's a pretty expensive behavior to constantly read framebuffer content out of system RAM at 60 Hz.
I replied a longer reply to the other reply to your comment, but in terms of battery life 13th gen and lower intels are poorer to zen4 (7x4x) u series cpus, however apparently the intel hs/hx versions are slightly more efficient than their amd counterparts.
I highly recommend going through some youtube battery life videos and looking up notebook check reviews for whatever you're planning to buy/compare.
Recent intel systems unfortunately often push higher power to "beat" amd on performance metrics. Single core intel perf is still higher and likely will be, but multicore and efficiency of zen 4 is generally similar if not better than intel 13th gen. 14th gen helps efficiency but is barely available. Oh, and the amd igpu is quite performant, much better than the 13th gen and lower intels.
Btw as someone with a skylake laptop that also used to sip power, I suspect there's been a mild across the board power increase especially as newer chips clock much higher. My ryzen 7 iirc goes till 5.1ghz and is noticeably faster (i'm at 392 tabs in edge right now) than my skylake. I suspect your older laptop wouldn't clock so high, and a 3ghz limited intel/amd would have great battery.
Colder and hotter are temperatures and not measures of power consumption. A soldering iron can put out 75w at 800F, a cpu can put out 200w and top out at 175F.
In the modern era, AMD chips are actually known for running hotter for quite a number of reasons (much thicker IHS on AM5, stacked v-cache on X3D, boost algorithm deliberately saturating thermals, etc), even though the intel chips pull more power.
Akktually, according to the second law of thermodynamics you cannot get hotter with lower consumption of energy (with equivalent heatsinking, in terms of thermal resistance, measured in K per Watt) , at idle, as idle is a state of thermal equilibrium (and at rather low sustained power of 3-5w, well within ability of heatsink to dissipate), where none of your reasons are applicable.
> according to the second law of thermodynamics you cannot get hotter with lower consumption of energy (with equivalent heatsinking, in terms of thermal resistance, measured in K per Watt
cpus are not an ideal thermal system and do have their own internal thermal resistance. a 7800X3D runs hotter than a 7700X at equivalent (limited) PPT, which runs hotter than a 13900K at the equivalent (limited) PPT, because the thermal resistance is higher. these are objectively measurable things!
Also, generally, surface area is a component of thermal intensity as well and if you take the same flux and spread it out over more surface area you will get a lower temperature too. A threadripper putting out 250W does so with less thermal intensity than a 7800X3D putting out 250W and will run at a lower temperature too.
like yes, you are correctly describing the measurements in which these cpus are not the same thing, but then making the incorrect leap that "because in a spherical-cow world they would be equal" that these cpus are equivalent in these metrics in real life, which they are not. different cpus have different thermal resistances, and AMD's is generally higher right now because of the decision to go with a thicker IHS (to maintain cooler compatibility) and the move towards stacking (more silicon in the way = more thermal resistance).
and again, don't pretend this is some absurd or unknown concept, we literally spent years and years with amd fans making memes about "intel toothpaste"... thermals and wattage dissipated are not the same thing. you can have a great, efficient product with terrible thermal resistance, there have been a number of them!
it's just that AMD isn't on the top this time, so everyone pretends not to get it... or volunteers a bunch of theoretical reasons it doesn't matter... or ...
just like "thermal watts aren't the same thing as electrical watts!" etc
Depending on what you're doing, modern AMD chips are very close to M series chips under load (specifically efficiency). The biggest issue is idle power is still higher. My amd framework (7840u) takes between 5-10watts idle unless I use something like the xtu tuner.