Can you name those laptop Xeon CPUs that beat the pants off the M1 Max?
[Spoiler because I don't think I'll get a response -- there are none. Even when you get into the "luggable" category of workstation that is ostensibly portable but really needs to be plugged in, there is no competition right now. The upcoming Alder Lake should significantly improve Intel's entrant in this category, and hopefully brings some real competition]
The upcoming Thinkpad X1 Extreme is going to give it some stiff competition. It's wielding the insurmountable RTX 3080, and it's priced very competitively.
But I'm just going to tell it to you now so we don't make the same mistake we have for the past 10 years of computer hardware discussions: specs don't matter. You could tell 90% of the people buying PCs with dGPUs about your 5nm GPU and next-gen power efficiency, but they won't care. They're buying them as gaming devices, general-purpose machines and game development laptops. I'd argue the market for Mac users and PC users has not radically shifted, just the hardware you're using. If we're here to talk smack about hardware superiority, this website would have been insufferable for the past decade, because there was quite literally a complete lack of professional dGPU Macs. Now that the tables have shifted slightly, I don't see why Mac users feel the need to crawl out of the woodwork and declare the game as changed, now that Prometheus gave them the gift of a laptop that doesn't throttle to hell.
People will still buy all sorts of computers. Lots of options will appeal to different people. Intel is finally being forced to actually complete. It's all good. My M1 Mac (not even M1 Pro/Max) is quite easily the best computer I've ever owned, but I have zero need to proselytize and simply do not care what you or anyone else use.
That doesn't change the fact that the above claim about "laptop Xeon chips" beating the pants off the M1 Max is delusional nonsense.
I have to comment on the RTX 3080 bit: I have used many PC laptops over my career, and currently have a Lenova with a fat, barnburner Nvidia dGPU. The GPU is literally never used, because the moment it engages my battery life falls to cartoonish levels (somewhere in the range of 40 minutes), the laptop becomes a space heater, and the fans turn into jet engines. This is the sort of "spec chasing" that the industry is addicted to, providing absurd, completely unreasonable solutions just so someone can boast. One of the things about Apple, quite contrary to your claim, is that they don't do that. When they provide something, it is meaningfully usable and useful 100% of the time.
It's using DDR4 versus DDR5 in the Mac and a fraction of the memory bandwidth so will be interesting to see its performance on heavier tasks. Also looks like it has at least half the battery life if not far less when put under heavy load.
I do love this review though:
"It is a nice laptop but extremely noisy. Even when idle the fans are on all the time."
Memory bandwidth is nothing for most workloads. Besides a scant few Geekbench figures, I have genuinely never encountered a workload that was bottlenecked by my ability to transfer assets to the GPU. Is 100gbps of PCIe bandwidth not enough for your needs?
That's not true if you're actually using the ram though.
I remember memory bandwidth being the bottleneck when running large(ish) datasets for game worlds. It was so much that we put a lot of pressure on google cloud because we worried they wouldn't be able to compete with bare metal (since it's not usually measured, reported and can be non-guaranteed when you have neighbours).
xeon branded workstation laptops aren't using Icelake-SP or similar server chips, they are using Tiger Lake-S or Ice Lake-S or Skylake-S client chips.
They are what would previously have been branded as "Xeon E3" series chips - on the desktop platform they used to share a socket and be drop-in upgrades with consumer desktop chips, because they're basically the same chips with "enterprise" features like ECC turned on.
An example would be Xeon E3-1285 v3 - which is basically the same thing as an i7 4770.
These products are nowhere, nowhere near the M1 Max. They are consumer laptop chips with ECC and vPro turned on.
Yes, your Xeon from 2013 example would be nowhere near the 2021 CPU from Apple. The outdated Xeon Apple had in their old laptops - a year behind everyone else, is also slower. The Xeon W-11955M however makes the M1 look like a kid's toy. In fact, if you remove 2 cores from that Xeon, you'll have my 6-core Xeon. Which also smokes that 10-core M1 in a bong.
I'm also not sure why you're sarcastic about ECC RAM. I have 128GB of RAM in my laptop. If it wasn't ECC, I'd have crashes in my VMs and errors in my calculations. When you go 32GB+ and actually use the RAM, anything that doesn't support ECC cannot be taken seriously for professional use. Like the M1 Max.
The point is that a laptop Xeon from 2021 is also going to be the same as an 1185G7 or something similar - because they’re the same silicon. It’s not like you’re getting more silicon because it’s a Xeon, it’s not a server chip, it’s just a laptop chip with the enterprise features enabled.
So, really no need to test them specifically. Go get an 1185G7 or something and you know what “Mobile Xeon” benches will look like. Anandtech already did those benches.
Just to humor you I looked it up - your Xeon W-11955M is the same chip as a consumer 11980HK, which is one of the processors in Anandtech’s benchmark. Same cores, same cache, same clocks, slightly lower TDP limit - the consumer one has a 65W boost configuration. It is the same TGL-H die with (very slight) variations in what feature fuses are blown.
So based on Anandtech’s benchmark you can pretty much extrapolate how that is going to go - slightly lower performance and slightly higher efficiency due to TDP limiting clocks a bit, but same cache, same core configuration, etc mean that it’ll be identical to if you went into bios on a 11980HK and set a lower power limit.
By the way the 11980HK is specifically the chip they called out the M1 Max as being a factor-of-6 more power efficient than. And in fact they said exactly what I just said - that limiting TDP will reduce performance on the 11955M and the perf/W gap will close a bit, but the performance gap will get even wider.
> In multi-threaded tests, the 11980HK is clearly allowed to go to much higher power levels than the M1 Max, reaching package power levels of 80W, for 105-110W active wall power, significantly more than what the MacBook Pro here is drawing. The performance levels of the M1 Max are significantly higher than the Intel chip here, due to the much better scalability of the cores. The perf/W differences here are 4-6x in favour of the M1 Max, all whilst posting significantly better performance, meaning the perf/W at ISO-perf would be even higher than this.
> The m1 max TDP is projected to be 90W. The Xeon is 45W.
That article says that the M1 Max CPU TDP will be ~30W. The 90W figure is for the whole chip, which includes the GPU. The Xeon is just the CPU.
> here's that Xeon beating that 11090HK.
The CPUmark figure is 24092 for the Xeon, versus 23549 for the i9. That's a difference of about 2%. I suspect it's mostly down to statistical error. (Maybe it's also true that the Xeons tend to be put in systems whose other hardware is faster; I can see that going either way.)
Looking at the data in your second link, the performances differ by less than 3%. Looks like paulmd’s assertion that both are the same cpu core is fairly well supported by the data.
We've banned the other account, but you broke the site guidelines badly yourself in this thread, by feeding the flamewar and generally ignoring the rules. It's not ok to do that, regardless of how badly anyone else is behaving. Please read https://news.ycombinator.com/newsguidelines.html and stick to the rules in the future. Note this one:
"Don't feed egregious comments by replying; flag them instead."
And a Porshe is the same as a Kia, just with a few racing features.
>You don’t understand what you’re talking about
So you look at the graph I linked that shows the Xeon model is faster than the consumer I9 chip. And your brain says "the Xeon is slower." Then tell me I don't know what I'm talking about.
Tell me, do you ever stand in a lake and tell people you're dry? Are you in the lake to hide the pee? Do you tell people about the wonders of horse dewormer?
See, the only way your argument works is to compare the M1 to the I9. And despite the Xeon being faster than the I9, as literally and plainly shown in the tests I linked, you keep saying it's slower. ... But then you keep saying they're the same. Reality: it's faster, literally look at the numbers. Your mental gymnastics are amazing. I am not hostile. I have been getting great laughs and relaxation out of you. Yes, that comes at your expense.
> person who tried to explain it to you
By saying wet is dry. I get your explanation. I just see what my eyes tell me. You see blue, your brain says "nah, it's orange."
Just being pedantic, DDR5's implementation of on-die ECC is still not equivalent to the implementation on CPUs. It doesn't account for errors that occur during processing, and there's still a pretty significant chance of corruption in L1-3 caches.
One of my laptops is a Dell Precision 7760. Xeon W-11855M, NVIDIA RTX, 128GB ECC RAM. I don't need too much storage, but you can get it with 14TB if you want. Note mine is a 6-core. There is an 8-core available w/ the Xeon W-11955M, which is faster than mine.
It's only a little thicker than the macbook pro. It's keyboard doesn't break, and the product line has had a 4k screen since 5 years ago. It's 120Hz refresh rate. It has a very large power brick - 240W, it gets hot and loud with a huge fan exhaust. It's thick metal and about 7-9lb - you can run over it with a car. It only gets 9 hours battery w/ regular usage, and about 3 hours of "fan on time." Keep in mind, with the large and loud fan on, it can stay at 5GHz. This is called a pro laptop - a workstation. When I travel, I bring a 65W PSU, and it runs fine on that, just w/o turbo boost.
No, don't point to the lower "geekbench" score for this laptop - that's not a CPU test. GPU performance is a large part of that test, and they run the test on the default GPU. The M1 only has a single GPU. The Precision's default is the low power integrated graphics, not the discreet GPU. If you have a test where they assign the discreet GPU, please feel free to point it out.
As I've discussed before here, I have a shell script that runs in parallel with a bunch of VMs. My coworkers air (yes, I know it's not the max) runs it in 8-10 hours overnight. I run it over lunch. It loads, does calculations on, and creates graphs from several gig of ascii performance data.
What does compare in performance to the M1 air is my Latitude w/ the I7 in it. The M1 "max" is "max for apple" but competes with mid-tier laptops from everyone else. And that's ignoring the fact that it can't run pretty much any of the useful industry tools or games w/o recompiling x86 to arm on the fly.
Yes, I think my Dell cost the company $7-8k, without support. That's why it's called a pro laptop.
I feel like you have the setup necessary to produce an interesting, compelling argument here -- your benchmark seems, at least to some degree, less synthetic than some do, you have an uncommonly fast workstation which a lot of people doing comparison tests wouldn't be able to do...
... but the way you present it undermines your case a bit.
It seems like what you want to say is that if money is no object, weight is no object, heat is no object, battery life is no object, and portability is no object, but the comparison must absolutely be laptop to laptop, then there exists a laptop PC configuration that beats M1. If this is your point, then probably you want to compare an M1 Max to your PC, not your coworker's Macbook Air (which is a fanless laptop...)
I think this is a pretty unusual use case and there aren't too many people who are looking for this exact market segment. I definitely think it's fair to admit that Apple isn't intending to operate in this market segment, for better or worse.
You'd probably also want to drop the part about game support, since anyone who wants to play games can spend 1/4 what your workbench costs and get a shitkicking fast small form factor PC. But also, like, recompilation isn't what you should highlight -- what you should highlight is performance. If the recompilation is fast enough for users not to notice, then it doesn't matter, and if it's not, then the reason why it matters is performance, not recompilation.
Anyway, again, I don't think you're necessarily wrong or whatever here, but you're just presenting your point in a way that I think it's extremely unlikely anyone will care or be convinced.
“A little thicker” is a bit of an understatement. At its thickest point, it’s apparently 71% thicker than the thickest point of the 16-inch MacBook Pro (2021), and 55% thicker at its thinnest point compared to the thickest point of the Mac. That’s a huge difference for portable electronics.
It's also "sale" price of $6,700 with a 2TB HD and just a 6-core CPU. Going with the 8-core xeon bumps price to Bump it up to $6,900 and moving to 8TB of SSD goes up to $9,200 ($8,000 for the 4TB version).
The 2TB M1 macbook 16" is $4,300 and the 8TB, max-specced version is $6,100.
I know Apple has a bad rap for high prices, but that machine's prices make Apple prices look bargain basement. You could almost buy TWO M1 macs for the price of one 4TB Dell.
Now as far as big little countries, this one seems to have a population (non-resident) of 2k. This is huge for a micronation, although by definition of the word, it seems Hong Kong would be the largest micronation. This is because size does not define the term - only that the residents claim they're a country and most other countries don't recognize them as one. Other notable micronation is of course the famous Sealand, hosting the world's child porn from an abandoned oil platform.
Let me introduce you to old soviet trains, from my memories of a couple of cross country trips. Cozy, you line up by the window in the hallway as it pulls away while your parents unpack and put the suitcases in the high-up compartment in the little room - 2 bunk beds. You get the top one where you hang out and later fall asleep watching the window and listening to the train. There are 4 spots, so you have a guest joining you for the trip, and he tells you interesting stories and plays chess w/ you while drinking his tea. Magic.
Trains when I went back to Russia for a couple of years as an adult: mix of soviet and modern cars. In the soviet ones, the bedding is still from those soviet times. It's itchy, you're not sure whether it's clean, it smells a little funny. The toilets are disgusting, and you want to wash your hands and face from a water bottle -just trust me on that. You got some bunk mates in the 4-bed room. They're annoying, they're idiots, and they pretend to drink tea in class, while mostly just drinking cheap vodka. They step out by the car entrance to smoke every ten minutes, but reak of booze, BO, and smoke - so the whole car reaks too.
The new cars are clean, double the price, nice toilets, everything is plastic, everything smells like plastic. It feels like you're on an airplane. There's no magic.
Spent about a month in Kazakhstan, but flew in there from Moscow. Honestly - egh. Nothing special, just a smaller city with fairly friendly people trying to get through the day.
I had a similar experience in Egypt. I was on an organized tour with tour office and part of the trip is a long overnight train from Aswan to Cairo. The train looked terrible from the outside, very dirty. Insides were horribly cramped, no running water, toilets were unusuable. But the beds were surprisingly comfy and we managed to sleep all the way to Cairo.
Unfortunately, as I later learned, they are very unsafe too over there. Just two weeks after my trip a train crash happened with multiple fatalities, and months after it happened again.
I think the common thing here is not giving up w/o immediate success. That is a very good idea sometimes. Homer Simpson built a goofy car, people were exposed to it and he got negative feedback, and he rightly abandoned it. Some projects are a bad idea, you don't know they are, and once you find out they are, you absolutely should not keep wasting resources due to the sunk cost of wishful thinking.
His common issue is he expected to invest time to make money. You need to invest money to make money. This is why I sometimes laugh at a guy who can't get a good job, has no savings, is borrowing cash from friends to pay the rent and maxing out credit cards for food. Sometimes that guy's solution to being poor is "I'm going to start my own business." When you build on a foundation of nothing, your house collapses.
He ran an ad campaign - it clearly worked. He needed to run it more, and bigger, and yes, eat the cost of it from personal savings. In a business, you first spend money, then you make money. There are no freebies just because you have an idea. Stories of that happening are like self-learning guitar because you plan on being a rockstar.
The reason people don't want to pay for a new project from a new person? Because there's high risk of it being dead within a year - like all of this guy's projects. It's the same reason I don't start watching any new shows till they've had a few seasons out. Don't want to invest hours into the plot just to have it cancelled and left w/o closure in after two seasons.
It's also a reason no one joined things like google plus or used any of their other now dead projects. They proudly declare they try many things to see what sticks and kill the rest. Well, I'm not willing to give my time for free to their unpaid focus group.
So what he needed to do was save up, spend those savings on letting people know about his product, take the risk of loosing that cash, and give the product away at first. Then when the paid product comes, there is a huge user base he paid for, and people see the risk of it being killed as minimized.
I didn't read the whole document linked in the tweet, but no.
This is equivalent quite simply to someone putting out an inferior product - for example a phone that needs to scan a government issue photo id, so the phone maker - not your carrier - can identify you. There are people who don't care, and they are the target market. There are people who laugh at it and just buy something else. They are not the target market.
They did not try to change the web into anything. They tried and to some extent did change their browser into an inferior product, and change their ad network into an inferior product. Websites placing ads and ad exchanges are free to not use google's products. Web users are free not to use their product like their browser. I use google translate - that's about it. Not even maps, since heremaps is and always was simply superior.
They also tried to change regulations to suit them better. But they are allowed. Trying to change here simply means making their opinion known to the people who make the rules. They didn't bribe anyone. They emailed pdf files describing their ridiculous point of view.
The takeaway is they are a corporation with inferior products, who will do what's legal to make more money. Most people don't care. Some people are free to choose other products.
Azure and AWS for example completely destroy GCP This is because google treats the people who do care about google's inferior points, like they're a phone user who doesn't care if they're tracked. So google can't get any marketshare there for their inferior products. Despite GCP from a purely technical aspect being superior. When packaged with the worst customer service available, and the risk of losing all your data and backups permanently w/o notification, because you mistakenly hire a guy who is known to google as a hacker? They're not even a player for any customer that plans to be successful long-term.
It's a choice, pick what you want to do and who you want to do. Google is not hiding who they are. Or doing anything wrong. They just make inferior products. But VHS won the format battle.
> They also tried to change regulations to suit them better. But they are allowed.
They might be allowed. But just because it is legal does not mean that it is right, particularly as they themselves are "making their opinion known to the people who make the rules" and can excert economic pressure on lawmakers. And while their products might indeed be inferior – if they place their products as default on their own systems (android), buy themselves default placements on other products (Safari, Firefox…) many customers who do not try out all the options only use google’s products. And I can't blame them for not trying out all alternatives.
It absolutely is right, and a right. Corps aren't people, but there are people submitting the documents to the legislators. There is zero difference between this and a study on climate change being submitted to the EPA.
Law is not just for right and wrong. Most laws and regulations are to further people's interests, like the google people submitting the documents. This is called freedom of speech. Legislators are free to not read them.
> excert economic pressure on lawmakers
I don't think this means what you think it means. To exert economic pressure on a lawmaker (a person) means to make them to personally lose money in some way. This is a pretty grave accusation - it means "if you don't agree with us, we are going to damage you financially. Google has not being caught attacking people financially, or threatening them, or even paying off lawmakers thus far.
Google is also allowed to place their products in their own systems by default (not android, which is open source). If you buy a Samsung phone, Samsung picks the defaults - some are google, some are Samsung, some may be the carrier you bought the phone from. Of course if google makes the Pixel, it's going to have google's defaults. Are you surprised when you buy an LG TV, LG also makes the remote, despite plenty of other brands of remotes available for that TV and others?
As far as buying themselves default search placement within other products? Have you been to the grocery store? Did you know every brand you see there pays for better shelf space and placement?
There is literally zero wrong with any of what is going on here. Plenty of bad things about google. This ain't one of them. This is them purposely damaging their product for some of the possible market. Which is their choice. They are not "gimping the web" - they are simply making their own products worse. If google's search wasn't a good default choice in a firefox (it is), firefox would take google's payoff, decline in use and lose marketshare, and in lose turn google's money since they don't have market share anymore. And firefox is free to do that.
The reason he compares it to desktops, is because the only thing in the performance test that's comparable, is the desktops. The M1 system tested had double the RAM of the Intel systems. For processing video, RAM makes a huge difference. If your editor can store all the decompressed clips and cache in RAM, your performance increases exponentially. In addition, the "live playback" score is weighed heavily for some reason, when that's an easy test all systems can perform fine, and is not timed.
It's not clickbait, because the article literally tells you both the positives and negatives, as well as titling it "mixed bag" instead of "apple loses." We call this a balanced review.
But I get it. Anything that's not "m1 is revolutionary" is clickbait to the apple crowd.
"While performance of the new M1 Max-based MacBook Pro in Adobe Premiere Pro looks very good compared to the previous-generation MBP with discrete graphics, it doesn't look that good compared to x86 workstation platforms with standalone graphics processors."
This is the big takeaway. All of Apple's comparisons are to their own previous hardware, which was about a year behind the competition. In addition, if you look at Geekbench scores of the M1 Max, those include a lot of tests that use the GPU. When the tests are run against the competition, they are artificially lowered by running the test on the default low power integrated graphics, instead of specifying the discreet GPU.
In reality, the M1 is not a pro laptop as the industry defines "pro." It is only a pro laptop compared to Apple's other offerings, and competes with mid-tier laptops from everyone else.
It wouldn't be a surprise that a CPU that can't run most of the software out there (because that software is x86) and has ditched all compatibility and started a design from scratch, can beat last-gen CPUs from competitors. For specific apps and workloads, for which it has accelerators.
But here I am, with a pretty thin and very durable laptop that has a 6 core Xeon in it. It gets hot, it has huge fans, and it completely obliterates any M1 laptop. I don't mean it's twice as fast. I mean things run at 5x or faster vs an M1.
Now, this is a new version of the M1, but it's an incremental 1-year improvement. It'll be very slightly faster than the old gen. By ditching Intel, what apple did is making sure their pro line - which is about power, not mobility, is no longer a competitor, and never will be. Because when you want a super fast chip, you don't design up from a freaking cell phone CPU. You design down from a server CPU. You know, to get actual work done, professionally. But yeah, I do see their pro battery is 11 hours while mine usually dies at 9. Interesting how I got my computer plugged in most of the time though...
>Because when you want a super fast chip, you don't design up from a freaking cell phone CPU. You design down from a server CPU.
Is that really true? I don't have any intricate chip knowledge, but it rings false. Whether ARM is coming from a phone background or the Xeon from a server background, what matters in the end is the actual chip used. Maybe phone-derived chips even have an advantage because they are designed to conserve power whereas server chips are designed to harvest every little ounce of performance. IDK a lot about power states in server chips, but it would make sense if they aren't as adapted to rapidly step down power use as a phone chip.
Now, you might be happy with a hot leaf-blower and that's fine. But I would say the market is elsewhere: silent, long-running, light notebooks that can throw around performance if need be, you strike me as an outlier.
Pro laptops should have a beefy CPU, great screen, really fast SSD, long battery life, lots of RAM which (presumably) your notebook features, but the new M somethings seemingly as well. But in the end, people buy laptops so they can use them on their lap occasionally. And I know my HP is getting uncomfy hot, the same was said about the intel laptops from Apple I think.
Apple doesn't need to have the one fastest laptop out there, they need a credible claim to punching in the upper performance echelon - and I think with their M* family, they are there.
You actually have it correct. When you start with an instruction set designed to conserve power, you don't get "max power." The server chips were designed with zero power considerations in mind - the solution to "too much power" is simply "slap a house-sized heatsink on it."
>Apple doesn't need to have the one fastest laptop out there
correct. My complaint, which I have reiterated about 50 times to shiny iphone idiots on here who don't do any real number crunching for work, is when the industry calls "mid tier" something that apple calls "pro" - apple is deceiving the consumer with marketing. The new laptops are a competition to Dell's Latitude and XPS lines. Not their pro lines. Those pro laptops weigh 7lb, and have a huge, loud fan exhaust on the back so they can clock at 5GHz for an hour. They have 128GB of RAM - ECC RAM, because if you have that much RAM w/o ECC, you have a high chance of bit errors.
There are many things you can do to speed up your stuff, if you waste electricity. The issue is not that apple doesn't make a good laptop. It's that they're lying to the consumer. As always. Do you remember when they marketed their acrylic little cube mini-desktop? It was "a supercomputer." They do this as a permanent tactic - sell overpriced underperforming things, and lie with marketing. Like using industry standard terms to describe things not up to that standard.
I’ll happily take my quiet, small, and cool MacBook and number crunch in a data center infinitely more powerful then your laptop. Guess that makes me a shiny iPhone idiot.
Relax, no one is forcing you to use Apple products.
I love how you added "laptop" to make your statement... still false. There is a program running on macos that literally recompiles x86 binaries to arm, then the m1 executes the arm code. the m1 does not execute x86 binaries. period. it only runs arm binaries.
No, parent comment isn't false, even if the wording could be more precise. It is true that M1 CPUs do not execute x86 instructions, but the machines do, in effect, execute x86 binaries. Also, M1 does have added instructions for TSO to improve performance of translated code.
Hipster graphic designers make upwards of 150,000 a year in my area. The professional in pro, never actually meant “software engineer”. It meant anyone who can hang up their signboard and work on their own: lawyers, doctors, architects, and yes… graphic designers.
Personally, I think software engineers don’t need fast laptops either. We need mainframes and fast local networks. Nothing beats compiling at the speed of 192 cores at once.
Which reminds me, laptops and render farms is exactly the technique those hipster graphic designers you talked about are using so they aren’t missing out on any power.
which is the top of their salary ceiling, and it's not a high number, like at all. the top number for software devs is about 700k. In my field, people make 200k+. But we're not talking about "pro" people. We're talking about a "pro" laptop. It's the best thing that apple makes - that doesn't make it "pro." It's got the performance of the midrange systems from everyone else.
>I think software engineers don’t need fast laptops either.
yeah, when I run a script to read a few gig of performance data and do a bunch of calculations on it, I need a fast laptop. Until that's done, I'm sitting there, CPU maxed out, not able to do anything else. With an M1, I have to arrange my schedule to process the dataset overnight. With a Dell I run it over lunch. Case closed.
>We need mainframes and fast local networks. Nothing beats compiling at the speed of 192 cores at once.
I'm not a software engineer anymore. When I was, no, I did not usually compile on the server. I compiled on my workstation. Because you're not on a fast local network. You're at an airport for 4 hours, or on a plane for 5 hours, or on a comcast connection at your house.
Rosetta 2 kicks in, performs a JIT/AOT translation of the x86 instructions to ARM instructions, executes those, and caches the resulting ARM binary for later use.
Please stop being so hostile to other users. It really doesn't add anything. You have made some factually questionable comments yourself, and I say this as someone who has worked on JIT aarch64 translation very similar to Rosetta 2.
Right? I’m curious what pro’s do “at the indy 500”.
Most devs where I work use 15” macs and probably a blend of apps from jetbrains toolbox. Mostly connected to power outlets to be fair.
So we’re talking local installs of spring boot app Java servers, front end Webserver, an IDE to work on one of those, because opening a second one on a Intel mac will either run the dev out of RAM or the heat will cause a shutdown.
The thing is, the corporate DELL windows machines available were largely unsuitable to dev work due to the trashy interfaces (low resolution screens, bad trackpads, battery life so bad you can’t make it through a 2 hour meeting undocked). The Windows laptops available really failed hard when they needed to be laptops.
It's fine to work sometime on battery. Except, after 5 hours, marginal utility decreases, and 8 hours it goes to zero. Why would I need more than one day?
because you're a "dev" who makes web pages for a company that can't afford an oracle license. and your office is a starbucks. but you want to call your little toy a pro, because to non-programmers, you make missle guidance systems. well not you. but the other few people on this thread.
Yes, using a laptop for over 10 hours on battery is not for people who do any serious work needing a pro laptop - what is in the professional circle called a workstation. Glad you understand. Note apple's stated hours: 11 hours while browsing the internet, and 17 hours for watching videos. If this is your use case, you are not the target market for a workstation. Apple sells "pro" laptops like Kia sells racing cars.
But here I am, with a pretty thin and very durable laptop that has a 6 core Xeon in it. It gets hot, it has huge fans, and it completely obliterates any M1 laptop. I don't mean it's twice as fast. I mean things run at 5x or faster vs an M1.
Probably not faster than an M1 Pro and definitely not faster than the M1 Max.
Your machine doesn't have a 512-bit wide memory interface running at over 400GB/s.
Does the Xeon processor in your laptop have 192KB of instruction cache and 24MB of L2 cache?
Every ARM instruction is the same size, enabling many instructions to be in flight all at once, unlike the x86-64 architecture where instructions vary in size and you can't have nearly as many instructions in flight at once.
Apples-to-apple: at the same chip frequency, an M1 has higher throughput than a Xeon and most any other x86 chip. This is basic RISC vs. CISC stuff that's been true forever. It's especially true now as increases in clock speeds has dramatically slowed and the only way to get significantly more performance is by adding more cores.
On just raw performance, I'd take the 8 high-performance cores in an M1 Pro vs. the 6 cores in your Xeon any day of the week and twice on Sunday.
And of course, when it comes to performance per watt, there's no comparison and that's really the story here.
Now, this is a new version of the M1, but it's an incremental 1-year improvement.
If you read AnandTech [1] on this, you'll see this is not the case—there have been huge jumps in several areas.
Incremental would have resulted in the same memory bandwidth with faster cores. And 6 high-performance cores vs. the 4 in the original M1.
Except Apple didn't do that—they doubled the number to 8 high-performance cores and doubled the memory width, etc. There were 8 GPU cores on the original M1 and how you can get up to 32!
Apple stated the Pro and the Max have 1.7x of the CPU performance of Intel's 8-core Core i7-11800H with 70% lower power consumption. There's nothing incremental about that.
By ditching Intel, what apple did is making sure their pro line - which is about power, not mobility, is no longer a competitor, and never will be.
Pro can mean different things to different people. For professional content creators, these new laptops are super professional. Someone could take off from NYC and fly all the way to LA while working on 16-inch MacBook Pro with a 120 MHz mini LED 7.7 million pixel screen that can display a billion colors in 4k or 8k video—battery only.
If you were on the same flight working on the same content, you'd be out of power long before you crossed the Mississippi while the Mac guy is still working. At half the weight of your laptop but a dramatically better display and performance when it comes to video editing and rendering multiple streams of HDR video.
The 16-inch model has 21 hours of video playback which probably comes in handy in many use cases.
Here's a video of a person using the first generation, 8 GB RAM M1 Mac to edit 8K video; the new machines are much more capable: https://youtu.be/HxH3RabNWfE.
if you define efficiency as compute per watt. I don't give a flying crap about watts. Efficiency is measured as amount of work done per hour. Because I get paid for the work, and then I pay the two dollars a week for the watts. I don't care if it's five dollars a week for the watts or two. I do care if it's two hours of waiting time versus five.
lol no. the $20/month cost of electricity is a rounding error for my $6k laptop and the $50k of software licenses for it. It's even less of a rounding error for the datacenter, where a $500k ESX farm that has several million in software on it farm uses $5k of electric per per month including cooling.
Have you noticed almost no one uses ARM? There's a reason for that. Including software being licensed per core, so faster hotter cores and fewer of them win.
I also have a Xeon laptop. (45w TDP E3-1505m v6 in a dell precision).
Xeons are not magically faster than their i7/i9 counterparts (mine being not faster than a i7-7820HQ which is its contemporary flagship high performance mobile CPU). In fact they can be slower because the emphasis is on correctness and multi core, not usually single thread performance.
Xeons are also slower than a modern AMD chip which also can have more cores.
5x is a performance metric that doesn’t match up. Unless you have a desktop class 145w/165w cpu, in which case it’s not going to get 9hrs of battery unless you’re not actually touching the CPU. More like 30 minutes of heavy use on the largest battery legally allowed in laptops.
Edit: I just took a quick snoop on geek bench and found a modern xeon w in the largest dell precision laptop available:
Synthetic scores aren’t everything, but I’m hard pressed seeing how you can get 5x performance out of a chip that scores almost exactly half. Even with hardware accelerations like AVX512 (which causes a cpu to get so hot it throttles below baseline even on servers with more than adequate cooling.)
My experience as well. I, too, have a dell precision with a 8 core xeon part, and while it looks decent its heavy and not noticeably faster than the m1 I replaced it with when it came out. The xeon would get hot and noisy when running teams or hangouts. It sits in my drawer for the last year or so.
M1 does not. Code compile is about as fast. Battery lasts a 3 day business trip or a hackaton without charging. I never heard its fan. I dont care much about brands, but lightweight, fast enough and well built M1 is praiseworthy. I am not getting the pro or max, as the benefits for me as a software dev are probably not worth the extra weight and power consumption.
Citation please on “the Xeon dell smokes the M1 air”, geek bench says the M1 air can be twice as fast.
All other things being equal: your statement is simply not true.
I just checked and I can’t find a mobile Xeon with a greater TDP than 45w, so you’re stuck with that geek bench score because that’s essentially as good as it gets for a modern mobile Xeon.
Xeons, fwiw, are just higher binned i7s and i9s with features still enabled. The reason they can be slower than i7s and i9s is that the memory controller has to do more work and the way Intel does multi-core (essentially a ring bus) doesn’t scale gracefully always.
All things are not equal though. Geekbench includes many things that run on the GPU - video encoding and playback, rendering web pages - heck even your window manager mostly uses the GPU. The Dell has low power low performance GPU. To use the second one - an NVIDIA RTX, which is literally the fastest thing you can put in a laptop. You have to explicitly tell your OS to use that GPU for a program - it defaults to the low power one.
In summary, you are full of crap if you think an untuned blind geekbench score is what you're going by - an aggregation of a whole bunch of tests, using defaults. My statement is true, as I kick off the same data processing script on my laptop and it finishes it over lunch, while my coworker kicks it off overnight.
> Xeon with a greater TDP than 45w
yes, the Xeon W-11955M in the Dell is 45W. Now add the RTX GPU - which coincidentally will be doing most of the work. Unless you're running the geekbench test you're referring to, to purposely gimp the results. That Intel integrated graphics chip uses almost no power.
go process a large dataset and do some calculations on it. Run a bunch of VMs while you're doing it to - let's say 3. Give each one 32G of memory. Better be ECC memory too, or your data won't be reliable. Maybe in about 5 years when apple catches up to the current pro laptops, you'll be able to. This is why all the m1 comparisons they do is to previous generation intel chips in their old laptops. which have always been slow. apple has always used outdated hardware, in everything they've ever made.
I guarantee you, an M1 is about as fast as your "6 core xeon" laptop. M1 Pro/Max will steamroll it. You can look at Cinebench, LLVM compile, Java benchmarks, etc. You're completely delusional claiming your laptop is 5x faster in CPU. Mocking a "phone CPU" when the A15 is actually faster than a 5950X in single core performance shows you don't know what you're talking about.
You probably wouldn’t have gotten downvoted as much if people hadn’t (ironically?) read the part you said they wouldn’t read :p And I want to mention that I agree with you about apple not selling to pros very well over the past 6 years.
Your laptop is definitely very capable. But it’s barely a laptop. Why not build out a proper desktop? This precision would be a pain in the ass to carry around for most people, especially travel. Dell made sacrifices to get that kind of power: size, cooling, and battery life. Those are actually meaningful things when it comes to a laptop for most people, even pros.
I think the fact that you’re even mentioning a MacBook <em>air</em> in the same sentence is very good for the Air. The M1 hasn’t really been marketed to the pro market until the recent release.
Also, 5x faster at what? The M1 is about the same performance as a W-11855M at single core cinebench, and only 25% slower at multicore. So comparison to the M1 pro/max is not very promising for the Xeon chip.
Engineers in the field, on oil rigs, doing CAD, Simulations etc. need huge heavy desktop replacement laptops. The licensing cost for the software they use is usually above 100_000$, and only certified for RHEL or Windows.
Even at 8k$ the laptop is often below 5% of the budget.
it's because you're using "geekbench" for your numbers. which is a combined number of misleading stats. it includes things like "encoding video" and "rendering html" - things that the m1 has specific optimizations for, which in the real world are done by the NVIDIA RTX on my Dell, with the CPU sitting at under 1% utilization. Yes, if you offload these tasks, which in the real world don't use the CPU at all, and run it on a CPU with special accelerators for these useless tasks, the CPU designed specifically to game "geekbench" metrics will win. In the real world, I got a multi-gig dataset I need to process and do calculations on. Go put a database on the M1 and see if it beats a xeon. Or for an easier test, just load up excel with lots of functions and lots of calculated rows and columns. Run that while running a couple of VMs for virtual appliances on your laptop too (128GB ECC RAM helps with that).
You're literally here saying the M1 is going to replace a server chip. Newsflash - the M1 doesn't even run any of the needed code, because it's arm code - a small niche.
in fact, i'm not sure you can even order a xeon precision w/o ecc ram. but i'm not here to do your research for you for things you can look up in a minute. you're the one that claimed a xeon precision doesn't come w/ ecc ram, without even googling it.
my laptop is a 7660. I also have an i7 5560, and a latitude 7410 w/ an I7. It's what work gives me for work - and yes, I use all 3 since we went full remote. For "pro" work. The M1 laptop is comparable to my 7410 - which I use to play online games and chromecast videos. Not any real work. It's a kids toy compared to my 7660.
since you seem to be lost here on this tech site, instead of hanging out on reddit with your peers: all xeon CPUs support ECC ram. If you want you can go on ebay, buy ECC ram, and put it in any Xeon system. Or, you know, just order it w/ ECC ram from Dell.
>It is common for Dell Precisions to ship with Xeons and not ECC ram.
correct. because most come with 16 or 32GB or RAM, and are the low end of Dell's pro line. Once you get a lot of RAM, like 64 or 128GB, and you're crunching numbers and running VMs, your chances of a memory error go up dramatically. Which is why you need ECC RAM. Which has zero to do with your post that I was replying to, claiming precisions don't have ecc ram. now find me an m1 laptop with 128GB of ECC RAM. Because you're right - "Enough" strawman astroturfing from you.
The M1 is a Latitude competitor - not a Precision competitor. Apple's "pro" line is considered mid-tier from other vendors. Their tests showing it beats xeons are comparing xeons released 2 years ago, that for some reason they used in their apple laptops. Because Apple has always used outdated CPUs compared to everyone else.
> since you seem to be lost here on this tech site, instead of hanging out on reddit with your peers
This kind of behavior makes you seem much more lost here on HN than the guy you're replying to. And looking at your downvoted and flagged posts all over the place, HN seems to agree.
I don’t need to google it when I own such a system.
My precision did not ship with ECC ram.
ECC also needs to be supported by the motherboard; all AMD Ryzen chips have ECC enabled but due to limited motherboard support: many are not able to effectively use ECC.
If you have the time could you share the output of `sudo dmidecode —type 17`?
you're looking at geekbench scores that do a bunch of GPU-offloaded stuff. and they used the low power integrated graphics, not the NVIDIA RTX in their tests - you have to explicitly select the discreet GPU for a process to use, as it defaults to the low power. so, something that doesn't even look like a mistake of taking the defaults - something that looks like deliberately lying to game the numbers.
Yup. I, too, call BS on that. I do own a precision laptop with a 8 core xeon and this thing is heavy, noisy and can't work on battery for more than 2hrs under normal workload.
Just for fun, I looked up the most expensive Dell Precision Xeon laptop, which seems to be the Precision 7760 with a Xeon W-11955M.
With a GeekBench score of 1647 ST/9650 MT, this $5377.88 machine is just a bit faster than the passively cooled $999 MacBook Air with 1744 ST/7600 MT. The MacBook Pro 14" 10 core is better than the Dell Xeon in about every way, performance, price, performance per watt, etc.
This is the xeon that's in the laptop. the geekbench score is base on running tests on the low power integrated graphics instead of the discreet NVIDIA RTX GPU. the numbers you idiots keep quoting are completely bogus. anywise losers, enjoy your apples, i got real work to do.
I run about 50k worth of software licenses on the laptop and generate millions in revenue per quarter with it. That's why it's a pro laptop, and I'm sure my company paid about double your number after you add in dell's pro support plus. Pro laptop for pro work. You're a kid who wants toys, but wants to say you're using professional equipment. I got something that's like the pro mac laptop. work gave that to me too for secondary tasks. it's called a dell latitude. it runs the latest i7 and no ecc memory. great for chromecasting porn and playing games in the browser, 13 hours of battery unlike the precision's 9, and much lighter. they just don't call it a pro.
I think there's also a point everyone who talks about other "discoveries" of America purposely ignores. This is about the discovery of it not for Europe the continent. This is about its discovery for what was the civilized nations in Europe. So if we look at Iceland - a place that's essentially a standalone island already half way to America - where a bunch of vikings lived who at the time weren't hanging out with people from places like Spain or Italy or France - it's an apples to oranges comparison.
First Iceland is only half way to North America if you consider Greenland (which is kind of weird since both Iceland and Greenland are islands between the two continents). The distance between Iceland and Labrador is twice as long as the distance between Iceland and Norway. And the double distance is on top of much much rougher seas of the Labrador sea then the North Atlantic. So for small sailboats Europe is definitely close while North America isn’t.
Second, people traveled a lot both to Iceland and from Iceland in the centuries after the voyages mentioned in Grænlendinga Saga. Ships went to Iceland to trait, or fish and people went from Iceland to continental Europe for pilgrimage, trade, etc. These people definitely talked to each other and told each other stories of their ancestors. I wouldn’t be surprised if some Portuguese fishermen were told Grænlendinga Saga while wintering in Iceland sometime in the 14th century after their trip home was delayed for some reason. Or that a pilgrim from Iceland told a fellow Spanish Christion in broken latin about Leifur Eiríksson on their way to Rome.
Third. Flateyarbók (which contains written stories about the norse settlements in North America) was written down in the mid 13th century. The Icelandic sagas were coveted by Scandinavian royalty and I bet royalty in both Norway and Denmark knew about it’s existence, and might even have heard Grænlendinga Saga recited.
Now it probably wasn’t common knowledge that there were lands west of the Atlantic which people once tried to settle, but it probably wasn’t unknown either.
It is not hard to imagine an alternative scenario where by some freak luck Christopher Columbus happens to talk to a person who’s great grandfather told a story about an Icelander they walked part of the way to Rome with. “Curious folks those Icelanders”, they say, “in the old times they used to sail all around the world. Even going West of Greenland”.
“Greenland? You mean the icy land way north in the Atlantic where they get those Walrus husks?” Columbus replies.
“Yes, there! Apparently there are some much more favorable lands south west of there. I wonder how much further south it reaches, maybe as far south as Africa?”
Or maybe a scenario where a common crewman on Columbus’ voyage knew about these stories from a Basque fisherman who in turn heard them while on a fishing trip to Iceland. “This isn’t Japan”, he claims. “An old friend of mine heard stories about lands as far west as this—albeit further north as well. Maybe these islands are of the same island chain which lie between Europe and Asia”. This crewman is promptly laughed at. “Off course this is Japan, our captain says so.” They say, and the crewman never mentions it again.
One doesn't have to dream. Flashing a rom on google-branded phones is so simple, a non-tech person can follow a 5min youtube video to do it. The Nexus 6 from 2014 can have the latest android running on it - not just security updates. And unlike an iphone, it has a build that disables some eyecandy that keeps it actually usable and fast. As I understand it if you run the last supported ios on an iphone 5s, with all the patches, you can take a nap while waiting for the answer slider to draw when you get a call. IS that the dreaming you're talking about? During the nap?
I'm a tech guy though. had a nexus 6, now got a pixel2. all custom roms, completely degoogled. In addition to phone tasks, I use the phone for solitaire, basic web reading, and email. I charge once per week. Both phones are extremely easy to flash. No hacking or exploits required.
That’s all great - but don’t lure people too much into a false sense of security. While your Nexus 6 may run a shiny new version of Android, underneath it runs a crusty old 2017 kernel full of holes of different sizes. The community is great, but vendor support remains important. LineageOS and other projects can’t fix things in kernels they can’t compile - they can only provide security updates for open source components.
That makes Google’s promise here so key. 5 years of updates is 5 years of kernel level fixes. After that, it’s probably left up to the community.
I really don’t recommend people to go out and buy abandoned Android phones to flash software. LineageOS and other community projects are a blessing in many many ways, but they don’t make your phone completely up to date. And that’s something one should make an informed decision about (buying an iPhone, I decided against that).
> underneath it runs a crusty old 2017 kernel full of holes of different sizes
> LineageOS and other projects can’t fix things in kernels they can’t compile
I think that you're wrong on this, that is unless you decided to use term "kernel" above too liberally, referring to all software running on a device. AFAIK, alternative Android images, such as LineageOS, include relevant - and quite up-to-date! - AOSP common kernels (aka Android common kernels or ACKs; https://source.android.com/devices/architecture/kernel/andro...), which are open source, plus some manufacturer-specific proprietary binary drivers and firmware (though there exist a related, but slowly-moving, project Replicant focused on creating and maintaining a fully open, i.e., kernel + drivers + firmware, Android distribution: https://replicant.us).
No, I’m talking about the Linux kernel. You can check this for yourself. Take a look at the roms distributed on LineageOS as the example project and see if they include kernels that are up to date in any way. For older phones outside of vendor support, those kernels will always be out of date.
Some diligent LineageOS projects are known to incorporate some open source kernel fixes sometimes, or grab newer blobs from other phones from other devices. But there’s only so much to they can do. In general, it’s true to say that older devices with community Android support are not completely up to date - the kernels are old, and vendor drivers are not getting updated. Outside of making big usability concessions in projects like Replicant, the community can’t do much here.
Good points. Though I'm a bit confused by your reply. Are you saying that LineageOS folks do not always or, at least, mostly use the latest AOSP common kernels for their relevant ROMs (as opposed to "some open source kernel fixes")?
I don’t know. I’m saying that custom rom use kernels that make your phone work. In the best case that involves shipping 1) the driver and firmware blobs the vendor provided while supporting the phone and 2) a kernel that is binary compatible with those blobs. Because of how Linux works, in the best case (2) is an old kernel of the same major version as the vendor shipped with the phone, with maybe some security fixes that made it into the mainline kernel or in the Android kernel. But if your stock rom has security bugs in e.g. the wifi driver, graphics driver of baseband firmware, your custom rom has those exact same bugs. Even if the custom rom is years newer than the latest vendor update.
Just ran across this relevant nice little article, which I found quite interesting: https://arstechnica.com/gadgets/2021/09/android-to-take-an-u.... I hope that people who interacted with me in this sub-thread (and other folks here) will enjoy reading it as well.
So would you please help me to find an ROM with an up-to-date Android Common Kernel for my i9300 Samsung Galaxy S3?
AFAIK, the only way to run it with working drivers for all hardware components, are ROMs which use the rusty 3.0.101 Linux kernel from back in the day and I think that is what DCKing is referring to. If you want to create a new ROM, you either have to use the old kernel and have an upper Limit of Android 7.x (in this case) or you have to accept, that not all components are supported (e.g. no GPS).
I would be glad if the situation would be different. Maybe it is different for phones you buy today?
Obviously, not all devices have up-to-date kernels. It depends on whether they are supported by relevant Android distributions. That's why I used the phrase "quite up-to-date" instead of just "up-to-date". Unfortunately for you, LineageOS has stopped supporting i9300 Samsung Galaxy S3 with the latest official release being 14.1, which is based on Nougat (Android 7.1.2).
Having said that, I ran across the following post that describes successful installation of LineageOS 18.1 (Android 11) ROM on Samsung Galaxy S3 i9300: https://devsjournal.com/install-lineage-os-in-galaxy-s3-i930.... This is just FYI. So, if you understand relevant risks and feel adventurous, you can try to install it on your device. Disclaimer: I'm neither affiliated with the author of the post, nor responsible for any damage that might be associated with following the advice contained in the above-linked post.
Thank you for looking up that ROM, as I might want to try it out. However, you are also proving my point, even that ROM with Android 11 is still running the old 3.0.101 Linux kernel. You can see it in the video at the last row:
So congratulations to the guy who made it possible to run Android 11 with that ancient Linux kernel, even when Android officially doesn't support it. And to illustrate what I mean by ancient: Linux 3.0 was released in 2011 and got support updates until 2013 [1]. So even when CyanogenMod/LineageOS supported the Samsung Galaxy S3 the included Linux kernels were old as crap. You can't blame them for it, as they had little choice given that a few crucial drivers are not open source and included in the upstream Linux kernel.
I just wonder if anything has changed for modern devices?
backported 4.2, which includes some of the 4.3 changes as well. supports lineage. 4.1 is a version google supports till 2024, so I'm assuming 4.2/4.3 is going to be even later. So, you got a phone from 2011 that's going to run a modern kernel and latest android till after 2024.
> And to illustrate what I mean by ancient
yes. I would love to see an iphone from 2011 that's going to be running the latest ios and apple kernel after 2024.
Given that the kernel still identifies as 3.0.101, my guess is that they just backported some features from 4.x and applied them to the ancient kernel ;-) I am not so sure that qualifies as a 4.1 in terms of Android support.
I think the discussion about which devices live longer is simple to answer: Apple (iPhone) and Google (Nexus/Pixel) do probably the best job of supporting their devices for a while from a manufacturers point of view (in comparison to Samsung, Xiaomi, LG, Huawei, Sony, etc.). However, if you want to spend some time and flash alternative ROMs yourself you are better off with Android due to the large modder community, but it also depends a bit on the device you bought.
My biggest issue on the other hand, is that if the manufacturers would also open source the drivers, they could be included in the Linux kernel and we would not have this discussions, because one could simply use an up-to-date kernel as you can with every PC.
And how does the kernel affect you in any way. Most of the internet runs on old kernels because servers user long term stable kernels anyway. If they ux is good the kernel shouldn't be a problem to you
1. UX: most of the time kernel updates don't affect the user experience. However, from time to time there are scheduler updates which can have positive effects.
2. Security: Being able to run the kernel with the latest security updates is evidently very important to have a system that is not vulnerable to newly discovered exploits.
3. Dependencies: As discussed already, some software components like the Android itself requires certain kernel features and therefore certain versions to let you run the latest versions of the software.
Btw. even LTS kernels are just supported for six years or so.
My biggest problem with the situation is, that 99% of the software is open source (Android incl. the Linux kernel) and just a few vendor-specific drivers make it very hard to upgrade the kernel and therefore the system.
It is different for phones made by the people who also make Android. Google. Which is why I was specifically talking about the pixel and the nexus phones sold by google. For example, kernel version 4.9.3 - the latest one (yes, originally released in november of 2017) supports up to the latest Android. In fact, since 4.1 supports the latest Android, and will till June 2024 according to google. I'm going to go on a limb here, and given the current timeline, project 4.9.3 is going to be supported for probably whatever android is released in 2026.
So, Nexus6 released in 2014 will be able to run the latest android, fully security patched including kernel (which is not that important), till about 2026.
Now let's keep in mind that I replied to a guy who said how great it is that ios has more longevity.
> So, Nexus6 released in 2014 will be able to run the latest android, fully security patched including kernel (which is not that important), till about 2026.
This is getting to borderline misinformation here. Sorry to have made you dig in to this position, but please don’t call this fully patched. Qualcomm abandoned the Snapdragon 805 in the Nexus 6 in 2017 (maybe even 2016), and no updates to that platform's kernel drivers or other proprietary components exist. You can patch up open source pieces - those are important too - but that doesn’t count as “fully security patched”. Kernel drivers are a very important vector on any system, on Android especially so.
This is why e.g. CalyxOS has these EoL notices for Google devices much newer than the Nexus 6 here: https://calyxos.org/install/
They’re honest not everything can be updated!
If you choose to run your devices this way, more power to you. It's a legit way of extending a phone's life with some tradeoffs. But please inform others about the actual limitations.
> For example, kernel version 4.9.3 - the latest one (yes, originally released in november of 2017) supports up to the latest Android.
I couldn't find anything online about Nexus 6 kernels that are not some version of Linux 3.10, which despite being an LTS release was EoLed by the Linux kernel developers end of 2017. Would be curious to get any sources on the information that the Nexus 6 has modern-ish kernels available.
It's a rare feat that Android devices get a new major kernel version, _even with_ vendor support.
It's not the kernel security updates that are important in regards to this 5 year promise, those are all open source and can be applied to any device a ROM (such as CalyxOS) supports. It's the proprietary firmware blobs that are the big deal, and what this 5 years promise from google means is that those blobs, required for certain hardware on the device, will receive 5 years of security updates. And that's good, because those are the security vulnerabilities that e.g. the CalyxOS team cannot patch themselves (no source code).
This is why CalyxOS now makes it clear what devices they support are still getting full security updates (kernel + firmware blobs) or just kernel updates. I believe the most recent CalyxOS patch added the ability for the user to see in settings the month and year of the last firmware security update for their device vs their current kernel security update.
Alright - I'll bite. This is a smartphone, not a windows PC with a bunch of services. There is Zero listening on any port. There is no attack surface for any kernel - the only thing there would be a bug in mms. Please share your source for kernel attacks, on any android version, that's not an attack on an app - but on the kernel. No, this is not a google play attack, or an attack on an outdated app - which are updated fine.
In addition, I'm unsure why you think you can't update the kernel on a phone. In fact, updating the kernel is standard procedure for... pretty much all directions on flashing a custom ROM. I had my nexus6 on kernel 4.9.3. There are literally new phones, right now, selling with that kernel version and earlier, with android11.
This is like saying windows server 2016 has a kernel that's outdated, or that windows 10 which came out in 2015 is outdated.
I think you are extremely confused.
>I really don’t recommend
Which is a good thing, because you should not be recommending about things you do not understand on even a basic level.
>After that, it’s probably left up to the community.
right. the entire point of my post. you can load stuff from the community. which includes the community of things like lineage - a big official community that's an llc - a corporation like redhat.
A phone is not a server. It is not a security risk to run an outdated kernel. there are no services running a hacker can connect to. You don't connect to a kernel over the internet. A kernel which is by no means out of date, and is currently running in many datacenters.
Smartphones aren’t servers, but they run tons of services that interact with the surrounding world.
Bluetooth, WiFi, etc…
The kernel also still plays a vital and security-meaningful role in processing calls from applications.
Running an out of date kernel could mean strangers ransoming your data, or could mean an attack becomes persistent and starts logging and uploading through reboots.
Running an out of date kernel often does not result in this, and that higher level security matters first.
However, the kernel does have an attack surface through those higher levels, and pwning the kernel still means something.
Those datacenters are running LTS kernels with minor versions updated, or have security patches backported, or have far more limited connections to the world than your phone — only one protocol, one port, one service, for example.
We are not talking about datacenter servers - we are talking about smartphones. you can run a 4.9 kernel with all security patches applied, just like you can run windows10 with all security patches applied. You can update bluetooth and wifi modems without going to a later kernel version. We call those drivers, not kernels.
The issue you note is only exploitable via a bug if you have an outdated version of the chrome browser. You don't need to update the kernel, in order to update an application.
Seriously, I feel like I'm talking to my wife here, who is not a tech person. Why are you and the other couple of people being purposely dense, and purposely ignoring the content of your own links that doesn't fit your viewpoint?
BTW, after you said smartphones aren't servers, you go on to talk about why an older kernel is bad on servers.
But since you asked, the latest 4.9.3 kernel running on that nexus6 from 2014, that's been compiled appears to be from the end of the year 2019.
Good luck finding drivers for phone wifi, bluetooth, etc. That’s the fking problem — linux doesn’t have a stable driver api, so the binary blobs drivers will not allow people upgrading major linux kernel versions.
If everyone around you is stupid, then maybe you don’t understand the topic at hand?
> There is no attack surface for any kernel - the only thing there would be a bug in mms. Please share your source for kernel attacks, on any android version
There are various kernel level vulnerabilities listed. Some weakening privacy over tcp connections, others locally exploitable via a malicious app such as Pegasus.
I don't understand why you call him confused. Perhaps you can approach with curiosity instead.
I'll start by saying I spent a full 5 minutes reading through those and gave up. I asked for an example, you pasted twenty pages of random garbage and said "here, maybe you'll find something in this dump I took - why don't you spend some time and maybe I'll prove you wrong."
In those five minutes of looking through your garbage dump, I found Zero vulnerabilities that do not need either you installing a virus, which then gets root (the vulnerability), or a bug in an application running as root that's out of date, which then of course gives the attacker of the application root. None of those are valid examples, and I'm now bored digging through random garbage.
Any hack, in Any application, will give the attacker root - we're running rooted phones (for the extra functionality).
If you want to make a point, note the actual bug listed that does not need a compromised application. You installing a virus then the virus getting root does not count. The thread is about a kernel bug giving a remote attacker control of your phone. Applications and drivers like your modem can be updated without you updating the kernel. The latest N6 kernel is 4.9.3, with updates from the end of 2019.
Do you also run all your programs as root on desktop? Wtf.
Also, regarding your previous post, modern Android and ios is lightyears ahead in security than any desktop os out there, for good reason (majority of people interact with their phones, and store much more sensitive data there)
>Do you also run all your programs as root on desktop? Wtf.
yes. always have. same in windows where I also don't use antivirus. and this is what most tech people do for their personal equipment. because the one issue I had, in my 30+ years of using computers, and 20+ years of doing it professionally as a dev, sysadmin, and storage admin, I only once got a virus.
i'll tell you a little secret too. yes, it's wtf to people who don't know what they're doing and need the safeguard against when they screw up. I know enough to not screw up. now go pipe a bash script from a webpage to sh to install something, because that's what the installation manual for your game said to do.
Anyone saying they know enough not to screw up, most definitely knows hardly anything. Also, screwing up is not about knowing enough, it’s about being human, who make mistakes.
Running anything under root is just insanely stupid.
for those times when I run that terminal application on my phone. which is already rooted, so it's doesn't need the kernel bug to get root. it can just run.
I'm not sure I'm the one confused here. Not really willing to get combative on what security priorities one should have, but I'll stick to mine.
> I had my nexus6 on kernel 4.9.3.
I find this very hard to believe, as no evidence of Nexus 6 kernels that are not Google's original 3.10 shipped exists that I can find. Even PostmarketOS that looks to update kernels links to LineageOS fork of the 3.10 kernel on their page for shamu/Nexus 6.
Unless you mean a custom kernel from "some guy on XDA" that names itself 4.9.3 like this one - which is just kernel 3.10 with some branding on it. It says so right in its description: https://forum.xda-developers.com/t/kernel-sm-4-9-3-o3-graphi... . Kernel 4.9.3 is a weirdly specific point release to be on in modern times anyway - there's kernel 4.9.0 all the way up to 4.9.287 - so it'd definitely be oddly specific if that's what you had.
Outside of valiant community efforts like Replicant and PostmarketOS, who have an extremely hard time getting working or feature complete kernels running, Android devices getting new kernels is almost unheard of. Even with vendor support. Community ROMs have to stick with what the vendor gave them to have a functional device.
I think you're terribly naive if you think a phone kernel has no attack surface. It is absolutely a security risk to run an outdated kernel. It has nothing to do with whether there are services running for a hacker to connect to; it's about whether it's possible for an attacker to trigger buggy behavior somehow, whether that's sending malformed packets or Bluetooth frames or invoking patterns of syscalls that cause bad things to happen. Heck, here's an obscure bug in Linux on the front page of HN right now, which Android is based on: https://googleprojectzero.blogspot.com/2021/10/how-simple-li... Also, I know GP was specifically talking about upgrading the kernel, but keeping drivers patched is much harder without vendor support, and there's likely to be more attack surface there.
your phone is not a linux server. yes, if you install a virus or an outdated app, someone can daisychain a priv escalation using a kernel bug. no need for that though - my phone is already rooted.
Your car has pieces that run linux too. Guess an attacker can make you crash.
> drivers
since this is about iphone and android comparison, guess what has those same driver blobs form those same exact manufacturers. apple doesn't make their own bluetooth chips. oh, btw, the drivers get updated just fine, since that's part of the kernel and os, which all get updated just fine.
google supports kernel 4.1 till 2024 for android 11. the nexus from 2014 runs 4.9. so probably 2026 kernel and android, fully patched - 12 years.
oh, sorry, did you forget this thread started with a guy claiming ios is great because you can put later versions of the OS on there? where's that iphone from 12 years ago running the latest version of ios, and still performing fast? because that's what this thread is about.
I really don't get why you're so hung up on this server thing. Yes, a phone is not a server. But it still runs a lot of complicated software. Software has bugs. We haven't found all the bugs yet. Hence, it's important to keep all of the software as up-to-date as possible for when people find some of the bugs.
> Your car has pieces that run linux too. Guess an attacker can make you crash.
> the drivers get updated just fine, since that's part of the kernel and os, which all get updated just fine.
Just because the kernel is getting updated does not mean the drivers and firmware are also getting updated. Drivers are specific to hardware, and if a vendor stops shipping updates for some chip that is no longer used in newer phones, then you aren't going to get updates for that chip.
> since this is about iphone and android comparison
This isn't about iphone and android comparison, not for me. You made naive claims about kernels not having attack surface and unimportance of staying updated, and I am responding to those claims.
I love the openness of android, and the explicit permission to root your phone.
But iphones have amazing longevity. The iphone 5s you mentioned came out in 2013 - which is 8 years ago now. Back then Obama was still in his first term. Maybe it is way too slow to handle the most recent version of iOS, but I'd rather a phone vendor that releases operating system updates for 8 years than a vendor who releases updates for only 2 years (like you get with certain android vendors.)
Last year I replaced my iphone 6s with an iphone 12. The thing that astonishes me is that I didn't need to. After a battery replacement, my 5 year old iphone was still running fine. It still runs the latest OS, and it ran every app I threw at it with aplomb. I really only upgraded it as a personal indulgence. Its still in use by a friend.
I'm absolutely on board with complaints about apple's lock in. I'm disgusted by some of the documents that came out in the epic court case, and I wish you could easily root iphones. But it feels like a stretch to complain about their longevity.
I love the iphone. I get my wife the latest and greatest every two years, and I forget about it. If she had an Android anything, I'd be spending at least an hour per week on tech support. It's absolutely worth the inflated price for me, and the fact that it's extremely limited in possible features is a bonus. Just like I used to love stick, but now an automatic tranny is great, as my enjoyment is the destination not the trip.
Now as far as the iphone 6s being usable - that's my point. It is usable, on the old OS it was designed for. Because you can't load your own OS on it, it will never run the latest. While the Nexus does run the latest, and is completely usable. I do remember when my brother loaded some latest ios on his iphone 5S, and it literally became too slow to answer a phonecall.
Apple's lock-in is in my opinion a feature for its target market. That's why they get like $1200 from me every two years. Me, my concern was battery life. For that I needed to not have crap that keeps phoning home and waking up the phone. Imagine charging once per week. While not an issue now, I used to travel a lot. Country-hopping trips. Yes, you can charge at the airport, tied to a full charing pole for an hour. Yes you can charge while sleeping on the plane and have a usb cable hanging in six inches in front of your face getting in the way. Or... You can literally not worry about it for a week.
There are of course other things - I want to chromecast my screen or cast a movie from a pirate streaming site (not the youtube app). I want toggles on my lock screen and home screen to turn off data/wifi/bluetooth. I want to turn on the flashlight if I press both power buttons when the phone screen is off. I more importantly need a filesystem that I can store OVAs on that I can take to customer sites for demos - why would I carry a usb stick when my phone is always with me. I want a web server running on it and my laptop to dump a backup of itself onto the phone daily. This means the phone phone software needs to recognize that the phone hardware is a computer, not a toy for 5yo kids. My wife on the other hand needs it to be a toy, because if it wasn't, she'd do everything possible to get viruses, delete everything, and screw something up. So I got an android, she has an iphone.
Now, you think I'm complaining about longevity. Let's see the reality though.
The post I'm replying to touts the iphone's longevity compared to Android. I point out Android has much, much longer longevity and he has it backwards. You then declare I'm complaining about the iphone's longevity.
Now, normally I would normally unload on you with all kinds of funny (for me) things at this point, because you now fit into a certain category of people, but this isn't the place.
> get my wife the latest and greatest every two years, and I forget about it. If she had an Android anything, I'd be spending at least an hour per week on tech support.
That's... An odd thing to say. I'm not sure what you're saying about your wife, but I've never had anyone, young or old, have a problem with an Android phone that would require anywhere near that amount of time.
I've got to hard disagree with many of the points here based on my own experiences.
My whole family has Android phones from different makes except 2 people with iPhones and they don't need hours of tech support. Your experience may be different, but I think most people using Android phones would agree that for the most part it just works.
For the battery life and the latest iOS, once you upgrade your iPhone to a later version, it is hard to go back, and you need hacker chops to do that if it is even possible. Later versions of iOS do often reduce performance and battery life.
On top of that, iPhones have smaller batteries so even with a tightly-integrated OS, what happens is that with active use, the battery level drops precipitously. Sure they last ages when not touched, but what's the point of that when a video call drops the battery by 50% because the battery itself is smaller?
Most people stuck to power banks these days are people using iPhones, especially the smaller iPhones. Androids have taken care of the battery issue by going with 4000 mAH+ batteries.
>If she had an Android anything, I'd be spending at least an hour per week on tech support.
I highly, highly, highly doubt that.
Considering how static phones honestly are after initial setup, when you've installed the apps you need and configured the few things you need configuring, you never touch anything that's not an app.
try installing 5 random apps per week from the google app store and report back to us with the results. make sure to change your phone to mandarin and look for apps from china. repeat in japanese and cantonese. this is what my wife does. she's a language teacher and translator.
> If she had an Android anything, I'd be spending at least an hour per week on tech support
First: I hope your wife doesn't read this ;-)
Second: I can believe this to be true if-and-only-if you tried to run it on a language setting you can't read!
Third: One gets comfortable with whatever phone+ecosystem they familiar with or use the most and that is not a basis to claim one is superior to the other. For that person, yes it might be superior(experience), but it cannot be extrapolated to the general population.
In my little circle, if anything, I hear complaints and "how to get this done on my phone" requests way more from iPhone owning family members and friends than the ones with Android phones. But this doesn't just make Android a superior OS over iOS, because it is just 1 data point.
While Apple does provide updates for older devices, the devices are barely usable. Foremost, you most certainly will need a battery-swap on anything older than 3 yrs (not a cheap proposition) and their devices seem to get progressively slower (in my albeit limited) experience.
Apple and Android ecosystems and user-bases are wildly different so a true apples-to-apples comparison (pun-intended!) is not trivially possible.
> Now as far as the iphone 6s being usable - that's my point. It is usable, on the old OS it was designed for.
That phone was running the latest OS when I gave it to my friend last year. I think it might have been running faster thanks to ios 13 (or whichever version improved performance). I believe you when you say your brother's iphone 5s became unusable with subsequent updates. But my 6s kept chugging along just fine, updates and all.
I'm delighted there's solutions for android phones like what you're talking about. This sort of thing is really important - I mean, they're fully fledged computers capable of way more than we're able to use today. Its crazy that people throw them out after a few years. My iphone 12 is faster than my 2016 macbook pro. And I still occasionally code on that laptop. If I could run OSX on my phone and use my laptop as a terminal for it, that would be really sweet. But I can't because Apple doesn't care, and I'm locked out of making changes like that on my own hardware. Using old phones as web / file servers would be fantastic.
Companies like Apple are actively incentivized by the market to make their old products feel worse over time. And for that reason I'm always impressed when occasionally they release an OS update that improves performance across the board.
I guess my take is, Android phones have an awful history of dropping official support for recent devices. I'm delighted the hacker community can and has stepped in to clean up android's mess. Its a shame they have to, but such is life.
I'm sad you can't do that on Apple devices, but one saving grace is that, the 5s aside, apple seems to do a much better job of official software longevity than android. I'm expecting my iphone 12 to last 5-10 years. I do wish the battery lasted all week though - that sounds phenomenal.
I'm not covinced companies are out to "make their old products feel worse over time". It's just the inevitable consequence of the steady march of technological progress. A 10-20% per year performance improvements and new radio/camera hardware just add up over time and means that your old phone is worse than a new one. And that's before you take into account any degradation in things like nand and battery that they try and make fail gracefully (even if the PR messaging occasionally goes very wrong on that front).
There is of course also a degree of investing time writing to the new hardware more than the old one, and just cutting down features that don't fit due to lack of processing power or just lack of underlying tech on older hardware, but it's not something that being able to throw a different OS on seems likely to fix?
> I'm not covinced companies are out to "make their old products feel worse over time"
I don't think they're trying to make their old products worse over time. But I also don't think companies generally care that much about making old products work better over time. One of the parent commenters noted how well modern android runs on really old nexus phones if you strip out the "modern" animations and useless features. There's nothing stopping google doing this. People would love it. So its notable that they don't. Apple got a lot of good will from me a few years ago when they focussed on performance in ios 13 (or was it 12?). That OS release made my phone feel new again. After that update I think it ran faster than it did when I bought it.
Another way to think about it is that when you buy a product, your incentives and the company's incentives are aligned. You want the best phone. The company wants your money, and knows they need to deliver a good product to get it. After you've bought a product, the company's motivations aren't as well aligned with yours.
Arguably a company sells more phones in the long run when they have a good reputation for delivering on quality, and supporting their products. Eg, a few years ago some of my friends would buy every single blizzard game simply off the back of their reputation.
But most companies don't take advantage of this, and mistakenly focus on short term sales even if it harms their reputation. And, in turn, their long term profits.
As someone much wiser than me said, service and support is a form of marketing to repeat customers.
People aren't comparing their old phone to the new hotness, but to how it was when it was new. I think it is reasonable to assume that planned obsolescence is a thing and that OEMs make their products slower on purpose so people buy new ones.
I completely agree with you. If you want simple, if you want OTA updates from the people who made your phone, if you don't want to worry about it - iphone is perfect, i buy them for my wife.
But the discussion in this thread was specifically about the claim that iphones unlike android have a long life of updates. That's like saying "my dell from 2010 came with windows vista, windows vista is not supported, the computer has a short support life. Umm, no, you put Win10 or Linux on it, and can probably put win11 on it, and in 30 years still put the newest linux on it.
This sounds brilliant. If you were to start fresh today, what device would you use? My aged iPhone is on its last legs, and I’m looking for something on the small side (preferably Nexus 5 sized at most) that I can degoogle and use for telegram, HN, reddit, music, and podcasts. It seems like every older phone has a gotcha, like nonfunctional cameras or missing wireless bands. 7 days of battery sounds magnificent.
For 5 years of real support you will need to just get the pixel 6 when the roms you are interested in are ready. Graphene, Calyx, and Lineage will support it, but it might take a few months for their teams to get up and running. They are all very fast though.
Graphene and Calyx only support the devices as long as google is putting out the security updates, so all the phones before the 6 will only get the ~2/3 years that Qualcomm limits updates to. I am not sure how lineage is able to support devices for so long after vendors stop supporting it themselves. They are a super dedicated community of volunteers, though. [Here](https://grapheneos.org/faq#legacy-devices) is where Graphene talks about why they drop support after vendors don't officially support the device anymore
Of these projects Calyx and Graphene are the easiest to install. Graphene you only need a chromium browser and to allow unlocking your bootloader in the developer part settings, and over webusb the whole wipe, install, and flashing of their key so you can re-lock it. Calyx has a script you download to do the same. Lineage is a hair more involved.
Samsung S10 and S10+ looks to me like the place to be, flashed w/ a custom degoogled ROM. Nokia Maps (here wego), Open Camera, something like Aptoide, K9 mail.
I did a lot of research earlier, because I don't use a case, dropped my pixel 2xl, and the glass on the corner cracked. I ended up just putting a dab of epoxy on it instead though. I use the carbonOS ROM on the pixel, which os only for pixels I think. You do have to go through a lot of system services and turn off the unneeded ones though. Lots of useless stuff like "carrier services" and "sprint dm" and a bunch of other crap - just google them one by one. An app like Fibers is great too - I use it to do things like display percentages instead of icons, and when I need turn off half the screen pixels. You can do that in low brightness situations like reading this site in bed w/ the lights off, and you can't tell it's half the resolution. Adguard is great too - blocking ads at the DNS level saves quite a bit of battery when online.
Now here's the main thing - I do spend a couple of hours per day using it - either for email or reading sites. I get about 4 days from 90% to 25%. I never go below 25% or above 90% - my battery is like new 3+ years later. Another feature that an iphone can't have - an app having access to limit your max charge limit. I'm just guessing that 100%-0% is going to be about 7 days, so I can't fully promise that.
Also, a couple of banking apps don't work. The Uber app doesn't work either - you have to use the website versions.
Sadly it's just too big for me. The 4a is about as large as I'd go. And I'm even skeptical of that because as networks phase out 3G and even some 4G bands in favor of 5G, phones that don't support 5G will become increasingly hard to use. Talk about planned obsolescence!
You're ok w/ a pixel5 size (6" screen) but you're not ok w/ s10 size (6.1" and has headphone jack)? And the actual phone size of the S10 is smaller than the nexus 5 you mentioned.
The Pixel 5 is larger than the nexus 5, but it's almost in the range of reasonableness. The S10 is well past that size range -- there's no way I can reach the top of the screen.
Admittedly, it is nice that the S10 includes a headphone jack. But phone size is even more important than that to me. Guess I'll keep using my 2016 iPhone SE for a couple more years!
> Flashing a rom on google-branded phones is so simple, a non-tech person can follow a 5min youtube video to do it
Lol, this reeks of how little you understand how much a non-tech person is capable of. A vast majority of Android users won't bother or know how to flash the ROM.
They throttled performance depending on battery capacity, because the alternative was that the battery could not keep up with a sudden spike, powering off completely.
The problem was that they didn’t notify users, and after people in France winning a lawsuit, they now have it opt in I think? Nonetheless, it was a feature made in good faith.
The optics of that were absolutely terrible. They could have announced it in WWDC and made people aware of it. Instead they raised a shitstorm for no good reason.
Good intent or not, the management execution of it was completely botched. They could have even come out on top by projecting that they care about older devices' usability, if they had announced the feature; it was a no brainier and PR/marketing person worth their salt would tell that.
Which makes me question the bit: "good intent on their part" :-|
it's not a custom rom thing. they check for root and sometimes google play. i have to turn off root if i run a banking app. uber won't work - i use their website version. vanguard app runs perfectly fine on my custom rom, downloaded from the aptoide store.
I just flashed my pixel 2 to lineage 18 now that it's not receiving updates, and I was disappointed to discover that because of SafetyNet, I can't use Google pay in stores or install the Netflix app. So to me it's still less than ideal.
It does not matter if it is a 5 min youtube video, that just wont happen to the majority of the population. Most people can't even change their battery in their car.
it's def not for the iphone crowd though. also the crappiest lineageOS support ever - but lots of other roms.
I got the panda version. after 2 months in my side pocket, the white paint that for some reason the google geniuses decided should cover the also white plastic, started peeling off. From rubbing with my leather wallet. completely irrelevant to the functionality, which is the good part about it. searched it and everyone w/o a case is having the paint peeling issue. but... it's a much better issue than the entire phone, front and back being made of glass that breaks and costs $100+ to replace, and falls frequently because it's too slippery to hold in one hand.