Is the M5 Max the first laptop with significantly more memory bandwidth than the M1 Max? Looks like about a 20% jump… might finally be time to re-benchmark CFD workloads.
> I have never once told my manager “it would be really nice to have a few junior developers. It would really help us get this project done on time”. They do “negative work”.
I have. A good junior can do in a week what a senior with domain knowledge can do in a half day, with only an hour of mentoring along the way. This isn’t a great exchange rate per dollar (juniors are cheaper than seniors, but not that much cheaper) — but seniors with domain knowledge are a finite resource, you can’t get more of them for love or money, while juniors are fresh-minted every semester. The cheapest way to shipping may not go through juniors, but the fastest way usually does; and that’s completely ignoring the HUGE side benefit of building seniors “the hard way,” which is still easier than hiring.
And as a senior+ with domain knowledge, with AI I can do the work of two juniors without the communication overhead + do all of the project management, dealing with stakeholders, etc.
But you don’t build seniors, you build capable mid level ticket takers who jump for more money at the first opportunity.
How is that true? As an architect when I was working at product companies, the director/CxO wasn’t going to call the junior developer to the carpet for not getting a task done or even the mid level ticket taker. Hell they don’t even know what the individual tasks are. I’m going to be the one ultimately responsible either way for project success.
Even when I was working at AWS as a mid level/L5 in the Professional Services division (lower title/lot more money), I was the one who was responsible for my “workstream” on larger projects. I couldn’t say “that’s not my fault. Blame the new L4 junior consultant who just got out of the internal boot camp for new grads”.
Now that I have moved back up to a more senior position in consulting, if a project I’m leading goes sideways, I can’t tell the customer that it’s not my fault, it’s the fault of the workstream leads and the workstream leads can’t tell me, it’s the fault of the more junior consultants who work under them. They will never talk to the client. The people under the workstream leads may not even speak English.
And that’s not meant to be an insult. The workstream leads have to he able to speak passable English and they can work with people under then who only speak Spanish.
But using a browser (or a VM) buys into the fallacy that your customers across different platforms (Windows, Mac, etc) want the same product. They’re already distinguished by choosing a different platform! They have different aesthetics, different usability expectations, different priorities around accessibility and discover ability. You can produce an application (or web app) that is mediocre for all of them, but to provide a good product requires taking advantage of these distinctions — a good application will be different for different platforms, whether or not the toolkit is different.
Just because you don’t want it doesn’t mean <checks notes…> a billion or so people don’t want an iPhone. Or rather, a phone they don’t have to dick with straight out of the box.
OTOH, I don’t really even know what you’re on about. Android is a nightmare because…it’s like iOS, which is “take phone out of box, restore from backup, sorted”? That doesn’t even make any sense, especially in light of what TFA describes.
For many people around my area, iPhones are a status symbol choice.
People in your area are very forthcoming. Not once have I ever heard someone vocalize that they bought an iPhone as a status symbol. “Easy to use”, “it’s what my friends use, and they like it”, but never “it makes me appear higher in the social strata”. They might think it, and I’m sure some do, but it’s not said out loud. Or maybe that’s not why the majority buy iPhones, dunno.
My area is eastern europe. Where an iphone is like 3-4 minimum wage salaries or so. People take loans to buy iphones! And you could probably also correlate them with luxury item buyers, around here.
Good thing the US message color thing is isolated over there and the peer pressure on Gen Alpha hasn't reached us.
But yes, I stick to my claim. You don't have to hard press those people to tell you that they don't use phones "for poor people" . The idiom is local and used both ironically and literraly.
It's just a "high-end" good here in the States. Elsewhere it's a luxury good, on par with a Rolex. I use an iPhone because of the smooth UI, the integration with my Mac, and the less-evil company that spies on me less. But let's not kid ourselves, these things are spendy, and conspicuous consumption is still a thing.
>Not once have I ever heard someone vocalize that they bought an iPhone as a status symbol.
This might come as a shock to you, but people don't vocalize and share their desires and impulses on why they buy or do certain things, why they dress a certain way, why they sleep with certain people, etc. Apple's entire brand was built on being different and desirable at the lizard brain level.
In many parts of the world, people even take bank loans to buy iPhones simply because it's the device that all rich people, politicians, athletes, celebrities, influencers use. They don't buy based on the specs and reviews, they buy on what their lizard brain tells them, and no tech company does that better than Apple.
Paying 2000 USD for a phone absolutely is a status symbol. And nobody actually says a status symbol is one - like nobody says look at my Rolex watch, I paid 50000 USD for it. Nobody needs a 2000 USD phone and nobody needs a 50000 USD watch.
The problem with buying a $2000 iPhone as a status symbol is that no one knows whether you bought the $1100 256GB model or the $2000 1TB model unless you tell them.
But someone that cares about watches knows whether you paid $5000 or $50000 for your Rolex just by looking.
iPhone is a status symbol is more places than it is “normal” to pay $1k+ for a phone (this is yearly or less salary in most of the world). gotta come down from the ivory tower
> Android is becoming more and more like iOS: anything that the user used to be able to do... they can no longer do
The article shows this is not true, if you know the similar process for iOS.
The article could be compared to the iPhone setup process. There are some preferences to uncheck, but there is no third party spying software on an iPhone when it arrives. Contrast to Samsung.
That type of rhetoric won’t get you what you want. Don’t dismiss something just because you don’t like it.
iOS devices are not toys, and even if they were there is value in toys, and even if there weren’t it is provably false that “nobody wants those”.
Furthermore, if Google dropped Android it is misguided to believe “the FOSS community” would handle it and everything would be roses. What you’d have then are a couple of hardware vendors (like Samsung) publishing their own forks and dozens of different incompatible open-source versions that would get no traction.
iOS devices are. My iPad is the most useless piece of technology I own, calling it a "computer" is an insult to the actual computers I own. It's a toy, and not even a fun toy compared to my Nintendo Switch.
Android handles serious workloads fine, macOS takes software seriously. iOS is the only operating system that treats gatchapon as the pinnacle of high-performance workloads.
Hell, I'll double-down if you really disagree with me. ChromeOS, the operating system/spyware installed on e-waste like Chromebooks, has a more serious OS than iOS. It is more functional and capable, and undeniably the better professional OS. I say that with no love for ChromeOS.
iOS exists in a class of it's own, functionality-wise. A class much closer to game consoles than anything resembling a computer.
> Hell, I'll double-down if you really disagree with me.
No wonder the world is in its current state, if when faced with disagreement the reaction is “I’ll plug my ears and dig my heels in deeper” instead of “I wonder if I’m missing something”.
> ChromeOS (…) has a more serious OS than iOS. It is (…) the better professional OS.
For starters, there are professionals (as in, people who get paid to do a job) who do their work on iOS. Not programmers, but writers, illustrators, animators, video editors, photographers, film makers… Maybe you can’t (or refuse to even try?) doing your work on an iOS device—I certainly choose not to—but that in no way means no one does.
But all of that is irrelevant when you consider the very true fact of life that not everything is about work. Many people want something else, and not making all one’s computing decisions around work is healthy.
> there are professionals (as in, people who get paid to do a job) who do their work on iOS
I don't doubt it. There are people who get paid to do their work on a web browser, if iOS wasn't capable of that it would be a travesty. The flexibility of iOS pales in comparison to the absolute worst desktop OSes, like Windows and ChromeOS. The DAW, IDE and NLE software on iOS outright cannot compete with the offerings on Windows, macOS and Linux.
> Many people want something else, and not making all one’s computing decisions around work is healthy.
You've conceded the original point, then. I can do "real work" with an Xbox, toy shovel or Lego bricks, but it's still a toy at the end of the day. The real tragedy is that iPad and iPhone hardware doesn't have to be limited by toyetic software. It's entirely Apple's choice to restrict my iPad from supporting WINE, having Linux containers and running actual IDEs that aren't arbitrarily gimped by distribution terms.
I don't think my comment is controversial among most iPhone owners, it's only the hardcore ecosystem enthusiasts that debate it. Most people really do treat their iPhone and iPad like a set-top box or games console; it's the minority who rely on it for work. A passionate minority, certainly, but nowhere near the market share Windows and ChromeOS carved out. iOS and iPadOS compete from the sidelines, still struggling to displace (or match) Windows.
You have been refuted. Repeatedly. But as you yourself have said, you double down on disagreements. So I understand why you have been called a troll.
> it's only the hardcore ecosystem enthusiasts that debate it.
That’s not true at all. Case in point, I don’t care for phones. What I did care for was your exaggerated rhetoric. As someone who is critical of Tim Cook and modern-day Apple (especially around the state of their software), I’d rather criticisms remain grounded in things the people at Apple can understand and fix, not made up ramblings that make them dismiss critics as lunatics to ignore.
Your tone changed drastically from the original post. You went from derogatory terms and claiming “nobody wants” iOS devices to them having a “passionate” user base and recognising they can be used for work.
My tone has been consistent the entire time; people want iOS devices because they're toys.
In the overwhelming majority of all professional niches that exists, iOS does not compete for any market share. It is not obsoleting anything, anecdotally and statistically.
Any questions? Or do you have proof that macOS and Windows are genuinely threatened by iOS?
My personal experience is that the setup procedure wildly depends on the phone's vendor.
The biggest difference between setting up a Pixel and an iPhone I experienced was that Google asked for certain settings beforehand that I had to turn off in the settings after setup on iOS. Both would've been a lot faster if I hadn't tried to disable optional account stuff.
Contrast that to Samsung, especially their non-flagship models, where the setup wizard took forever because of the crap Samsung added to the process.
That said, I do appreciate some "tutorial" parts of the setup process on Android. When I first set up an iPhone, I got the distinct impression that Apple assumed I already knew how to do everything. Their interface isn't exactly intuitive if you haven't used iOS before, no matter what online forums may claim. It took me several tries and a Google search to figure out how to remove apps, for instance. Perhaps one might find it an annoying extra step you're going to skip as a power user who's used to the platform, but it felt strange to be dropped into a strange, new operating environment with no instructions.
Samsung phones are the most bloated pieces of shit I have ever seen. Mine came with an app just to view the msn.com web site and it couldn't be uninstalled. You have google and samsung in a tug of war over every single thing the phone does. I thought my home button was breaking because of how unresponsive it was, but it turned out it just waits for a sequence to trigger such and such unwanted AI assistant or whatever the hell. It's a crapshoot whether you accidentally launch google assistant or bixby or some other crap service you never even heard of.
There are touchwiz sound effects that I hear in public and to this day it sends a chill down my spine because it brings me back to so much miserable time spent with that abomination of electronics.
On Samsung phones you can skip making a Samsung account. All the Google bits still work and it's basically the same as having a Pixel, except you'll have a few unused apps, a different camera and phone app and a very slightly different UI.
I had to guess arcane adb permission commands to stop a 2025 Samsung tablet from nagging the user about creating a Samsung account. It just kept showing up multiple times a day. But nice enough hardware with the promise of long updates at a reasonable price.
Heck, I own a zflip 6 right now and the ONLY hard requirement out of everything in the article I gone through was the google account. And I know of a way to get rid of it too, it would just break a couple of things.
I had mi phones before and they had only one hard requirement of needed a xiaomi account for developer mode. I circumvented it.
The article isn't just a pathetic piece of writting in general, it's also operating on tip top tier redditor-brained US-defaultism where "Android = Pixel/Samsung". I have no idea why the crowd here is letting it stick around.
To be fair, they are doing with a Samsung phone, and Samsung is the Apple of Android (Big marketing budget, mid quality if we are being generous).
Samsung as a company is a universal No Buy. The fact OP bought Samsung makes me raise an eyebrow.
Credit to Apple where credit is due. When I unboxed my first iphone, I was happy to give Apple all my personal information, birthday, emails, ssn.... It was bizarre, I'm usually apprehensive to give this stuff away, but Apple made it fun. Within a few days, I was disappointed by a lack of widgets, slow transitions between screens, and a buggy podcast app. But the damage was done, my company was out $600 and Apple had my contact info.
Samsung's UI and software behaviour may be shitty in general, but they're one of the few manufacturers reliably offering timely long-term security updates. When you go beyond Samsung, you quickly end up with brands promising "quarterly updates" or having months-long delays fixing CVEs.
Plus, when they do something novel, they do it quite well. Their flagship phones have great price/performance if you buy them a month or two after launch (often for three quarters of the launch price + free earbuds/smartwatch + cashback), their software suite is quite complete and generally well-localised, and they have support channels non-English support channels available.
I do wish they'd fix some of their terrible software design crimes and stop the endless race to the bottom shoving product placement into their apps, but it's hardly a no-buy to me.
Pixel is much cleaner and ships security updates monthly like clockwork. Plus you can install GrapheneOS and you get security updates multiple times per month, no AI nonsense and sandboxed Google Play Services if you need it.
It's pick-your-poison. iPhone setup is eight hundred screens, half of which are upsells for Apple services, but at least it's only Apple services. Android setup, if you're not on a Pixel, is an invitation for the vendor's dozens of "partners" to all get your money and all your data.
There’s enough satellites in Sun-synchronous orbit (97-ish degree inclination) that polar coverage should be pretty good by now, I’d imagine. The gap from the big guns (GEO and MEO) is more than made up by LEO.
I’m pretty sure that the state of the art right now is firing the pastries on a ballistic arc in hard vacuum and hitting them mid-trajectory with a laser pulse to cook them through.
Pretty much everyone is on 300 mm wafers for everything now, and has been for a while. Are you perhaps reading this as 300
nm process (which would usually be called 0.3 micron)?
But in the context of what we are talking about it's still true that nobody in the EU is making cutting edge CPU/GPU/DRAM and there are no plans to do so either (including that Infineon fab).
I think a challenge to me for typing assembly, unless you’re doing old-school C style minimally-useful types, is that assembly types tend to be both more ad hoc and more transient than types in higher level languages, because these types come from the intersection of the problem domain and the way of expressing the solution, instead of just from the problem domain. In C++ I might have a type for “aircraft velocity in mm/s”, but in assembly I might have that type on one line, and then go to velocity in 2x mm/s the next line to save a renormalization; or have types for various state flags, but have them pack differently into a word in different places in the code. This is all expressible, but I think it would make me favor a more implicit typing with a heavier emphasis on deduction, just to minimize the description of types that exist but are not in themselves interesting.
Just thinking about an aircraft's velocity as a specific type, rather than a vector with three floats, has my mind whirling. I can imagine a lot of terrifying things I wish I didn't think could be added later to that struct in some avionics system. What would you need a type for that for? Am I thinking too high level, where this type might include its own getters and function calls?
Think of types more as physical units to check your calculation. The position on a chess board and on a checker board are both 2d integer vectors but you might or might not want them able to be summed together, the same way that 5 liters and 5 grams are both real numbers but should not be summed.
So if your algorithm counts apples and counts pears, those wouldn't both have the type "integer". Far from it. They would have the types "number of apples" and "number of pears".
See the other replies — think physical units. An aircraft velocity is a three-vector, but not all three vectors are aircraft velocity. There are probably many different aircraft velocity types, but taking a typical one (NED alignment, mm/s scaling, some particular precision), the type is the set of three vectors that can, for example, be meaningfully added to each other. It makes sense to add two aircraft velocities; it does not make sense to add an aircraft velocity and a pixel color (another three vector), so they are observably different types.
Any 3x3 vec (position 3, velocity 3, rotation 3) that would be more than that, a class specific to an aircraft, would be something much more. It would have position and velocity, and then it would have functions (getters, setters, or prediction functions). Something about the way OP said this was "a type" makes me very suspicious, because those raw values ought to be what you get when you poll the xvxrx ... of the type. If you're getting some sort of interpolation from that, it would obviously need to be from something which extended the type of an object in a certain position and attitude.
The reason I'm saying this is that this is exactly where terrible feedback loops occur, when a type may return an altered version of its basic data when trying to access the basic data.
If it's a type which has an underlying 3x3 vec and this type is specific to the aircraft so it has a bunch of overriding functions to operate on that data, that's fine, that's called a computer program. Not a specific type extending a vector.
reply