I would like to highlight just how much Apple is focusing on their customers and use cases right now. It seems that they're targeting products to what their professional customers actually want. And in this case, it's a 3.7" little thing that can process 18 streams of 8k video (fully specced out). That's kinda crazy, and they're doing it at a price point that's competitive compared to all of the companies out there.
Bravo Apple. I'd love to see what they have in store for designers and programmers next.
It's insane how boldly (for Apple) they focused on ports and cables. I can't remember a previous demo video that showed so many ugly cables hanging out the back of the machine. They've finally realized it's a selling point!
They did better with the limited space they have — they included a bunch of multi-functional USB-C ports that can be used with dongles for additional HDMI ports!
USB-C (or Thunderbolt) can also carry a DisplayPort link natively, and you don't even need a dongle for that. Workstations don't really care about HDMI, that's for TVs.
Not only that but with the Ultra variant, all of the USB-C ports are thunderbolt 4 ports at 80gb/s, that's a massive amount of IO.
(note that despite the number of ports - only 4 of them can be running displays, plus the HDMI port gets its own channel)
On Max it's only 4 thunderbolt ports (and I would assume 2 displays + 1 HDMI since it's half of an Ultra) but still, that's about 2x anything else on the market.
> Workstations don't really care about HDMI, that's for TVs.
And people who only have HDMI cables! I was using display port on my old MacBook, but I'm using HDMI on the new one as I already had an HDMI cable, but not a USB-C to DisplayPort one.
It's still useful to have at least one HDMI port. For things presenting at other people's offices, where you don't want to be carrying a cable or dongle around with you.
USB-C to DisplayPort cables are like $14 on Amazon. I'm responding to you using one. I have a MacBook Air in clamshell mode with 1 USB-C going to the display and a $10 hub that does Power Delivery and 3 USB-A ports for speakers, Yubikey, and external SSD for backup.
I had a M1 Mac Mini but I do occasionally want to take a Mac with me so I replaced it with the Air. I only slightly miss the extra ports on the Mini.
Tangent: it's a personal pet peeve that Model M buckling-spring keyboards often pull too much power for modern PCs. This is particularly apparent with the PS/2-to-USB adapters but also happens with some PCs using "native" PS/2 connection. And then I got a fancy Unicomp Model M with native USB connection and... that does it too.
That was on a Dell desktop I had at a previous job. Not sure if it was just kinda crappy, or whether the inrush current on those Model Ms is just that big.
I haven't found a reliable HDMI setup for 4k at 60 FPS yet, while OTS display port has been fine. TBQH in this price point and market segment that's the minimum requirement, so I can understand why they don't ship with HDMI.
For standard lengths up to 10m any HDMI 2.0 cable from decent manufacturers, like AmazonBasics or Monoprice will be fine. For longer lengths it gets tricky and you have to be quite selective but that's a niche use case.
This is Google with their new battery-powered Nest Doorbell explaining that it can't record 24/7 video _even when it's plugged in_ because "it's too thin and the thermals presented problems."
MAKE IT THICKER. IT'S A GODDAMN DOORBELL. YOU CONTROL THE THICKNESS! THERE IS LITERALLY NO REQUIREMENT IT BE THAT THIN!
Because nobody outside hardcore computer nerds wants a beige box tower with 14 drive bays sitting on their desk. People want quite little boxes that gets work done without having to have dedicated room for loud machines.
The advantage of a huge beige box ( mine is white actually) is that easy airflow and acoustic padding makes them pretty quiet. But yes huge. Mine is 24" x 20" x 10". And very heavy. Not on my desk.
So your opinion is that you want it. My opinion is I don't. HDMI is much more limiting.
Say that you're a manufacturer trying to keep an eye towards wants new vs supporting what's old, what decisions do you make to support new and retro without increasing the costs? At some point, having backseat decision makers giving input results in the car designed by Homer.
Of all of the things I've personally railed on Apple about in the past as well as others, their most recent HW offerings have been a great compromise. They've brought back some USB-A, SD card reader, HDMI, etc. They realized they've gone too far in certain areas, but some how they had a come to Jeebus meeting that reigned them back in.
If there were two HDMI ports, how would that impact your one-HDMI world? It's clear from the comments others would find it useful. So that's a valid compromise. It's on the back of the unit so you wouldn't have to stare at that second unused port and shudder in disgust.
> trying to keep an eye towards wants new vs supporting what's old
No one is suggesting they trade anything away. Apple has settled on USB-C. But there's plenty of extra room on the rear panel. Your argument assumes benevolence on Apple's part. For all you know the chips support dual HDMI, they removed it to save money which ends up being pure profit because they didn't reduce the product price at all. Do you think adding a second HDMI would increase the arbitrary price from $2k to $2k plus $20?
For the record, I would LOVE it if the Mac startup chime played "La Cucaracha"
Sure, but I don't think any Apple Silicon Mac has dual HDMI. While I guess it's possible that they spent the money to design dual HDMI support in their own chips and then said "but now that we've done this, we can save twenty bucks by not actually making the physical ports," but doesn't it seem likely that they also saved the money by…not putting it on the chip?
> Do you think adding a second HDMI would increase the arbitrary price from $2k to $2k plus $20?
Do you think anyone contemplating buying a computer that starts at $2K is going to suddenly stop, slap their forehead, and scream, "To connect two HDMI monitors, I'm going to have to buy a $15 USB-C to HDMI cable instead of a $6 HDMI cable! That extra $9 blows my budget! CURSE YOU, TIM COOK!"
I mean, I get what you're saying, but this just seems like a weirdly tiny hill to die on. :)
> For the record, I would LOVE it if the Mac startup chime played "La Cucaracha"
You want a 2nd HDMI port, probably because you already own HDMI cables by the drawer full. That's a valid something.
However, it's not like you have to replace your monitors because of only one HDMI port. There are cables that go from TB4 to HDMI without dongles. Yes, you would need a new something instead of the something you already have.
Do you really know what the cost of adding a 2nd HDMI port to a system is? I do not. I do know it will be much more than the cost of the actual port connector itself. There's a new hole to be added to the case. There's more robotically soldered points to connect the new port to the system. Of the way the system is laid out on the board, is there enough bandwidth to supply signals to all of the ports? I have no idea the level of complexities involved in deciding what ports are available with the bandwidth of the system, but I do not for a second think that I could do it better.
For people with big clunky PC boxes, they stick it under a desk or on the floor because that's where it fits and it's out of sight.
Because of the "sleek" design and smaller size, Apple has people convinced that the computer should be like children in the '50s--seen but not heard.
If you offer someone a computer in the shape of the mac mini or this new studio design OR they could have a typical PC style box, I'm guessing the majority of people will choose the smaller box. Even if there are no Apple logos on the smaller box. Nobody wants a 30+lb box any more. As an editor, you can take this bad boy on the road with you without incurring overweight luggage fees at the airport.
The current model Mac Pro at least supports add-in PCIe cards. So you are only 1 sound card away from having as many headphone and other audio jacks that you want.
Which display are you using and does it wake up properly when you wake your mac mini? I've had a problem with displays that wake up normally with an intel based mac but not with an M1 mac.
I'm also using a 144hz monitor (Samsung 28" G70A) with a DP-USB-C cable. Wakes up normally and VRR works both with my new MBP (M1 Pro) and with the previous MBA (M1).
Like others have mentioned, I've used displayport over usb-c for high-refresh rate at high resolution. Atm I'm using that for low-refresh rate 30" screen. Though I believe LG and Dell screens do have USB-C directly.
For cases when you have let's say a gaming PC and a Macbook connected to the same screen. You want a USB-C port so the Macbook will get everything from a single USB-C port (power included), but you also want your gaming PC to benefit from a high performance monitor with high refresh rates on a DisplayPort/HDMI connection.
Isn't this what a dock is for? Are you treating your monitor itself as the dock? I still don't see why the monitor itself needs USB-C support.
I use a Lenovo Thunderbolt dock for both my M1 16" MBP and a Lenovo laptop running Linux. It runs 2 x DisplayPort monitors, mouse, keyboard, and webcam.
> Isn't this what a dock is for? Are you treating your monitor itself as the dock
Yep. There are monitors with Type-C, power delivery, usb hub, so they avoid you the need of having a dock. Usually they're priced at around monitor+fancy dock, so it doesn't even cost more for more practicality.
And yet none on the front. So your shoebox full of thumb drives are useless and you have to plug them in from behind.
Just to be clear yes I know USB-C thumb drives exist but they are next to useless because their whole purpose is to shuttle files between devices. Devices that likely don't have USB-C yet.
Apple is one of the world's biggest proliferators of e-waste through indirect means. No charger with the new phone huh? Well it's USB-C cable won't fit your old charger or computer either.
USB-A is the de-facto standard for thumb drives. Are we all just supposed to throw them away overnight because Apple blessed us with computers without USB-A ports? Or, they'll sell you a dongle for $19 which can be had on Aliexpress for $0.75. Would it have killed them to throw an extra USB-A 3.x port on the nearly empty front panel? It carries nearly the same signals as USB-C minus power delivery. No, this was a purely aesthetic decision. Apple could design a brick PC with no ports besides power and people would still pleasure themselves into a coma at the thought of spending $2k on such a grotesque monstrosity.
>Apple is one of the world's biggest proliferators of e-waste through indirect means.
This seems like a very disingenous way of saying that Apple is one of the largest companies on the planet and sells hundreds of millions of devices a year.
Given that Apple regularly supports devices almost a decade old with firmware updates, has one of the most aggressive environmental agendas in the industry, and has invested billions of dollars in recycling programs, I would hazard a guess and say that on an adjusted basis they probably compare very favourably to competitors like Samsung and other OEMs.
>USB-A is the de-facto standard for thumb drives. Are we all just supposed to throw them away overnight because Apple blessed us with computers without USB-A ports?
You're upset about having to reach seven inches further to plug in a USB drive. If that's enough to cause you to throw all of your thumb drives away then I'd say that reflects more on you than on Apple.
>Or, they'll sell you a dongle for $19 which can be had on Aliexpress for $0.75.
There's a considerable engineering (and therefore cost) difference between a dongle that carries all the required electrical certifications, and a 75c Chinese firestarter that stands as much chance of damaging your computer as enhancing it, but that's moot anyway since nobody is forcing you to buy the Apple-made adapter in the first place.
>Would it have killed them to throw an extra USB-A 3.x port on the nearly empty front panel?
Given the form factor, quite possibly. It isn't as simple as just cutting out a slot and sticking a connector in, you need to route that connecter internally and find space on the logic board for the supporting circuitry, which has all sorts of difficult implications for the board layout.
And for what? How many thumb drives do you think the average person needs to plug in at any given moment? Why do you assume that your extremely narrow use-case is so important that it must be given the highest possible priority for Apple's engineers to meet?
The overwhelmingly vast majority of consumers, even pro consumers, will do just fine plugging their thumb drives into the back of the very small box sitting in front of them, or plugging in a USB hub to one of the USB-C ports on the front. Why add complexity and cost to solve a non-problem for a tiny subset of users who, if they're anything like you, were never going to buy an Apple product irrespective of what ports it has on the front?
Okay, so don't give them your business. I'm sure you're protest with your wallet will have Apple contemplating their "0 thought applied to their decision" for the next go round.
If you are willing to buy a $0.75 anything from Alibaba and trust it with your precious data, you do you. As I've already suggested, you more than likely already have everything you need in a drawer, on your desktop, etc that will allow multiple USB-A devices to connect through a single port.
On that kind of device it doesn't cost them much and those features are more appreciated. It's not much of a war. On the laptops it becomes a bit more of a design conundrum. I'd imagine they have different leadership teams, and, Jobs + Ive are both gone. So there's that.
This is very interesting because PC workstations have been stagnating for quite long! It'd be cool to be able to bring medium-sized tasks back from clouds to personal computers.
The CPU performance is impressive but I wonder how well it'd fare against say an AMD ThreadRipper or a dual socket system with an equivalent price? It'd be also interesting to see a deep learning benchmark against Nvidia. Branding and RAM limits suggest this is geared towards video processing, but might also be useful for some other domains.
The 3990x runs a bit faster on the initial compile stage but the linking is single threaded and the M1 Max catches up at that point. I expect the M1 Ultra to crush the 3990x on compile time.
Even if they stagnate, you can at least pull out a GPU and replace it with another (when they're available for purchase) even if you don't change RAM/CPU which you could replace when faster is better as well. You can even change storage options.
None of that swapability is available on any of the Apple systems, so when they stagnate there's nothing that can be done.
Practically this doesn't often happen, especially on work machines. All the parts are loosely linked by generation. It's not often you want to jump forward 5 years in GPU tech but not want an upgrade in every other field. Unless you are swapping parts out yearly for the absolute latest bit, there is not a whole lot of benefit over selling your existing device and getting the new one where you will see improvements across the board.
Okay, what happens if something fails/dies? At least it's replaceable by the user. It's very common to upgrade the RAM capacity rather than buying it all at once. Many CPUs have stuck around long enough to see GPUs replaced. Hard drives get swapped out for faster/more space.
I love the idea of component machines, but at this time of my life, I would be just fine to never have build another PC. With that, these systems of plug in the cables and go to work are appealing. Having said that, it still rubs me the wrong way on how non-upgradeable these things are.
Depending on your use case. Assuming no video editing (since this is HN), for equivalent performance I guess they can't beat their rivals on price. Different story for power tho.
Why does HN mean no editing? I'm a dev and video editor and get a lot of work done with dev tools that would take in ungodly amount of time to do as a human.
If you packed those specs in to something twice the size of the mac studio it would sound like a loaded up server rack on your desk. Yes you can get absolutely massive coolers which are quiet but nothing nearly as portable as this.
Do you have a reference on how quiet these are? I mean it’s fantastic if true but half the reason my MBP gathers dust is the noise under load compared to my (almost) silent 5950x under load. But yes it’s a big box that I have no intention of moving.
Don’t feed the “but I can build a pc for less” crowd. They can’t see the apples to oranges comparison they’re making: a Mac is not a PC. They have different use-cases and run different software.
The end users doesn't care about the reasons, the internal debates at apple that lead to them making a computer that's slower and more expensive, or what marketing spin they put on it .
Having a clever reason to justify how it was intentionally designed to be slower doesn't change the fact that it's slower.
Apple is great at other things and you do get other benefits when you buy an Apple computer, the performance/dollar just isn't one of them for desktop. Even for laptop, the M1 was the first time they were completive on that metric.
> Apple is great at other things and you do get other benefits when you buy an Apple computer, the performance/dollar just isn't one of them for desktop. Even for laptop, the M1 was the first time they were completive on that metric.
That's a narrative which I think has been proven to be not true for quite some time? If you look at the prices of Windows ultrabooks from brand names, they have been at the same level as Mac laptops for many, many years.
Likely mostly running the same workloads such as web browsing.
> ... and run different software.
Maybe if Apple wrote it. Otherwise you can run the same Chrome/Firefox, Photoshop and Office on a Windows or even Linux Machine. Most software is shared between platforms nowadays with posix/windows being the lowest common denominator in terms of "cross platform".
I'v seen (more than once) Apple Store employees push people away from higher-end devices, or from buying devices altogether (happened with me) when they didn't think the bigger sticker price would serve the customer well, so at least apple is happy to give that same advice to their own customers.
I have seen this too. A woman wanted to buy an iMac Pro and after the sales guy heard her use case he kept saying the regular iMac was more than enough. She ended up getting that instead.
pretty sure apple lives and dies by their iPhone sales. Sure, they sell a ton of computers, but it's nothing compared to their phone revenue. I wouldn't be surprised if there were activist investors who think they should drop the personal computer/laptop product lineup entirely
Professionals are highly paid because of the tools they know. A Final Cut Pro pro isn’t going to switch to an off the shelf Linux or windows workstation and learn a whole new paradigm just because it’s cheaper. What I’m saying is the Mac platform is o be seen as a whole since some popular and useful Mac software like Final Cut isn’t cross platform.
Same goes if you prefer pages to word. Or software from Panic vs something else. The thing is if you’re most comfortable ale and productive on MacOS unless you build a hackintosh you need to buy a Mac and pay the mac tax but you get a whole integrated thing
I dabble in video editing and it seems most people work with Adobe Premiere Pro or Davinci Resolve. FCP has not been getting the love from Apple that it used to get. Same with Aperture vs Lightroom. I could be wrong though, but many people seem to be using cross-platform software on Macs, much more than a decade or two ago when it really was two different worlds.
You dabble. Pros are beholden to their tools. Some are experts at Premiere and others at FCP. Some even migrate to and fro but the mac has enough inertia here that they’ll still have the market enough to justify these insane chips and computers even if they are not pushing the software as hard as they are the hardware.
Right, but the link was for 2TB at the same speed, and it's a $600 upgrade for 2TB from 512, which means literally $800 for 2TB in total, plus tax! To be fair, it is likely higher quality, more efficient maybe, but I just think it's quite a reach to charge $800 USD for 2TB of storage that's not uniquely fast or anything. If you think that's a good value, it certainly might be depending on your work, but it would be twice as good value if it wasn't twice the price as off-the-shelf.
That gumstick can be replaced in two minutes with a single screw if and when it goes bad. As opposed to spending your day at the Genius Bar. I like Apple stuff but let's not make specious comparisons. Unless you think Apple SSDs are magic and can never go bad just because they are soldered in.
A terabyte of dead-slow and flaky SSD costs a hundred bucks, sure. But it doesn't matter much because, in the kinds of professional workflows this machine is targeting, local storage is more like swap space than anything. It needs to be fast more than it needs to be big.
You're forgetting Apple sources their SSDs from the same merchant that sold Jack the magic beanstalk beans. Or they just get generic chips from Hynix. One of those.
I think you would struggle, TBH. The m1 ultra will likely have geekbench scores equal to the 3990x ($5k). The gpu (based on double performance of the max) should be something similar to an RTX 3070 ($1k). 128GB RAM is going to set you back another $1k. 8TB of storage will be another $1k.
Then you have to add on motherboard ($1k), case ($200), cooling ($200), psu ($200).
Even if you cut a few corners, you are still over $8k for equivalent performance, and a substantially larger form factor. Yeah, you can upgrade, etc, but thats not really the point of the exercise.
That said, if you can think of speccing it another way, id be interested, as im looking for a decent workstation build that is x86.
Also the impressive things for me are it’s actually really cheap, as in bargain territory, and you don’t have to piss around with HP sales drones to actually get one like the Z series.
While I could be wrong, I think the Mac Studio may be their "desktop aimed at designers and programmers" -- get the M1 Max instead of Ultra, bump the SSD up to 1TB and the RAM to 64G and it's $2600. By no means cheap, but not the wheeze-inducing iMac Pro prices or the second-mortgage Mac Pro ones.
The caveat is whether they're going to bring out an Apple Silicon-based "big iMac" that's a more direct replacement for the iMac I currently have. I'm not really optimistic about that possibility now that they've shipped the xMac, er, Mac Studio, but never say never.
I got the impression that this Mac Studio and their new 27" studio display form a replacement for the iMac and iMac Pro 27"
Overall I like the idea, because years after buying an iMac I still have a nice monitor, permanently attached to a (now) slow computer that I can't upgrade. With this approach I only need to replace the Mac itself as its age or my needs catch up with it.
They are doing a great job. 180 turnaround from the company that wanted their products to look clean above usable.
Only downside to me is that in such relatively expensive hardware they should have doubled all storage options. Starting at 512GB for the entry spec and 1TB for the high end spec is rather low.
They still look clean though even after making them useable.
I'm really glad to see that the mea culpa they pulled when admitting the trashcan was a really bad idea wasn't just a bunch of platitudes. They've actually released good products after that. That last MacPro was interesting, but they still had custom dedicated GPUs that again had no replacements.
I suspect storage is one of their high margin profit knobs.
They do a great job of pricing it just a little bit more than their target market would prefer to pay, but they grin and bear it, especially if it is not upgradable later.
I sprung for the 2TB M1 Max...damn did I resent the premium I had to pay for 2TB, but I paid it in the end.
I have a 10-core CPU with a performance advantage of 23% over an M1 Max in Cinebench R23 (AMD 5800x) and a 1,280 CUDA core GPU (GTX 1060 6GB) that cost a little over $1,000 US total.
UPDATE - Sorry, I originally listed this as a 5900x by accident. I need more sleep.
Cinebench R23 is a worst case scenario for M1 because it doesn't have high core utilization and sits in L2 cache. If you look at a broader set of benchmarks (SPEC) then M1 Max in laptop form is competitive anywhere from a 5800x to a 5950x.
I've never talked to someone who was worried about 100 watts give or take in a desktop. The 100 watt difference, if you ran 24/7 is about 26 cents a day. A laptop would be a different story, of course, but these aren't laptops.
As for the footprint, it's usually not enough of a concern at the desktop to double or triple the price to go from a standard mini ATX down to a double height Mac Mini, but I'm sure there are some locations where that would be a concern.
No one outside of SF, Tokyo, & NYC cares about saving power or space for desktops which is why people panned the old garbage can pro. This is a slightly better direction, but I’m still waiting for a slightly more affordable bigger box Mac that isn’t an exorbitant Mac Pro. The old refrigerator models were reasonable. I don’t want an improved Mac garbage can model
I work from home in a 1 bedroom in NYC and a normal sized desktop works just fine under my standing desk with dual monitors here. Even if my PC were Mac Studio sized, I'd have it under my desk to maximize desk space.
It's the 5800x and 8-core. The stats are correct, I just mixed up the core count. It's my "new" desktop sitting behind me that I haven't finished moving onto yet, currently sitting without its video card in it.
I have an existing GTX 1060 6GB I already had from before insane prices. And I bought an HP Omen 30L 5800x that someone had purchased to pull the video card from.
If you're an Intel/Windows gamer, or a Linux/Nvidia GPU server user, no. But this is a tailored product aimed at a distinct professional class, which spends heavily on the latest hardware that boosts their productivity.
This. He brought along the kind of design philosophy that was very much needed in devices. But that philosophy became the new normal, and pushing it to it’s purest level did not end up being what everyone wants.
Products and needs evolve and change. Saying bad things about the man is nasty especially considering his gentle nature. Especially that we really actually don’t know who is exactly responsible for what in that world.
"Jony" was reportedly there forever and still influences the joint and previously had designed bathroom fixtures. To his credit, I think he did okay until the transition to magnesium and "al-u-min-ium" (lol) All he had to do was drill some fucking holes in his designs and they wouldn't have become component ovens. But also Apple was a bit of a struggling hardware company when that happened and this was an effective means of generating revenue from the loyal UX devotes, so I'm not sure this was entirely unintentional, at least until they got burned by trying to hold nVidia to the actual thermal specs.
I would also never buy such a machine. Still, in 25 years of owning my own computing resources, I have never actually expanded the internal storage of the 6-8 systems I've built.
IME, expandable storage in Macs was mainly used as a run-around around the Apple Tax on their storage options - buy the base model and upgrade the storage yourself for 1/3rd the price.
I always thought it was a fair trade-off. Pro users who care more about stability/support can pay the Apple Tax, and enthusiasts on a budget can DIY it and get a better machine for less. It feels like greed to cut off that path.
It was also way more important in the day of spinning disks - so many machines from that era got a new lease on life by having the HDD replaced with an SSD. Imagine if HDDs had somehow been soldered in...
For a prosumer there’s only so much you can offload to iCloud or NAS and eventually these machines, which are typically owned for a long time, do run out of space.
Apple added back expandable internal storage for the Mac Pro and I hoped they would relent here too
I think over all the designers that focused on how the products looked have lost the tight grip they had over them. For years, look was more important than function. It was probably a good thing since it helped make the company become what it is today. But giving engineering a bit more control is givings us some very functional products. Very cool.
As a programmer, data scientist and macOS user, I have to say I'm just annoyed by Apple these years. They've built a preset of users who may use their products, on that list are "Creative producers", etc. I'm not saying creative works aren't important - I personally enjoy poems, movies and photography very much. But just imagine how many MacBook Pros and other products Apple sell each year - does the world need that much creative producers, so we can indulge ourselves in Tiktok and other things? This is crazy. And I also feel excluded from their potential customer.
Everyone in my data science team is using MacBook Pros right now. At least in a corperate environment, I don't know why you would even want to train models locally. We either do it in the cloud (Databricks / SageMaker) or remote into on-premise workstations for research & experiments.
idk, like having no considerations for data integrity? Because like single 1-bit error in a 2-hour movie file just don’t matter, though it might for like, molecular biology research. idk tho.
Apple aren't just selling a tool, they're selling a lifestyle. Apple customers like to feel part of that artist/architect/film producer vibe, with the expensive weird glasses and green trousers. That's the marketing pitch and has been for two decades. The current Apple product engineers grew up on that marketing diet and came to believe that that was really who they should design the product for.
I said this because the design of new MacBooks and Mac Studios are completely following the need of creative producers. I would never use a SD card reader, and really don't need that slot to appear on my laptop. The return of those utilities is an obvious bias of the design.
This “Studio” mac seems pretty ideal for programming to me.
I don’t really see how Macs generally are limited to “creative producers”. The computer doesn’t care if you’re compiling, running containers, or doing whatever it is that people do to create tiktok videos.
That has always been their target market since forever, even developer tools on Mac OS classic were eventually outsourced to Code Warrior and PowerPlant C++ framework.
I don’t care if a laptop is a tad thicker, or the aesthetics would allow the device to sit in a museum - Ive helped make them desireable objects, the iPhone provided the scale and momentum and Apple silicon in macs at last allows for focus on actual usability. This set of kit will pave the way for a revolution in how we think about computing.
They just (November) released the best MacBook Pro since 2015.
Original keyboard back. Escape key back. Utility keys back. Mag-Safe back. HDMI back. SD card back. Headphone jack back. And all with a brand new M1 Pro chip.
I really wish they focused on bigger displays too. For music/video production having at least 2x27'' screen size is crucial. But two is kind of weird because you can't centre any app. And there is kind of too much...
I've never understood why 2x27" is supposedly "crucial" for music/video production. I had two large monitors for a while; eventually I went back to one slightly larger monitor because I found it was more comfortable and ergonomic to be staring at a single surface perpendicular to my eyes.
Note that I mentioned "2x27'' screen size", not "2x27''". I would very much prefer if they just just glued two 27'' studio displays together, effectively making it a single 32:9 10K 49'' screen.
Why are they doing this? Trickle down effects for their mass market products? There is no way that the prosumer market is big enough to justify this distraction for Apple even if it used to be their core business line.
Apple's market share has always been a small slice of the overall PC market. In a way, they never did "mass market". While the prosumer market may be a small proportion of the overall market, it could be very significant, for Apple, relative to the market Apple addresses.
Plus, I think a lot of people who would not normally call themselves a "prosumer" will want, and purchase these.
Apple's market share has always been a small slice of the overall PC market.
Small is doing a lot of work in that sentence.
Apple sold nearly $11 billion worth of Macs last quarter. Once you get out of the HN echo chamber and enterprise IT circles, Macs are quite popular.
In a way, they never did "mass market".
Having an Apple Store within a 20 minute drive of 80% of the American public counts as mass market [1]. Haven't been lately because pandemic but my local Apple Stores were always packed with people. And of course there's a Best Buy, Micro Center and other regional retailers that sell Macs in places with no Apple Stores.
It's not just prosumers; it's normies who just want a good computer made by a company they've heard of and trust vs. a cheap plastic 3rd tier PC from a manufacturer they're vaguely familiar with. I've been involved in user groups since the 80's; trust me, most Mac users are just regular people—not music producers and cinematographers.
An M1 Mac mini, which certainly outperforms most PCs in it's price class. The retail price starts at $699 but is available for significantly less via 3rd parties like Amazon.
If you think of the market segment as "non-plastic computers that don't suck", Apple is doing quite well. And now that Apple Silicon performance continues to outpace the industry as a whole, this will continue.
The other segment is the "I like nice things" crowd. They aren't price sensitive; they just like nice things and Macs have that in spades compared to the vast majority of PCs.
> It's not just prosumers; it's normies who just want a good computer made by a company they've heard of and trust vs. a cheap plastic 3rd tier PC from a manufacturer they're vaguely familiar with
It's like when people say their iPhone is better quality than a $200 Xiaomi phone. Well, duh??? Why are you even comparing them? If you look at higher tier Xiaomi phones, or in your example, laptops from reputable companies such as Dell, HP, Lenovo, Asus, they're much closer in build quality, have a choice of tradeoffs ( you can choose if you want small, light, long battery life, type of screen, ports, performance, etc. and not have Apple choose), and were still quite a bit cheaper than equivalent specs Macbooks. That's no longer quite as valid for laptops due to the M1s l, but is still valid for phones.
"Prosumer" is also a somewhat vague term. It's probably supposed to mean consumers with lots of disposable income, but if you're an engineer, developer, video editor, or studio musician it just means you think of Apple hardware as a business expense.
Yep, this is the requisite "hey Hollywood I know we haven't thought about your studio needs in a while, here's a bone that reminds you why Apple is the industry standard" play.
So, you'd think, and that's certainly what I was thinking when watching the thing... but they teased a future Mac Pro announcement at the end. This is a mid-level machine, apparently (similar to the old iMac Pro, I suppose).
There is a big market of creatives/artists who basically own an apple product as a decent chunk of their personality, for good or bad. Ergo, they'll sell.
The indie professional market is more than big enough to justify this. Small films have been able to roll some pretty impressive vfx on desktop computers recently, and North American creative types tend to love Macs.
I look at it from the other side - why aren’t Dell and Lenovo and HP and Microsoft doing this? Have they given up? Is assembling commodity parts into the same computer everybody else makes the only thing they can do?
> I'd love to see what they have in store for designers and programmers next.
We're not really their target audience. Back when they had a rack server, they had something that looked good to us, but we're not really the ones that will buy a maxxed-out Studio.
It's an interesting machine. It definitely is aimed at folks that would get Pros, beforehand. With the hint dropped at the end of the Studio presentation, I suspect that they may announce some crazy Pro, in the coming months.
Yes, but not so many that choose Apple, as their platform. Of those that do, they like laptops (like me). They are the target for the MacBook Pros.
Creatives, on the other hand, probably have a 50% or more Apple platform user base, and also like fixed workstations.
I used to run a dev shop for a photographic equipment manufacturer, and the Apple platform (even relying on the users having beefy workstations), was quite important.
Beautiful high resolution and crisp text rendering. I want looking at a dense page of code to look as beautiful and comfortable as reading a magazine. A comfortable keyboard and trackpad are a must too. Give me as much battery life as possible--at least a day or more. Performance, memory, and storage are less of a bottleneck these days and today's higher end specs are generally good enough.
I want looking at a dense page of code to look as beautiful and comfortable as reading a magazine
I am trying to imagine what a page of code laid out by a professional designer would look like and I can hear a million programmers screaming bloody murder in my head about the thoughtful use of whitespace, proportional-width fonts (even though they are ones chosen to clearly differentiate between confusing characters like 1/I and 0/O), and the occasional change in font size for... what is the source code equivalent of a subhead, anyway? Comments?
I am trying to imagine what a page of code laid out by a professional designer would look like and I can hear a million programmers screaming bloody murder in my head about the thoughtful use
There are beautiful programmer fonts nowadays that look amazing on Apple's high resolution screens.
There are several GUIs for Vim/Neovim that take advantage of the GPU and the text rendering abilities of modern computers in general and Macs in particular [1].
And once you get used to seeing your code this way, it's hard to go back.
Nicolas Rougier has also implemented a few of those ideas in a series of packages for Emacs called "NANO Emacs"; the actual implementation goes beyond what's discussed in the paper and is worth checking out.
"It is as if typography recommendations had been frozen sometime during the eighties and nothing has ever changed since then." hahahhaah YEP
I love the exploratory screenshot of a function reformatted with the comments floating on the left of a hairline instead of next to the source, with the function name nice and big and bold right next to the version control info.
I agree, that's a really great example. It would be even more natural for tags in org-mode, which already supports drawing tags away from the text itself to a far-right column, but they really could be moved in the other direction to the left of a hairline next to the bullets, and they'd be more readable.
BTW, Nicolas has a lot of interesting packages exploring use of graphical design elements in Emacs apart from the Nano Emacs stuff; for example, there's https://github.com/rougier/svg-tag-mode. I love how he chooses to represent dates and dates with times attached, and to distinguish active vs. inactive dates with shading rather than "[" and "<". That actually makes it easier to spot a common org-mode mistake without having to call up the agenda.
I'm annoyed at how narrow their concept of "use cases" is.
I mean, are there really so many "pros" that need 18 streams of 8k video? Seriously? That's where all the money is? How does the market support so many "creative pros"?
Meanwhile, I know that Fusion 360 is going to suck as it always does, with its crappy slow graphics and clunky UI, and my docker containers will run dog slow.
It feels like they're catching up; there was a long lull where they seemed to focus entirely on iphone, with the occasional macbook because they needed devices to make apps for iphone on.
I mean their bread and butter will still be the iphone and co, but I'm glad they seem to have moved away from min/maxing for whatever brings in the most money.
You could buy a 900$ laptop with a better CPU than any laptop out there. 20 hours battery life (almost 3x any other laptop out there). Silent (no fans). With a great screen, a great keyboard, lightweight, well built, etc.
Basically a machine that was more than 2x cheaper of competing laptops at 2500 $ or more, yet had more than 2x of everything.
... if you can live with machines that just come with the bare minimum RAM and SSD. If you get the Macs with the same amount of storage that the PCs come with, the $900 laptop quickly becomes a $2000 laptop.
I wish I could get back all the time I've wasted helping relatives deal with their Photos libraries and backups just because they got the entry level machine with skimpy storage...
I've had an 8gb macbook Air m1, and it's honestly never been an issue even for dev stuff since my workloads just wouldn't run on my local machine anyway. It's much, much less of an issue than on my Windows laptop with 8gb of ram, too.
As for storage, yeah 240gb is probably not ideal for most people but since I bought the air mostly as a lightweight device I can carry anywhere instead of a workstation (even though it's insanely powerful for what it is) it does not really matter in my case.
(This is my first Mac so I was very worried of the pretty limited ram since I had no idea how macOS deals with memory, but if it's fine for me I'd say it's fine for most normal/casual users)
8GB RAM is fine as long as you don't try running multiple VMs or lots of docker containers. macOS is surprisingly good at dealing with limited RAM thanks to memory compression.
The small SSD is the bigger issue. If you use it as your main machine you will fill it up quickly and people then start doing stupid things like putting their Photos library on a USB stick or on an SD card, and that's just asking for trouble.
Why not just run your VMs / containers on a beefy, cheap, off-the-shelf Linux server that you connect to over WiFi 6? You could also keep all your photos there (and back them up with cheap cloud storage).
Keep your daily driver lean and cheap (and easy to replace).
Because most people don't (and probably can't) do this. The best option for most people is to simply have more local storage so that they never run out.
That's a somewhat recent addition, and it does help somewhat with the Photo library problem. It sucks if you don't have a fast internet connection, though. It also sucks if there's ever a problem with the photo library, because then you don't have a backup anymore.
Maybe 240GB are enough for light usage if you store photos in the cloud. I can only say that in my experience 1TB is the bare minimum if I don't want to spend half my time copying files around.
Yeah, as I said, a somewhat recent addition. The problem with skimpy storage on base model Macs has been an issue for decades, and it has gotten worse since you can no longer upgrade with aftermarket parts.
"Optimize Storage" is a bandaid that helps make your Mac with too little storage usable. But it's not a great solution. You'll be looking at spinners a lot. It's just not a nice experience when every video you want to look at takes a few minutes to load.
Apple’s been fairly cost competitive for a while now. That’s an old trope, lots of their new stuff is similarly priced or in some cases cheaper than their competitors.
Given the longevity and resale value, they make great machines.
They have been off-and-on. They had a few notable price cuts back in the twenty-teens that made them damned competitive. At times they've been a rip-off but most of the "LOL look how expensive Apple is" stuff achieves such large gaps by comparing them to significantly worse hardware and calling it "equivalent". Plus theirs is the only consumer phone and computer hardware with a healthy used market, so you don't take as big a hit on recoverable value as soon as you "drive it off the lot".
Apple has consistently been cost-competitive since at least the Intel era, if you compare to mid- and high-end PC hardware. Apple
never has made entry-level systems. Also, comparing to home-built doesn’t really count because most IT departments won’t support custom PCs.
Apples stuff has been cost competitive since the M1 came out... Although it's partly because there isn't much else in the same performance envelope, so there isn't much to compare to.
uh, performance/power-consumption/volume combined, perhaps. But there are plenty of systems that can easily beat the apple M1 systems on a purely performance basis, both before they came out, and since. Especially true if you're working on CPUs not GPUs.
You could have stopped at video content. Or just content. I don't see how the platform that serves and broadcasts said content is relevant to the value or performance of the device it's made on.
Almost. The hardware looks fantastic all-around. But the hostility to running an actually-usable OS like Linux is a huge stumbling block. Their users having to rely on the herculean efforts of the Asahi project, et. al, is shameful for the world's largest company.
They're never going to try to make it easy to avoid their ecosystem and that's not shameful for their company. If it's shameful for anyone, it's the US regulators who have allowed the largest companies in the world to continue vertically integrating.
If they were actually listening, they would have a more affordable, configurable Mac Pro like the old refrigerator models. I don’t care about compactness for a desktop model. I also don’t like components being soldered like it was an appliance instead of a computer.
If it wasn’t for Mac OS, and the fact that the alternatives suck more; I would have been gone last generation.
I don't know, "18 streams of 8k video" doesn't seem terribly customer-oriented? Is it something anything anyone really needs or understands other than "that's very fast?" How much video can you watch at a time?
I also don't see how the size of a desktop computer matters, other than as a fashion statement.
it's not a consumer product, video editors can utilize multiple streams of video in many usecases. The iMac and the Macbook Air are the consumer products with enough performance for 99% of users.
This generally isn't aimed at consumers; the much cheaper Mac Mini and iMac would be more appropriate there. It's for creative professionals (and to some degree programmers, though I really wish they'd make a similar-sized chip with less silicon budget spent on GPUs and more on CPUs...)
More and more I see the M1 chips and I wish Mac worked seriously well for gaming. Would love too see something like Proton but for Mac (given up hopes for native support).
I hate that I have my gaming PC and then my Mac for everything else.
I have to wonder though what their plan is for the M2. Are they laying the groundwork for when the M2 comes out all of these variants will be ready at the same time? Or a gradual upgrade but the same series (Normal, Pro, Max, and then Ultra) for each.
Having worked with Apple on games, I can tell you that as an engineer they work fine. Metal isn't bad. The problem isn't technical. The problem is how Apple approves apps and thinks about the gaming experience. For example, a game I worked on got rejected from the App AND Mac store because (in Apple's opinion) the texture download was too large. There's no analogue for this in other game ecosystems. As a developer, if your secondary ecosystem is demanding a lot of changes your primary ecosystem don't require you'll decide it isn't worth the effort.
It's why the games on apple products are fundamentally different. Apple may say it's about protecting the integrity of the user experience, but the best and the brightest would rather learn directly from users themselves and not use Apple as a go-between.
Thing is, as a former engine and graphics programmer, I have to say I agree with them, we got gaming software wrong. It took me years to actually see it but we do it wrong. A game like all software should be incredibly light and fast. A game that examplaries that is World of Warcraft which is incredibly light for a game of its stature.
It took me years to understand that, you can picture a performance red line, and instead of approaching it from above (that is, optimising a lagguy game) we should approach it from below. I havent work for a single game developer which use that approach and yet Im now convinced its the right one. Approaching the line from above almost always gives you a bloated heavy erring-on-the-wrong-side software.
If that was really a consideration that correlated strongly to success, the market would've shown that a while ago and the most successful projects would sail through Apple approvals. However, this is not reality, and when our philosophy fails to describe reality, only one of those two things is capable of changing.
"the market says otherwise" its a lazy and thought-terminating refrain and we need to retire it. it teaches us nothing to say "they market says otherwise", universally we need to go deeper and actually understand why things are wrong. your god "the market" also hasn't run A/B tests on GPs hypothesis, so this is just bad science.
It's bad science, but it's great shorthand for an HN comment. Otherwise you get into a quagmire of "how do you define quality games" and I didn't want to type out the text of Zen and the Art of Motorcycle Maintenance in a comment thread.
Mobile games have been bigger than PC or console for awhile, and a few of the titles are massively popular and profitable. Though App Store review policies differ in permissiveness, the approach from below has been the industry model both on iOS and Android. Remind me what you were saying needs to change.
> Mobile games have been bigger than PC or console for awhile
Of course mobile games are "bigger" than PC games. It's because today 18yos use mobile phones to "type" their homework. It's because they have never actually used or owned a computer/mac keyboard. It's because +80% of visitors of all web pages come from mobile.
It's because everyone has a mobile phone, and relatively fewer people use or own a classic computer. It says nothing about the scale of mobile gaming. It says everything about the scale of mobile.
It's not just about reach, though it wasn't actually inevitable that popular small communication devices would become a major gaming platform. The quantity of games and number of standouts (whether measured by quality/profitability/talent of creators/license/something else) is also high.
parent did not suggest that good and correct correlated with market success. we all know a market only demands profitable, which is a distinctly different objective.
Seriously. If Apple supported gaming more I'd be able to drop this PC.
Could be the sort of thing that could cause an avalanche of switchers.
You'd also pull over the entire games industry into the Apple space. Right now we code on PCs. As Apple silicon gets better, it's gonna be more and more painful to not be able to make use of that.
Not sure about game developers, but gamers are a low-margin, high-touch, low-satisfaction-score, garish-aesthetic, speeds-and-feeds market. Pretty much antithetical to Apple's core focus and I think they rightly avoid them. Despite personally wanting more games on Mac :)
Also I feel like everyone is missing their AR/VR push right around the corner. VR headsets with these chips will be dramatically more powerful than other devices on the market. Combine that power with Apple product marketing, and devs will be there regardless of how much the Metal APIs suck.
VR is happening on PC. Event the Quest2 has a PC link that expands its possibility many-fold, and you guessed it, it works only on PC.
Apple could come with its magic to take the field, but it’s been a while we saw them succeed in a market where a serious competitor is already expanding (voice assistants or the apple TV comes to mind). I’d love to be proven wrong, but won’t be holding my breath)
There is no “serious competitor” in VR/AR, because there is no market yet. The number of devices sold is miniscule. When AR/VR actually takes off, it’s going to look very, very different from what we have today.
Games on Apple will always be treated as 2nd class citizens. I don’t feel it will be any different on their new XR platform which is both good and bad because it’ll mean that we’ll have an XR device that the masses will finally accept if the price is right.
It's because they skew towards being children more. Apple already has a large children market via the iPad. "Serious games" on daddy's macbook pro and their iPads & apple TVs would further that stronghold. Games are their biggest money maker on the app store as it is, a real serious gaming department with 100-200 employees will pay for itself multiple times over. Apple arcade is another example. Apple Game Studios with first party games like nintendo would probably do better than Apple TV+ does today. They're alllllmost there.
How's your internet connection? I'm playing on a RTX3080 thanks to geforce now, but I admit it's only possible due to my living in a big city with a very low ping connection to nvidias data centres.
But the economics make total sense. It's like 15/euro a month. A RTX 3080 would cost me like 1200 euro to buy at the moment, without the rest of the PC to go with it. And I'd need to run windows, which I've not done since '98.
Outsourcing hardware it seems, does work for gaming, if you've got the 'net for it, :}
What I don't like about this model is that you are not allowed to run whatever you want, you can't even run all your library from Steam or Epic but select games only.
And cyberpunk 2077 or farcry 6 on full on ultra settings makes me almost want to cry it's so pretty. I'll walk past a puddle and just... walk back and forward. It feels like what your teenage overclock made crysis look look like in the early 2000's, I can recommend.
As someone who hasn't really gamed since Crysis, looking at https://www.youtube.com/watch?v=L5rYjHuZsM0, I have to say I feel.... not impressed at all? Maybe that video is a poor example, but the plants for instance look absolutely awful, especially up close.
The core gameplay mechanics of Far Cry have not changed significantly since Crysis, either. (Okay, more accurately since Far Cry 3- released a decade ago- or so.)
As far as a mercenary safari goes, Far Cry 2 still rules supreme in my book with its minimalist UI and serious tone, and could really use a modern remaster.
Fortunately it's gotten better. I didn't play it much during the first months as it was such a terrible shitshow of bugs. Seems they're getting ontop of it now though. I feel deeply sorry for the devs that they made them release it in such a condition. We've all been there I guess!
100% this: cloud gaming tech is solid but the business model is still in infancy. By the time Apple caught up in gaming I expect cloud gaming will have taken root.
I mean I kinda agree with you, but for 15 bucks a month is it really such a big deal? If the service went away tomorrow I wouldn't really care at all... I enjoy it, and it enables me to game on a level that would normally require thousands of euros of equipment, so .... Yeah, I don't care about gaming freedom it seems.. I still want OSS for my OS though ;)
You do own them in GeForce Now, not sure in Stadia/Luna. You're renting the gaming PC, but you can install those Steam games on whatever machine you choose. Though perhaps Steam is the GAAS.
You mean if gaming supported Apple more, right? The industry's reliance on DirectX and the frameworks that build on it are largely what keep gaming so PC-centric.
If we saw AAA studios embrace macOS-compatible (or better yet Open Source) graphics architecture, the need to have separate gaming PCs (and the insanely expensive GPUs associated therewith) would evaporate.
> As of June 3, 2021, there is no native support for Vulkan API provided by Apple devices.
> Apple deprecated OpenGL in iOS 12 and macOS 10.14 Mojave in favor of Metal, but it is still available as of macOS 11 Big Sur (including Apple silicon devices). The latest version supported for OpenGL is 4.1 from 2011.
OSX accounts for 2.62% of Steam users.
Implementing another graphics backend is prohibitively expensive for small studios whereas you could implement either OpenGL or Vulkan and get the other 97.38%.
And that's without factoring in the required investment in single-use hardware, people, and skills.
DirectX is another safe choice but I try to stay away from anything proprietary or Microsoft-related.
Not really. Small studios don't even bother writing their own rendering path most of the time; they're using alternative engines (Unity) or middleware (SDL, bgfx/libgdx/webgpu/whatever) to solve the problem entirely because they have better problems to actually slove.
Honestly the real issue is you probably wouldn't get many users or make a whole ton of money, really. Unless you target iOS or iPads, but that can add some significant constraints to the game. (But sometimes it's OK; Divinity Original Sin 2 basically works fine on an M1 Mac or an M1 iPad, as long as you have a Xbox controller to go with it.)
I think as programmers we desperately want to believe it's just a matter of programming and that we're in the critical path here, but it really isn't. Most of that stuff is a solved for a lot of cases. And if it was technological constraints, honestly, there's like a whole host of actual technological concerns, all of them really boring and crappy and not "cool" like rendering (like how do you CI and QA the damn thing, what kind of impacts might it have on your build system, do all the 3rd party things you have work here, etc.) Rendering is just "cool" to talk about but it's not actually as important as those; by the same token you never see giant threads on here about like OS-specific sound APIs or whatever...
There just aren't many Mac gamers and the hardware is kind of expensive to buy and test on for most purposes in that case.
>> You mean if gaming supported Apple more, right?
I can tell you from long experience-- more than 20 years ago I produced a Mac games with Apple's financing and as recently as five years ago I produced a game that sold over $100m on an Apple device-- Apple has been extremely consistent in their attitude. They are much less supportive of gaming then Microsoft or Sony. Part of it is that they see games as a lesser use of their devices over other use cases. Another part is that they consistently promote the kind of experiences they wish their customers wanted over the experiences their customers choose.
Why would you spend a huge chunk of your R&D money supporting a platform with users who don't game built by a company that won't spend a tiny amount of their R&D budget supporting industry standard technology that's been established for years.
AFAIK Unreal and Unity but support Metal. It's a chicken and egg thing. There is no game market on MacOS because no gamers are on MacOS because there are no games on MacOS so there are no gamers on MacOS ....
Of course consumer level macs have never had GPUs that could run AAA games. I don't know what an M1 MacBook Air's perf is. If it's able to run AAA games and a reasonable framerate then maybe a market would build? But if you're limiting gamers to M1 Pro/M1 Max/M1 Ultra users only then it's probably a pretty small market?
There is no chicken and egg. Apple has tens of billions of dollars. If they wanted to support games it could be done really quickly. They straight up are anti-gaming unless it is of the exploitative micro-transaction mobile kind where they can reap that store tax.
Yeah, it's pretty sad to see them pimping the likes of Genshin Impact. Even among the people who do play and enjoy it, I've yet to meet anyone who will defend it as anything other than a Breath of the Wild ripoff with a slot machine-shaped tumor attached to it's hip.
Apple and the gaming industry is a long and complicated relationship, that is I think best surfaced by the Epic vs Apple trial.
Apple makes a huge part of its revenue from mobile gaming, while ferociously antagonizing the game makers and the main players of the industry (still remember their beef with Nvidia for instance) which makes it a bad bet to embrance Apple platform outside of the most dominant ones.
Right, it's the studio's fault, not because Metal is a dogshit API with dogshit performance. And instead of paying 900$ for a RTX 3090, you could pay 4000 for something with the equivalent power of a 3060Ti that you can't upgrade without buying an entire new SoC.
But go on, blame directx for being the one API actually moving things forwards. Not Apple's shitty behavior.
Is gaming bad on Mac? Honest question because I have no need for high end performance for gaming. If it can run Path of Exile well enough to burn an hour or two here and there, it would be enough.
gaming isn’t terrible on mac. i’ve recently played factorio, ai war 2, this war of mine, sayonara, the pathless, and some other random apple arcade and steam games without any issue on my 13” m1 mbp. i also played sniper elite 3 on it via stadia, and the experience was quite good.
I don't feel the need to play cutting edge games anymore. Playing Windows games in Parallel on my M1 is pretty good. I was shocked to see how well it worked. It's unlikely to run Elden Ring well. However, the vast majority of games are fine.
Technically, I think this is illegal (Microsoft doesn't have a license for ARM support that isn't Qualcomm based). So for the pirating community this might not be an issue, but for long term survivability of the industry that would need to change.
Blizzard games are or were probably the outlier. They supported Macs even back in the PowerPC days. As a kid with a PowerBook G4 Blizzard games were the only good games I was able to play (Warcraft 3, WoW, Diablo). I have no idea if this is still the case for their newer games though.
Same, I want to run AAA games using these crazy Mac hardwares. A more interesting question will be: how hard it will be to implement such Proton for Mac? Is there any previous project?
I don't think something like Proton will pan out. What will likely happen is that the larger studio making AAA games will want an iOS port. At that point, it will be trivial to tweak it for the M1 chips. Games like Elden Ring and and GTA are perfectly playable on an iPad with a controller. M1 Apple TVs are definitely in the pipeline. iOS makes it worthwhile for AAA studios to port their hits over to the Apple ecosystem.
It exists as a proprietary product. Final Fantasy 14's Mac "port" is using Crossover, which slaps some magic on Wine. Definitely not a AAA graphics title but you could run it today. On a 16" MBP with the top spec M1 Max, it will get about 30-50fps with medium detail and I think I tested it at 1680x1050 - which feels like a miracle, almost, but gets trounced by an Intel i9 + AMD Vega 56(?) MBP running Windows from a few years ago. If the right APIs were available and an aarch64 binary were published instead, it would probably be a different story.
I know there is Crossover, Wine technically works on Mac from what I can tell also. So not sure if there is a technical reason Proton couldn't work on Mac other than just not building it.
Apple does talk a lot about gaming now on the iPhone and iPad, I would love to see them talk about it more on Mac (especially with how powerful these are) and maybe work with Proton like Steam is for the Steam Deck.
Proton also depends on DXVK, an additional translation layer that specifically bridges the Windows graphics APIs to Vulkan (as opposed to WINE handling its system calls in a general sense).
Meanwhile, newer versions of macOS (and the M1 that runs the ARM builds in particular) don't use Vulkan at all, opting for Metal instead.
The comparison chart for the high spec M1 Ultra was using a 3090.
So even if it "only" reaches 3060 levels of performance in games, I think many people would be okay with that. Reality will likely place it somewhere between a 3060 and the 3090.
Honestly, performance per watt is probably the least important consideration for a desktop computer.
Running a farm of these things? Maybe perf per watt makes sense to save some money. Mobile device running on battery? I totally get it.
But surely, the difference in cost for one person running it in their house, or a few hundred people in an office... like who cares? The cost savings won't even be noticeable on the power bill.
So 3090 price for 3060 performance, doesn't sound very promising. I wish we could exploit those gpus easier for compute, but compute with metal is simply not there, at least for now.
It's not priced like a 3090 though? There's an entire computer there and the CPU performance is no slouch, even if the GPU may be lacking for what gamers want today.
I'm pretty sure the gamers who're always rocking high-end video cards are a small minority of PC gamers, anyway, even if we only count "serious" gamers (not just Candy Crush or whatever) to remove that potential objection.
A lot more probably find a best-bang-for-the-buck midrange card and hang on to it for ~3 years, before upgrading to another midrange card.
I would like to see better gaming on Mac just so I wouldn't need two computers, or compromise just to get a machine that can be a halfway decent gaming setup.
There is a lot of support for M1. See these lists. Please note though that "playable" does not mean "on par with other platforms", ie, Windows and Xbox / Playstation.
Civ V and Civ VI - each shows one record and seems it ran for that person (30-60 fps - for comparison, both run 144 fps on my $1000 Lenovo laptop.)
It Takes Two - not in database
That's all anecdotal, i.e. the games this one person likes to play are not supported, or don't play very well, and some of them are 5+ years old.
Overall the point being, it's hard to shell out all that money for a device that won't play your games. (If this sort of gaming is part of your computer use case). So to that original comment's point "I wish Mac worked seriously well for gaming."
In reality though would you really want your games to be installed on the same machine as your daily driver?
Game companies and their parent companies tend to install some shady spyware on your machine in the name of "anti-cheat software" (ie, installed in the ring0/kernel space). I am kind of relieved that I have 2 separate machines.
Although if they develop virtualized environments to sandbox games from the host machine (similar to QEMU) then I suppose that might work.
> In reality though would you really want your games to be installed on the same machine as your daily driver?
Yes. The vast majority of people that I'm aware of don't buy multiple desktops. If I spend $1000+ for a gaming machine, I don't want to spend hundreds more to have multiple machines to jump between and to manage connections to and to take up space in my office.
Apple doesn't get a cut of software that's sold outside the App Store on MacOS.
And the most performant MacOS APIs for graphics rendering aren't on Linux AFAIK. Besides, game developers generally eschew compatibility layers because they harm performance.
Maybe short term. Helping a company grow that is know to lock down their products and force developers to pay them for access might not be the best long term plan overall.
Apple Arcade actually has some great stuff. Even all the rebadged "+" games are much improved since Apple forbids IAP in Arcade. The game play loops have been redesigned to actually be fun. Thumper+, Winding Worlds, and What the Golf are some of my favorites.
What Im hoping for is with fiber/5G things like GeforceNow become more popular. I used it recently thanks to 6 free months from ATT when I had an unscheduled layover due to a missed flight, and it was actually playable with 7 Days to Die (a FPS). Granted it wasn't as good as my home machine, but this was also over hotel wifi so not bad
Additionally all the creators, musicians, artists, etc that are making assets for and working with the game industry are having to use PCs to interface with the game.
Yeah but in game dev increasingly the tooling is integrated in the engine and so doing assets on a mac, then bringing them over to a PC and putting them into the game is an incredible unnecessary and annoying friction point.
Increasingly rare to find an artist that only works in photoshop and isn't working in the engine, directly putting their work into the game.
and it makes total sense! computers are slowly transitioning back to professional/enthusiast/educational focused as casual internet browsing continues to move onto mobile devices. you don't have people buying laptops to browse the internet on their couch anymore.
It's important to note that Apple cut off the Proton devs; not the other way around. It would indeed be pretty cool to see that sort of thing running on Apple Silicon, but the plethora of architectural changes that came along with Catalina stopped Mac support from being a viable target for Valve.
Not very surprising though; Mac native games don't really work that well either. If it relies on 32-bit libraries, it won't launch. If it uses outdated OpenGL, you can expect a plethora of errors to accompany you to an instantaneous crash. Apple has their work cut out for them, I just doubt they'll have the "courage" to bring back the features they so courageously threw away.
the last 32bit only Mac system shipped a decade prior to Catalina. 32bit was dropped a decade after support for the last 32bit only Macs. Apple Silicon in both iOS and Mac is 64bit only, on iOS it has been for years.
Despite that, macOS supports everything required for proton - the mechanisms that allow wine to run 32bit x86 apps on both intel and under rosetta exist pretty much purely for the purpose of things like wine (and so proton).
So I'm not sure where you get "apple cut off the proton devs" from that work
I was trying to find the original justification for why macOS was removed from Proton, but could not find any official word apart from some comments here https://github.com/ValveSoftware/Proton/issues/1344.
It was quite odd because the original herald of Steam Play had macOS front and center, even with exclusive cosmetics if you launched the game on macOS.
I do remember a large kerfuffle over the very poor state of OpenGL on macOS, which if I remember correctly was stuck on 3.2 for a long while, before being completely deprecated.
I agree that just M1 MacBook Pro is not great for gaming. However my M1 iPad Pro is very nice for some types of games. Lots to enjoy in Apple Arcade that takes advantage of motion detection in the tablet, developer support for VR/AR, etc.
I worked on Nintendo video games about 22 years ago, but I am not much of a gamer now because my iPad and Quest just about do it for me.
> I hate that I have my gaming PC and then my Mac for everything else.
Why do you hate it?
I also have this setup and I just play games, watch movies occasionally and play music on it unattended and I don't mind it's running Windows as I have no other use for it.
Looks like all the people saying "just start fusing those M1 CPU's into bigger ones" were right, that's basically what they did for the top of the line new M1 CPU (fused two M1 Max'es together).
And since the presenter mentioned the Mac Pro would come on another day, I wonder if they'll just do 4x M1 Max for that.
I think the Mac Pro is almost like a phantom product at this point. Or a riff. Like they're going to say "the new Mac Pro is there is no more Mac Pro".
What can they really do, in a single machine, that is more "Pro" than the Studio? How are they going to bring developers along when most aren't developing for twenty-core machines?
I still think something else is coming. Maybe something unconventionally stackable. Something for render farms.
Whatever it is, it feels like it's going to need novel OS support.
Right, but the keynote is marketing. And they've been saying that for, what, nearly a decade now? [0] And in that time they have chipped away at that segment from different directions, with the iMac Pro and the Mac Studio. The latter of which is a beast already.
What's left? Render farms, scaleable computing.
I guess I think there will be some sort of new machine in that segment. But I figure it's not going to be what people expect from a single, unitary Mac Pro desktop.
Because once you're up at that level of performance requirement, you start to want modular and scaleable things.
What is the pitch for the part of the market that is used to using commodity hardware in scaleable configurations?
I guess. I mean, it's a forgettable stopgap machine (I forgot it ;-) It's not really what people wanted, is it? It's just something they had to put on sale to shut people up. A conventional, Intel-era thing.
They've been saying "something really great is coming", all that. That Mac Pro was not that.
Does Apple still care about the market that wants to plug in internal GPU cards? Does that make much sense even in the context of hardware the speed of the Mac Studio?
What? It's a whole new tower with PCIe slots and everything, and was super well received. Were you still under the impression they were selling the trashcan Mac Pro?
I simply forgot that they made it. I forgot a lot of stuff in 2020 and 2021, like most people.
But by extension I certainly don't remember it being "super well received", or I'd probably have remembered it at all.
Either way: look at it. It's a tower PC. Do you think the current Apple trajectory has any meaningful room for that? What are people going to really be putting in it except disks?
It's an amazing machine.
It's an incredibly expensive machine, but still amazing.
> Do you think the current Apple trajectory has any meaningful room for that?
Are you upset that you think Apple is going to stop making something that you forgot they made, except there is absolutely 0 evidence that they will stop making it?
We know for a fact that they are going to release a new Apple Silicon based Mac Pro.
We knew Mac Pro would be the last thing they move over to Apple Silicon.
Apple even gave a timeline when they first launched M1, and so far it seems to be on track.
Apple has been extremely pragmatic lately, backtracking on objectively bad decisions around everything from keyboards and ports to form factors.
I didn't phrase it that way, but thanks, I guess. Whatever.
> Are you upset that you think Apple is going to stop making something that you forgot they made, except there is absolutely 0 evidence that they will stop making it?
I didn't say they'd stop making it, quite.
So much as that I think the product designation is a phantom, in a way. And the more alternatives they add to the Mac Pro, the less the market needs it.
How many people who bought the forgettable machine put an expansion card in it, do you think? How many of those people thus simply do not need anything more than the Studio?
And also: how many third party manufacturers are going to rush out of the gate to port their drivers to some complex new multiprocessing M1 machine?
I agree they have been pragmatic. But I think they've also sliced and diced this segment to the point where it doesn't make the sense it did.
If all you want is disks then the Mac Studio seems to be exactly what you'd want. I would assume people that want a tower want exactly what you were asking for above, which is GPUs or other cards like audio engineers need.
I didn't see a single negative review of the tower at all, so I'm really not sure where that comes from. And they said during the keynote today a new one is coming.
If their GPU can do workloads replaceable ones can’t then it’s worth it. If you need that amount of GPU, you’re doing high end shit with a high end budget, and can afford to just replace the machine in 2 years (if you even need it - many people stick with their Mac Pro 5-10 years).
Tons of people in pro industries bought it and love it. Pre-COVID it was sold out for almost a year. Every professional music producer I know bought one, and I’m sure they’ll get the M1 Mac Pro.
Funny story. Max Martin partnered with Creative to make a new audio rack to enable the audio for the first Back Street Boys album. That got miniaturized to audio cards for gaming which further shrunk down to a DAC and that lead to the iPod which lead to the iPhone which lead to Apple Silicon which brought us here today. Music producers are huge fucking tech nerds and literally move technology forward. Apple frequently works with producers when developing their highest end products that eventually trickle down to consumer grade.
Looking at the Mac Studio... what would an M1-based, drop-in Mac Pro replacement offer on top? That sounds like a tail-end product.
It's not like 2022 Apple is the kind of business that announces deep partnerships with graphics card makers.
It feels more to me like we're going to see either something that is half-rack-half-Mac, or something with major developments in neural engine hardware, or something.
A new segment -- something that is going to need significant new OS work.
CGI, CAD, rendering, driving huge screens, running live sporting events, scientific workloads, massive simultaneous automated app testing, etc. Just because you can’t imagine why someone would need that much compute doesn’t mean the use cases don’t exist.
I did not say -- at all -- that I can't imagine that kind of power requirement.
Not at all.
What I said if you read the rest of my comments is that I find it difficult to believe there is a specifically single-desktop-machine use case for that kind of power over and above the Mac Studio.
The reason I mention this is that high end (TV and cinema) CGI just is not the domain of single machines anymore; it's the domain of render farms. The people who do that kind of work for TV and cinema now expect to run farms of commodity hardware that can be swapped out and replaced. And that technology is available to everyone in a way that can be constructed more pragmatically.
Scientific workloads, similarly: most of that market is not going to spend a bucketload on a single Mac when they can spread their risk with cluster computing.
App testing: again, an application for clustering, and low cost hardware spreads risk.
So my point, again: given the existence of the iMac Pro line, and the M1 Ultra Mac Studio (with its evident astonishing GPU performance), given that I imagine most Mac Pro users never put an expansion card in their machines (which is the grand theme of Apple -- people don't upgrade or expand, usually), and given that cluster hardware is commonly in use and well-supported, is the niche for single mega-expensive desktops really big enough?
I don't think it is -- you think it is. But I think you can disagree with me without imagining me stupid, as I disagree with you without doing the same.
The real-time rendering demos of the Studio were incredible. If the pro is 4x that it would be insanely cool. If I were still doing music videos I’d kill to have one of those on set. Or what we did for Kanye’s Yeezus tour (live motion tracking of the dancers with some kinects and putting them into models projected on the screen) - could’ve been so much cooler with this much compute in a small box. You’re talking about this as if you have knowledge from working in an industry that you clearly haven’t. $100,000 fragile rack or $8,000 shoe box you can drop and it’ll still work fine.
The visual arts this thing is going to enable us going to be amazing, and we haven’t even seen the Pro yet. One former client has been texting me all day about ideas from 10 years ago that weren’t feasible but now are with this lil thing. I keep telling him to wait for the Pro then we can take some REALLY neat ideas off the shelf.
> I keep telling him to wait for the Pro then we can take some REALLY neat ideas off the shelf.
Aside from the fact that you're making my point for me -- you think the Mac Studio could do the job you imagine a Mac Pro doing -- there is this:
You tell your former customers to wait for an unreleased, as yet unscheduled update to a product to which historically Apple has displayed considerable, time-insensitive indifference, rather than order maybe multiples of the product that has been announced, or try to make it work somehow with kit that exists?
I'll note down your prediction about $8000. That, I guess, would be interesting. But if it's that cheap it's going to be after the chip shortage is over, surely.
They showed it being used for pro audio, but that's a niche that still uses expansion cards a lot. Keeping the current mac pro form factor enables a lot of niche use cases.
One question is whether the M1 chips have enough connectivity to efficiently handle as many expansion cards as the current pro. Even the Max version of the CPU is limited to 4 thunderbolt ports.
Connecting additional devices was the second thing talked about in the introduction, after power & performance. If anything, the focus on upgradability is growing.
And I bet the Studio is a way to release what they've done so far to serve a subset of the Pro market that has simpler expansion needs. The Pro likely requires a ton more work.
There's USB-C MADI/DANTE stuff already and it handles a lot of bandwidth with similar latency to those PCIe cards. That guys setup is a niche in the niche.
A friend told me last night that he thinks there's really a lot of generic resistance to the idea of using thunderbolt in this way, and that is what makes the Mac Pro necessary.
Reading more replies I think it's also clear that people still question Apple's GPU claims, and maybe there is more of a market for whatever-PCI-bus GPU cards.
Moreover I think if you look at the Mac Studio, it is difficult to imagine Apple thinking that a straightforward M1/M2 cheese grater is a remotely satisfactory product design.
But I've been told enough times that I am wrong, (albeit by people who were blown away by a keynote -- am I just old?) that I'm cheerfully willing to admit that I probably will be.
It has been a phantom product, because Apple has needed to fill major gaps in the desktop space first.
I suspect Apple is set to debut powerful revision to the Mac Pro focused on AI/ML at WWDC.
I wrote about this 10 months back[0], and it still applies following Studio:
> It may not be obvious, but Apple has repair work to do in the pro community. Four years ago this month, Apple unusually disclosed that it was "completely rethinking the Mac Pro." [1]
> The current Mac Pro design wasn't announced until June of 2019 and didn't hit the market until December 10th of 2019. That's just six months prior to the Apple Silicon announcement.
> It seems unlikely Apple spent 2017-2019 designing a Mac Pro that they would not carry forward with Apple Silicon hardware.
> The current, Gen 3 2019 Mac Pro design has the Mac Pro Expansion Module (MPX). This is intended to be a plug-and-play system for graphics and storage upgrades. [2]
> While the Apple Silicon SoC can run with some GPU tasks, it does seem it does not make sense for the type of work that big discrete cards have generally been deployed for.
> There is already a living example of a custom Apple-designed external graphics card. Apple designed and released Afterburner, a custom "accelerator" card targeted at video editing with the gen 3 Mac Pro in 2019.
> Afterburner has attributes of the new Apple Silicon design in that it is proprietary to Apple and fanless. [3]
> It seems implausible Apple created the Afterburner product for a single release without plans to continue to upgrade and extend the product concept using Apple Silicon.
Given "Ultrafusion" it seems plausible that M2 would be able connect multiple custom SoCs that are littered with GPU cores as stackable accelerator cards.
> So, I think the question isn't if discrete Apple Silicon GPUs will be supported but how many types and in and what configurations.
This prediction not come to pass:
> I think the Mac Mini will remain its shape and size, and that alongside internal discrete GPUs for the Pro, Apple may release something akin to the Blackmagic eGPU products they collaborated on for the RX580 and Vega 56.
But at least the mid market display now exists.
> While possibly not big sellers, Apple Silicon eGPUs would serve generations of new AS notebooks and minis. This creates a whole additional use case. The biggest problem I see with this being a cohesive ecosystem is the lack of a mid-market Apple display. [4]
"There is already a living example of a custom Apple-designed external graphics card. Apple designed and released Afterburner, a custom "accelerator" card targeted at video editing with the gen 3 Mac Pro in 2019.
Afterburner has attributes of the new Apple Silicon design in that it is proprietary to Apple and fanless. [3]
It seems implausible Apple created the Afterburner product for a single release without plans to continue to upgrade and extend the product concept using Apple Silicon."
---
You don't understand what Afterburner is, and as a result have read far too much significance into it, and haven't noticed that they've already moved on even before the new Mac Pro is out.
Afterburner isn't a "graphics card". It's just a FPGA on a PCIe x16 expansion card. Although in principle you could download an arbitrary bitstream to the FPGA and use it for a really wide variety of purposes, in practice Apple uses it for exactly one function: accelerating ProRes. That's Apple's proprietary low-loss video codec designed for use in video editing software, most notably their own Final Cut package.
They needed Afterburner in the Intel Mac Pro because they didn't want to burn CPU cores on a software ProRes codec, and none of the silicon available in the Intel/AMD/Nvidia ecosystem offers a hardware ProRes codec.
They don't need Afterburner in Apple Silicon Macs because they simply integrated a ProRes codec into M1 Pro/Max/Ultra. I wouldn't be surprised if they were able to share a lot of source code between the two implementations. The M1 version is lots faster than the Afterburner version, which isn't surprising when you know about the penalties FPGAs pay for being reprogrammable in the field.
I suspect the M1 Pro/Max/Ultra project came first, and the Afterburner port was a quick side project. To me, it's not implausible that it's a dead end, instead that's just about all it can be.
The dialog around unveiling the Ultra had a “one final member of M1” vibe to it. I think that is the end for M1. The next cpu bumps will be M2s.
Also I wonder about fabrication nodes. The iPhone SE is going to take a lot of dies and keep that fan very busy. M1 is on something else, but it isn’t going to free the A15 fab for other customers.
When handing over to the M1 Ultra part of the keynote he literally said something along the lines of "we're adding this last M1 chip to our lineup". So maybe an upcoming Mac Pro will have the next generation of Apple Silicon (M2).
I am curious to see how that device will look like. Surprisingly, with the last Mac Pro the modularity of the device and the ability to user-upgrade GPU, RAM or storage was actually a huge selling point that Apple emphasized. I can't imagine how this would work with Apple Silicon.
I’m surprised they’ve introduced four M1 variants already, and while that makes a fifth variant seem more likely… I still think they’ll want to find another way to differentiate the new Mac Pro. What seems more likely to me: the M2 MBA is released either in tandem with a M2 Ultra MP, or a few months ahead of it, and the other higher spec models will lag 6-12 months behind from then on. Especially given the significant delays on M1 Max, it makes sense to me they’ll want to roll out the next gen in a lower volume/higher margin while they ramp up production.
re: four M1 variants, the clever thing is that from a certain perspective, they've only designed two. It's just that one of those two (M1 Max) was designed such that it could be scaled down down to M1 Pro and up to M1 Ultra.
On scaling down, they simply designed and laid out M1 Max such that half the GPU cores, memory controllers, and media encoders were divisible from the rest of the chip, with the interface between the two a clean, straight cut line. To make M1 Pro masks, they likely just took the M1 Max mask artwork, cropped it, and did minor cleanup on the cropped edge to terminate all the dangling connections.
To make M1 Ultra, they just "glue" two M1 Max die together with advanced packaging and interconnect technology. Every M1 Max ships with an unused die-to-die interconnect block along one edge of the chip.
This is just about CPU but I really wish they go back to simpler naming for macOS.
Cat namings were good and memorable but now I don't even remember what I'm using nor which version is newer. Even the spellings are hard for foreigners.
This is actually really cheap. Maxed out at £7999 is less than half the price of an HP Z with similar numbers configured in which it probably can’t even get near the mac.
My workstation build with Epyc is ~$5,000 and has more cores (24 core), more memory (256 GB), faster GPU (3090), a 2TB pcie 4 SSD and I suspect will perform better on standard benchmarks. Definitely not as compact as the Studio though but lot more extendable.
> I suspect will perform better on standard benchmarks.
I wouldn't be so sure about that. These M1 chips have some crazy benchmark performance.
What they don't have, is a standard x86 ISA, which means that a lot of applications run emulated (and still tend to beat many Intel specs, anyway). I keep reading people complaining that "It's faster, but not that much faster. What's the big deal?", when they are talking about an i86 application, running on Rosetta2.
So given the Epyc has 24 cores versus 16 cores for the M1 Ultra, the Epyc has roughly 75% of the single core performance of an M1 Ultra and 115% the multi-core performance.
I feel that the M chips are awesome at saving power while providing performance which makes them perfect for laptops and portables. I’m not seeing their value proposition for desktops though. I don’t care about something that saves space. The only time I care about that is for portables.
Remember when people were talking about "ARM rack servers"? The idea was to have dozens of daughterboards, jammed into a 2U chassis, providing massive parallelism. The low power draw would make this possible.
Wouldn't surprise me, if Apple is thinking about doing something like this, maybe with a new API/SDK, focused on this. They would probably sell it as an ML machine.
The issue for that is that it’s not cheap, commodity hardware. It’s a risk to build a server farm with Apple given their historical penchant to arbitrarily cancel entire product lines that were doing ok. Apple is best as a workstation or personal machine.
imo this is just an improved version of garbage can Mac Pro
They refuse to build what a sizable number of people want, yet we're stuck with Apple because the alternatives are even worse. That's what pg got wrong with one of his predictions. He didn't realize that no one can do better than Apple's worst effort
$8100 for a 20-core, 128GB, 8TB, 2x10ge, 2xTB3 BTO config of a HP Z6 G4. You get more expandability but fewer and slower TB ports, random graphics, and a much slower CPU, also a different operating system. Z workstations won't be competitive for people who can choose macOS until they refresh them with a newer Xeon. On the other hand, Mac won't be competitive for people who require ECC or Windows or Linux.
That's why hard disks are still made and sold: they're an inexpensive way to achieve high capacity. I wouldn't expect seek latency to be problematic for video editing, and if it is, using the SSD as cache would likely be sufficient.
That 32 GB is not a typical stick. It’s a unified memory on SoC like how it is in smartphones. It affects performance significantly. You can read more about how they work and how they are different here:
"unspecified frequency" is LPDDR5-6400.
Better than anything else you can get for a reasonable price, and well into the experimental range with a 12th gen Intel cpu
>> 32 GB of unspecified frequency ram? 512 GB of unspecified speed and type ssd
Yes and it is all fun and games as long as it works. If something breaks, it is either terribly expensive to repair or unrepairable completely. Then it may not be as cheap long-term as it may look now... but the situation is probably worse with macbooks and their custom LCDs and keyboards than with this Studio.
Is it even possible to boot from USB on these M1 macs after the internal SSD fails?
This is such a random thing to complain about but I hate these Mac product pages with the animations as you scroll. I guess they are designed for mobile but I navigate them on a desktop using a mouse wheel and they always look super clunky.
It's like going to the library but someone has hired dancing cheerleaders to grab the book you're trying to read and wave it around so that you're more excited about reading it.
Apple does them FAR better than anyone else. I can’t remember if they did it first, or just popularized it, but it works well for what it is.
That said, while it looks cool, after the first time you’ve used it it’s just a pain to navigate. I kind of wish it was a separate “intro” page or something.
(They’re annoying on mobile too, but work better than desktop)
Yeah, I don't get it. They are such a UX nightmare. My company was trying to implement something like that in our latest redesign of our site and I put my foot down about it. We are in the financial B2B space and I had to explain to the marketing team that people come to our site for information... not to be shown cute animation. Also (I haven't tested Apples take on it), they are often ADA nightmares.
Mouse wheels mostly just don't have a good input resolution. It's one of the main reasons I stopped using a mouse and switched to an apple trackpad full time. I can't stand static pages that scroll in chunks either.
Good. None of these releases matter to me if all those machines can run is Mac OS. I'm not a Linux freak, but I like machines that last a long time. Long term support is the weak spot of any proprietary design. Given the incredible hardware capabilities of these new boxes, the 5 or 6 years official expected official lifecycle is waaay too short.
Latest MacOS is supported by Mac Pro from 9 years ago, that's pretty good. Macbook Pro from 7 years ago. These M1 machines are the new baseline, I wouldn't be surprised if they're supported a decade from now, it's not like MacOS is going to need 10x the CPU/RAM to run in 2032 knock on wood
I don’t buy Apple products for philosophical reasons, but the Mac Mini, and now Mac Studio, are everything I wish Intel NUCs were. I have an Intel Skull Canyon system from 2016 as my gaming PC and it fits in my wife’s purse. Since then, the gaming NUC variants have gotten larger and larger so that they’re nearly mini-ATX form factor again.
You should get into the small form factor (SFF) Mini ITX gaming scene.
Brands like Velkase, Custom MOD (in Ukraine, cannot currently conduct business), Phanteks, DAN Case, NZXT H1 V2, and many others are really interesting options.
I've built systems that use either SFX PSUs or Flex ATX PSUs (the latter are usually modified with Noctua fans to quiet them down, so using a case that requires an SFX power supply is more noob-friendly and will have more generous power limits)
The Optimum Tech channel on YouTube is a great as a general small form factor resource. Here's a case roundup:
LOL. If Apple actually cared about the environment they would make their products repairable and not fight people who repairs them.
It doesn't matter what you say if people are force to trash the entire computer because of one part failure. Most computers will not be recycled and a lot of them end up in toxic fires and stripped for metals.
Linking to a marketing page of one of the absolute worst offenders and claiming that they are the most responsible company ever is delusional.
Apple pushes promotions of their services into my face all the time. Apple Music, Apple Arcade, Apple Whatever. I hate it. At least with Windows I can disable it once and forget about it.
Microsoft puts ads in the file explorer, pre-loads its OS with bloatware and unremovable (other than by using third-party hacks) system tray icons for software nobody wants or cares about, spies on absolutely everything all of the time, and sometimes surreptitiously re-enables all of the above when you've gone out of your way to disable them - and it's Apple you think is pushing their services into your face all the time?
I feel like Windows users have become hostages without realising it, trapped in an endless cycle of abuse designed to mentally beat them into submission until they just stop noticing the horror of what Windows has become. Even the UI is an insane hodge-podge of bizarre interaction paradigms and half-finished flows that make it seem like everyone on the design team suffers from split personality disorder.
If it's what you have to use because you need Windows-only tools or you have games that won't run under Proton or WINE or whatever then I get it, but I will never understand people who consciously choose to use such a hateful, abusive OS.
And for what it's worth, I can't think of a time that macOS has ever mentioned Apple Music or Apple Arcade or any other Apple service other than iCloud to me, and even then it's only been when I've run out of cloud storage space and needed to upgrade anyway.
But i didn't know you you prefer when microsoft force install Candy Crush Saga, Tiktok, Adobe in your start menu, you can remove them? no you can't, and even if you manage to, they'll come back at the next update ;)
To each their own after all, different standard of quality and respect i guess
The amount of compute paired with the shared memory and fast bandwidth would make this an awesome fit for machine learning. But for PyTorch, the framework everybody is using now at least in computer vision, there seems to be no support at all for the GPU or the neural cores (which the presentation went on and on about).
I guess I’d better not hold my breath about support there, given Apples historical stance on support third party APIs/frameworks?
Apple said that Max's CPU is up to 2.5x faster and the Ultra's CPU is up to 3.8x faster than whatever intel CPU is in the iMac Pro, so you're getting about 52% more CPU performance with Ultra's doubling in CPU cores vs the Max, so definitely feeling some linear scaling limitations with the interconnect.
I don't work in the relevant space, but what makes coding for multi-cpu substantially harder than programming for multiple cores? Is it just having to manage separate memory for each CPU?
When you work with multiple cores they'll likely share the same L2 or at least L3 cache. Multiple CPUs often you pay the cost of copying that L3 cache over or need an L4 or worst case you go back to system RAM.
Each level that you go out further can drastically reduce performance, so you need to try and stay on nearby cores where possible.
Most devs don't account for that since dual CPU machines are in the vast minority
Yes, as all NUMA machines do, one CPU can access all memory, both local (to the CPU) and global (through the interconnect). The problem is that there is a significant latency cost when a CPU accesses non-local memory (limitations of the interconnect). So the HPC people writing their algorithms make sure that this happens at a minimal amount, by enforcing that the data each CPU is using is allocated locally as possible (ex. by using special affinity controls provided by libnuma)
I was just curious if these kinds of optimizations are possible in the M1 Ultra.
The way Apple presented it sounded more like the chips talked at a lower layer, much like if it was all built as one physical chip, than when you have two normal chips with an interconnect fabric.
Someone will figure it out with benchmarks or something.
I am genuinely crestfallen that there was no update to Mac Mini. That thing has the right balance for most computer science folks who want a Mac. Not everyone needs/runs a silicon Godzilla on their desk after all.
As of the current lineup, Mini is still stuck with 16 GB RAM and a disappointing number of ports.
That's a pretty big gap. Basically it means you have to shell out $2000+ if you want to do video editing on any M1 mac. Buying a brand new computer to do video editing that has only 16GB RAM in 2022 is unjustifiable.
$2k+ is too much for most people especially young "creatives" that are making entertainment (videos). Lots of people here make hundreds of thousands a year but none of those people make that money doing video editing or working in entertainment.
Maybe it's too much to expect a $1K computer to do heavy video work? Even then, I've been using an M1 macbook air myself for computational work, and haven't run into any memory issues and am surprised by how much power it provides for the price, and without a fan. The only issue I've had with the M1s is the display support, i.e. the mini only supports two displays and the macbooks only support a single external display.
Thats sounding more like being an Apple apologist. And there are certainly other uses of graphics without using PS or InDesign involved e.g Meshlab 3D renders, computer graphics & raytracing where more RAM definitely counts. Upselling the Studio definitely hits the academic bracket (mine) hard, when it was perfectly possible to just introduce M1 variants, with its advantage of ports & more RAM. That sort of upgrade would have cost ~$1500 theoretically given Apple component listing, which is still 25% cheaper than baseline Mac Studio.
(For students looking to upgrade, $400 is a month's rent saved)
I suppose the M1 pro is such a variant with more ports and more memory? For the M1 itself, it seems there are interesting aspects of the chip design that limit the total memory:
I also make no apologies for saying nice things about the current lineup, it's a clear improvement over 2016-2020 which I think was a step back in several ways.
> Basically it means you have to shell out $2000+ if you want to do video editing on any M1 mac.
16GB is plenty, and in fact the conversation when the M1 Macs were introduced was whether having 16GB vs. 8GB of RAM made a difference. (Mostly, it doesn't.)
MacOS on ARM handles memory differently, try it you might like it.
They can only print so many M1max chips at a time, they want to sell them in a high margin machine. Give em a year and the M1max will trickle down to the mini.
"MacOS on ARM handles memory differently, try it you might like it."
The only sense in which this claim is true is that macOS on Arm uses a 16KiB page size, while macOS on Intel uses a 4KiB page size. This is actually less space efficient (more wasted RAM due to partially filled pages), although it has performance benefits (N CPU TLB entries cover 4x as much RAM).
If you needed 32GB RAM to avoid swapping on Intel hardware, you still do. There isn't anything which profoundly reduces memory use. Arm Mac hardware does tend to perform lots better while swapping than Intel, which fools some into thinking it is using less memory, but it isn't.
(The reasons for the better performance: one, Apple put in a hardware accelerator block for the memory compression feature which allows macOS to sometimes avoid swapping to disk, so light swapping which only invokes the compression-based pager is faster. And two, the SSD controllers integrated into Apple Silicon are sometimes much faster than the SSDs in the Intel Macs they replaced.)
> MacOS on ARM handles memory differently, try it you might like it.
I already use the 16 GB 13". And while your comment is correct for most consumer applications regarding memory compression, it may not work that well in numerical computing on Python or C++ e.g. where a lot of RNGs are used or 2D/3D arrays are juggled. Just clarifying. Your statement is mostly true - but not always. Having a higher RAM machine could help definitely.
WWDC is a few months away. There might be a refresh of the Mini there? If we don't see a refresh of the mini this year, I can't see them not upgrading it for M2 next year.
I initially thought the Mac Studio was the "pro" version of the Mac Mini. But no -- they're still selling the i5/i7 mini. This makes me wonder whether they're going to update it at all. Maybe they'll just quietly drop it at some point down the road?
I'd buy one in a second if only it would support running vmware fusion - I still have to have one foot in Windows world, and not being able to spin-up and old windows vm is a deal breaker for me. (M1 chips won't support it).
Sure hope my latest macbookpro with intel chip lasts a while, I fear it may be one of the last ones they make.
For now do not attempt to run Win11 on it though. It kinda sorta works a little after doing tons of workarounds including some really crazy ones (network through a kernel debugger stuff or something; manually installing x64 store packages if you want the store, etc...), the graphics is crap, rdesktoping is not very good. Seems fine for Linux VMs though (then use them with ssh).
If you want Win11 use Parallels, it works very very very well.
You can even run Fusion tech preview and Parallels at the same time :)
Yes, but AFAIK, M1's will not support emulation of x86 based regardless of software vendor - and unfortunately, I still have to run some old VM's several times a week - include a very old Windows XP environment, and I don't think anyone's in any rush to support that anytime soon.
Who knows, maybe get a Mac studio for my Mac work, and keep my MacBook Pro for when I need to run non-arm VM's
They may be referring to the Non-Arm Windows. Let me tell you Parallels running Windows ARM is the snappiest Windows experience I have had in years....only problem is that its Windows 11 :/
Really wish there was someway to easily run Windows 7 or 8 on this thing. It would be bliss. Now that Windows XP Source has been leaked, maybe we can come together as a community, try to harden the OS and make it forward compatible with ARM. Bring back the Windows that people actually somewhat liked! If only that wasn't a monumental task :/
Why would you overload this machine for that need, though? Just have a laptop sitting on a shelf for when you need to spin up those old VMs, do what you need to do, and put it back on the shelf.
Its just the inconvenience of having to lug around multiple machines, and backup multiple machines - its nice when everything you need fits under your arm, but you are right - I could offload my old VM's, and may need to do that at some point.
The 2TB option is an extra $400. That's $200/TB.
The 4TB option is $250/TB.
The 8TB option is $275/TB.
That looks like planned obsolescence to me. A customer with 1TB will likely want to upgrade sooner than a customer with 8TB, so this pricing strategy discourages people from buying the more future-proof options. Other SSDs on the market tend to be cheaper per TB as you go up in size, but Apple's seem to be completely backwards (and obscenely overpriced, of course).
Or am I being too cynical here, and this is just that famous "luxury tax"?
"Options" are a popular marketing scheme to have your customers with deep pockets subsidize a lower starting price at the bottom end while keeping the same overall margin. Doesn't matter if you're Works the same way whether you're buying a Mac, buying SaaS software, or buying a car.
Apple does use very good quality SSDs, so I would expect them to be premium priced regardless.
I've never heard of an accusation of planned obsolescence based on price before.
> A customer with 1TB will likely want to upgrade sooner
Though I have no data to back it up, I think this is your mistaken assumption. I think it's much more likely that people upgrade because of CPU/RAM than disk space.
A basic 8TB SSD is about $500, but it's also more than 10x slower than what's in the Mac. A fast 8TB SSD from Sabrent, which is still slower than what's in the Mac, is about $1800.
Of course not everyone needs a blazing fast SSD for all their stuff. But that's why it has USB ports in the back.
A lot of professionals use external drives for their work, and Apple has the right of charging beyond market value per TB of storage because the space is integrated, so you'll find that most people will purchase on-the-curve storage and rely on high-performance commodity drives for day-to-day artifacts and asset work.
I'm really happy they're not going with a iMac pro form factor for these. Having a separate display and computer is great for avoiding e-waste. There are a ton of iMacs out there that have obsolete hardware but a screen that still works great.
This was initially kinda justifiable; when the first 5K iMac came out there was no commonly available connection that could do 5K (the iMac used a somewhat overclocked DP link internally, I think), so their only option would have been two DP cables (a configuration that then also had very limited support).
That excuse really went away with TB3 and newer iterations of DP, tho, and it's pretty disappointing that they didn't bring it back at that point.
I've seen projects online about converting 5k iMacs into external displays by essentially ripping out the insides and putting in a driver board instead - I wonder what the relative cost (/value proposition) of that is compared to these new displays if you don't need a webcam?
If anyone thinks Apple is not very serious about AR/VR consumer products, their renewed focus on creative professionals, intense workloads, and GPU performance seems to suggest otherwise.
Whatever the marketing out of Microsoft Surface, it's the Mac that has always enabled creative workflows. I'm genuinely more excited about the Mac than any other product, which I haven't felt or said for many years.
> Whatever the marketing out of Microsoft Surface, it's the Mac that has always enabled creative workflows. I'm genuinely more excited about the Mac than any other product, which I haven't felt or said for many years.
A professional will work with whatever tools he needs, Mac or otherwise. It's pretty narrow-minded to think that it's the domain of one OS.
Apple finally releases a powerful affordable desktop computer. Now I am worried it's the start of WW3 or otherwise the end of the world. Going outside now to look for flying pigs
What? $2000 for the Studio plus $1600 for their 5K display means Apple's new entry level workstation now costs $3600. That's a LOT more expensive than the old iMac 27 i7 at about $2400.
To sum up, Apple just hiked their entry level workstation's price by $1200. AND they convinced you they performed magic.
No, I think that flying pig you saw was proof that Jobs' reality distortion field is still alive and well.
Except you don't have to buy their display, that's part of why the Studio is so nice. You can buy any generic display you want that doesn't cost 1600 bucks. For what you get, the 2k Studio is an absolute steal in terms of compute per watt, I/O, form factor, and longevity (Macs last ages). Plus, having 64 gigs of VRAM is absolutely amazing at that price point.
But if it works, well its better than Jony Ive‘s „design is everything, function is last“ approach.
- Like the Mac Pro (2013) which was thermal limited even with the launch configuration and could not be refreshed because more power would mean less power through throttling
- Magic Mouse 2 which well u could not use while charging
- Macbook Pro Touchbar which is there because there was nothing else to „inovate“
- MacBook Pro Keyboard which is so thin and lookin good that the owner has to replace it every 6 month
I agree with some of this, but will note that actually using a Magic Mouse 2 cured me of joking about the design (I was entirely on the "LOL how dumb" train before that). It wasn't an issue, in practice, and did keep me from just using it plugged in all the time (which is what I tend to do with other wireless things at my desk that have integrated rechargeable batteries).
I think the main issue with the magic mouse 2 is that over time, as the batteries wear out, the effective life of the mouse risks dropping so much that you may eventually be unable to use it for a full work-day, whereas the previous solution of AA batteries had 'infinite' longevity - while it wouldn't be too much of an issue when new, it harms the resale value
That makes sense as a legit problem. Reminds me of people going "LOL WTF do you need 12 hours of battery life for?" about the new M1 laptops. Well, for one, more battery life is always nice, and for another, it'll be really nice to still have 8 hours of battery life when the laptop's seen five heavy years of use without a battery replacement. I could see a few-years-old Magic Mouse 2 getting to be kinda shitty, sure.
I always felt like the Touch Bar was needed so people could "see" something to compensate for the cost of the touchid + whatever the MacBook called the Secure Enclave? (I know it's T2 in the iMac)
Great they avoided $1000 stand memes by offering a more reasonable $400 stand.
To be more serious looks like a pretty good product that fits well into offices. I stopped working in IT in 2018 but back then SFF was all the rage and a powerful workstation with a smaller form factor will probably be attractive to a lot of customers.
Perfect. Recently went to a 40" ultrawide display to share between work laptop (during work hours) and personal desktop (after hours) to simplify my desk, and felt the Mini was probably just a little too limiting/low-end.
How is this going to scale up to a Mac Pro, especially related to RAM?
The Ultra caps at 128 GB of RAM (which isn't much for video editing, especially given that the GPU uses the system RAM). Today's Mac Pro goes up to 1.5TB (and has dedicated video RAM above this).
If the Mac Pro is say, 4 Ultra's stacked together - that means the new Mac Pro will be capped at 512GB of RAM.
Would Apple stack 12 Ultra's together to get to 1.5TB of RAM? Seems unlikely.
Hector Martin who’s developing Asahi Linux for Apple Silicon recently tweeted about this. The M1 Pro, M1 Max and the at the time unannounced “Double M1 Max” have a completely different series identifier (T600x) to the forth coming Mac Pro chip (T6500) so no it won’t just be a doubling/quadrupling of an existing design.
https://twitter.com/marcan42/status/1498317101245034502?s=21
Without having a teardown available, and more importantly, without having actually manually manipulated it, I am ready to say the Studio is poor design. I could be completely wrong, but it appears from the images of the back that the design is simply this: a Mac Mini on the bottom with storage, I/O & PSU, below a processor layer with a massive heatsink, which is likely the only thing taking up any space behind the top grill. The problem with the design is that the M1 Ultra version weighs 8lbs., and most that weight is going to be in the bottom, making Studio unwieldy, and I expect placement of this little bottom-heavy box that weighs nearly as much as a gallon of water is going to be the source of a number of wrist injuries, and I think it is safe to say also, there will naturally be some blunt force injuries to a number of hands and fingers, some with ugly lacerations. Most of this could have been avoided with some new cutting edge technology colloquially known as putting a doggone handle on the top that was solidly loaded into the more massive layer on the bottom. I think the design ethic here is similar to strapping a 350 to a gocart. Putting a handle on something that weighs 8lbs. is so ordinary and necessary and obvious, and because Studio lacks it, I honestly do not care what's inside it. Even if manipulation is only going to occur once or a small handful of times, physically it is going to function like junk, and that just bugs me too much. I have too much junk as it is. I'd take 2 budget M1 Minis and an Intel Mini over a maxed out Studio every time. I'm absolutely serious, a handle would have changed everything.
Could/would this machine makes sense as a development machine? I use a Macbook Pro (M1) and it's always docked...I basically don't need it to be transportable. I could also use more power (I'm regularly running a bunch of docker containers + PyCharm + DataGrip + Android Studio).
I don't think any M1 chip makes sense at all if you need to work with a lot of docker containers. Orders of magnitude faster to do that on a Linux machine.
A test suite that takes under 2m to run under Linux docker takes over 4m when I run it on the M1. Yet the M1 is marginally faster than this i7-1185g7. For regular development reloading apps etc it's similarly sluggish. "Just fine" sure, but that time adds up.
The M1 is an interesting CPU, but OSX is less popular than Linux for software development. Apple working to support Linux on the M1 would increase sales, though realistically with the other high margin product sales they do not really _need_ extra sales.
Mac is perfectly fine for heavy Docker usage in my experience. Also OSX is extremely popular for software development. Be interested to see data that corroborates your assertions.
I actually hadn't compared the specs till you mentioned it. I think you're right, yes. The Ultra should be enough. Do you know if these new chips are only available in the Mac Studio or the Ultra and Max will show up in MBPs soon as well?
The complete lack of what HDMI port it uses is indeed pretty bizarre, but why would it come with a keyboard and mouse? The people who buy this already have those, or know which ones they need far better than Apple does, and will buy the exactly models they want on their own.
> The complete lack of what HDMI port it uses is indeed pretty bizarre
Isn't "Support for one display with up to 4K resolution at 60Hz" as stated on their specs page[1] pretty clear? It's an HDMI 2.0 port and I think mentioning the supported resolution and refresh rate is more helpful to customers than the HDMI spec version.
Side note: HDMI Forum pulled a USB and fucked their standard so forum members didn't have to upgrade their hardware. In the same way that USB 3.0 became "USB 3.1 Type-1" you can take HDMI 2.0 and essentially market it as HDMI 2.1 directly - it doesn't have to support 120 hz / 48gbps mode, it doesn't have to support VRR, etc. If it passes the HDMI 2.1 compliance testing, HDMI 2.0 hardware can be marketed as HDMI 2.1 without change.
So it could be HDMI 2.1 compliant, and still not support 48gbps mode.
(Which isn't quite the same thing as not supporting "120 hz" - it could probably do 120 hz over displayport/thunderbolt, or at a lower resolution, or with chroma subsampling - just not 4K120 4:4:4 because it doesn't have enough bandwidth)
Biggest missing thing for me is no internal M.2 SSD slots where you can swap storage. Getting external closures is annoying, really wish it came with 2 or 4 m.2 slots that you could put on the top or bottom. Cables break, wiggle around, get moved around during data transfer and cause it to get corrupted or fail and more.
I generally like Apple designs, but I think the Studio is ugly and uninspired. It reminds me of a cheesy HiFi component. If they couldn't come up with a better original design, they should have abandoned pretty design and put a handle on the top to make it easier to move around. Also, I think it is ridiculously expensive, should have maxed out closer to $4K, not $10K. Also, since it is thick enough, they could have jammed in a little wide-screen right on the front as a backup or aux monitor. For $10K, I'd like a little wide-screen display on the front.
I see that the official Apple specs mention "support for 5 displays" and the tech specs state:
"Support for up to four Pro Display XDRs (6K resolution at 60Hz and over a billion colors) over USB-C and one 4K display (4K resolution at 60Hz and over a billion colors) over HDMI"
... so I am very pleasantly surprised there ... >4k resolution on the four Thunderbolt ports and then a bonus 4k port via HDMI.
The same question arises as with the mac mini: Why can't the additional (leftover) ports be used for even more displays ? In this case, there are two more ports on the front (either USB-C or TB4) that can handle a display - why is that not possible ?
I have to assume at some point you simply hit the limit of how many pixels it can push out. Remember, you're talking about running FOUR displays at a resolution that was unheard of on the desktop two years ago AND another display at 4k simply as a "bonus".
At the moment the PC market is upside down, too. If you're building a professional workstation, you'll find with current prices that a business XPS desktop is cheaper by a significant margin over building your own workstation with the highest end parts available today.
The "overview" is just a graphic with links to an event page and video content. That isn't an overview. An overview is a blurb of text summarizes the subject matter.
I am really, really sick of these gated "watch the video" information vehicles. I read much faster than video plays: stick to text if you want my attention for things like this.
As a comparsion a quick build on PCPartPicker: Xeon E5 22 core, Radeon RX 6900 XT, case, power supply, 64 gb of ram, motherboard, 1TB SSD and CPU cooler. Comes in at just shy of $5,000
That CPU isn't as powerful as the one the M1 Ultra beat in their specs, but should be about the same GPU as they compared and beat. If the benchmarks are to be believed.... the $4,000 Mac Studio will be an absolute powerhouse in the price/performance/power market for quite some amount of time.
Normally I'd make some snarky remark about Apple Tax, but in this case they look to have the PC hardware equivalent very well and truly beat on cost. For now.
> As a comparsion a quick build on PCPartPicker: Xeon E5 22 core, Radeon RX 6900 XT, case, power supply, 64 gb of ram, motherboard, 1TB SSD and CPU cooler. Comes in at just shy of $5,000
And that's just the parts. Don't forget the M1 Ultra will use less electrical power too.
If anything this just reminds me how frustrating their product line was pre-M1 for so many years. I'm still dealing with the legacy of having to work with people with 16GB as the ceiling and projects that simply do not fit in them.
I'm not speaking 8GB as an end-user problem, but as a developer problem. Most of my developing machines are 32GB+ but I have to create successful workflows for macbooks that only have 16GB.
As a dev, I think this is neat - but unfortunately, to this very day, the most important workloads usually are single-threaded, even the ones that are multi-threaded, rarely scale to more than 8 cores.
This usually means that the advanced M1 variants don't really offer an advantage over the base offering, although, more RAM is always nice.
It's a little funny looking, proportionally speaking, but I am so happy to see some ports on the front. It can be incredibly awkward to try and find ports on the back of a machine. At first glance it seems a little expensive, but the value based on performance is actually fantastic. Well done, apple.
I'm kind of disappointed, Apple make such a big thing of being green but keep making machines that aren't upgradeable in these tiny form factors. I was really hoping for a more affordable Mac Pro with easily accessible memory/ram and some expansion slots.
It’s interesting that they are demoing a 3d app on the screen. Does this mean that finally 3D on a Mac is not underpowered. The hold point always used to be the graphics card, but will apples new chip be equal to the task?
I was kind of disappointed as I expected this to be the keyboard thing that was leaked to the news a bit ago; seemed to fit just right with their arguments of portability, connectivity, modularity.
This is pretty squarely marketed towards current users of 27" iMacs and iMac Pro's, so from that perspective the "modularity" being spoken of is the ability to use the display with things that aren't the Mac as well as the ability to upgrade the display. Thunderbolt 4 also makes it better than the previous TB3 iMacs in terms of external expansion.
> I really hope apple gets the "Mac Pro" form factor right this time.
Tbh, they did mention at the end that their Apple Silicon Mac Pro is coming later down the line, which I'm guessing will focus more on upgradability and modularity
I think from a design perspective, this is the most disappointing Apple release so far. It's just way too tall, imo. There's a lot of empty space on the front which just seems odd.
I'm not exactly sure what they could've done to keep the Mac Mini footprint, but this ain't it.
Fake grills have existed for years. More airflow is usually, but not always, better.
All I'm saying is the current design doesn't look good to me, and some more work could've been done to not just make it look like a weird, tall Mac Mini.
Yes, that's called an opinion. I think the design doesn't look good, I think it's the most disappointed I have been with hardware design from Apple, and made my disappointment known. It even starts with "I think".
What, exactly, is your issue with someone posting their opinion?
I'm convinced the design is a tradeoff to make this work in the datacenter. With the exact same footprint, racks of Mac Mini servers can be converted to fit these with very little effort.
Unfortunately, the height (3.7") is just above a 2U (3.5") rack. It's also not a multiple of a Mac Mini height (1.4") which means you'll have some leftover space if you're replacing 3 Minis with one of these.
But most rack mounted Mac minis seem to be placed on their sides with space between each individual machine. Perhaps it still works in that orientation, just fewer per rack?
I was thinking that as well, but I believe they still use some sort of support to keep them propped in place, so not being an exact multiple means modifying them would be harder than, for instance, just cutting some of the vertical supports off to fit a 3xMini.
But why would anyone do that? IO is very limited, memory on the m1/mini doesn't support ECC from what I know, and repairs/parts/redundancy is terrible.
I can understand it if it's just for fun, but otherwise it seems like a really strange idea.
It's one of the only sane form factors available if you need off-site compute power running on MacOS. The rack mount Mac Pro helped to fill that niche somewhat, but they're just insanely expensive.
Amazon has dedicated Mac hardware in AWS as well and the pricing is also crazy expensive.
For large app devs though, the cost is worth it to build their apps in the cloud without bogging down their local machines constantly. If you have rebuild targets for a half dozen OS versions, it can add up!
Datacenters are the solution and the Mac Mini just so happened to be the form factor that was available and could be made to work with less hassle than accommodating laptops.
I did that with a Mac Mini, but at some point it seems to have entered a permanently throttled state for some reason. Disk IO and CPU performance are atrocious, and idk why. Even tried installing Linux on it, but it's still incredibly slow. Maybe some sensor broke?
Whatever it is, I've become much less trusting of Apple hardware that is that old.
Well yes and no. M1 Ultra is obviously faster, but that actually is two M1 Maxes glued together. (Of course it's much more complicated than that) In that sense it's not a completely new processor so it's not exactly breaking the "Max" promise... or am I playing devil's advocate here?
Nice machine, but for all the lip service to sustainability and environmental friendliness, the insistence of soldering everything in makes for a disposable device. In particular, once the SSD goes, it’s just a very expensive brick.
Using the 27" iMac with lots of stuff plugged in is a bit annoying. I occasionally have cables coming loose when I adjust the display. And it's impossible to get additional displays that match the iMacs look.
That didn't match my experience with my 27" now-dead iMac, which I'm eager to replace with an Apple silicon model.
As for the "additional displays that match" comment, I did say above that it looks like the Studio Display could be the form factor for a new 27" iMac.
It's got an A15 chip in it already, as well as multiple ports on the back so they probably could get something less powerful than an M1 Ultra in it soon.
The whole GPU thing is really funny. The M1 Ultra is supposedly more powerful now than an NVIDIA Quadro RTX 8000, but I'm guessing really truly only for professional work.
If you tried to play games on it or develop games for it, you're gonna find that it's less powerful than the standard AMD integrated GPU on a 5k iMac from 2015 until you come across software or build software using Metal directly.
Both Wine and Parallels usage today provide poor graphics performance.
Don't a lot of people want a computer in the 1000-2000$ range? The average person who wants to do video editing on an M1 mac just wants >=32GB of RAM. The only way to get that is to shell out $2000+.
M1 Mac Mini with 16GB RAM and 512GB SSD for $1100
M1 Max Mac Studio with 32GB RM and 512GB SSD for $2000
Yeah because nobody ever buys a computer that costs between $1000 and $2000... ?
A 512GB SSD in a $2k computer? People who deal with HD video are going to want at least 1TB right? So it basically starts at $2200?
Is Apple going to release anything anytime soon for the young/not-rich kids who want to do real video editing (need >=32GB RAM)?
Bravo Apple. I'd love to see what they have in store for designers and programmers next.