Wednesday November 18, 2020: $118.03 -($1.36) -(1.14%)
Nov 19, 2020 7:08:45 GMT -8
Lstream, Luckychoices, and 7 more like this
Post by dmiller on Nov 19, 2020 7:08:45 GMT -8
So basically:
It’s “game over” for Intel.
That’s the post.
Questions? 🙂
I wanted to sleep on this before answering. What led me to make such an expansive and possibly "rash" statement yesterday, in the context of all things Apple?
Was it "just" the fact that Apple has just done the impossible? Or was there more subliminal goodness behind what intuitively felt like the right thing to say?
(I was going to write something much longer, but I think I can summarize more quickly this way):
This is the first time that Apple has had "the complete package" - processor/architecture, in addition to the rest of the hardware and software stack, that significantly outperforms the competition. And I don't see that lead going away; but instead, expanding.
All of the early reviews of the M1 Macs confirm this. For the things that "matter" in terms of performance, and for creative professionals - video editors, photographers, anyone who needs to work things that are still, on yesterday's Intel hardware: time consuming (time is money), expensive (higher config laptops are desktops mean more money - a LOT more money), heat generating (systems get hot, fans blow) and ultimately, mostly "stuck" with Intel, with small performance gains year to year and at great angst and expense.
This has been true for both Intel Macs and PCs. The processor and directly related architecture haven't been a differentiator. Overall system design (and of course, operating system) has been an important differentiator, but everyone is running from the same set of ("yesterday's") Intel processors. Year over year, Intel has dragged. Clocks speeds have increased only in small increments year over year; next generation of chips have been delayed; the move from 10nm to 7nm has been delayed.
"Game over", to me, means this. Apple has moved on by leveraging a decade+ of developing their own processor technology. What's in that first M1 chip blows away everything else on the market, and only Apple has it. Where does Intel go from here? By the time Intel finally gets to 7nm and only for the processor (remember we're at 5nm and with many other custom technologies including: custom GPU, and machine learning, and unified RAM on the M1 chip), Apple will have stopped making any Intel Macs and will probably be at a 4nm process. Intel isn't going to catch up.
For people who are web browsing, doing social media, creating small documents, writing emails? The pure raw speed may not change everything, but we're also getting almost double the battery life on laptops and with either no fans (the Air) and the system barely even getting warm under load, or "active cooling" (silent fans) on the Pro to sustain peak performance, and still nothing other than slight warmth. No fans kicking on with too many tabs open in Safari - which, when you think about it, is ridiculous.
But for some of the "best" customers (not the largest number, but Apple has never cared the most about the "largest number") - the creative professionals who need SPEED and have been stuck in Intel-land for years now - these systems change everything.
You can take a base model Air for $999 and edit/render 4K video in FCP, with no stuttering during playback, with "live" effects applied, and it handily beats out $3K MBP Intel systems and even a $15K Mac Pro.
That base model Air can edit and render 4K video as if it was, effectively, standard def video from 10 years ago. It's that ridiculous. Nobody would have believed this would be possible.
That's just one early and obvious example. When Adobe puts out M1 optimized versions of Lightroom Classic, Lightroom, and Photoshop, expect similar performance gains, when working on large image files, particularly RAW; or large stitched RAW panoramas. Photographers (including me) have been complaining about editing speed in Lightroom for years, even after they've started using the GPU for assistance. It's never fast enough. I'm expecting wizardry with M1 versions.
How about price/performance. A $1K system with no fans and almost zero thermal footprint outperforms (totally clobbers) $3K and up systems for tasks like this. Even if I buy up from the base, I'm going to spend many $hundreds less than I would have, a few weeks ago, for a larger/more $$$$/heavier/hotter system that won't even match the performance.
To reiterate, when you compare what an M1 system can do now, vs. the previous Intel version. It's like:
- I could go out today, and run a 1 minute mile.
- Someone could run a marathon in half an hour.
- My car could get 300 mpg while driving across state at 180 mph.
- Someone starts throwing pitches at 250 mph.
(That's the kind of scale difference that's being shown). It should be impossible; until, all of a sudden, it wasn't.
Windows users won't get any of this. None. Never will. Where would it come from?
And this is only the first chip.
/edit /added: I've been through every Apple architecture change. I started with an Apple II+ in college (6502); then had the first Mac (68000); the "next" Mac (PPC); then the first Intel Mac; and now I should have an M1 laptop next week. I've written code in Applesoft, 6502 assembler, Pascal, 68000 assembler, C/C++, and other languages for professional development on all of them. I've used Photoshop from the first "Barneyscan" version and Lightroom Classic with massive RAW files for digital photography since; and edited tons of video in iMovie at first, and then FCPX. There's never been an "inflection point" in the desktop platform like this until now. (or: a ripple in the force).