Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Intel can’t grow profits in a global chip shortage (theconversation.com)
83 points by mellosouls on May 17, 2022 | hide | past | favorite | 125 comments


I swear HN's discussions on anything Intel related are always bottom of the barrel. They've been very clear about their messaging over the past few years. They are currently spending large amounts of money on capital expenditures to ensure future competitiveness and that eats into profits. The lack of increased profits during this current chip shortage is not a surprise to anyone paying attention nor is it indicative of anything particularly catastrophic for Intel.

From Reuters

>Intel Corp expects its profit margin to drop this year and then be steady for several years as it invests in new technologies and factories to meet rising chip demand, but added it forecasts climbs from 2025.


The reason why this comes up is Intel historically as a company built it's market dominance on it's manufacturing processes. This in turn lead to profitability.

I'll use Intel's advertised sizes for processes below, but keep in mind that a smaller number just represents a new manufacturing process. Not that actual size.

Intel switched to a FinFET process with its "22nm node" in 2011, then a few years later in 2014 switched to a "14nm node" that is also FinFET. That was effectively the last profitable manufacturing innovation for Intel. A "10nm node" certainly exists, but is practically a black hole of money when you consider the time invested on it and just how far behind they are at this point.

The only concrete thing Intel did to address this was to just rename all their process nodes to be smaller. That's basically it. A few years from now is a full decade that Intel will have spent without actually introducing a new profitable manufacturing process. You can make all the press releases you want about how it'll be profitable "one day", it doesn't change the fact that Intel can't really seem to design new manufacturing processes.

Intel effectively watched the manufacturing technology move past them and shrugged. The proposed solution to this is to spend several more years doing basically the same thing, but this time they'll hope they get it right. This is magical thinking.


And they renamed 10nm to Intel 7, but it is not EUV. Both TSMC and Samsung have moved to EUV, and until Intel does as well they are going to remain behind. All that money spent is nice, but until they start taking delivery of tons of equipment from ASML it's not going to help them at the leading edge.


Leading edge processes are not where the shortage exists -- it's trailing process nodes that are used heavily for support chips and automotive. Support chips like voltage regulators sure as hell don't need a 5nm or 7nm process, but they are critical components to any piece of electronics being able to ship. Older processes are ideal for these needs because they are very well characterized / understood, making them excellent for chips with an analog component. And older processes were cheap until demand exceeded supply.

Edit: and Intel doesn't and won't play into the market for support chips. There's no way in hell that they can support Intel's required profit margins.


To add on: Older processes also have lower static current consumption due to less leakage from bigger features. They are quantitatively better in many applications for low power or high precision.


EUV is a tool, not a process. TSMC N5 isn't 100 percent EUV either. Intel can hit similar feature sizes with Intel 7 as TSMC can but it requires more steps, hitting their margins and lowering their yields. The chips that come out of the other end are equivalent though.


> EUV is a tool, not a process.

It's kinda both. To be clear, EUV directly refers to a specific type of lithography equipment, ones using extreme UV light sources. However, those changes come with a lot of changes in larger process flow that are all inter-related.

The change in EUV means smaller lithography lines, and also means needing photo resist, etching, cleaning, etc. - many other process steps, that support that size change. Add in that now, with EUV lithography, you are operating lithography under vacuum and its a huge challenge. Suddenly you have to have vacuum ready materials.

One of Intel's mid 2010 wins was getting immersion litho to work better than peers - similar thing just wet wafers rather than vacuum wafers.

EUV is a tool/technology, one used only on critical layers where that precision matters (e.g., gate patterning). But the larger (entire) manufacturing process has to be designed to enable the use of that tool/technology.


EUV is coming with the Intel 4 process node. Intel is also the lead customer for ASML's High-NA EUV, which is supposed to be the next generation EUV.


I wouldn't say they shrugged, they tried but this stuff is hard. It was an engineering failure. Though engineering failures are often caused by organizational failures.


Not just engineering. I listened to an earning's call once. The CFO or someone in that area was claiming that "14nm was the most profitable process ever!". My thought was "well yes, it's also the most long lived process. I'd hope you haven't been operating at a loss for all these years"


I don't think the average user or custom cares all that much about what node they are actually on. Yes, of course battery life and efficiency matters. But if Intel can make some decent incremental improvements I just don't see them completely going away.

AMD and Intel have been trading the throne for years and every single time everyone says one of them is about to die. Yet here we are.


Via still exists and they never had the throne so I don't think either AMD or Intel are likely to die any time soon. The more interesting bet is if either will still be valued at ~$200 billion in 5 years or if these recent performance challengers end up making a large portion of the existing revenue generating x86 products look like Via CPUs did before they manage to get stronger grips into other product markets like GPUs and networking.


Also, it's really not like Intel is doing badly profit wise. They made $20 billion last year. That is 125% of AMDs entire revenue and 75% of nvda's. All this talk of growth opportunity for nvda/amd seems to ignore that the pretty much best case scenario for them is to become what intel already is profitability wise.


They also only have a P/E of 7.3 with a 3.3% dividend yield.


That's stock market language for low expected growth in a company that is doing OK now but doesn't look good in the future.


Of course. Everything is a stochastic process and the markov property holds true in all instances, your brain is useless. Markets have it all figured out.

My post was in terms of relative valuation compared to AMD and NVDA but I didn't feel like spelling it out. I mean can one really be an order of magnitude certain of profits vs AMD and NVDA either?

Of course this is not a democracy. It has to be the minority game takes hold. https://arxiv.org/search/advanced?advanced=&terms-0-operator...



> nor is it indicative of anything particularly catastrophic for Intel.

Well, it's indicative of the slow-moving catastrophe that was their 10nm farce. Intel is spending a lot of money to recover from that, because they need to make drastic changes to regain their dominant position. If they weren't sacrificing profits in a big way right now, the most reasonable expectation for their future would be an IBM-style slow decline from relevance.


Agreed.

> They are currently spending large amounts of money on capital expenditures to ensure future competitiveness and that eats into profits.

Which is precisely why understanding more than just an Income Statement (i.e. Earnings) is important to understanding how much companies "make".

Hilariously this article was published in Jan. Intel just (April) recorded its highest quarterly earnings ($8B).

https://www.intc.com/news-events/press-releases/detail/1541/...

Financial media is such a trash-heap.


> They've been very clear about their messaging over the past few years

They were very clear about "finally" shipping 10nm every year for half a decade. Their messaging doesn't mean squat. I'm not sure why there are still people who believe everything Intel says.


There “messaging” can be anything they want. By missing mobile, they are stuck just competing in the PC space which is minuscule compared to mobile.

There are about 280 million PCs sold in a year.

https://www.gartner.com/en/newsroom/press-releases/2021-01-1...

Apple alone will be selling that many ARM chips in a year between phones, tablets, computers, set top boxes, monitors (the latest displays have the same processors that are in the iPhone 11), etc.

Apple is only 15% of the phone market. All of the major cloud providers are trying to move to ARM wherever possible.

The world is moving toward more power efficient chips. Something Intel can’t do.


Not seeing the bottom of the barrel discussion you’re referring to, in fact someone else already pointed this out. However, I really feel like this viewpoint is too charitable still. Sure, they are definitely investing in R&D to stay competitive, and that has eaten into profits. What exactly does that say about their corporate strategy before recently? If you really put two and two together, it’s hard to come up with anything but the most cynical answer: they cut R&D too short and maximized short term profits like crazy.

I don’t feel like it’s any more reasonable to take Intel’s own rhetoric at face value any more than NVIDIA or AMDs. They’re all trying to paint the best picture for investors.

Plus, none of this really discounts the valid points from the article either…


>What exactly does that say about their corporate strategy before recently?

I get the impression that people think that Intel's stagnation after 14nm was due to complacency and the desire to avoid taking risks. That is literally the exact opposite of the truth because almost all of their problems with the 10nm node stemmed from it being far too ambitious. It incorporated a large number of extremely cutting edge technologies which were leaps ahead of anything else being attempted at the time. This over-ambition lead to nearly half a decade being wasted because, as it turns out, they were a bridge too far. If Intel had been able to pull off what they originally intended with 10nm when they intended it they would have been absurdly far ahead of all of their competition but unfortunately that choice actually resulted in them falling behind.


The first half of their post-14nm stagnation was because they were overly ambitious for 10nm. But the second half of that era was because of their hubris, denial that things weren't going to plan, and inability to adjust plans. They eventually admitted defeat (to some extent) and back-ported one of their 10nm designs to 14nm as a stopgap, but they started that effort at least two years later than they should have.


I’m not denying most of that, but it feels extremely one-sided. I feel like I’m reading Intel investor relations right now. I am not suggesting Intel was not working on cutting edge technology, but so were other companies, and some of them succeeded where Intel failed. Meanwhile, Intel had multiple business units with similar issues, especially their modem business, and I doubt the postmortem said “Oops, we were too ambitious!” Bullshit. They had organizational issues and complacency. I’m sure 10nm was plagued with many problems that weren’t complacency too. That’s part of the table stakes when you are in the business of defying physics.

The financials tell a different story.


Intel’s process technology lead was the core of their strategy and explains 95% of what’s gone wrong. The 5G modem design was an irrelevant sideshow that had no impact on 10nm.


I’m not suggesting the same exact thing happened, but it’s hard to believe there weren’t some overarching management issues going on, and it seems likely they contributed to problems further down the line. A couple blunders could’ve been a fluke, but I think we’re past the “it could’ve been a fluke” stage.

Of course, this is all pure conjecture, but nobody can really be all that sure. I’m sure even people at Intel have differing opinions about what was wrong.


Word on the street is also that the 14nm death march resulted in a lot of engineers taking early retirement which left them in a worse position to pursue 10nm.


From what I've heard, that was Intel's doing. They intentionally pushed older engineers into early retirement to save money.


"These geese lay nice, gold eggs but cost too much to feed. Let's get rid of them."


Well the old guard were the ones who screwed up 14nm, so were they really the geese?


People forget that around the 14nm timeline Intel was completely DOMINATING in terms of performance. And this was not THAT long ago. Will be interesting to see where everything is in 2 years


In this case, we'd expect them to have gotten some R&D out of those overambitious failures, right? So we should hope to see some really snappy progress in post-10nm nodes?


There is a balance a business needs to keep. Under spend on your future (e.g., R&D) and you'll have higher profits. Until you don't have something great to sell in a few years. It's short term. Over spend on the future and you won't have money for your share holders which they won't like.

Intel is expanding what they do (i.e. fabing chips for others). This is a business opportunity that wan't overly visible a few years ago. Global economics have changed and that presents a new opportunity they are positioned for and jumping on.

They also under invested in R&D. Many companies do and go through cycles of it. They are playing some catch up. But, they seem to be serious about it which is good in many respects.

If you're a short term share holder you might not like it because there are lower short term profits. If you're a long term share holder you might like it because they'll have a more diverse portfolio and better future product (and sales) potential.


> "...fabing chips for others). This is a business opportunity that wan't overly visible a few years ago. Global economics have changed and that presents a new opportunity they are positioned for and jumping on."

That's not really true, and it's not even the first time Intel has offered foundry services. There was Intel Custom Foundry a decade ago.

https://newsroom.intel.com/news-releases/altera-and-intel-ex...


Fabing for others has been visible for decades. Apple tried to get Intel to fab its ARM processors in 2007. But Intel thought that the margins were too small.


>I swear HN's discussions

Hardware Discussions on HN has been trending towards mainstream media since forever or arguably 2016/7. At least this being upvoted to the top suggest there are still a silent majority who knew what is going on and didn't bother commenting.


Hackernews and reddit discourse seems to desperately want to believe the Intel vs. AMD, blue versus red, style narrative.

Maybe Intel deserve it, I don't know, but it's silly to think these companies are your team.


Intel stumbled pretty hard, but they seem to have self-righted. The fact that their stumble didn’t really cost them all that much is a testiment to the company.


It is amazing how much incorrect analysis this global chip shortage has brought about. The situation with intel is complex but I will bring out the simplest error in this analysis.

Chips are not like oil, or steel or wheat. A chip shortage is a very different beast than an oil or wheat or iron ore or copper shortage.

Lets say that there are 500 different chips in a device. This is not an exaggeration. If you are short of chip #203 you cannot throw an extra couple of pieces of chip #205 instead. These chips are different and the exact ones are needed. Furthermore, many of these chips are made using different processes in different fabs, so you cannot slow down manufacture of chip #205 to make some extra ones of chip #203.

So that is the first thing to keep in mind in terms of the chip shortage. Currently the chips that are in shortage are mostly things intel does not make. To the best of my knowledge chips in shortage are power electronics, power discrete, power management ics, networking ics and industrial/auto/medical grade microcontrollers. If someone had updates to this list, please comment below.

The thing about the chip shortage is, that if there is a shortage in something you do not make, it is not necessarily a good thing. That may mean that your client will make fewer devices because they do not have that chip #203, and you will be able to sell them fewer of your chip #205.

Of course intel has some very serious issues but this chip shortage argument is just wrong.


An analogy that might work for most people:

Cars?

Intel is one of the manufacturers of engines, but they only make the literal engine block, maybe some pistons too. However cars have all those other tubes, wires, and side bits in the engine compartment and in the hidden parts of the body.

It's not the engine, but the specially sized and specially rated tubes around the engine. Those parts that everyone thought were dirt cheep, so people stopped making enough of them.


> Those parts that everyone thought were dirt cheep, so people stopped making enough of them.

Not only that, but because (in this analogy) tube making machines are very expensive, low-volume machines made by only a couple of companies in the world, take a long time to make, make only single very specific kind of tube each and take a long time to install and commission, you can't just up and start making more tubes.

And on top of that, the machines cost so much and tube technology moves along so that if you're not careful, you may never turn a profit on the tubes at all if, by the time you're tooled up, you've missed the window of opportunity and no one wants that many old-style tubes anyway.


This is pretty good and helped me understand.


And FPGAs. You may be literally looking at 100x pricing and higher, if you can even buy them at all.

Not a big component in many consumer items, but when you need them, you need them.


The chips used Raspberry Pi's.


The great rasppberry pi shortage is probably caused by shortages of power electronics and networking chips.


If that was the case the compute modules would be unaffected. You can't get even those right now either.


The compute modules only has a 5V input, which it must convert to the voltages used by the CPU, RAM, peripherals, etc. using aforementioned power ICs. A modern CPU like the BCM2711 usually has several different voltage domains that must be supplied by the PCB designer, ranging from 1.2v to 3.3v.


There is a fab space shortage. The cost to manufacture goes up and so only the highest volume, highest margin chips get made.


These analyses always focus on M&A, outsourcing, and that kind of business stuff while completely ignoring the decades of disgruntled engineers that left Intel to its present fate. Intel engineering was mismanaged for years and now they are paying the price.


> ignoring the decades of disgruntled engineers that left Intel to its present fate

Not to mention the layoffs of thousands of the most senior and experienced staff. Talk about cutting off your nose to spite your face.


A lot of people look at R&D and investment in things like fabs and process as the way forward but I feel like what you mentioned is the main problem.

Intel has been leaking and practically ejecting their best and most experienced talent to "maintain profitability". The brain drain will be their biggest problem and courting the best talent takes a lot of time and isn't something you can necessarily throw a bunch of money at.


Anecdotally they are deepening their bench right now, but it's hard to make up for the mistakes of the past decade.

Recruiting however is great for a company like Intel today, because of the ongoing tech crash it's getting really hard to get good people to join riskier companies. Intel is big, stable, rich, and too big to fail.


Intel also has a reputation for translating its success in good times to layoffs for individual engineers... so what does that imply about your risks at Intel in bad times?


The largest of those layoffs happened during Intel's years in the wilderness when Finance guys ran the company. Engineers are back in the executive leadership now, so hopefully they understand the folly of those past actions.


> The largest of those layoffs happened during Intel's years in the wilderness when Finance guys ran the company

Actually, it happened while Brian Krzanich was the CEO - he's a semiconductor/manufacturing guy who made his career at Intel.

The narrative that beancounters screwed up the company is pretty flawed.


That occasionally gets mentioned here [0]

[0] https://www.thelayoff.com/intel


Because that is what matters when it comes down to money, and Intel has plenty of it to burn, and pattents, regardless of ARM and AMD groupies wish for.


Intel got MBA'ed just like most American companies. Will be interesting if/when it will happen to Apple.


Intel’s woes were caused by poor engineering decisions. Apple has been run by MBAs for 10 years and is more successful than ever.


No, Apple's top leadership are all operations people or engineers, plus general counsel. The fact that some spent 2 years at an mba school doesn't make them forget their 4 years of industrial engineering/operations/supply chain undergrad.

Way different from most people who get MBAs.


42% of the most recent HBS class have science or engineering undergraduate degrees https://www.hbs.edu/mba/admissions/class-profile/Pages/defau...


If HBS only manages 42% I imagine it's even less elsewhere. And that number includes life sciences, etc., that is way different.


Isn't this because Intel under-invested in R&D over the last several years and Gelsinger is (rightly, it seems) saying he needs to invest a bunch of money to catch up?


It's not just R&D. They are opening their chip making facilities to the designs of others and increasing their ability to produce chips. All of that takes capital investments and it's a new line of business. Much of this isn't R&D but about building out capacity and processes for this new area.


I don't think they under-invested with sheer magnitude of money. The money is plentiful at Intel. It seems like they need better leadership, which hopefully they have in Gelsinger.


It wasn't under-investment, it was lack of focus. They spent their time running after new growth opportunities at the cost of their core business. They wanted to do IoT (low power they're not particularly good at and the margins are awful), 5G (they had some success here) and autonomous driving (some success but still comparatively tiny). Meanwhile their core business started to fail.


I was bullish on Intel all of last year but I think it’s effectively dead money until 2024-25. Between AMD and Apple, Intel is so far behind. Their GPU launch is also an indication that they are very disorganized.


Profits are still decently high and they have the capital to basically subsidize their own chips (as they have done in the past) to keep their market up.


>Their GPU launch is also an indication that they are very disorganized

May be something not Intel related but cough the head of their GPU unit.


Top three reply chains are transparently Intel damage control. It slides the topic but doesn't add anything to the discussion. Fact is Intel flopped on the tick tock cycle and is going the TSMC route to try to move to a smaller process and this is essentially what's been going on with them in the past few years. Moreover we learned through Spectre and Meltdown that Intel took a lot of shortcuts for decades, Theo De Raadt famously called out the CORE microarchitecture as being dramatically flawed and he ended up being correct[1].

Profits are important as a publicly held company but Intel needs a new from-scratch improved microarchitecture which isn't based on the venerable Pentium Pro, which is basically what we're all running a riced up version of even today.

[1]https://linuxreviews.org/Theo_de_Raadt_on_the_Intel_Core_2_J...


>Profits are important as a publicly held company but Intel needs a new from-scratch improved microarchitecture which isn't based on the venerable Pentium Pro, which is basically what we're all running a riced up version of even today.

Indeed. In the one market Intel dominates, what saved the company 15 years ago amid Itanic and Pentium 4 was 1) AMD Thunderbird being just as inefficient as Pentium 4 and, more importantly, 2) an Intel Israel skunk works project to improve on the Pentium 3. There is no such out-of-the-blue miracle this time.

Meanwhile, Intel has failed, or at least failed to distinguish itself, in every single market it has entered since IBM chose the 8088 forty years ago: Every non-x86 instructional set CPU, flash memory, antivirus (still perhaps the most mystifying move in Intel history), servers/motherboards, and GPUs come to mind. The most successful non-x86 business for Intel is ... Ethernet cards? Meanwhile, it's embarrassed itself with discrete GPUs at least three separate times. A few years ago buying Nvidia would have been a savvy move; now it can no longer do so, and it's not impossible to imagine Nvidia buying Intel in the future.


Intel's made a bet on RISC-V so let's see how that turns out. It's a bit of a signal that maybe x86/x64 is dying. Might be apparent in a couple years but at this point it was enough to make me prick my ears up.


They're also strong in Wifi chipsets and FPGA.


The big question in semiconductors is what will happen to the China-Taiwan situation. China has repeatedly stated that they intend to reunify with Taiwan by 2049 for the 100th anniversary of their communist party. I think they will probably try. If there's fighting then there's a good chance that TSMC ends up destroyed. Refugees can flee, physical plants can be destroyed.

So if TSMC is destroyed, what happens to everyone downstream who depends on semiconductors? We probably end up both delaying products for years and giving Intel boatloads of money thinking "I sure hope they can scale up their only-three-years-behind technology quickly."

So we should subsidize Intel to keep improving their currently-uncompetitive technology. This isn't the same as most "we should make things in America" complaints, it's a contingency plan for a specific, reasonably likely intense disruption to the world of technology.


Isn't tsmc building fabs in Arizona and Japan?

I don't think China will invade Taiwan but what do I know. Is it really worth the risk of being cut off from international trade and banking the way Russia has been? If China invades and there's an international embargo of Chinese goods, semiconductors are but one small part of the havoc that would ensue.


>Isn't tsmc building fabs in Arizona and Japan?

Both planned Fabs, and projected growth in Fab capacity in both location combined wouldn't equal to a single MegaFab in TSMC Taiwan.


> This isn't the same as most "we should make things in America" complaints

How is it any different? There is a strategic argument for onshoring pretty much anything that isn’t utterly frivolous like fidget spinners and such. Look how effectively Europe’s need for energy has undermined the Russia sanctions. The Ruble is stronger than it was before the invasion!

It’s nice not to be vulnerable to foreign embargoes of food or energy, just to name a couple important strategically important things besides semiconductors.


> There is a strategic argument for onshoring pretty much anything that isn’t utterly frivolous

I would probably adjust that to "onshore anything that takes longer than <insert number of years> to spin up".

You can spin up funiture factories in a year or less. You probably don't need them onshored. A steel mill probably takes 3-5 years. You probably should have a local alternative. You can't spin up ASML in less than 10 years. There definitely should be a local ASML alternative.


> anything that isn’t utterly frivolous like fidget spinners and such.

This frivolity is exactly what countries need to do if they want to manufacture at the level China is. Fidget spinners were built as a way of scaling up the manufacturing of very high-tolerance bearings, and selling them in the USA for $5 helped ensure that every American involved in manufacturing understood just how adept China had become.


> We probably end up both delaying products for years and giving Intel boatloads of money thinking "I sure hope they can scale up their only-three-years-behind technology quickly."

It’s weird, five years ago they were two years ahead of everybody.


Haven't checked Intel's stock in quite some time. I am familiar with their RISC-V moves and big future plays.

Market Cap 180 billion sounds great.

PE ratio of 7.3...

WOW. People are heavily predicting major crash on intel. Very nice dividend as well... so they aren't attracting people?

15 billion/year into R&D is pretty comfy.

Gain on Sale of Security 6,823million

Good and increasing EPS, consistently beating estimates.

total assets are going up very well.

total debt is going up and is a bit high.

Their financials look good. I dont see why their PE ratio would be sooooooo low.

0.06% % of Shares Held by All Insider

Oh. Intel is probably just skittering by, and it's really the everything bubble that's the problem? Might make sense.

I think the bigger reality? They label their assets as 168 billion and their market cap is 180 billion? Can we talk about bankruptcy?


It is the same reason companies predicted to have large growth have higher PE multiples. A PE of 6 indicates that the market does not believe Intel will be able to keep this level of earnings in the future.

Here are some of the factors:

   * Getting beat in manufacturing by TSMC
   * Competition in X86 infrastructure from AMD
   * Centralization of the lucrative server market into a few very large players that are fabricating their own ARM based chips (IE AWS gravitron)
   * X86 no longer having the same moat as ARM mobile domination has made many major libraries,languages and frameworks work just as well with ARM.
   * Recent track record of failing to respond to such challenges in a competitive way
Of course they could turn this around and keep making insane profits, but that is what the market is thinking at the moment.


Intel isn't successful in the fields that "the market" thinks are the future.

ARM is slowly chiseling away at their market share on the general purpose end of the spectrum. It's pretty clear that Apple intends to move completely away from Intel as some point in the near future.

Nvidia has heavily outpaced them in the AI computing space. Data centers are filling up with racks of five-figure Nvidia cards. Intel was never competitive in the gaming sector, so there's no evidence to suggest they could ever catch up.

Even in their home turf of legacy PC chips, there is strong competition from AMD.

I can see why people aren't expecting huge growth from Intel in the future. There isn't an obvious growth market that they are primed to take.


Part of the reason was their lower than expected margin in their DC / Enterprise sector where they are facing competition from AMD EPYC. This somehow only showed in the recent quarters which is much later than most expected as EPYC has been on the market for quite some time.


Question: why is it that Intel can’t just poach a bunch of the best TSMC engineers and catch up that way?

Is it a lack of knowledge how to do this manufacturing? Or is it more like a skills shortage of engineers to do the actual manufacturing?

Where exactly is it in the process that they can’t compete, and why is it so hard to catch up, given their virtually infinite R&D budget.


These questions tends to come from software side, especially web developers where deployment and distribution lead time is practically zero.

Even if Intel knew, the exact method to catch up, the time required to build Fabs, even if it was perfectly planned both internally ( as in Staff, Operation, recruitment ) and externally, everything from Land, Water, Electricity supply, equipment all readily available would still take 8 months to build and 1 year to operation. That is the fastest record to date to bring up a Fab by Apple and TSMC with lots of help from Government ( in terms of Regulation and infrastructure approval ). And if you include the planning prior to it that is more like 2 years. And it is already consider an astonishing achievement because most other players ( including TSMC themselves ) need 3-5 years.

And right now, ASML has a backlog of orders to fill, assuming they cant increase their rate of production both due to the complexity and chip shortage, they are talking about 2024 if not 2025 orders. That is ignoring other part of the supply chain like Tokyo Electron which are also running extremely tight. ( Or Applied Materials if you want to look it up )

Again, all of that is assuming you knew exactly what to do to catch up to leading node.


Hey thanks for the reply (and yeah, software dev here!)

Follow up question: does anyone know what it is they don’t know? Because while that does sound like a long time, they’ve been faffing around for years now trying to make 10nm work. Even on that time scale they should have been able to build a new fab - intel should really know how to do that anyway.

If there is something they don’t know, why can’t they just buy it in. That’s the thing I don’t get. They have so much money still.


This is stategic. Making money out of it is optional (like defence).


It's arguably strategic to have domestic capacity, sure. It doesn't have to be Intel.

Giving a company even the vague belief, let alone the certainty, that it won't be allowed to fail is a stunningly bad idea.


Who else can it be besides Intel? I don't think it even needs to be "domestic", per se, it just needs to be in a country that isn't extremely likely to be invaded by China.


Hmm. I think it depends on the application. Defense isn't HPC, right? They can probably get by on 22nm or so for most applications, so GlobalFoundries might be able to handle it.

Or, heck, Texas Instruments might be able to handle some chips (obviously they aren't cutting edge on the digital logic process node side, but they so all the funky analog stuff and they'll keep making a design for, like, a bazillion years, which may be appealing to the military).


IBM had bleeding edge fabs not too long ago and is still involved in the research if not the mass production side.


They should ask Nvidia how they did it, or they could peek, they got the data after all


Intel is done. Gelsinger can do his best to rally the troops but it's too late to catch up with Apple Silicon. The M1 came out in 2020 and Intel is still unable to match it. The M2 is coming this year, leaving Intel ever further behind.


It's extremely possible for a product to be strictly dominated by another on technical merits and still be successful.

"Intel is done" because Apple produced a great chip focuses on only one small (albeit very important!) part of the puzzle.


I'm so old I've seen many of these cycles where some new CPU blew everything else away and everyone said the competitors were done. Most of these "competitor-killer" CPUs are not even remembered today (HP PA-7000 anyone?)


Intel even did it to themselves. Itanium was supposed to be an x86 killer iirc.


Itanium didn't work but not only because of hardware. They didn't invest nearly enough in developer relations, compiler tech, contributions to other compilers and working with Microsoft to ensure a good landing spot for their tech.

That and huge die sizes which in turn resulted in high prices just didn't make it competitive considering all the other stuff.


PA-RISC will be remembered forever in the hearts of many:

https://youtu.be/VLTh4uVJduI


The difference is their competitors neither had the volume or the margins to stay competitive. Mobile is much larger than desktop/server.


then again, not everyone runs on a mac, and won't be anytime soon.


Doesn’t matter.


right, because the millions of computers in corporate America running windows are all just going to be thrown out and converted to m1/m2 macs.

I prefer mac, and use them everyday, but the prediction of the death of intel are premature. They don't need to be faster than M1s or M2s, they just need to be at least as fast, or faster than what people need them to be to surf the web, create spreadsheets and power points and run all that legacy enterprise software that only runs on windows.

Macs are better, but they are much more expensive on average; were I work (mega-corp), I can order a new windows machine whenever I want - if I need to get or upgrade my mac, it needs to be justified and approved two levels up from me, and really only developers and some graphics folks ever get approval - and my company probably buys/replaces 30K-40K machines every year 99% of which are running windows on intel.


I hate Apple, but every time I run something resource intensive on my laptop and it starts to sound like a plane going to take off I go to Apple.com and hover over Check out button. Once there is a reliable way to run Linux on M1/2 I'll switch.


If you run msoffice apps on a mac you'll be burning cpu like no ones business. I've a 2019 macbook pro 16, fans rev up regularly as excel (or power point, or word..) is using 100% cpu. The mac window manager process is a hog also. Honestly I switched to mac for work assuming "it just works" , but unfortunately it does not. M1 are likely better, but not overly impressed on the SW side.


Why do you think that Linux would run as fast or efficiently on an M1 Mac as an operating system that was designed from the ground up to run well on it?


For devs, MacOS is still pia. Asahi is still far behind. I just don’t get why Apple wouldn’t embrace Linux. Who’s stopping that from happening now that Jobs is gone?


What benefit does Apple get by “embracing Linux”?

And more “devs” are using Macs than Linux so it must not be too bad. That’s not even to mention that Android developers are saying Macs are fasted for development than x86 PCs and of course iOS developers are using Macs.

What benefit is there in “embracing Linux” for Apple? Better software? Better hardware support? Popularity?


I don't think it's about running faster, clearly OP wants to use Linux over MacOS, same for me. For dev work Linux is still king, and I happen to also personally prefer the KDE UX over MacOS.

So while Mac laptops are great hardware wise, it doesn't run the software I want, so that's why I'm still buying laptops made for Windows.


Except when dev work means anything related to graphics programming, than the king is naked.


Mac laptops have bad GPUs generally though, that's why people I know who work on games have Windows laptops most of the time, basically beefy gaming laptops. Or to be honest most of them don't even use laptops and develop on a desktop for that same reason.


I am lost, so for graphics developers, dev work Linux is still king or not?


For graphics I don't know, probably depends what kind of graphics you're targeting. By king I didn't imply most popular, I doubt Linux is the most popular at anything honestly. I meant that I find it still excels at development because of the way the OS and userland tools are setup. If I needed to do some graphics programming and I could get away with using Linux for it I probably would still choose it.

But for graphics, unless we're talking 2D or simple stuff, I'd imagine you'd want some beefy GPU, and that means you're buying a PC which gives you the choice of Windows or Linux.

My complaint is that Mac laptops don't let you install Linux and MacOS didn't embrace Linux with something like WSL for example either, and that holds me back, because otherwise the laptops are very enticing.


See, that is what I took point with, because devs != UNIX CLI as it seems to be cargo cult in some circles.


I mean, it's my personal preference, I find having a good command line and package manager to be quite nice for development personally. And I prefer the KDE UX as well. I also like Unix as a whole, even the OS configuration is just code inside files.

And I think I associate the userland to be part of the OS. So for example, instead of thinking, oh I wish Windows had a better command line and package manager and got rid of the registry and used files instead as the main abstraction, I'm much more likely to wish that Nvidia released high quality drivers for Linux and that Unity had prime support for it.


> graphics programming

You mean as in Vulcan and games, or rendering? Linux is your best choice unless you are using some Windows first framework like Unity. (AFAIK there is no OS X first one.)


That answer only reveals the lack of knowledge of the state of the art in GUI and 3D graphics tooling in general.

To your information all relevant middleware supports Metal, like Unity and Unreal.

Isn't great that the "best choice" needs to rely on workarounds like Proton, and Electron apps, and gets zero ports from Android/Linux.


I'm not so sure about that I work in graphics and we're 99% linux


Welcome to the 1% of the desktop market in games and graphics, the market worthy of a king.


Hum... You know you don't need to program on the same platform that your game will run, right? (I still don't know what you mean by "graphics", since it's not games.)

Otherwise creating a mobile game would be pretty insane.


Anything related to GPGPU programming with sane tooling like what NVidia Insights, Instruments, PIX are capable of.

Has RenderDoc finally started to support shader debugging, with watchpoints and and everything else that one expects?

Engines like Unreal and Unity.

Visualization tools like Maya, AutoCAD, Catia, 3D Painter, OctaneRender, InDesign, Photoshop, Illustrator,...

CMYK and other typesetting colour workflows tooling.

Yes probably you will refer one or two of those have a Linux version with a feature subset, which is ok, I guess, when the kingdom is actually a principality.


games aren't everything I work in the vfx space


If it is king for dev work, it should win at it across the board, regardless of what work the dev does.

Game devs are also devs.

And in VFX, UNIX was already dominant thanks SGI, not Linux.

Which tend to use it for rendering farms in most cases, while graphics oriented people stick with macOS and Windows workstations for the daily workflows.


[flagged]


Apparently not benchmarks




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: