Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

My understanding is that a 40nm fab is only economically viable if it's spent the first several years of its life producing high margin chips.

In other words; the life cycle of a 40nm fab is:

  1997: start building fab
  2000: fab goes online and starts producing CPUs 
  2006: fab upgraded
  2012: fab switches from CPUs to video and memory controller  chip sets
  2018: fab switches to USB controllers and embedded chips
  2019: fab offline for 2 months because an antiquated but critical part is broken and is only brought back online because another similarly old fab went offline and sold off their parts
  2020: fab shut off because of covid
  2021: fab found to be a write-off because too many things broke while fab was offline.
So if you skip straight past the profitable phase, you end up spending billions of dollars to make a fab that makes $0.30 parts, and it'll never be profitable unless those parts are $10 each, which in turn makes the product they're in unprofitable.


You are correct. Building fabs today only for fabbing much older nodes will not be profitable. You have to target 22nm and below otherwise you can't afford to jump in the semi fab ring.


TSMC is building a lot of new 28nm production with plans to shut down all their older nodes and move everyone over in the next few years.

GlobalFoundries (formerly AMD fabs) created a brand-new 22nm planar process specifically for older chips as an upgrade to other company's 28nm processes.

Profits seem possible if you approach it the right way.


We're talking about different things here. I was talking about building new fabs for 28nm nodes and you're talking about TSMC upgrading existing fabs from older nodes to 28nm production.

Of course upgrading an existing older "sunk-cost" fab to 28nm production will be profitable, but not building a new one from scratch just for that same older node.


But now this makes the subsidies angle make more sense: You subsidize initial construction and then the domestic plant remains online indefinitely because the construction is a sunk cost and the incremental cost of upgrades over time is sustainable.


The math works out a lot better when you’re upgrading pre-EUV fabs or expanding an existing facility. A lot of the gear and setup is mostly the same such as wafer cleaning, HVAC and isolation, etc and the local challenges to setup and labor have been figured out.


Whatever is the best they can make a DUV, planar process do will be used for decades to come.


"But I've got a product that's certified with this part that's running on a 40nm process that has these specifications that are deeply tied to features of that 40nm process; things like voltage ranges and temperature tolerances! If you force me to switch to a comparable but not identical part at 22nm I'll have to re-certify my widget with 18 different regulatory agencies!"


If those are your needs, you order all the parts you need over your product’s lifetime up-front or get (= pay for) a contract with the manufacturer that makes them promise to sell you the parts for X years (they probably wouldn’t keep producing old parts, but would stockpile enough of them to be able to deliver working ones years later)

(Or you prepare for having to go to eBay for working parts. https://www.nytimes.com/2002/05/12/us/for-parts-nasa-boldly-...)


There are companies (I've used Rochester Electronics) that both stockpile and manufacture legacy chips specifically for the long tail support situations.


You will always have pure analog electronics and other bespoke things that basically don't benefit from anything finer than these nodes. Even for digital chips, it makes no sense to use leading edge nodes for very simple logic where a lot of the area is just contact pads.


Relevant to mention MEMS (micro-electromechanical systems) in this context, which use much older nm tech. Be it digital micro mirror devices¹ or gyros². Or photo/laser diodes.

Given the physical limitations, as well as the problems we have with code base security it might be time to aim for cheaper production of something in the region of 180nm instead.

Looking at how old much of the standard weaponry used today is (TOW 50 years with an actual physical gyroscope, Javelin still 25 years³), the demand from the military alone should cover the initial cost. Especially if you look at the ludicrous prices western countries payed for even dumb artillery shells.

¹ Texas Instruments DMD from a DLP projector from @AppliedScience https://youtu.be/9nb8mM3uEIc?t=428

² Explanation of MPU-6050 from @BreakingTaps https://youtu.be/9X4frIQo7x0?t=664

³ Teardown of both from @lelabodemichel5162 https://www.youtube.com/watch?v=s7-6hgX7-zQ

Sorry for late edits


It's not about what you can do or can't do. It is about what you can do profitably and that's a completely different thing.


I have to wonder if the ability to profit depends entirely on the established cartel of semiconductor manufacturers. They determine the current prices of chips in the marketplace.

If entering that marketplace requires competing with them, then I am not sure anyone that is not already in the market can ever win. The margins are too low and the startup costs are too high.

Government intervention seems to be the only possible solution, and that option hardly sounds viable when considering that cartel’s collective lobbying power.


I don't think this is a "cartel of semiconductor manufacturers" so much as it's been a "shambolic cluster of organizations running crappy old fabs into the ground producing cheap chips that were subsidized by a prior decade's worth of very expensive products."

I can afford to sell gazillions of chips at $0.08 per chip if I'm running a fab I didn't pay to build. I'm only (barely) paying for the inputs. When Stan, the last guy who understands how to run the widget verifier, or Elaine, the last lady to understands how to run the polishing machine retire, I'll have to close up shop.

Those $0.08 per chip devices have been absurdly subsidized in that a replacement infrastructure to make them would require that they cost $10 per device, and the ecosystem of things built on $0.08 chips isn't viable in a $10 per chip world.

In order to have a fab make $0.03 per unit devices, you first have to have the fab spend 10 years making $300 per unit devices, regardless of the underlying node size of those $300 per unit devices.

Likely you couldn't even go back and make a fab that makes large volumes of 60nm-90nm node sizes at all, for any amount of money, because the equipment to do this (new) hasn't been made in 2 decades and no company is willing to invest the money to make new crappy old equipment.

It's not a nefarious oligopoly as much as a synchronized "run the asset to failure" lifecycle of the infrastructure.

How much does it cost to make a 300 year old tree?


>Likely you couldn't even go back and make a fab that makes large volumes of 60nm-90nm node sizes at all, for any amount of money, because the equipment to do this (new) hasn't been made in 2 decades and no company is willing to invest the money to make new crappy old equipment.

I believe your argument assumes that there is a fixed cost to produce even 180nm or 350nm ICs that hasnt changed since the first one was produced.

We still need 300 years for a 300 year old tree, but 25 year old technology might now be relatively easy to build if we start from scratch.

What was high tech then might be relatively easy to solve now. One example might be https://github.com/circuitvalley/USB_C_Industrial_Camera_FPG... being open source instead of a multi year, multi million dollar project.


Yes, my argument is that producing at industrial scales even chunky nodes requires enormous capital expenditures and may be impossible without rebuilding large chunks of an antiquated and abandoned supply chain.

Even if it is 10% the cost of making the each of the individual components involved in making a relatively simple 90nm chip, you're still looking at vast costs.

If you're talking about making 30 chips in a university fab, sure, I'll concede that it is "possible" but if you're talking about propping up an industry built on products that require a herd of standardized "$0.30" parts made on legacy 90nm fabs, that ship has sailed.

Update your BOM and recertify or raise your costs by an order of magnitude.


First off, you are definitively making a very solid point, cost for getting mass production right are a killer once the institutional knowledge is gone. For example, its very visible in the field of battery technologies if i am not mistaken. Going from lead to lithium was a gigantic task and the inertia going forwards hasnt reduced enough at this point.

But realistically this is a matter of going back far enough, to lower the cost far enough? 10% are a good start but to stick to the topic, physical gyroscopes from decades ago are now replaced with MEMS ICs where the reduction in cost is magnitudes more then down to 10%. At a certain point the reduced cost makes it viable. The question is just has it been long enough?

While we wont get 90nm cheap enough, the question is what can we do on a hobby level (vs academia)? Because going from there (neglectable cost and technological requirements) to mass production will at some point be cheaper then the cost of setting up reproducible tooling for older high tech systems.

I am likely still off with 180nm, but there should be a level at which this makes economical sense. A level that gets cheaper to reach with technological progress / time.


The problem I see with this argument is that there are plenty of fabs making trailing-edge devices, some of which aren't even that old. It even seems to be part of the established path for countries and locations more generally that seek to bootstrap a semiconductor industry of their own. They get started with the simplest and coarsest nodes, then go finer step by step. Even TSMC got their start that way. So it seems like a pretty robust industry to me, I'm not seeing the argument for a crisis.


Personally i cant follow this line of reasoning. In the end this is an economical argument, as they still buy machines from the same manufacturer. At that point its a matter of being able to deliver and create a market for ICs with the given machines. Which is often achieved through political will and subsidizes to get to that point.

My initial argument is that while you cant compete with ASML products in 2023, you will be able to economically compete with some of their older products once you go back far enough.


> How much does it cost to make a 300 year old tree?

Aside from your main point, I found this an interesting thought exercise thinking about cost of air, sunlight, soil, water and then 300 years of security


I imagine if you're going to grow one 300 year old tree, then your best bet is obscurity. Find a stable very-rural area that's not prone to bushfires, plant one tree and make sure it's doing well for a few years, come back 300 years later, you're done.

If you're not going the obscurity path then you'd really want to scale it up - there's not much difference between security for one tree and security for 100 trees.


The capital expense on a new fab is crazy. There may be a cartel factor but that usually would work to the advantage of the manufacturers, so that doesn't seem to be the case here.


There's no real cartel for older nodes. It's not even really possible considering how many fabs exist and how many players are operating those older fabs.


Number of producers of these fabs is still quite limited though.


But you can only really make those profitably for a few industries (military, medical, seismic come to mind). The EU does have the chip fabs for those industries, of course...


Power electronics are a mainstream, very profitable market.


>> But you can only really make those profitably for a few industries

I think it's more like they're only profitable if the equipment is already paid for. And even then the margins may be low.


There might be an argument then that it would be worth it for the state to take the hit. If shit hits the fan and you have zero semi-manufacturing, then you are going to be pretty screwed.


> If shit hits the fan and you have zero semi-manufacturing, then you are going to be pretty screwed.

I don't really understand this claim at all. Chips are not exactly fungible, unless you force your local companies to use you "state sponsored chips" in their products just being able to produce "chips" wouldn't be that useful. What are you going to do with them?


Guide munitions if needed.


So the cost of building a fab hasn't come down in the last decades, huh? Genuinely asking, is there some^W^W^W what is the "uncompressible" cost in fab-fabbing? I'd totally guess that staff and the building itself are not it?


What confuses me is that there seems to be a bathtub curve on the fab market.

We've got the race to 7-5-3-2nm for high-end parts, and then a couple generations back on support chips and lower-performance cheap parts.

Then you've got the dead zone. Is there a meaningful active use case for 1-micron processes anymore?

But then you get to perennial embedded-market stuff (Z80, 65C02, 80186) originally built on 40-year-old processes. And going even further, you have stuff that's a handful of transistors on a die that was probably drawn with markers for the original lithography; I'm not sure we'd even use numbers to describe that process. How does that stuff get made anymore? It can't all be draining down old stock.

I can't imagine it's cost effective to use modern-fab capacity to manufacture Z80s, 555s, or 74LS04s at their original die sizes. And if you shrunk them to put a million on a wafer, even assuming that's feasible, you'd change their performance characteristics in ways that might break long-established specs and contracts.


Not all ventures need to be profitable. The EU may decide to take a loss on this solely for strategic reasons.


> strategic reasons

Such as? I can't really think of any benefit besides providing jobs and funding for contractors (so kicks backs etc.)

Then again it's not particularly surprising, the EU is well know for wasting massive amounts of money on all sorts of nonsense while ignoring things that actually matter.


There's both supply-chain and runtime security.


Don't forget the MBAs willing to burn it all down to juice the Q2 profits.


Have you looked at a Pentagon budget lately? It's entirely welfare for defense contractors.


> Have you looked at a Pentagon budget lately?

No, but I trust you. However the US tech industry is doing just fine without too much government funding and intervention so it's not such a huge issue (in this specific area of course, not in general)


Sounds like there is a need for investment into innovation beyond just building the next-generation fab for $2^x billion. Bringing the cost of a new less-advanced fab down from $2 billion to $100 million, and then building 20 of them, could also be profitable (though less exciting). There is a national economy that's actually been growing quite well for a few decades now by applying that general idea to other industries.


But if you were a country or an alliance that wanted to be 1000% sure you always had access to a component (drone parts) you might be willing to pony up billions to make sure you could not be blockaded or embargoed. I don't know if that makes sense but given what is going on in Ukraine and the Mid East, people have to be thinking about that.


How does an entire semiconductor factory become FUBAR from being offline for a year?


The example is hypothetical, but complex machines can be complex to keep running, and often suffer catastrophically when shut down.

If the fab was barely profitable before shutting down, it doesn't take much to total it. Fabs are full of machines that cost tens of millions of dollars when they were new and there are simply no spare parts of vendor support for them now, and you can't just swap in a modern replacement. Fabs are full of extremely sensitive environments (no dust here, acid that will kill you if you touch it there, constant temperatures, no humidity, etc). If any of that is compromised, it's now just a toxic waste dump.

Again, I have no specific knowledge in this domain, but I imagine most of the time the owner's happy enough just to walk away from the headache.


There's also the brain drain aspect. All the process engineers and techs that understood all the various "recipes", quirks, etc, of the various machines moved on to other work.

A new crew will eventually work it out, but there's a lot of trial and error getting to the right bake time/temps, spin rpm, etc, etc. Yield and rework suffers while they do that.


Dust is the simplest example.

Once you shut off the dust extraction, you may just end up with too much dust collected in the equipment to make it utterly useless.


Not an expert, but there are additional start up costs that need to be spent to “start it up.” With any significant downtime, those could eat up any possible profit unless it’s a newest technology fab.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: