can we please socially murder the sales team that rebranded the unit in nodes from something physically meaningful to a random countdown detached from reality? (1nm node does not have any bearing on critical dimension or size of the circuits)
How we get to 1 nanometer chips and beyond
Submitted 2 days ago by AfterNova@lemmy.world [bot] to technology@lemmy.world
https://research.ibm.com/blog/1nm-chips-vtfet-ruthenium
Comments
urshilikai@lemmy.world 2 days ago
jj4211@lemmy.world 1 day ago
To be fair, the industry spent decades measuring a distance, so when they started doing features that had equivalent effects, the easiest way for people to understand was to say something akin to equivalent size.
Of course, then we have things like Intel releasing their "10 nm* process, then after TSMC’s 7nm process was doing well and Intel fab hit some bumps, they declared their 10 to be more like a 7 after all… it’s firmly all marketing number…
Problem being no one is suggesting a more objective measure.
certified_expert@lemmy.world 1 day ago
Eli12?
JohnEdwa@sopuli.xyz 1 day ago
Open any wikipedia article about “x nm process” and one of the first paragraphs will be something like this:
The term “2 nanometer”, or alternatively “20 angstrom” (a term used by Intel), has no relation to any actual physical feature (such as gate length, metal pitch or gate pitch) of the transistors. According to the projections contained in the 2021 update of the International Roadmap for Devices and Systems published by the Institute of Electrical and Electronics Engineers (IEEE), a “2.1 nm node range label” is expected to have a contacted gate pitch of 45 nanometers and a tightest metal pitch of 20 nanometers.[1]
It used to be that the “60nm process” was called that simply because the transistor gate was 60nm.
SaveTheTuaHawk@lemmy.ca 1 day ago
The best thing you can do to understand this is watch the latest Veritasium video about ASML.
ASML figured out how to make ultraviolet light very close to X-Ray wavelength using some incredible physics and engineering.
lemmydividebyzero@reddthat.com 1 day ago
We should call it nm…
Calling it nanometers does not make sense.
umbrella@lemmy.ml 23 hours ago
let’s focus on letting people have what we can already make.
what’s the point of -100nm chips if only ai techbros have them?
Buffalox@lemmy.world 1 day ago
Well AFAIK the smallest usable atom is about 120 picometer, and the smallest amount of atoms theoretically possible to make a transistor is 3, so there is (probably) no way to go below 360 picometer. There is probably also no way to actually achieve 360 picometer which is the same as 0.36 nanometer.
So the idea that they are currently going below 2nm is of course untrue, but IDK what the real measure is?
What they are doing at the leading chip manufacturing factories is amazing, so amazing it’s kind of insane. But it’s not 2nm.
jj4211@lemmy.world 1 day ago
For a while now the “nm” has been a bit of a marketing description aiming for what the size would be if you extrapolated the way things used to be to today. The industry spent so long measuring that when the measurement broke down they just kind of had to fudge it to keep the basis of comparison going, for lack of a better idea . If we had some fully volumetric approach building these things equally up in three dimensions, we’d probably have less than “100 pm” process easily, despite it being absurd.
Buffalox@lemmy.world 1 day ago
There is no way less than 100 pm can make sense.
chonomaiwokurae@sopuli.xyz 23 hours ago
The whole idea of somehow representing different nodes and their development with one number is a bit silly. That being said, it looks like future channel materials could be 0,7 nm in thickness (monolayer WX2).
Hadriscus@jlai.lu 1 day ago
That’s super cool. I’m asking as a total layman, what’s preventing the use of subatomic particles as transistors ?
Buffalox@lemmy.world 1 day ago
Well in the future light may be possible, and light is merely a photon, and you can have photons basically follow the same paths in each direction simultaneously without colliding.
So without in any way being an expert, I would think that if light can somehow be controlled precisely enough, that would be a possibility to go way below what any atom can. Even if the paths need to be directed by atoms.
But AFAIK there is not a practical working model for that yet, although research on it has been going on for decades.
spicehoarder@lemmy.zip 1 day ago
Heisenberg uncertainty probably is a big barrier
theparadox@lemmy.world 1 day ago
Not an expert but… typical computers do what they do by transmitting (primarily) electrical signals between components. Is there electricity or isn’t there. It’s the “bit” with two states - on or off, 1 or 0. Electricity is the flow of electrons between atoms. Basically, we take atoms that aren’t very attached to some of their electrons and manipulate them so that they pass the electrons along when we want them to. I don’t know if there is a way to conduct and process electrical signals without using an atom’s relationship with its electrons.
Quantum computing is the suspected new way to get to “better” computing. I don’t know much about the technical side of that, beyond that they use quantum physics to expand the bit to something like a qubit, which exploits superposition (quantum particles existing in multiple states simultaneously until measured, like the Schrodinger’s cat metaphor) and entanglement (if two quantum particles’ states are related to or dependent on each other, determining the state of one particle also determines the state of the other) to transmit/process more than just a simple 1 or 0 per qubit. A lot more information can be transmitted and processed simultaneously with a more complex bit. As I understand it, quantum computing has been very slow going.
That’s my shitty explanation. I’m sure someone will come along and correct my inaccurate simplification of how it all works and list all that I missed, like fiberoptic transmission of signals.