Seeing as we now have a multitude of tools available to us that we didn’t have in 1947, I imagine it would be faster.
Comment on Scientists say quantum tech has reached its transistor moment
RedWeasel@lemmy.world 2 months ago
So, around 1947. Took about 14 years to get to being able to put into chips. So another decade and a half?
funkajunk@lemmy.world 2 months ago
Gsus4@mander.xyz 2 months ago
And an already existing consumer base with expectations that were only for hobbyists before…maybe that’s a bad thing, because it will constrain QC to evolve in ways that it would be better to explore rather than try to fit modern use cases.
kutt@lemmy.world 2 months ago
I don’t think it will ever reach consumer households, since it requires extremely complex and expensive materials, tools and physical conditions. Unless a major breakthrough occurs but highly unlikely.
Also we don’t really have a use for them, at least to regular users. They won’t replace classical computers.
But you can already access some QCs online. IBM has a paid remote API for instance.
baggachipz@sh.itjust.works 2 months ago
requires extremely complex and expensive materials, tools and physical conditions.
Counterpoint: they said the same thing when a computer was made of vacuum tubes and took up an entire room to add two digits.
kutt@lemmy.world 2 months ago
Yeah but you have to consider one other thing. Before creating classical computers, we already had theorized them, we had algorithms etc. We knew why we were creating them.
For QC, the pace of hardware development is faster than our ability to create algorithms. It’s very similar to what’s happening with the AI bubble currently, we’re investing heavily in a new technology because it looks cool to investors, but we don’t even have enough algorithms to run on it. It’s just a shit ton of marketing…
baggachipz@sh.itjust.works 2 months ago
Yeah, understood. I was just saying that because it doesn’t seem technically possible now, don’t discount that it could be in the future. Whether it would be useful, that’s another debate. But I have a hard time believing it has practical uses. If it does though, the innovation will be rapid like the shift to silicon transistors (assuming it is even possible).
RedWeasel@lemmy.world 2 months ago
I can currently only see them used as accelerators of some type right now. Could see them used potentially for GPUs, but generally I suspect some form of compute first. GenAI anyone? SkyNET? But that is only if they can be made portable for laptops or phones which is still a major issue still needing to be addressed.
I don’t expect them to replace traditional chips in my lifetime if ever.
a_non_monotonic_function@lemmy.world 2 months ago
Could see them used potentially for GPU
Like used as GPUs or like GPUs. The latter, certainly. The former not as much. They aren’t a replacement for current tech they accelerate completely different things (and they really do nothing currently that your average consumer would be interested in anyway).
kutt@lemmy.world 2 months ago
Yes they will probably never replace them because they’re actually slower than classical computers in doing simple calculations.
Quantum ML is actively being researched. However I am not informed at all about the advancement in this field specifically.
But the good news is that it doesn’t need to be portable, we can use them just as we do right now with remote access!
photonic_sorcerer@lemmy.dbzer0.com 2 months ago
From the byline:
So pretty much, yeah.
Corkyskog@sh.itjust.works 2 months ago
Well years could be 3 years or 300 years so that doesn’t really confirm OP’s guess.
sorghum@sh.itjust.works 2 months ago
In this case it’s probably both until observed.