Open Menu
AllLocalCommunitiesAbout
lotide
AllLocalCommunitiesAbout
Login

China scientists develop flash memory 10,000× faster than current tech

⁨476⁩ ⁨likes⁩

Submitted ⁨⁨3⁩ ⁨weeks⁩ ago⁩ by ⁨Ninjazzon@infosec.pub⁩ to ⁨technology@lemmy.world⁩

https://interestingengineering.com/innovation/china-worlds-fastest-flash-memory-device

source

Comments

Sort:hotnewtop
  • CouncilOfFriends@slrpnk.net ⁨3⁩ ⁨weeks⁩ ago

    By tuning the “Gaussian length” of the channel, the team achieved two‑dimensional super‑injection, which is an effectively limitless charge surge into the storage layer that bypasses the classical injection bottleneck.

    Image

    source
    • the_tab_key@lemmy.world ⁨3⁩ ⁨weeks⁩ ago

      They’re just copying the description of the turbo encabulator.

      source
    • psoul@lemmy.world ⁨2⁩ ⁨weeks⁩ ago

      They finally stole the French édriseur technology I see

      source
  • boonhet@lemm.ee ⁨3⁩ ⁨weeks⁩ ago

    AI AI AI AI

    Yawn

    source
    • Zip2@feddit.uk ⁨3⁩ ⁨weeks⁩ ago

      normal person’s server.

      I’m pretty sure I speak for the majority of normal people, but we don’t have servers.

      source
      • peteyestee@feddit.org ⁨3⁩ ⁨weeks⁩ ago

        Ikr…Dude thinks we’re restaurants or something.

        source
      • umbraroze@slrpnk.net ⁨3⁩ ⁨weeks⁩ ago

        Yeah, when you’re a technology enthusiast, it’s easy to forget that your average user doesn’t have a home server - perhaps they just have a NAS or two.

        (Kidding aside, I wish more people had NAS boxes. It’s pretty disheartening to help someone find old media and they show a giant box of USB sticks and hard drives. In a good day. I do have a USB floppy drive and a DVD drive just in case.)

        source
        • -> View More Comments
      • notabot@lemm.ee ⁨3⁩ ⁨weeks⁩ ago

        You… you don’t? Surely there’s some mistake, have you checked down the back of your cupboard? Sometimes they fall down there. Where else do you keep your internet?

        Appologies, I’m tired and that made more sense in my head.

        source
        • -> View More Comments
      • fmstrat@lemmy.nowsci.com ⁨2⁩ ⁨weeks⁩ ago

        “Normal person” is a modifier of server. It does not state any expectation of every normal person having a server. Instead, it sets expectation that they are talking about servers owned by normal people. I have a server. I am norm… crap.

        source
    • gravitas_deficiency@sh.itjust.works ⁨3⁩ ⁨weeks⁩ ago

      You can get a Coral TPU for 40 bucks or so.

      You can get an AMD APU with a NN-inference-optimized tile for under 200.

      Training can be done with any relatively modern GPU, with varying efficiency and capacity depending on how much you want to spend.

      What price point are you trying to hit?

      source
      • boonhet@lemm.ee ⁨3⁩ ⁨weeks⁩ ago

        What price point are you trying to hit?

        With regards to AI?. None tbh.

        With this super fast storage I have other cool ideas but I don’t think I can get enough bandwidth to saturate it.

        source
        • -> View More Comments
      • WorldsDumbestMan@lemmy.today ⁨2⁩ ⁨weeks⁩ ago

        I just use pre-made AI’s and write some detailed instructions for them, and then watch them churn out basic documents over hours…I need a better Laptop

        source
  • minoscopede@lemmy.world ⁨3⁩ ⁨weeks⁩ ago

    Link to the actual paper: www.nature.com/articles/s41586-025-08839-w

    source
    • primemagnus@lemmy.ca ⁨3⁩ ⁨weeks⁩ ago

      Damn. I just pulled all my stock out quantum computing and thru it all into this…

      source
      • anonApril2025@lemmy.zip ⁨3⁩ ⁨weeks⁩ ago

        Easy when you have zero

        source
    • Jolteon@lemmy.zip ⁨2⁩ ⁨weeks⁩ ago

      Speaking of, did you hear there’s a new room temperature super conductor?

      source
  • conditional_soup@lemm.ee ⁨2⁩ ⁨weeks⁩ ago

    This article appeared in my feed just above another article about how China has the world’s first operational thorium reactor. Meanwhile, the US is about to fight a civil war over whether vaccination causes measles and stripping away the last of our social programs in order to get out wealthiest people another 2% subsidy.

    source
    • SoftestSapphic@lemmy.world ⁨2⁩ ⁨weeks⁩ ago

      China and Russia worked very hard to get these rich stupid people in power.

      It really started in 2016 when US security agencies released a joint report showing Russia was spreading misinformation to help Trump win the election.

      Surprisingly, the “liberal tears compilations” and “something about an email server people didn’t understand wasn’t actually illegal” actually worked and drowned out the warnings from our security agencies.

      I don’t think China will be any better of a world leader tbh.

      I see humanity’s future as a boot stepping on a human face forever, unless humanity globally rejects kings, oligarchs, and dictators.

      source
      • Netux@lemmy.world ⁨2⁩ ⁨weeks⁩ ago

        Don’t forget the genius DNC folk, including HRC thought a pied piper strategy of boosting the circus peanut was a good idea.

        If the Russians and Chinese did anything it was just capitalizing on an unforced error by the hubris of the centrist. One again, bernie would have won, but that was more distasteful to the ruling class than fascism.

        source
        • -> View More Comments
      • WorldsDumbestMan@lemmy.today ⁨2⁩ ⁨weeks⁩ ago

        Not my future, I will try to die in a way that even an omnipotent AI can’t bring back.

        source
        • -> View More Comments
      • eleitl@lemm.ee ⁨2⁩ ⁨weeks⁩ ago

        You rely on professional fabrications of misinformation to tell you the truth about who is producing misinformation? Don’t fall for crude propaganda. When empires end they do some self-destructive things. It’s normal.

        source
        • -> View More Comments
      • Knock_Knock_Lemmy_In@lemmy.world ⁨2⁩ ⁨weeks⁩ ago

        It really started in 2016 when US security agencies released a joint report showing Russia was spreading misinformation to help Trump win the election.

        Compare russia to the British and consider who is the bigger villain.

        source
        • -> View More Comments
    • MedicPigBabySaver@lemmy.world ⁨2⁩ ⁨weeks⁩ ago

      Fuck the idiotic Americans that won’t bother to immunize, never mind understanding science as a whole.

      source
      • morriscox@lemmy.world ⁨2⁩ ⁨weeks⁩ ago

        No. We don’t want them to breed…

        source
        • -> View More Comments
  • MuskyMelon@lemmy.world ⁨3⁩ ⁨weeks⁩ ago

    Too bad the US can’t import any of it.

    source
    • umbrella@lemmy.ml ⁨3⁩ ⁨weeks⁩ ago

      they can if they pay 6382538% tariffs.

      or was it 29403696%?

      source
      • errer@lemmy.world ⁨3⁩ ⁨weeks⁩ ago

        “These chips are 10,000 times faster, therefore we will increase our tariffs to 10,100%!”

        source
      • jaxxed@lemmy.ml ⁨2⁩ ⁨weeks⁩ ago

        That was yesterday. It doubled since then IIRC

        source
  • InFerNo@lemmy.ml ⁨2⁩ ⁨weeks⁩ ago

    China scientists

    So, Chinese scientists?

    source
    • gwilikers@lemmy.ml ⁨2⁩ ⁨weeks⁩ ago

      Chientists

      source
      • FourWaveforms@lemm.ee ⁨2⁩ ⁨weeks⁩ ago

        (acts confused in French)

        source
    • Mooseford@lemmy.today ⁨2⁩ ⁨weeks⁩ ago

      No they are people who study the China Science.

      source
    • realitista@lemm.ee ⁨2⁩ ⁨weeks⁩ ago

      I think it’s a slightly different connotation. “China scientists” infers scientists residing in China while not presuming their ethnicity, while “Chinese scientists” implies their ethnicity but not their location.

      source
      • essteeyou@lemmy.world ⁨2⁩ ⁨weeks⁩ ago

        You literally never hear “America scientists” even if some of them might be from another country. Same with every single other country I can think of, except China.

        source
        • -> View More Comments
    • pycorax@lemmy.world ⁨2⁩ ⁨weeks⁩ ago

      Probably because is an ethnicity and nationality. There are ethnic Chinese people all over the world and a few countries and regions are made of a majority of ethnic Chinese but are not related to China. Calling them the same thing is playing into the PRC’s “all ethnic Chinese pledge their allegiance to China” nonsense.

      source
      • Netux@lemmy.world ⁨2⁩ ⁨weeks⁩ ago

        Isreal like that game of pretend. They believe anti zionist Jews are traitors.

        source
      • saimen@feddit.org ⁨2⁩ ⁨weeks⁩ ago

        Isn’t that true for every (older) country though?

        source
        • -> View More Comments
      • pHr34kY@lemmy.world ⁨2⁩ ⁨weeks⁩ ago

        It’s a reasonable assumption that someone in China is Chinese.

        source
        • -> View More Comments
    • liquidparasyte@pawb.social ⁨2⁩ ⁨weeks⁩ ago

      Real talk, why is discussion around people and subjects in China so fucking weird?

      If it’s not referring to the entire population when it only applies to the government or a subset of them as a global “the Chinese” or doing silly shit like “China scientists” everyone’s grammatical skills suddenly tank when even broaching a topic even tangential to the PRC.

      source
    • Lemminary@lemmy.world ⁨2⁩ ⁨weeks⁩ ago

      Me stutter? No think so!

      Image

      source
      • muhyb@programming.dev ⁨2⁩ ⁨weeks⁩ ago

        Him legend.

        source
    • Etienne_Dahu@jlai.lu ⁨2⁩ ⁨weeks⁩ ago

      No, it’s people who study fine tableware.

      source
    • DasSkelett@discuss.tchncs.de ⁨2⁩ ⁨weeks⁩ ago

      Seriously, for me a “China scientist” is someone doing research on China, like a space scientist would do research on astronomy and similar. But I’m not a native English speaker, so, idk

      source
      • simplejack@lemmy.world ⁨2⁩ ⁨weeks⁩ ago

        Someone doing research on China is a chiologist.

        Same as someone doing research on biology is a biologist.

        source
        • -> View More Comments
      • realitista@lemm.ee ⁨2⁩ ⁨weeks⁩ ago

        The wording of the headline would be different if it were trying to convey that.

        source
  • AI_toothbrush@lemmy.zip ⁨3⁩ ⁨weeks⁩ ago

    Brother, have you heard of buses? Even INSIDE cpus/socs bus speeds are a limitation. Also i fucking hate how the first thing people mention now is how ai could benefit from a jump in computing power.

    source
    • xthexder@l.sw0.com ⁨2⁩ ⁨weeks⁩ ago

      That’s pretty much my understanding. Most of the advancements happened in memory speeds are related to the physical proximity of the memory and more efficient transmission/decoding.

      GDDR7 chips for example are packed as close as physically possible to the GPU die, and have insane read speeds of 28 Gbps/pin (and a 5090 has a 512-bit bus). Most of the limitation is the connection between GPU and RAM, so speeding up the chips internally 1000x won’t have a noticeable impact without also improving the memory bus.

      source
  • phoenixz@lemmy.ca ⁨3⁩ ⁨weeks⁩ ago

    Clickbait article with some half truths. A discovery was made, it has little to do with Ai and real world applications will be much, MUCH more limited than what’s being talked about here, and will also likely still take years to come out

    source
    • Emmie@lemm.ee ⁨3⁩ ⁨weeks⁩ ago

      The key word is China, let us not kid ourselves

      source
      • phoenixz@lemmy.ca ⁨2⁩ ⁨weeks⁩ ago

        Heh?

        source
  • 800XL@lemmy.world ⁨2⁩ ⁨weeks⁩ ago

    You just fucking wait. Trump is bringing manufacturing to the US. And when that plant opens someday you’ll be so sorry you doubted.

    source
    • BobSentMe@lemm.ee ⁨2⁩ ⁨weeks⁩ ago

      I’m sure the foxconn plant in Wisconsin will fire up ANY DAY NOW! drums fingers

      source
      • 800XL@lemmy.world ⁨2⁩ ⁨weeks⁩ ago

        I talked to like 50 people today and all of the people said they were starting manufacturing plants tomorrow and they’ll be fully functional Tuesday around 3:15.

        I started mine earlier and I’ve already done manufacturing 3 times today. It’s really easy. By this time tomorrow I’ll have a couple more and they’ll all be winning manufacturing.

        Tariffs gave me the ability to finally believe in myself. Tariffs have increased my stamina in bed, given me a full head of hair again, and since I started manufacturing plant yesterday I’ve dropped 50 pounds.

        source
  • MTK@lemmy.world ⁨2⁩ ⁨weeks⁩ ago

    Yeah… At best click baity as fuck, at worst a complete scam.

    Any time there is a 10x or more in a headline you are 10x or more likely to be right by calling it BS.

    source
  • KulunkelBoom@lemm.ee ⁨2⁩ ⁨weeks⁩ ago

    Yeah, but endurance. and accuracy. and longevity. How about those?

    source
    • AceBonobo@lemmy.world ⁨2⁩ ⁨weeks⁩ ago

      And price and maye write more than 1 single bit

      source
  • WorldsDumbestMan@lemmy.today ⁨2⁩ ⁨weeks⁩ ago

    Whenever they say X whatever times, I doubt it right away, because they always interpret the statistics in the dumbest ways possible. You have a solar panel that is 28% efficient. There is no way it can be 20x times as efficient, that’s just clickbait.

    source
    • amon@lemmy.world ⁨2⁩ ⁨weeks⁩ ago

      trustworthiness = 1/(claimed improvement)

      source
  • muntedcrocodile@lemm.ee ⁨3⁩ ⁨weeks⁩ ago

    Is that fast enough to put an LLM in swap and have decent performance?

    source
    • jj4211@lemmy.world ⁨3⁩ ⁨weeks⁩ ago

      Note that this in theory speaks to performance of a non volatile memory. It does not speak to cost.

      We already have a faster than NAND non volatile storage in phase change memory . It failed due to expense.

      If this thing is significantly more expensive even than RAM, then it may fail even if it is everything it says it is. If it is at least as cheap as ram, it’ll be huge since it is faster than RAM and non volatile.

      Swap is indicated by cost, not by non volatile characteristics.

      source
  • tetris11@lemmy.ml ⁨3⁩ ⁨weeks⁩ ago

    Wow, finally graphene has been cracked. Exciting times for portable low-energy computing

    source
  • LodeMike@lemmy.today ⁨2⁩ ⁨weeks⁩ ago

    Not possible.

    source
    • WereCat@lemmy.world ⁨2⁩ ⁨weeks⁩ ago

      Why? If they looked at how current tech works then they could easily develop the same tech 10000x faster

      source
      • FourWaveforms@lemm.ee ⁨2⁩ ⁨weeks⁩ ago

        How

        source
        • -> View More Comments
    • LuckyPierre@lemm.ee ⁨2⁩ ⁨weeks⁩ ago

      No? Oh, that’s a shame. I was hoping for some improvement in the world, but a random person on the internet said it wasn’t possible without giving any reasons at all. Oh well.

      source
      • LodeMike@lemmy.today ⁨2⁩ ⁨weeks⁩ ago

        No it’s literally impossible without bypassing the speed of light and/or the size of atoms.

        source
  • fullsquare@awful.systems ⁨3⁩ ⁨weeks⁩ ago

    This sounds like that material would be more useful in high performance radars, not as flash memory

    source
    • CosmoNova@lemmy.world ⁨3⁩ ⁨weeks⁩ ago

      It‘s likely BS anyway. Maybe it’s just me but reading about another crazy breakthrough from China every single day during this trade war smells fishy. Because I‘ve seen the exact same propaganda strategy during the pandemic when relations between China and the rest of the world weren‘t exactly the best. A lot of those headlines coming from there are just claims about flashy topics with very little substance or second guessing.

      source
      • LadyAutumn@lemmy.blahaj.zone ⁨3⁩ ⁨weeks⁩ ago

        It’s definitely possible they’re amplifying these developments to maintain confidence in the Chinese market, but I doubt they’re outright lying about the discoveries. I think it’s also likely that some of what they’ve been talking about has been in development for a while and that China is choosing now to make big reveals about them.

        source
  • altphoto@lemmy.today ⁨3⁩ ⁨weeks⁩ ago

    Hopefully they can use it against covid.

    source
  • GoodOleAmerika@lemmy.world ⁨2⁩ ⁨weeks⁩ ago

    It’s like temu. 100x discount.

    source
  • bassomitron@lemmy.world ⁨3⁩ ⁨weeks⁩ ago

    Does flash, like solid state drives, have the same lifespan in terms of write? If so, it feels like this would most certainly not be useful for AI, as that use case would involve doing billions/trillions of writes in a very short span of time.

    source
    • schema@lemmy.world ⁨3⁩ ⁨weeks⁩ ago

      I don’t think it would make much difference if it lasted longer. I could be wrong, but afaik, running the actual transformer for AI is done in VRAM, and staging and preprocessing is done in RAM. Anything else wouldn’t really make sense speed and bandwidth wise.

      source
      • bassomitron@lemmy.world ⁨2⁩ ⁨weeks⁩ ago

        Oh I agree, but the speeds in the article are much faster than any current volatile memory. So it could theoretically be used to vastly expand memory availability for accelerators/TPUs/etc for their onboard memory.

        I guess if they can replicate these speeds in volatile memory and increase the buses to handle it, then they’d be really onto something here for numerous use cases.

        source