Open Menu
AllLocalCommunitiesAbout
lotide
AllLocalCommunitiesAbout
Login

Can you think of any now?

⁨1797⁩ ⁨likes⁩

Submitted ⁨⁨1⁩ ⁨month⁩ ago⁩ by ⁨LadyButterfly@piefed.blahaj.zone⁩ to ⁨science_memes@mander.xyz⁩

https://piefed.cdn.blahaj.zone/posts/2l/Vt/2lVtS7OeYhBiPfn.jpg

source

Comments

Sort:hotnewtop
  • deaf_fish@midwest.social ⁨1⁩ ⁨month⁩ ago

    Your work improves the lives of others more than it will improve your own. Which others is determined by politics. Best to spread the improvement around so you can get more of it back from more people.

    source
  • hayashifty@lemmy.world ⁨1⁩ ⁨month⁩ ago

    1992 Bumblebees defy aerodynamics !

    source
  • degoogler@lemmy.zip ⁨1⁩ ⁨month⁩ ago

    In an atom, the electrons orbit around the nucleus in the same manner as the planets orbit around the sun.

    That’s been debunked for many many decades but middle scool still teaches this model. At least I wasn’t told back then how misleading and wrong that is, only in high school right before graduating the physics teacher emphasized this misconception. I remember how mad she was about it lol. I have no clue how its taught elsewhere.

    source
    • Adalast@lemmy.world ⁨1⁩ ⁨month⁩ ago

      The Bhor’s model is at least a useful simplification of the atomic structure. What needs taught is that everything you learn before college and intensive narrow topical courses is simplified to the point of being incorrect with the hope that you get enough of an intrinsic understanding of the concept that the less simplified explanation you get next will make sense. I say this because it will still be simplified to the point of being wrong, but will be a step closer to the truth. This is the essence of education.

      Elementary/middle school: ice is water that has frozen solid HS: ice is water that has lost enough energy that the molecules form a crystalline lattice. College: there are actually 19 or 20 kinds of water ice that have been verified, but as many as 74,963 might exist. Post-collegiate: There may be 74,963 kinds of ice, but I know one ICE we should definitely eliminate from this world.

      source
  • Soktopraegaeawayok@lemmy.world ⁨1⁩ ⁨month⁩ ago
    [deleted]
    source
    • SpongyAneurysm@feddit.org ⁨1⁩ ⁨month⁩ ago

      So straight up timeless facts only?

      source
  • missfrizzle@discuss.tchncs.de ⁨1⁩ ⁨month⁩ ago

    I was taught that serious academics favored Support Vector Machines over Neural Networks, which industry only loved because they didn’t have proper education.

    oops…

    source
    • bluemellophone@lemmy.world ⁨1⁩ ⁨month⁩ ago

      Before LeNet and AlexNet, SVMs were the best algorithms around. People used HOG+SVM, SIFT, SURF, ORB, older Haar / Viola-Jones features, template matching, random forests, Hough Transforms, sliding windows, deformable parts models… so many techniques that were made obsolete once the first deep networks became viable.

      The problem is your schooling was correct at the time, but the march of research progress eventually saw 1) the creation of large, million-scale supervised datasets (ImageNet) and 2) larger / faster GPUs with more on-card memory.

      source
      • missfrizzle@discuss.tchncs.de ⁨1⁩ ⁨month⁩ ago

        HOG and Hough transforms bring me back. honestly glad that I don’t have to mess with them anymore though.

        I always found SVMs a little shady because you had to pick a kernel. we spent time talking about the different kernels you could pick but they were all pretty small and/or contrived. I guess with NNs you pick the architecture/activation functions but there didn’t seem to be an analogue in SVM land for “stack more layers and fatten the embeddings.” though I was only an undergrad.

        do you really think NNs won purely because of large datasets and GPU acceleration? I feel like those could have applied to SVMs too. I thought the real win was solving vanishing gradients with ReLU and expanding the number of layers, rather than throwing everything into a 3 or 5-layer MLP, preventing overfitting, making the gradient landscape less prone to local maxima and enabling hierarchical feature extraction to be learned organically.

        source
        • -> View More Comments
  • Sam_Bass@lemmy.world ⁨1⁩ ⁨month⁩ ago

    I didn’t graduate highschool though

    source
  • TacoButtPlug@sh.itjust.works ⁨1⁩ ⁨month⁩ ago

    I did too many drugs in high school. I don’t remember a lot.

    source
  • dethedrus@lemmy.dbzer0.com ⁨1⁩ ⁨month⁩ ago

    Or history that was not covered…

    source
  • Etterra@discuss.online ⁨1⁩ ⁨month⁩ ago

    People believe enough random bullshit to tickle their memories with their classics list.

    source
  • P1k1e@lemmy.world ⁨1⁩ ⁨month⁩ ago

    Nero linguistics programming

    source
    • exasperation@lemmy.dbzer0.com ⁨1⁩ ⁨month⁩ ago

      Nero

      The dude who fiddled as Rome burned?

      source
  • Randomgal@lemmy.ca ⁨1⁩ ⁨month⁩ ago

    That website is called ChatGPT lmao

    source
    • Speiser0@feddit.org ⁨1⁩ ⁨month⁩ ago

      No. It is called duckduckgo.com.

      source
  • homura1650@lemmy.world ⁨1⁩ ⁨month⁩ ago

    China is the most populace country.

    source
    • the_crotch@sh.itjust.works ⁨1⁩ ⁨month⁩ ago

      tbf when I was in school that was true

      source
      • Jarix@lemmy.world ⁨1⁩ ⁨month⁩ ago

        That is what the post is a about. Not just facts that were always wrong, but ones that no longer are true

        source
  • Zerush@lemmy.ml ⁨1⁩ ⁨month⁩ ago

    The reallity ist that you create a website with Google and it filled out automaticly your complete Curriculum Vitae from Birth to now.

    source