Open Menu
AllLocalCommunitiesAbout
lotide
AllLocalCommunitiesAbout
Login

Dell says the quiet part out loud: Consumers don't actually care about AI PCs — "AI probably confuses them more than it helps them"

⁨1055⁩ ⁨likes⁩

Submitted ⁨⁨1⁩ ⁨day⁩ ago⁩ by ⁨throws_lemy@reddthat.com⁩ to ⁨technology@lemmy.world⁩

https://www.windowscentral.com/artificial-intelligence/dell-says-the-quiet-part-out-loud-consumers-dont-actually-care-about-ai-pcs-ai-probably-confuses-them-more-than-it-helps-them

source

Comments

Sort:hotnewtop
  • TheBat@lemmy.world ⁨1⁩ ⁨day⁩ ago

    Image

    Stolen from BSKY

    source
    • Holytimes@sh.itjust.works ⁨1⁩ ⁨day⁩ ago

      Weirdly dell always seems to understand what normal users want.

      The problem is normal users have beyond low expectations, no standards and are ignorant of most everything tech related.

      They want cheap and easy to use computers that require no service and if there is a problem a simple phone number to call for help.

      Dell has optimized for that. So hate em or not, while their goods have gone to shit quality wise. They understand their market and have done extremely well in servicing it.

      Thus I am not surprised at all dell understood this. If anything I would have been more surprised if they didn’t.

      source
      • artyom@piefed.social ⁨1⁩ ⁨day⁩ ago

        I think they all understand what we want (broadly), they just don’t care, because what they want is more important, and they know consumers will tolerate it.

        source
        • -> View More Comments
      • Zagorath@quokk.au ⁨1⁩ ⁨day⁩ ago

        What companies actually make decent mid-range laptops these days?

        source
        • -> View More Comments
      • anomnom@sh.itjust.works ⁨1⁩ ⁨day⁩ ago

        And yet just before looking at Lemmy I got an ad for the Dell AI laptop on YouTube (on my TV, still need to get a piHole up and running).

        source
        • -> View More Comments
    • lemmy_get_my_coat@lemmy.world ⁨1⁩ ⁨day⁩ ago

      That is gold

      source
    • edgemaster72@lemmy.world ⁨22⁩ ⁨hours⁩ ago

      This is extra funny to me since I just re-watched this episode the other day

      source
  • Clent@lemmy.dbzer0.com ⁨12⁩ ⁨hours⁩ ago

    What a trash click bait headline. That’s not how the statement “saying the quiet part out loud” works. This isn’t a secret and it’s not unspoken and it certainly doesn’t not reveal some underlying motive.

    source
  • ZILtoid1991@lemmy.world ⁨3⁩ ⁨hours⁩ ago
    > be me
    > installed VScode to test whether language server is just unfriendly with KATE
    > get bombarded with "try our AI!" type BS
    > vomit.jpg
    > managed to test it, but the AI turns me off
    > immediately uninstalled this piece of glorified webpage from my ThinkPad
    

    It seems I’m having to do more jobs with KATE. (Does the LSP plugin for KATE handle stuff differently from the standard in some known way?)

    source
  • bytepursuits@programming.dev ⁨9⁩ ⁨hours⁩ ago

    They said they still adding all of it. They are adding ai. Just not talking about it. Which is probably correct 😂

    source
  • mctoasterson@reddthat.com ⁨1⁩ ⁨day⁩ ago

    What people don’t want is blackbox AI agents installed system-wide that use the carrot of “integration and efficiency” to justify bulk data collection, that the end user implicitly agrees to by logging into the OS.

    God forbid people want the compute they are paying for to actually do what they want, and not work at cross purposes for the company and its various data sales clients.

    source
    • Gsus4@mander.xyz ⁨1⁩ ⁨day⁩ ago

      Unveiling: the APU!!! (ad processing unit)

      source
      • Samsy@lemmy.ml ⁨1⁩ ⁨day⁩ ago

        Just there to create ads based on your usage.

        source
        • -> View More Comments
    • Aceticon@lemmy.dbzer0.com ⁨1⁩ ⁨day⁩ ago

      God forbid people want the compute they are paying for to actually do what they want, and not work at cross purposes for the company and its various data sales clients.

      I think that way of thinking is still pretty niche.

      Hope it’s becoming more widespread, but in my experience most people don’t actually concern themselves with “my device does some stuff in the background that goes beyond what I want it for” - in their ignorance of Technology, they just assume it’s something that’s necessary.

      I think were people have problems is mainly at the level of “this device is slower at doing what I want it to do than the older one” (i.e. AI makes it slower), “this device costs more than the other one without doing what I want it to do any better” (i.e. unwilling to pay more for AI) or “this device does what I want it to do worse than before/that-one” (i.e. AI forced on users, actually making the experience of using that device worse).

      source
  • EndlessNightmare@reddthat.com ⁨19⁩ ⁨hours⁩ ago

    I actually do care about AI PCs. I care in the sense that it is something I want to actively avoid.

    source
  • stoy@lemmy.zip ⁨1⁩ ⁨day⁩ ago

    I’d much rather have a more powerful generic CPU than a less powerful generic CPU with an added NPU.

    There are very few people who would benefit from an added NPU, ok I hear you say what about local AI?

    Ok, what about it?

    Would you trust a commercial local AI tool to not be sharing data?

    Would your grandmother be able to install an open source AI tool?

    What about having enough RAM for the AI tool to run?

    Look at the average computer user, if you are on lemmy, chances are very high that you are far more advanced than the average computer user.

    I am talking about those users who don’t run Adblocker, don’t notice the YT ad skip button and who in the past would have installed a minimum of of five toolbars in IE, yet wouldn’t have noticed the reduced view of the actual page.

    These people are closer to the average users than any of us.

    Why do they need local AI?

    source
    • unexposedhazard@discuss.tchncs.de ⁨1⁩ ⁨day⁩ ago

      Just offer NPUs as PCIe extension cards. Thats how computers used to be and should be. Modular and versatile.

      source
      • Meron35@lemmy.world ⁨1⁩ ⁨day⁩ ago

        Already existed for half a decade.

        Google Coral is probably the most famous and is mainly suited for small IoT devices, e.g. speeding up image recognition for security cameras. They come in all shapes and sizes though.

        M.2 Accelerator A+E key | Coral - www.coral.ai/products/m2-accelerator-ae

        source
        • -> View More Comments
      • stoy@lemmy.zip ⁨1⁩ ⁨day⁩ ago

        Exactly!

        I could even see the cards having ram slots, so you can add dedicated ram to the NPU to remove the need for sharing ram with the system

        source
    • tal@lemmy.today ⁨1⁩ ⁨day⁩ ago

      My understanding from a very brief skim of what Microsoft was doing with Copilot is to take screenshots constantly, run image recognition on it, and then make it searchable as text and have the ability to go back and view those screenshots in a timeline. Basically, adding more search without requiring application-level support.

      They may also have other things that they want to do, but that was at least one.

      source
      • Spuddlesv2@lemmy.ca ⁨1⁩ ⁨day⁩ ago

        Do you mean Copilot, the local indexer and search tool or do you mean Copilot the web based AI chat bot or do you mean Copilot the rebranded Office suite or do you mean … etc.

        Seriously, talk about watering down a brand name. Microsoft marketing team are all massive, massive fuck knuckles.

        source
        • -> View More Comments
    • biggerbogboy@sh.itjust.works ⁨1⁩ ⁨day⁩ ago

      There’s also the fact that many NPUs are pretty much useless unless used for a very specific model built for the hardware, so there’s no real point having them

      source
    • Endmaker@ani.social ⁨1⁩ ⁨day⁩ ago

      an added NPU

      cmiiw but I don’t think NPUs are meant to be used on general-purpose personal computers. A GPU makes more sense.

      NPUs are meant for specialised equipment e.g. object detection in a camera (not the personal-use kind)

      source
      • vithigar@lemmy.ca ⁨1⁩ ⁨day⁩ ago

        They are in general purpose PCs though. Intel has them taking up die space in a bunch of their recent core ultra processors.

        source
        • -> View More Comments
      • altkey@lemmy.dbzer0.com ⁨1⁩ ⁨day⁩ ago

        Probably not even general purpose GPUs, although we sucked it up when RT and Tensor cores were put on a plate whenever we like it or not. These though at least provided something to the consumer unlike NPUs.

        source
  • UsoSaito@feddit.uk ⁨1⁩ ⁨day⁩ ago

    It doesn’t confuse us… it annoys us with the blatant wrong information. e.g. glue is a pizza ingredient.

    source
    • Electricd@lemmybefree.net ⁨19⁩ ⁨hours⁩ ago

      That’s when you use 3 years old models

      source
      • AngryRobot@lemmy.world ⁨19⁩ ⁨hours⁩ ago

        Are you trying to make us believe that AI doesn’t hallucinate?

        source
        • -> View More Comments
  • Sam_Bass@lemmy.world ⁨1⁩ ⁨day⁩ ago

    Doesn’t confuse me, just pisses me off trying to do things I don’t need or want don’t. Creates problems to find solutions to

    source
    • Gsus4@mander.xyz ⁨1⁩ ⁨day⁩ ago

      Can the NPU at least stand in as a GPU in case you need it?

      source
      • UsoSaito@feddit.uk ⁨1⁩ ⁨day⁩ ago

        No as it doesn’t compute graphical information and is solely for running computations for “AI stuff”.

        source
        • -> View More Comments
      • Sam_Bass@lemmy.world ⁨1⁩ ⁨day⁩ ago

        Nope. Don’t need it

        source
  • Blackmist@feddit.uk ⁨12⁩ ⁨hours⁩ ago

    Yeah, I’m not sure what the point of a cheap NPU is.

    If you don’t like AI, you don’t want it.

    If you do like AI, you want a big GPU or to run it on somebody else’s much bigger hardware via the internet.

    source
    • rumba@lemmy.zip ⁨7⁩ ⁨hours⁩ ago

      A cheap NPU could have some uses. If you have a background process that runs continuously, offloading the work to a low-cost NPU can save you both power and processing. Camera authorization, if you get up, it locks; if you sit down, it unlocks. No reason to burn a core or GPU for that. Security/Nanny cameras recognition. Driving systems monitoring a driver losing consciousness and pulling over. We can accomplish this all now with CPUs/GPUs, but purpose-built systems that don’t drain other resources aren’t a bad thing.

      Of course, there’s always the downside that they use that chip for recall. Or malware gets a hold of it for recall, ID theft, There’s a whole lot of bad you can do with a low-cost NPU too :)

      source
  • Electricd@lemmybefree.net ⁨19⁩ ⁨hours⁩ ago

    I want to run LLMs locally, or things like TTS or STT locally so it’s nice but there’s no real support rn

    source
  • manxu@piefed.social ⁨1⁩ ⁨day⁩ ago

    Dell is the first Windows OEM to openly admit that the AI PC push has failed. Customers seem uninterested in buying a laptop because of its AI capabilities, likely prioritizing other aspects such as battery life, performance, and display above AI.

    Silicon Valley always had the annoying habit of pushing technology-first products without even much consideration of how they would solve real world problems. It always had it, but it’s becoming increasingly bad. When Zuck unveiled the Metaverse it was already starting to be ludicrous, but with the AI laptop wave it turned into Onion territory.

    source
    • Lucidlethargy@sh.itjust.works ⁨20⁩ ⁨hours⁩ ago

      What do you mean? Do you even have ANY foundation to this accusation?

      Hold on, I need to turn off my heater. 22211123222234663fffvsnbvcsdfvxdxsssdfgvvgfgg

      There it is. The off button. Touch controls are so cool guys.

      source
      • manxu@piefed.social ⁨19⁩ ⁨hours⁩ ago

        Ha! Enjoy your off button while they still make them. Once our AI Overlords have won the War, you can only politely ask your laptop to please temporarily quiet itself, please and thank you if it’s not too much asking.

        source
  • InFerNo@lemmy.ml ⁨1⁩ ⁨day⁩ ago

    “Recall was met with serious backlash”. Meanwhile I’m looking for a simple setting regarding the power button on my wife’s phone and stumble upon a setting that is enabled by default that has Gemini scanning the screen and using it for whatever it is that it does, but my wife doesn’t use any AI features on her device. Correct me if I’m wrong, but isn’t this basically the same as Recall? Google was just smart enough to silently roll this out.

    source
    • Xanvial@lemmy.world ⁨1⁩ ⁨day⁩ ago

      Isn’t this only triggered when user use Gemini (and the google assistant before). To use something like circle to search. I’m rather sure this already exists before AI craze

      source
      • Rooster326@programming.dev ⁨12⁩ ⁨hours⁩ ago

        That is the assumption but that is explicitly spelled out somewhere. I’m not sure you can trust it.

        source
      • arararagi@ani.social ⁨22⁩ ⁨hours⁩ ago

        Yeah, Google assistant was able to read your screen and take screenshots when asked years ago.

        source
  • tal@lemmy.today ⁨1⁩ ⁨day⁩ ago

    Not the position Dell is taking, but I’ve been skeptical that building AI hardware directly into specifically laptops is a great idea unless people have a very concrete goal, like text-to-speech, and existing models to run on it, probably specialized ones. This is not to diminish AI compute elsewhere.

    Several reasons.

    • Models for many useful things have been getting larger, and you have a bounded amount of memory in those laptops, which, at the moment, generally can’t be upgraded (though maybe CAMM2 will improve the situation, move back away from soldered memory). Historically, most users did not upgrade memory in their laptop, even if they could. Just throwing the compute hardware there in the expectation that models will come is a bet on the size of the models.

    • Heat and power. The laptop form factor exists to be portable. They are not great at dissipating heat, and unless they’re plugged into wall power, they have sharp constraints on how much power they can usefully use.

    • The parallel compute field is rapidly evolving. People are probably not going to throw out and replace their laptops on a regular basis to keep up with AI stuff (much as laptop vendors might be enthusiastic about this).

    I think that a more-likely outcome, if people want local, generalized AI stuff on laptops, is that someone sells an eGPU-like box that plugs into power and into a USB port or via some wireless protocol to the laptop, and the laptop uses it as an AI accelerator.

    When I do generative AI stuff on my laptop, for the applications I use, the bandwidth that I need to the compute box is very low, and latency requirements are very relaxed. I presently remotely use a Framework Desktop as a compute box, and can happily generate images or text or whatever over the cell network without problems. If I really wanted disconnected operation, I’d haul the box along with me.

    source
    • cmnybo@discuss.tchncs.de ⁨1⁩ ⁨day⁩ ago

      There are a number of NPUs that plug into an m.2 slot. If those aren’t powerful enough, you can just use an eGPU.
      I would rather not have to pay for an NPU that I’m probably not going to use.

      source
    • Nighed@feddit.uk ⁨1⁩ ⁨day⁩ ago

      I think part of the idea is: build it and they will come… If 10% of users have NPUs, then apps will find ‘useful’ ways to use them.

      Part of it is actually battery life - if you assume that in the life of the laptop it will be doing AI tasks (unlikely currently) an NPU will be wayyyy more efficient than running it on a CPU, or even a GPU.

      Mostly though, it’s because it’s an excuse to charge more for the laptop. If all the high end players add NPUs, then customers have no choice but to shell out more. Most customers won’t realise that when they use chat got or copilot one one of these laptops, it’s still not running on their device.

      source
    • Goodeye8@piefed.social ⁨1⁩ ⁨day⁩ ago

      I’m not that concerned with the hardware limitations. Nobody is going to run a full-blown LLM on their laptop, running one on a desktop would already require building a PC with AI in mind. What you’re going to see being used locally are going smaller models (something like 7B using INT8 or INT4). Factor in the efficiency of an NPU and you could get by with 16GB of memory (especially if the models are used in INT4) with little extra power draw and heat. The only hardware concern would be the technological advancement speed of NPUs, but just don’t be an early adopter and you’ll probably be fine.

      But this is where Dells point comes in. Why should the consumer care? What benefits do consumers get by running a model locally? Outside of privacy and security reasons you’re simply going to get a better result by using one of the online AI services because you’d be using a proper model instead of the cheap one that runs with limited hardware. And even for the privacy and security minded people you can just build your own AI server (maybe not today but when hardware prices get back to normal) that you run from home and then expose that to your laptop or smartphone. For consumers to desire running a local model (actually locally and not in a selfhosting kind of way) there would have to be some problem that the local model solve that the over the internet solution can’t solve. So far such a problem doesn’t exist today and there doesn’t seem to be a suitable problem on the horizon either.

      Dell is keeping their foot in the door by still implementing NPUs into their laptops, so if by some miracle some magical problem is found that AI solves they’re ready, but they realize that NPUs are not something they can actually use as a selling point because as it stands, NPUs solve no problems because there’s no benefit to running small models locally.

      source
      • jj4211@lemmy.world ⁨1⁩ ⁨day⁩ ago

        More to the point, the casual consumer isn’t going to dig into the nitty gritty of running models locally and not a single major player is eager to help them do it (they all want to lock the users into their datacenters and subscription opportunities).

        On the Dell keeping NPUs in their laptops, they don’t really have much of a choice if they want modern processors, Intel and AMD are all-in on it still.

        source
        • -> View More Comments
    • Euphoma@lemmy.ml ⁨1⁩ ⁨day⁩ ago

      Phones have already came with ai processors for a long time, specifically for speech recognition and camera features

      source
    • MonkderVierte@lemmy.zip ⁨1⁩ ⁨day⁩ ago
      [deleted]
      source
      • porous_grey_matter@lemmy.ml ⁨1⁩ ⁨day⁩ ago

        Where in their comment does it say “exactly zero users”? Oh right, it doesn’t

        source
  • scarabic@lemmy.world ⁨1⁩ ⁨day⁩ ago

    As time goes by I’m finding a place for AI.

    1. I use it for information searches, but only in cases where I know the information exists and there is an actual answer. Like history questions or asking for nuanced definitions of words and concepts.

    2. I use it to manipulate documents. I have a personal pet peeve about the format of most recipes for example. Recipes always list the ingredient amounts in a table at the top, but then down in the steps they just say “add the salt” or “mix in the flour.” Then I have to look up at the chart and find the amount of salt/flour, and then I lose my place in the steps and have to find it again. I just have AI throw out the chart and integrate the amounts into the steps. I can have it shorten the instructions too and break them into easier to read bullet points. I also ask it to make ingredient substitutions and other modifications. The other day I gave it a bread recipe and asked it to introduce a cold-proofing step and reformat everything the way I like. It did great.

    3. Learning interactively. When I need to absorb a new skill or topic I sometimes do it conversationally with AI. Yes I can find articles and videos but then I am stuck with the information they lay out and the pace and order in which they do it. With AI you can stop and ask clarifying questions, or have it skip over the parts you already know. I find this is way faster than laborious googling. However only trust it for very straightforward topics. Like “explain the different kinds of welding and what they are for.” I wouldn’t trust it for more nuanced topics where perspective and opinion come into it. And I’ve leaned that it isn’t great at topics where there isn’t enough information out there. Like very niche questions about the meta of a certain video game that’s only been out a month.

    4. Speech to text and summarization. AI records all my Zoom meetings for work and gives summaries of what was discussed and next steps. This is always better than nothing. I’m also impressed with how it seems to understand how to discard idle chit chat and only record actual work content. At most it says “the meeting began with coworkers exchanging details from their respective weekends.”

    This kind of hard-and-fast summarization and manipulation of factual text is much easier with AI. Doing my job for me? No. Hovering over my entire computer? No. Writing my emails for me? Fuck off.

    The takeaway is that specific tools I can go to when I need them, for point-specific needs, is all I want. I don’t need or what a hovering AI around all the time, and I don’t want whatever tripe Dell can come up with when I can get the best latest models direct from the leading players.

    source
    • phil@lymme.dynv6.net ⁨7⁩ ⁨hours⁩ ago

      Assuming you keep a critical eye on the results, surely AI can be used for some meaningful things like the ways you found - thanks for sharing them. But i could bet that most people will be stuck at the BS generator level with its poisonous effects on them and the society at large.

      source
      • scarabic@lemmy.world ⁨3⁩ ⁨hours⁩ ago

        I agree. I share my use cases mostly to out the critical thinking behind them on display. I’m sure the crowd here is very savvy. But in the general public I agree that many if not most people would be completely seduced by the obsequious & confident tone of the robot. It can do so many things that it becomes tempting to rely on it. You wish it worked better than it did, and if you let yourself get lazy, you can easily slip into trusting it too much.

        source
    • Lfrith@lemmy.ca ⁨1⁩ ⁨day⁩ ago

      Extent of my comfort with AI is through website and interaction is limited to copy and paste or upload. Capabilities not running on a system level.

      But, when it comes to actually running on hardware and being able to do things by reading what is on the screen or hearing what is said I don’t trust AI to be secure or privacy respecting. When it comes to that type of functionality I’ll only trust ones that is compiled myself to run locally as opposed to provided by a corporations who are largely in the business of data collection.

      source
  • SethTaylor@lemmy.world ⁨1⁩ ⁨day⁩ ago

    Holy crap that Recall app that “works by taking screenshots” sounds like such a waste of resources. How often would you even need that?

    source
    • Buddahriffic@lemmy.world ⁨1⁩ ⁨day⁩ ago

      It’s such a stupid approach to the stated problem that I just assumed it was actually meant for something else and the stated problem was to justify it. And made the decision to never use win 11 on a personal machine based on this “feature”.

      source
    • tal@lemmy.today ⁨1⁩ ⁨day⁩ ago

      So, it’s not really a problem I’ve run into, but I’ve met a lot of people who have difficulty on Windows understanding where they’ve saved something, but do remember that they’ve worked on or looked at it at some point in the past.

      My own suspicion is that part of this problem stems from the fact that back in the day, DOS had a not-incredibly-aimed-at-non-technical-users filesystem layout, and Windows tried to avoid this by hiding that and stacking an increasingly number of “virtual” interfaces on top of things that didn’t just show one the filesystem, whether it be the Start menu or Windows Explorer and file dialogs having a variety of things other than just the filesystem to navigate around. The result is that you have had Microsoft banging away for much of the lifetime of Windows trying to add more ways to access files, most of which increase the difficulty of actually understanding what is going on fully through the extra layers. But regardless of why, some users do have trouble with it.

      So if you can just provide a search that can summon up that document where they were working on that had a picture of giraffes by typing “giraffe” into some search field, maybe that’ll do it.

      source
  • Darkness343@lemmy.world ⁨12⁩ ⁨hours⁩ ago

    The world is healing

    source
    • musubibreakfast@lemmy.world ⁨10⁩ ⁨hours⁩ ago

      I’m readying for some new bullshit. I just hope it’s not tech related

      source
      • samus12345@sh.itjust.works ⁨8⁩ ⁨hours⁩ ago

        Does a third world war count as tech related? It certainly uses a lot of tech!

        source
  • torubrx@piefed.social ⁨1⁩ ⁨day⁩ ago

    Why not just leave it alone inside a browser tab? If I want AI, and I use it quite a lot, I will go into their website. Don’t force it system wide, just sucks

    source
    • AstralPath@lemmy.ca ⁨1⁩ ⁨day⁩ ago

      They want their greasy tendrils all up in your PC’s guts. Every bit of info flowing in your system can be monetized. All they care about is money and dominance and their “AI” in everyone’s devices is their wet dream.

      Cancer is preferable to tech bros as cancer doesn’t know its killing the host. Tech bros know full well their actions are killing the planet and its inhabitants. Their actions are willfully vile and toxic; completely at odds with the needs of humanity.

      Don’t expect them to ever do the right thing for anyone but themselves.

      source
    • Aceticon@lemmy.dbzer0.com ⁨1⁩ ⁨day⁩ ago

      This is pretty much a “all Tech companies have to jump on the AI hype train” pressure on publicly traded companies and those who need lots of investor money, and little if at all customer pressure.

      source
  • MutantTailThing@lemmy.world ⁨1⁩ ⁨day⁩ ago

    For me at least, AI reminds me too much of that thrice cursed MS Word paperclip. I did not want it then and I do not want it now.

    Also, adding ‘personality’ to pieces of software is cringy at best and downright creepy at worst.

    source
    • justsomeguy@lemmy.world ⁨1⁩ ⁨day⁩ ago

      Forget about the personality for a minute. They have a different thing in common. Uselessness. I tried AI for a bunch of general use cases and it almost always fails to satisfy. Either it just can’t do the task in the first place or it makes mistakes that then cost too much time to fix.

      There are exceptions and specialized models will have their use but it’s not the Swiss army knife tool AI companies are promising.

      source
      • The_v@lemmy.world ⁨1⁩ ⁨day⁩ ago

        AI hardware is a sales pitch without a clear product. Consumers have no clue why they would want to buy something with AI on it.

        For most consumers AI is a webpage that kids cheat on homework or adults attempt to cheat at work with. It makes ugly fake pictures with all sorts of weird errors. Its also the annoying as fuck answering services that you have to yell at 4 or 5 times to get to a real person.

        Why would an AI PC be desirable?

        source
  • Eternal192@anarchist.nexus ⁨1⁩ ⁨day⁩ ago

    We should have been given a choice whether we want to use it or not, them trying to force it on us is why they are getting so much pushback, let those that want to use it use it and those that don’t want it to be given the option to turn it off, it’s not rocket science, but they are constantly going:

    Tech CEOs - this is our AI you have to use it!
    Consumers - but i don’t want to!
    Tech CEOs - FUCKING USE IT!!!

    and then they are whining “WAAAHHHHH PEOPLE ARE MEANIES THAT DON’T LIKE OUR AI THAT DOES NOTHING TO IMPROVE THEIR LIVES AND WILL MAKE US MORE MONEY BY LETTING US PUT TARGETED ADS INTO THEIR EYEBALLS WWWWAAAAAAAAAAHHHHHHHH!!!!

    source
    • kiagam@lemmy.world ⁨1⁩ ⁨day⁩ ago

      But think about the shareholders, how are they going to pay off their trillion dollar debts building data centers? You need to use it, replace all aspects of your life with AI, then they can squeeze you!

      source
      • Eternal192@anarchist.nexus ⁨1⁩ ⁨day⁩ ago

        There was a scene in a kinda shitty movie The Scorpion King starring The Rock where he was buried and just his head was sticking out and huge red ants were coming to eat his face, i mean i would care, that they would scream too much and scare the ants, poor things are hungry, but they are so disgusting that the ants would probably walk past them thinking it’s camel shit…

        source
  • SabinStargem@lemmy.today ⁨1⁩ ⁨day⁩ ago

    It is going to take at least five years before local AI is user-friendly enough and with performant hardware circulating, that ordinary folks would consider buying an AI machine.

    I have a top-end DDR4 gaming rig. It takes a long time for 100b sized AI to give some roleplaying output, at least forty minutes for my settings via KoboldCPP with a GGUF. I don’t think a typical person would want to wait more than 2 minutes for a good response. So we will need at least DDR6 era devices before it is practical for everyday people.

    source
    • lmuel@sopuli.xyz ⁨1⁩ ⁨day⁩ ago

      A local LLM is still an LLM… I don’t think it’s gonna be terribly useful no matter how good your hardware is

      source
      • maus@sh.itjust.works ⁨1⁩ ⁨day⁩ ago

        I have great success with local LLM in some of my workflows and automation.

        I use it for my line completion and basic functions/asks while developing that I dont want to waste tokens on.

        I also use it in automation. I run my own media server with a few dozen people with an automated request system “jellyseerr” that adds content. I have automation that leverages local LLM to look at recent media requests and automatically requests content that is similar to it.

        source
      • luridness@lemmy.ml ⁨1⁩ ⁨day⁩ ago

        Local AI can be useful. But I would rather see nice implementations that used small but brilliantly tuned models for… let’s say better predictive text… it’s already somewhat AI based I just would like it to be. Better

        source
      • xthexder@l.sw0.com ⁨1⁩ ⁨day⁩ ago

        The diminishing returns are kind of insane if you compare the performance and hardware requirements of a 7b and 100b model. In some cases the smaller model can even perform better because it’s more focused and won’t be as subtle about its hallucinations.
        Something is going to have to fundamentally change before we see any big improvements, because I don’t see scaling it up further ever producing AGI or even solving any of the hallucinations/ logic errors it makes.

        In some ways it’s a bit like the Crypto blockchain speculators saying it’s going to change the world. But in reality the vast majority of applications proposed would have been better implemented with a simple centralized database.

        source
  • AmbiguousProps@lemmy.today ⁨1⁩ ⁨day⁩ ago

    Confuses them?

    source
    • _edge@discuss.tchncs.de ⁨1⁩ ⁨day⁩ ago

      Have you recently vibe editted a Microsoft Copilot Excel Sheet on your AI PC (but actually in the cloud)?

      source
      • AmbiguousProps@lemmy.today ⁨1⁩ ⁨day⁩ ago

        🤮

        source
    • nyankas@lemmy.world ⁨1⁩ ⁨day⁩ ago

      I think it’s quite possible to become confused if you’re used to software that, bugs aside, behaves pretty much completely predictably, then get a feature marketed as “intelligence” which suddenly gives you unpredictable and sometimes incorrect results. I‘d definitely be confused if the reliable tools I do my work with suddenly developed a mind of their own.

      source
      • AmbiguousProps@lemmy.today ⁨1⁩ ⁨day⁩ ago

        Well, that certainly would confuse you, yes.

        source
  • WraithGear@lemmy.world ⁨1⁩ ⁨day⁩ ago

    if “I” were the one confused, then AI would actually be USEFUL.

    source
  • NEILSON_MANDALA@lemmy.world ⁨1⁩ ⁨day⁩ ago

    i don’t see AI as confusing, tbh. if anything it is WAY too easy to use, which means it is WAY too easy to come up with stupid shit no one wants. if only people weren’t morons

    source
  • JensSpahnpasta@feddit.org ⁨1⁩ ⁨day⁩ ago

    I kind of makes sense to produce computers that are able to run local AI. People here hate AI, but there are a lot of tasks that make a lot of sense even on a laptop. It’s great to produce tags and descriptions for your images. Google Photo has this - search for “horse” and you will get back all pictures of horses you have taken. And it totally makes sense to run stuff like this locally on your own computer without sending every picture you’ve taken to Trumpland.

    But I really do not trust Microsoft to build something like that. That would totally go against everything they have been doing for the last decade and would also devalue their billions they’ve put into OpenAI

    source
  • KoalaUnknown@lemmy.world ⁨1⁩ ⁨day⁩ ago

    Disappointing that the new XPS doesn’t use CAMM2

    source