• CarbonIceDragon@pawb.social
    link
    fedilink
    arrow-up
    6
    ·
    6 days ago

    I mean the “allow non verbal people to speak” thing has some merit though. Not LLMs per se, but the types of machine learning used by people trying to develop ways to decode the brainwaves of people to allow them to talk while physically unable are usually lumped in the general category of “AI” from what I’ve seen.

    • zqps@sh.itjust.works
      link
      fedilink
      arrow-up
      3
      ·
      edit-2
      5 days ago

      Yeah that’s not what they mean. They mean feeding recorded texts and speeches of a person into an llm, then instruct it to pretend to be that person. Like that AI murder victim “testimony” that was permitted to be shown in court as “evidence” some time ago.

    • Swedneck@discuss.tchncs.de
      link
      fedilink
      arrow-up
      2
      ·
      5 days ago

      i mean i’m pretty sure we can enable people to communicate if they’re at all conscious and mentally able to communicate, stephen hawking was able to write despite literally only being able to move his eyes reliably. So long as a person can intentionally move one muscle we can rig something up to interpret it as morse code.

      is it great? no, these methods fucking suck, but they do work and we don’t need AI to do it.

    • Peppycito@sh.itjust.works
      link
      fedilink
      arrow-up
      4
      ·
      6 days ago

      I’ve just had experiences with Ai help chats where when I started typing the Ai would try to finish my sentence and would jump the cursor around making it absolutely unusable. I had to type in note pad and copy it into the chat. Staggeringly useless. So if this ‘mind reading’ Ai is like that I don’t predict good results.

      Also, fuck you quickbooks.

      • CarbonIceDragon@pawb.social
        link
        fedilink
        arrow-up
        6
        ·
        6 days ago

        I mean, any technology can be stupid if it is utilized stupidly, which I would think taking over someone’s keyboard while they’re tping would qualify as. But why would a company deploying a technology in a stupid manner mean that someone else’s research into a different but related technology is guaranteed to produce equally poor results?