• LeeeroooyJeeenkiiins [none/use name]@hexbear.net
    link
    fedilink
    English
    arrow-up
    4
    ·
    1 month ago

    neither does the computer!!!

    I think chatgpt is basically like a computer equivalent of figuring out language processing to an alright degree which is like p. cool and I guess enough to trick people into thinking the mechanical turk has an agenda but yeah still not thinking

    • plinky [he/him]@hexbear.net
      link
      fedilink
      English
      arrow-up
      4
      ·
      1 month ago

      i guess my issue is that neural networks as they exist now can’t emerge property, they are fitting to data to predict next word in the best way possible, or most probable in unknown sentence. It’s not how anybody learns, not mice, not humans.

      Something akin to experiments with free floating robot arms with bolted on computer vision seem like much more viable approach, but there the problem is they don’t have right architecture to feed it into, at least i don’t think they do, and even then it will probably will stall out for a time at animal level.

      • LeeeroooyJeeenkiiins [none/use name]@hexbear.net
        link
        fedilink
        English
        arrow-up
        6
        ·
        edit-2
        1 month ago

        my problem is at some point they’re gonna smoosh chatgpt and that sort of stuff and other shit together and it might be approximating consciousness but nerds will be like "it’s just math! soypoint-2 " and it’ll make commander Data sad disgost n’ they won’t even care

        • plinky [he/him]@hexbear.net
          link
          fedilink
          English
          arrow-up
          3
          ·
          1 month ago

          well of course they could, flawless imitation of consciousness, after all, is the same as consciousness (aside from morality, which will be unknowable), just not here at the moment