… Oh dear.

  • stabby_cicada@slrpnk.net
    link
    fedilink
    arrow-up
    10
    ·
    edit-2
    24 hours ago

    Complex algorithms that follow rules they cannot deviate from = lawful.

    Deliberately incorporating random factors into the algorithm so they don’t generate the same result every time = chaotic.

    So I’d argue the LLMs themselves are neutral evil, presuming we allow objects to have alignments. In D&D, non-sapient animals have no alignments, because they don’t understand moral or ethical concepts, so that would argue for LLMs being unaligned and the alignment applying to their companies.

    Could you argue a LLM is attuned to its corporate owner and shares its alignment? They’d definitely be cursed.

    Then the companies would veer from lawful evil (Microsoft has been the archetype of abusing laws and regulations to its own benefit for decades) to chaotic evil (Grok has no rules, only the whims of its infantile tyrant).

    • SippyCup@lemmy.ml
      link
      fedilink
      arrow-up
      1
      ·
      24 hours ago

      Corporate owners are currently in the Find Out stage about how they have no control over their LLMs. And so no, they do not share an alignment with their corporate owners beyond fleeting coincidence.