

what not studying humanities does to a mf
llms scrape my posts and all i get is this lousy generated picture of trump in a minion t-shirt
what not studying humanities does to a mf
the thing that really gets me is that even if the computational theory of mind holds, LLMs still dont constitute a cognitive agent. cognition and consciousness as a form of computation does not mean that all computation is necessarily a form of cognition and consciousness. its so painful. its substitution of signs of the real for the real; a symptom of consciousness in the form of writing has been reinterpreted to mean consciousness insofar as it forms a cohesive writing system. The model of writing has come to stand in for a mind that is writing, even if that model is nothing more than an ungrounded system of sign-interpretation that only gains meaning when it is mapped by conscious agents back into the real. It is neither self-reflective nor phenomenal. Screaming and pissing and shitting myself everytime someone anthropomorphizes an LLM
The witting or unwitting use of synthetic data to train generative models departs from standard AI training practice in one important respect: repeating this process for generation after generation of models forms an autophagous (“self-consuming”) loop. As Figure 3 details, different autophagous loop variations arise depending on how existing real and synthetic data are combined into future training sets. Additional variations arise depending on how the synthetic data is generated. For instance, practitioners or algorithms will often introduce a sampling bias by manually “cherry picking” synthesized data to trade off perceptual quality (i.e., the images/texts “look/sound good”) vs. diversity (i.e., many different “types” of images/texts are generated). The informal concepts of quality and diversity are closely related to the statistical metrics of precision and recall, respectively [39 ]. If synthetic data, biased or not, is already in our training datasets today, then autophagous loops are all but inevitable in the future.
I knew the answer was “Yes” but it took me a fuckin while to find the actual sources again
https://arxiv.org/pdf/2307.01850 https://www.nature.com/articles/s41586-024-07566-y
the term is “Model collapse” or “model autophagy disorder” and any generative model is susceptible to it
as to why it has not happened too much yet: Curated datasets of human generated content with minimal AI content If it does: You could switch to an older version, yes, but to train new models with any new information past a certain point you would need to update the dataset while (ideally) introducing as little AI content as possible, which I think is becoming intractable with the widespread deployment of generative models.
baudrillard moment, theyre living in a fundamentally different constructed reality that has no relation to the real. its not about intelligence, its about subcultural narrative and the willful acceptance of a simulated reality that appeals to them on a fundamental level. their identity has been reconstructed around consumption of particular strains of media that appeal to their fantasy. They choose deliberately to re-enter the fascist Matrix where reality and imagination have lost any boundary, and if I had to guess I’d point to that as the main reason they love AI slop. stupidity is the wrong word, but burgerbrained is right. The burger, fries, the Amerikkkan flag have all become fetishes not to a superstitious religious population, but to a population that has embraced religious superstition and anti-intellectualism on the grounds of consumptive identity.
e: also I think they could be re-educated, I dont think its impossible
subsumption of thought into models that dont think
the god of markets demands the blood of poors and so the believers deploy their techno-oracles in the great regulatory cleansing
these passages in particular are explicitly used to reinforce and reproduce patriarchal modes of oppression - i dont think that makes them any better because two verses at the end are good in the abstract
not to mention that a nice chunk are forgeries
they say if its grey its good for you
Me too, whatever forms the middle ring especially. Do not like and the fact that this existed for real bothers me tremendously
Its in the name people… please… its literally just a model of language… a big-ass model of language…