Thanks, idk if op needed this but I did
Engineer/Mathematician/Student. I’m not insane unless I’m in a schizoposting or distressing memes mood; I promise.
Thanks, idk if op needed this but I did
Not always, if the headline is “How do we stop (insert capitalism-caused problem)?” Then the answer is revolution.
I’m guessing the answer is ✨capitalism✨
It’s clearly just saying that the surfaces on which the ends of the cylinder lie are metric spaces with distances defined using Chebyshev or Taxicab metrics based on pentagonal tilings of the parabolic plane so the ratio of a circle’s circumference to diameter is 5.
Since it’s a cylinder we assume the vertical dimension is Euclidean and voila the math checks out geometrically.
Your username is purple. Thank you for developing the Voyager app lol
I have had the same thought before. Unfortunately conservation of energy is not enough to ensure entropy is monotonically increasing.
Say you created a tiny universe with the same average entropy of our universe and then you connected it to the edge of our universe. Energy is not conserved because you just added some, but entropy is because you didn’t create an entropy potential.
Say you had a warmer object and a colder object and you took all the heat energy from the cold object and added it to the warm object. The energy of your system was conserved, but its entropy decreased, violating the second law.
You can use violations of the second law to violate the other laws because entropy naturally wants to increase due to probability (which cannot be violated without destroying math and logic etc.).
In the scenario above, if you put some fluid between the two objects you could harness convection via a turbine to harvest energy. Even though your action of moving energy around didn’t create or destroy energy, it created a sort of entropic potential energy. Kind of like how teleporting an object to a higher elevation doesn’t really increase any energy in the universe since all mass and kinetic energy were conserved, but you’ve now increased the potential energy of the object which would become kinetic energy as the object falls back down. You could then harvest infinite energy if you repeated the cycle.
In order for one to move energy around via magic without violating entropy, one has to increase the entropy of the universe by at least the same amount it would take to move that energy without magic.
The solution I thought of was just that magic accelerates the expansion of the universe. Technically this still allows for some “impossible” stuff locally, like a perpetual motion machine or free energy generator that will eventually die but on the timescale of human lives seem infinite.
Magic would get weaker with use over time as the universe nears it’s equilibrium temperature, and you would be shortening the lifespan of the universe every time magic is used. But even if you used it excessively, you probably wouldn’t be shortening the lifespan of the universe by very much unless you were using magic to like move black holes around or rearrange galactic clusters.
That still is a violation of entropy because you’ve increased the “order” of energy in the universe as a whole, which is not possible.
If you can violate entropy, one can create a more than perfect Carnot Engine (or in general just a heat engine with efficiency greater than 1) which would allow you to generate an infinite amount of energy in the form of mechanical motion.
Unless in creating/gaining “mana” one is accelerating the entropic decay of the universe as a whole equal to or greater than the amount of entropy reversed locally (eg spells must produce heat and be inefficient at converting mana energy into work), magic would violate thermodynamics and allow for infinite energy creation.
Fuck the square cube law. If there is any magic that can freeze things / make things cold, the second law of thermodynamics is void, and by extension the other two are as well.
Perpetual motion machines? Hell yeah. Infinite energy? Hell yeah. Being able to create negative energy by decreasing entropy thus being able to create antigravity and simulate negative mass? Hell yeah
一 二 三 四 五 六 七 八
8 “ba”
是的,我是美国人。我的文法很不好。
Hate those bandits, and that general area with all of them; their blades are good at killing demons tho
Yes there is demand for art, but art is produced by people.
AI is only able to do what it can by mimicking the art of others. By plagiarizing that work, it prevents artists from getting paid to create art and discourages people from creating and sharing art on the internet in the first place. You may not care about them, or value creativity, but image generation relies on creative people putting new artwork on the net.
What are your bots going to create when they have nothing to feed on but themselves?
That’s the fun upside to the internet becoming filled and killed with AI slop: AI companies are literally poisoning their own models. (Data poisoning that is)
Predictive models of any kind produce error, and when you train on predicted data you compound that error.
Unless AI scrapers can differentiate AI generated “art” from human generated art (which would mean that AI art never truly becomes indistinguishable-from or as-good-as human art, something techbros and idiots would be upset about), generative AI will eat its own tail in an oddly literal sense.
The more the web fills with slop, the more AI will train on it, and the worse and worse the models will get at generating good looking images, leading the images they produce (and the ones they inevitably train on) to decrease in quality, hastening the cycle of their own degradation.
Valid point, though I’m surprised that cyc was used for non-AI purposes since, in my very very limited knowledge of the project, I thought the whole thing was based around the ability to reason and infer from an encyclopedic data set.
Regardless, I suppose the original topic of this discussion is heading towards a prescriptivist vs descriptivist debate:
Should the term Artificial Intelligence have the more literal meaning it held when it first was discussed, like by Turing or in the sci-fi of Isaac Asimov?
OR
Should society’s use of the term in reference to advances in problem solving tech in general or specifically its most prevalent use in reference to any neural network or learning algorithm in general be the definition of Artificial Intelligence?
Should we shift our definition of a term based on how it is used to match popular use regardless of its original intended meaning or should we try to keep the meaning of the phrase specific/direct/literal and fight the natural shift in language?
Personally, I prefer the latter because I think keeping the meaning as close to literal as possible increases the clarity of the words and because the term AI is now thrown about so often these days as a buzzword for clicks or money, typically by people pushing lies about the capabilities or functionality of the systems they’re referring to as AI.
The lumping together of models trained by scientists to solve novel problems and the models that are using the energy of a small country to plagiarize artwork also is not something I view fondly as I’ve seen people assume the two are one in the same despite the fact one has redeeming qualities and the other is mostly bullshit.
However, it seems that many others are fine with or in support of a descriptivist definition where words have the meaning they are used for even if that meaning goes beyond their original intent or definitions.
To each their own I suppose. These preferences are opinions so there really isn’t an objectively right or wrong answer for this debate
The term “artificial intelligence” is supposed to refer to a computer simulating the actions/behavior of a human.
LLMs can mimic human communication and therefore fits the AI definition.
Generative AI for images is a much looser fit but it still fulfills a purpose that was until recently something most or thought only humans could do, so some people think it counts as AI
However some of the earliest AI’s in computer programs were just NPCs in video games, looong before deep learning became a widespread thing.
Enemies in video games (typically referring to the algorithms used for their pathfinding) are AI whether they use neural networks or not.
Deep learning neural networks are predictive mathematic models that can be tuned from data like in linear regression. This, in itself, is not AI.
Transformers are a special structure that can be implemented in a neural network to attenuate certain inputs. (This is how ChatGPT can act like it has object permanence or any sort of memory when it doesn’t) Again, this kind of predictive model is not AI any more than using Simpson’s Rule to calculate a missing coordinate in a dataset would be AI.
Neural networks can be used to mimic human actions, and when they do, that fits the definition. But the techniques and math behind the models is not AI.
The only people who refer to non-AI things as AI are people who don’t know what they’re talking about, or people who are using it as a buzzword for financial gain (in the case of most corporate executives and tech-bros it is both)
Nope, gen z, and I haven’t actually read any of the Harry Potter books myself.
But you’re on the right track; I think it was reading The Hobbit that did me in lol
When I am talking about fibrous material, like individual strands of carbon in a composite, I naturally type “fibre” but when I talk about nutrition or the internet it’s “fiber”
I also tend to spell armor armour and color colour despite being American.
Oh and I write grey instead of gray.
I also catch myself writing units like metre and litre instead of meter and liter sometimes.
It really all depends on if there’s a spellchecker turned on that will tell me I’m spelling things wrong.
I grew up in a small Utah town. The only four adults I ever remember hearing admit they were wrong especially when it came to politics or science or religion were my father and three of my high school teachers.
All the rest would literally tell me that the research papers and encyclopedias I tried to cite as evidence were made up by either satan or some government deep state conspiracy. Or they’d say we can “agree to disagree” about shit like animals feeling pain and the flaws in eugenics (I wish I was joking)
Yes, they have always been this stupid. Learning requires accepting when you’re wrong and the vast majority of people I knew growing up saw that as weakness.
I thought it would be different when I got out of that place, and while living in a larger city is better, it’s not better by all that much.
This is why you can taste/smell saline when it’s injected. Trace amounts of dissolved things (which taste like plastic and metal) in the saline are able to pass through the alveoli in the lungs and evaporate into your breath.
Oddly, I think it’s a similar thing with my ADHD meds because about an hour or two after taking them my breath smells/tastes weird.
Could you expand on what you mean by modular web technologies? Also when would you say the shift over from interoperable web technologies to one-stop-shop happened?
I’m relatively young and wasn’t really allowed on the internet, but from what I remember (trying to build websites on the old family computer in the basement lol) there were lots of issues with browsers not working with the same CSS properties circa 2016. Then again I had no idea what I was doing at the time so maybe it wasn’t so bad.
YouTube seemed to start going down hill shortly after that, followed swiftly by other apps and sites.
Basically as soon as I got consistent internet access it seemed like the internet was getting worse, but it seemed like lots of interoperability/compatibility issues were resolved over the decline in quality of the content of the net.
Again, I didn’t have much experience with the old net so I want to know your perspective
While fitting, there are many more things in life besides the internet could be called “capitalism fucked up everything.” Better to call it “capitalism fucked over the net” otherwise people could call Healthcare, climate change, fascism, most wars, pollution, etc. Web3.0 which might be confusing lol
Okay, so I’m definitely not the most knowledgeable hacker, but the issue with an active AI hunter, to hunt and kill instead of setting tarots, is that you’d have to actually create an AI capable of of hacking the scraper.
This would mean tracing it back to the actual source and then hacking that source to destroy the scraper, and I’d bet that’s not an easy task even for a human.
But yeah honestly, creating an AI capable of hacking and fucking up certain systems and then setting it loose on the net really could cause a Datakrash like event if it can replicate itself like a virus on the hardware it infects.
Even better if you could find some way to have it mutate as it goes along but that’s pretty far fetched even for this already far fetched hypothetical.