You are right man
“Engineer of Information”, please 😎
If you want to know how computers work, do electrical engineering. If you want to know how electricity works, do physics. If you want to know how physics works, do mathematics. If you want to know how mathematics works, too bad, best you can do is think about the fact it works in philosophy.
all roads lead to philosophy
If you want to know how philosophy works, do sociology…
It’s kind of like a horseshoe with philosophy and math at the ends.
If you want to no longer want to know how anything works, do biochemistry
Too real
A horseshoe capped off by Computer Science 😉
looks weird without the clevage
tbf all good programmers are good at math. Not classic arithmetic necessarily, but at the very least applied calculus. It’s a crime how many people used a mathematical discipline every day, but don’t think they’re “good at math” because of how lazer focused the world is on algebra, geometry and trig as being all that “math” is.
Serious question; how does Calculus apply to programming? I’ve never understood.
PID control is the classic example, but at a far enough abstraction any looping algorithm can be argued to be an implementation of the concepts underpinning calculus. If you’re ever doing any statistical analysis or anything in game design having to do with motion, those are both calculus too. Data science is pure calculus, ground up and injected into your eyeballs, and any string manipulation or Regex is going to be built on lambda calculus (though a very correct argument can be made that literally all computer science is built of lambda calculus so that might be cheating to include it)
Does it apply to interpolation for animation and motion?
Motion yes, but I have no idea about the mathematics of animation (sorry)
Lambda calculus has no relation to calculus calculus, though.
Data science is pure calculus, ground up and injected into your eyeballs
Lol, I like that. I mean, there’s more calculus-y things, but it’s kind of unusual in that you can’t really interpret the non-calculus aspects of a neural net.
Lambda calculus has no relation to calculus calculus
I wanna fight your math teachers. No seriously, what did they tell you calculus is if it’s got nothing in common with lambda calculus?
Is there some connection I’ve just been missing? It’s a pretty straight rewriting system, it seems Newton wouldn’t have had much use for it.
Lot’s of things get called “calculus”. Originally, calculus calculus was “the infinitesimal calculus” IIRC.
I think the issue here might be the overloading of terms - lambda calculus is both the system of notation and the common name for the conceptual underpinnings of computational theory. While there is little to no similarity between the abstracted study of change over a domain and a notational system, the idea of function composition or continuous function theory (or even just computation as a concept) are all closely related with basic concepts from “calculus calculus” like limit theory and integral progression.
edit: clarity
I’m pretty sure the term was coined in the interwar era, so it’s kind of interesting if people are just calling the concept of functions “lambda calculus” now. Obviously they’re much older than that.
Graphics programming is the most obvious one and it uses it plenty, but really any application that can be modeled as a series of discrete changes will mostly likely be using calculus.
Time series data is the most common form of this, where derivatives are the rate of change from one time step to the next and integrals are summing the changes across a range of time.
But it can even be more abstract than that. For example, there’s a recent-ish paper on applying signal processing techniques (which use calculus themselves, btw) to databases for the purposes of achieving efficient incremental view maintenance: https://arxiv.org/abs/2203.16684
The idea is that a database is a sequence of transactions that apply a set of changes to said database. Integrating gets you the current state of the database by applying all of the changes.
that can’t be right. maybe they meant lambda calculus? programmers are definitely good at applied logic, graph theory, certain kinds of discrete math etc. but you’re not whipping out integrals to write a backend.
Any function that relies on change over a domain is reliant on concepts that are fundementally calculus. Control systems, statistical analysis, data science, absolutely everything in networking that doesn’t involve calling people on the phone to convince them to give you their password, that is all calculus.
Many things that work with time series data use calculus all the time. Both derivatives and integrals are very useful in that context: derivatives being the rate of change at some particular time step, and integrals being the sum of the changes across a range of time steps.
There’s a pretty wide range of applications.
Lotta infinite sums in loops
Computers are just big calculators so to program them you need calculus.
good physics/graphics engine require calculus
How?
Again, legit question.
If you write them yourself. Then you actually need a bit of math.
But claiming that you need math skills as a programmer because some kinds of programs need you to know maths is like claiming every programmer needs to know a lot about logistics because some people write software for warehouses.
A senior firmware engineer said to the group that we just have to integrate the acceleration of an IMU to get velocity. I said “plus a constant.” I was fired for it.
That sounds like it might be a gift in disguise.
Had a graduate Dev who did not have a fucking clue about anything computer related. How tf he passed his degree I have no idea.
Basic programming principles? No clue. Data structures? Nope.
We were once having a discussion about the limitations of transistors and dude’s like “what’s a transistor?” ~_~#
Tbh, as a dev knowledge of transistors is about as essential as knowledge about screws for a car driver.
It’s common knowledge and in general maybe a little shameful to not know, but it’s really not in any way relevant for the task at hand.
Maybe for dev knowledge, but computer science? The science of computers?
What kind of cs degree did you get where you learned about electrical circuits. The closest to hardware I’ve learned is logic circuit diagrams and verilog.
I learned about transistors in Informatics class in highschool. Everything from the bottom up, from the material that makes a transistor possible to basic logic circuits sr flip flops, and, or, xor, addition, to the von-neumann-architecture, a basic microprocessor and machine code and assembly.
I mean, I graduated over 20 years ago now, but I had to take a number of EE courses for my CS major. Guess that isn’t a thing now, or in a lot of places? Just assumed some level of EE knowledge was required for a CS degree this whole time.
I got my BS in CSci about 15 years ago and it was 100% about programming in java. We didn’t learn a fucking thing about hardware and my roommate was an EE major and we had none of the same classes except for calculus.
By the time I graduated java was basically dead. Thanks state college.
My CS program had virtually no programming outside a couple of courses where C was used to implement concepts. Had one applications type course where mostly Java was used.
CS is and should be a specialized math curriculum IMO. Teaching specific programming languages is time that would be better spent teaching theory that can’t be taught by dev docs or code bootcamps, as exemplified by your anecdote. Unfortunately nowadays people tend to see degrees as glorified job training programs.
Yeah, EE and CS had a lot of cross over where I went. At least in undergrad, grad school saw them diverge a lot more, but they still never disentangled, parts of each were important to both. Hell we had stuff like A+ labs, and shit.
Java isn’t dead, though
In my uni they kinda just teach java. There is one mandatory class that’s in C and one that’s in mips assembly tho.
Everyone used AI when I took those classes. By the end of the year they were still having trouble on groupchat with syntax stuff.
In my own uni’s coursework the closest we get are some labs where students breadboard some simple adder circuits, which we do just to save them from embarassing gaps in their knowledge (like happened in the inital comment). It doesn’t add much beyond a slightly better understanding of how things can be implemented, if we’re being honest.
I don’t have a degree
Well, computer science is not the science of computers, is it? It’s about using computers (in the sense of programming them), not about making computers. Making computers is electrical engineering.
We all know how great we IT people are at naming things ;)
Computational theory would be a better name, but it overlaps with a more specific subset of what is normally called CS.
My BS in CS took its roots down to CMOS composition of logic gates and basic EE, on the hardware side, and down to deriving numbers and arithmetic from Boolean logic / predicate calculus, on the philosophy side. Then tied those up together through the theoretical underpinnings of computation and problem solving, like a trunk, and branched back out into the various mainstream technologies that derived from all that. It obviously all depends on the program at the school of choice, I suppose, and I’m sure it’s evolved over the years, but it still seems important to have at least some courses that pull back the wizard’s curtain to ensure their students really see how it’s all just an increasingly elaborate, high-tech version of conceptually simple (in function) machinery carrying out fundamental building blocks of logic.
Anyway, I’m going to go sniff my own cinnamon roll scented farts while gazing in the mirror, now.
We did the same thing, going so far as to “build” a simple imaginary CPU. It was interesting but ultimately dead knowledge.
I built an emulator for that CPU, which the university course took over and used for a few years for the course. But after that I never did anything with logic gates or anything like that.
I got into DIY electronics lateron as a hobby, but even then I never used logic gates and instead just slapped a cheap microcontroller on to handle all my logic needs.
I do use transistors sometimes e.g. for amplification, but we didn’t learn anything about that in university.
In the end it feels like learning how to theoretically mine sand when studying to become an architect. Interesting, but also ultimately pointless.
Informatics is a much better name imo
I see there’s a fellow German speaker ;)
I do agree though!
Is that not the difference between a computer science and a computer engineering degree?
If you want someone to know about the physical properties of transistors, find an electrical engineer.
Ok, but he didn’t know what a transistor is. Like I get not knowing the mechanics or chemistry of it, but to literally not know it or how it applies to a computer boggles my mind.
I’ve met people like that too.
It’s called cheating, lots of people do it.
Most worthless dev I’ve met was a graduate of comp sci who couldn’t hold a candle compared to a guy that did a dev boot camp.
The best dev I’ve met so far didn’t even have any credentials whatsoever, second next best did 2yr associates.
Tie for 3rd best with associate’s and 4yr degree.
I was partnered with that guy for one class in grad school. We were working on a master’s degree in software engineering, and the assignment was analysis and changes to an actual code base, and this mofo was asking questions and/or blanking on things like what you mention. I can’t remember the specifics but it was some basic building block kind of stuff. Like what’s an array, or what’s a function, or how do we send another number into this function. I think the neurons storing that info got pruned to save me the frustrating memories.
I just remember my internal emotional reaction. It was sort of “are you fucking kidding me” but not in the sense that somebody blew off the assignment, was rude, or was wrong about some basic fact. I have ADHD and years ago I went through some pretty bad periods with that and overall mental & physical health. I know the panic of being asked to turn in an assignment you never knew existed, or being asked about some project at work and just have no idea whatsoever how to respond.
This was none of those. This was “holy shit, this guy has never done anything, how did he even end up here?”
Could be a case of bad memory. Solved the exams and forgot everything in the next hour.
Depends on the context. When my company proposes me to a client for work I am, but oddly during my yearly performance review I am just some smuck who programs.
I’m something of a scientist myself
I literally have no idea what this picture means, and at this point I’m too afraid to ask.
The typical holder of a four-year degree from a decent university, whether it’s in “computer science”, “datalogy”, “data science”, or “informatics”, learns about 3-5 programming languages at an introductory level and knows about programs, algorithms, data structures, and software engineering. Degrees usually require a bit of discrete maths too: sets, graphs, groups, and basic number theory. They do not necessarily know about computability theory: models & limits of computation; information theory: thresholds, tolerances, entropy, compression, machine learning; foundations for graphics, parsing, cryptography, or other essentials for the modern desktop.
For a taste of the difference, consider English WP’s take on computability vs my recent rewrite of the esoteric-languages page, computable. Or compare WP’s page on Conway’s law to the nLab page which I wrote on Conway’s law; it’s kind of jaw-dropping that WP has the wrong quote for the law itself and gets the consequences wrong.
I‘d honestly be interested where you are from and how it is in other parts of the world. In my country (or at least at my university), we have to learn most of what you described during our bachelors. For us there is not much focus on programming languages though and more about concepts. If you want to learn programming, you are mostly on your own. The theories we learned are a good base though
I meant the guy in the picture, but thanks anyway
I mean, nowadays you need to be very smart and educated to google efficiently and avoid all the AI traps, missinformation, stackoverflow mods tripping, reading reddit threads on an issue with half the comments deleted because of the APIcalypse etc… sooo you could argue that you’re somewhat of a scientist yourself
Had a discussion with my 8yo niece the other day… turned out the lesson was, sometimes it can be worse to know the wrong thing than to know nothing at all.
If a C- is enough to pass Analysis of Algorithms, then a Computer Science degree can make me a Computer Scientist. :P
You need C++ for computer science, though!
Be me, a computer scientist who still struggles with XOR.
Wait til you see XNAND
My favorite was always XANEX
what fuck that one does
Turns all your zeros into ones.
I have been coding since I was 10 years old. I have a CS degree and have been in professional IT for like 30 years. Started as a developer but I’m primarily hardware and architecture now. I have never ever said I was a computer scientist. That just sounds weird.
Yeah you’d really only say it on the theoretical side of things, I’ve definitely heard it in research and academia but even then people usually point to the particulars of their work first
I mean, I am applying various kinds of science but I’m not actually doing any science so I’m not thinking about myself as a scientist. What I do is solving problems - I’m an engineer.