Everything changed when they removed ham from the menu
Everything changed when they removed ham from the menu
“Don’t mention the war”
Ok, I take it back. I just got some peach Jinro soju, it’s 13% ABV and doesn’t taste like alcohol at all. Even less than the Sake. Definitely try it.
Soju tastes kinda like vodka. I think sake is a good balance of “high enough alcohol” and “doesn’t taste like paint thinner.” Not a connoisseur by any means but a bottle of Kurosawa is cheap and tastes pleasant and fruity, and has around 14% ABV.
1000 or so moves
Ok, this nerd-snipes me a bit. The real point of the Towers of Hanoi problem, at least if you’re doing it in a discrete math class, is to count exactly the number of moves for n discs. Call this number f(n).
To make this post easier to read, we will label the discs from 1 to n, with 1 being the bottom disc and n being the top. We will also label the pegs A, B, and C, with A being the starting peg and C being the target peg. This lets us describe moves in a compact notation: 3: A -> C means disc 3 moves from A to C.
For n=0, we vacuously need 0 moves to win.
For n=1, we need just 1 move: 1:A->C.
For n=2, we can do it in 3 moves, so f(2) = 3. The sequence is:
Now suppose we know f(n) for n between 1 and N, with N >= 2. For n=N+1, we can move N+1: A -> C and attempt to use our strategy for moving the remaining N discs from A to C, shuffling N+1 around as needed. Since it’s the smallest disc, we can always move it to any pillar, and any time we want to move something to the pillar it’s on, it must be moved first. Let’s see how that plays out for N=2:
We hazard a guess that every step of the N-disc strategy is preceded by a move of the N+1 disc, plus one final move. In other words, f(N+1) = 2f(N) + 1. Some careful analysis will justify this guess.
Now we have a recurrence relation for f(n), but how to solve it? A classical technique is “guess the formula magically, then prove it by induction.” It’s certainly doable here if you compute a few values by hand:
f(2) = 3
f(3) = 2(3) + 1 = 7
f(4) = 2(7) + 1 = 15
f(5) = 2(15) + 1 = 31
f(6) = 2(31) + 1 = 63
You’ll probably see it right away: f(n) = 2^n - 1. Indeed, we can prove this by induction: the base case n=2 holds as f(2) = 3 = 2^(2) - 1, and if f(n) = 2^n - 1 then f(n+1) = 2f(n) + 1 = 2(2^n - 1) + 1 = (2^(n+1) - 2) + 1 = 2^(n+1) - 1 as desired.
We may conclude f(12) = 2^(12) - 1 = 4095. In other words, with 12 discs, exactly 4095 moves are required.
–
Bonus: an alternative to the “guess and prove” technique that is generalizable to a broad class of recurrence relations. The technique is called “generating functions.” Given the sequence f(n), we consider the formal power series
F(x) = f(0) + f(1)x + f(2)x^(2) + … + f(n)x^(n) + …
This F(x) is the “ordinary generating function” for the sequence f(n). Strictly speaking, it may not be a well-defined function of x since we haven’t made any assumptions about convergence, but we will assume (for now, and prove later) that the set of such formal power series behaves algebraically much like we expect. Namely, given the recurrence relation above, we can write:
F(x) - f(0) - f(1)x = f(2)x^(2) + f(3)x^(3) + f(4)x^(4) + … + f(n)x^n + …
= (2f(1) + 1)x^(2) + (2f(2) + 1)x^(3) + … + (2f(n-1) + 1)x^(n) + …
= 2x(f(1)x + f(2)x^(2) + … + f(n)x^(n)) + (x^(2) + x^(3) + … + x^(n) + …)
= 2x(F(x) - f(0)) + x^(2)/(1-x)
In our case, we have f(0) = 0, f(1) = 1, f(2) = 3 so we can write more succinctly:
F(x) - x = 2xF(x) + x^(2)/(1-x)
Solving for F,
F(x)(1 - 2x) = x + x^(2)/(1-x)
= x(1 + x/(1-x))
F(x) = x(1 + x/(1-x))/(1 - 2x)
= x/(2x^(2) - 3x + 1)
Ok, great. We’ve found that our generating function, convergence notwithstanding, is that rational function. We can use partial fraction decomposition to write it as
F(x) = 1/(1 - 2x) - 1/(1-x)
which has the advantage of telling us exactly how to compute the coefficients of the Taylor series for F(x). Namely,
1/(1-x) = 1 + x + x^(2) + … + x^(n) + …
1/(1 - 2x) = 1 + 2x + 4x^(2) + … + 2^(n) x^(n) + …
So F(x) = (1-1) + (2-1)x + (4-1)x^(2) + … + (2(n)-1)x(n) + …
The nth coefficient of the Taylor series for F about 0 is 2^(n)-1, and by the definition of F as the ordinary generating function for f(n), we have f(n) = 2^(n) - 1. (The rigorous justification for ignoring convergence here still needs to be done; for now, this can be seen as a useful magic trick.)
Not necessarily do logic, but mimic it, like it can mimic coherent writing and basic conversation despite only being a statistical token muncher. The hope is that there’s sufficient information in the syntax to model the semantics, in which case a sufficiently complex and well-trained model of the syntax is also an effective model of the semantics. This apparently holds up well for general language tasks, meaning “what we mean” is well-modeled by “how we say it.” It’s plausible, at face value, that rigorous argumentation is also a good candidate, which would give language models some way of mimicking logic by talking through a problem. It’s just not very good in practice right now. Maybe a better language model could do better, maybe not for a reasonable cost.
Yeah, it’s a matter of convention rather than opinion really, but among US academia the convention is to exclude 0 from the naturals. I think in France they include it.
Seasons 1 and 3 are also missing 1 or 2 episodes each because a guest star’s royalty agreement renegotiation fell through
Well now we have to find you good Miku songs just in case you are trans.
If “AI art is kinda bad” is part of the argument against AI art, it will only be used against us when AI art isn’t that bad anymore.
It’s a different situation, as a dev I’d happily bet my life on this assumption.
Dropping support for that stuff means breaking 95% of the websites people currently use. It’s a non-starter, it cannot ever happen, even if you think it would be for the best.
“put the excess energy into batteries” is an idea, and is already pretty much what is done, but the large scale implementation still requires a lot of time, effort, and expense.
The standard .NET C# compiler and CLI run on and build for Windows, MacOS, and Linux. You can run your ASP.NET webapps in a Linux docker container, or write console apps and run them on Linux, it doesn’t matter anymore. As a .NET dev I have literally no reason to ever touch Windows, unless I’m touching legacy code from before .NET Core or building a Windows-exclusive app using a Windows app framework.
Ok, there’s no such thing as native Windows apps for Linux, but there are cross platform GUI frameworks like Avalonia and Uno that can produce apps with a polished identical experience across all platforms, no electron needed
I dunno if you’re joking, but yeah there’s IDE plugins that do this. GitHub Copilot grabs context from files in your edit history and you can tell it to edit, refactor, “fix” etc. selections. The more complex actions, the less likely to succeed, though.