• 1 Post
  • 188 Comments
Joined 2 years ago
cake
Cake day: June 13th, 2023

help-circle

  • I’d say don’t be hesitant to try to get her into things. Don’t push it multiple times, but if she’s genuinely never heard of, for example, South Park, just show her an episode. If she doesn’t like it, that’s that and it’s not your fault or anything and it sounds like she’s at least willing to give things a shot for you.

    Then of course try to find things you’ll both like. But do it together cause it’s more fun that way and it sucks to feel like you’re the only one trying.

    But also maybe you don’t have a ton of interests to share and just enjoy each other’s company and that’s fine 🤷




  • Having tried simple bidets in both warm, cold, and neutral-ish climates, I find that cold water bidets seem to stiffen the poo bits and make it hard to actually get them off your butt esp since they stick to the hairs. You and I might be talking about different levels of cold, though.



  • You should give Claude Code a shot if you have a Claude subscription. I’d say this is where AI actually does a decent job: picking up human slack, under supervision, not replacing humans at anything. AI tools won’t suddenly be productive enough to employ, but I as a professional can use it to accelerate my own workflow. It’s actually where the risk of them taking jobs is real: for example, instead of 10 support people you can have 2 who just supervise the responses of an AI.

    But of course, the Devil’s in the detail. The only reason this is cost effective is because of VC money subsidizing and hiding the real cost of running these models.




  • Compilation is CPU bound and, depending on what language mostly single core per compilation unit (I.e. in LLVM that’s roughly per file, but incremental compilations will probably only touch a file or two at a time, so the highest benefit will be from higher single core clock speed, not higher core count). So you want to focus on higher clock speed CPUs.

    Also, high speed disks (NVME or at least a regular SSD) gives you performance gains for larger codebases.




  • I think the main barriers are context length (useful context. GPT-4o has “128k context” but it’s mostly sensitive to the beginning and end of the context and blurry in the middle. This is consistent with other LLMs), and just data not really existing. How many large scale, well written, well maintained projects are really out there? Orders of magnitude less than there are examples of “how to split a string in bash” or “how to set up validation in spring boot”. We might “get there”, but it’ll take a whole lot of well written projects first, written by real humans, maybe with the help of AI here and there. Unless, that is, we build it with the ability to somehow learn and understand faster than humans.




  • jcg@halubilo.socialtoMemes@lemmy.ml"They" are on it 24/7
    link
    fedilink
    arrow-up
    4
    ·
    edit-2
    6 months ago

    I don’t mind a whoops somebody fucked right up error message if you let me click a button for more details. Or at the very least, give me a reference number I can tell somebody about. Some “software companies” don’t even properly log things on their end so nobody can solve shit.