Owner and admin of blimps.xyz

I’m a dorky inflatable latex coyote! Linux nerd, baker, some 3D things as I learn. Also love latex. The material, not the typography thing.

KeyOxide: openpgp4fpr:ef9328927969d342939bbb2718817244ed315340

  • 0 Posts
  • 25 Comments
Joined 2 years ago
cake
Cake day: July 14th, 2023

help-circle
  • I think the problem is that you think you’re talking like a time traveler heralding us about the wonders of sliced bread, when really it’s more like telling a small Victorian child about the wonders of Applebee’s and in the impossible chance they survive to it then finding everything is a lukewarm microwaved pale imitation of just buying the real thing at Aldi and cooking it in less time for far tastier and a fraction of the cost.


  • If you want to argue in favor of your slop machine, you’re going to have to stop making false equivalences, or at least understand how its false. You can’t make ground on things that are just tangential.

    A computer in 1980 was still a computer, not a chess machine. It did general purpose processing where it followed whatever you guided it to. Neural models don’t do that though; they’re each highly specialized and take a long time to train. And the issue isn’t with neural models in general.

    The issue is neural models that are being purported to do things they functionally cannot, because it’s not how models work. Computing is complex, code is complex, adding new functionality that operates off of fixed inputs alone is hard. And now we’re supposed to buy that something that creates word relationship vector maps is supposed to create new?

    For code generation, it’s the equivalent of copying and pasting from Stack Overflow with a find/replace, or just copying multiple projects together. It isn’t something new, it’s kitbashing at best, and that’s assuming it all works flawlessly.

    With art, it’s taking away creation from people and jobs. I like that you ignored literally every point raised except for the one you could dance around with a tangent. But all these CEOs are like “no one likes creating art or music”. And no, THEY just don’t want to spend time creating themselves nor pay someone who does enjoy it. I love playing with 3D modeling and learning how to make the changes I want consistently, I like learning more about painting when texturing models and taking time to create intentional masks. I like taking time when I’m baking things to learn and create, otherwise I could just go buy a box mix of Duncan Hines and go for something that’s fine but not where I can make things when I take time to learn.

    And I love learning guitar. I love feeling that slow growth of skill as I find I can play cleaner the more I do. And when I can close my eyes and strum a song, there’s a tremendous feeling from making this beautiful instrument sing like that.







  • It does! I have my bedroom one controlled through it and even showing up as a play target for Spotify Connect. I’ve got my speakers I was plugging into my phone to play music before, or into a Raspi briefly, plugged into the 3.5mm jack on that one.

    My kitchen one I just leave as-is. I DID modify the ESPHome firmware on each, extending to add an OLED (I think) clock display that also shows remaining time for timers in numbers. I do really like the LED ring animation for timers built-in though, it’s pretty slick!


  • I ended up picking up two of the Home Assistant Voice PE devices and I’ve been fairly happy with them. I even extended their firmware so I have a clock display on each with one being my bedroom alarm clock even. But even out of the box functionality, as long as you can either run faster-whisper on Home Assistant (or another box), or don’t mind their lighter device-control-only route, is totally solid.

    Plus music streaming to them (with an external speaker attached via the 3.5mm jack) is pretty good!



  • There’s a mode for voice control that is even friendly to a Raspi 4 or 5, but it’s very simplistic in control, basically a super lightweight speech to text trained only on device names and aliases. Think the speech to text in late 2000s through early 2010s non-smart phones.

    Small models for faster-whisper will run on even my little Dell Micro i5-6500T that I have Home Assistant running on, it’s just a little bit slow, but it absolutely works and is usable speed! I run a larger model currently offloaded to my server, which has an RTX 2070 Super in it, but that’s to make it perform more like how Google used to a long time ago, and it’s unused power most of the time.

    They’re trying to make it as accessible as possible for sure. There’s even options to use cloud STT and TTS (they even include it in the Home Assistant Cloud optional feature), but it’s definitely cool as hell to be able to talk to an open-source-design speaker and get a reply and control any switches or lights or even my thermostat and robo vacuum without needing the Internet to work. As long as my Wi-Fi and HA box are up, I’ve got options!


  • I don’t think that the point here was trying to do anything to say that the user did anything wrong. I think it’s simply pointing out how frustrating it is that Microsoft’s Insistence on various things, as part of their EEE policy, created this situation to begin with, and that it wouldn’t have even broken if not for that.

    I’m pretty sure that the person you replied to was really just lamenting that that this is what broke it. And that fundamentally, Microsoft is getting exactly what they wanted as a result. And it’s just frustrating.




  • whether it’s telling the truth

    “whether the output is correct or a mishmash”

    “Truth” implies understanding that these don’t have, and because of the underlying method the models use to generate plausible-looking responses based on training data, there is no “truth” or “lying” because they don’t actually “know” any of it.

    I know this comes off probably as super pedantic, and it definitely is at least a little pedantic, but the anthropomorphism shown towards these things is half the reason they’re trusted.

    That and how much ChatGPT flatters people.


  • Python is compiled at “runtime” to a similar OS+arch byte-code minus ELF headers that Linux binaries typically have from gcc.

    My point was it’s a stupid distinction and worthless when the other points about poor implementations of common language frameworks are plenty on their own is all, and it’s needlessly snobbish.

    As far as class variable reference however I wish more languages self-referenced. In my eyes it makes it far clearer at a given line of code glance as to where the hell a value came from as opposed to just by name. I feel a keyword like self::variableName, or maybe more aptly &self as a pointer to reference in C++ would be very clear, like Rust does, which is very much, by the original definition, a programming language instead of scripting. Even Java, which is definitely not a scripting language though is still run inside a virtual machine, uses this. I don’t personally like the term versus self, but eh.

    Though if you want a hammer in a screw-driven world look no further than Electron. I think it puts anyone else’s even purposeful attempts at such to shame.





  • Struggling to read all the comments on mobile so apologies if this is a duplicate, but if you need recipes, Tandoor Recipes. I use it for hosting my own edits of recipes. Since I do baking streams it’s great for me to easily link to my stream for folks who want the same recipe including any tips I’ve added or variations, or something I’ve kinda come up with that’s based off a standard formula.

    Plus, using the Kitshn app on a tablet makes for an absolutely gorgeous kitchen companion for reading recipes. Split screening it between the recipe and the chat has been awesome. For real, Kitshn is absurdly polished for an open source app.