• 0 Posts
  • 80 Comments
Joined 2 years ago
cake
Cake day: June 17th, 2023

help-circle
  • I’m not sure what you were trying, but this works for me:

    Never use hardware encoding. That is intended for real time transcoding. There are not many settings that work since it is just sending the file to the video card and letting it do its thing.

    Slower is better. If you set the software encoder to very slow it will produce an output that is very high quality per megabyte. I generally don’t care if it takes twice as long to encode it as to watch it. I queue it up and let it run over night.

    Choose the right codec. I like 10 bit HEVC, because I know it will work on the clients I play it from. When you rip a DVD using MakeMKV, the video will be MPEG-2, it was designed in the 1990’s and converting the file to a modern codec will save a lot of space. I don’t reencode 4K UHD rips much since I don’t want to mess with losing the hdr or other color features that I like in watching those files.

    Audio tracks: I will rip out audio for languages I don’t speak, or desctiptive audio track, but go out of my way to label things like director commentaries. I don’t reencode the audio tracks at all, you won’t save much disk space by messing with them compared to the video tracks.




  • To be honest, that seems like it should be the one thing they are reliably good at. It requires just looking up info on their database, with no manipulation.

    That’s not how they are designed at all. LLMs are just text predictors. If the user inputs something like “A B C D E F” then the next most likely word would be “G”.

    Companies like OpenAI will try to add context to make things seem smarter, like prime it with the current date so it won’t just respond with some date it was trained on, or look for info on specific people or whatnot, but at its core, they are just really big auto fill text predictors.


  • The network effect is real. You can have the best, most awesomely-designed social media platform ever and it will be useless if you are the only person on it.

    You can try to convince all your contacts to switch away from whatever app is causing the most evil today, but you also have to convince all of your contacts’ contacts and all of theirs as well.









  • I have a Yamaha YAS-209. We use our tv as a dumb one with a shield tv plugged into it. The Yamaha allows us to either run the shield signal through it or just have one cable going into the bar from the tv. It also has a subwoofer that connects wirelessly and we have it sitting behind the couch.

    It sounds great and works well for what it is. The only problem we have is that when we change volume, there is no visual signal displayed on the screen. We also have a Yamaha receiver for the media room and it does a good job with on screen notifications but the sound bar does not.



  • I have an anecdote that says the opposite. I got the same fridge, washer, and dryer from LG when we moved in our house 10 years ago and have had no problems with any one of them. My wife hates that we got a model with the freezer as a drawer on the bottom and would have preferred a side by side but no problems with anything breaking.

    Our Bosch dishwasher on the other hand had a gasket start leaking during the pandemic and it took the repair people 4 or 5 months to get a replacement in. I think they were redesigning a faulty part at the same time as all the supply chain issues so we had a really bad time with that. It was only a couple years old at the time and has worked ever since.




  • Side note. Don’t use hardware acceleration with TDARR. You will get much better encodes with software encoding, which is great for archival and saving storage.

    Use hardware acceleration with Jellyfin for transcoding code on the fly for a client that needs it.

    If you know what your client specs are, you can use TDARR to reencode everything to what they need and then you won’t have to transcode anything with Jellyfin.



  • You are correct that if you are on thee moon and have a cs-133 atom with you is second will take that many transitions. And if you do the same thing on Earth, a second will take the same number of transitions.

    But things get weird when you are on earth and observe a cs-133 atom that is on the moon. Because you are in different reference frames, you are traveling at different speeds and are in different gravity wells time is moving at different rates. This means that a cs atom locally will transition a different number of times in a second from your point of view on Earth vs one you are observing on the moon.

    And it would all be reversed if you were on the Moon observing a clock back on the Earth.

    They already have to account for this with GPS satellites. They all have atomic clocks on them but they don’t run at the same speed as clocks that are on the ground. The satellites are moving at a great speed and are further from the center of the earth than us, so the software that calculates the distance from your phone to the satellite have to use Einstein’s equations to account for the change in the rate of time.

    Relativity is weird.