

CrossCode is really nice on the deck


CrossCode is really nice on the deck


Also the notepad guy is immediately faxing their notes directly to the police and any advertising data brokers who ask for them


We’re writing an online banking service entirely in brainfuck. Backend, frontend, even middleend if we have to


I really want to play this game, but I can’t stomach brain-parasite stuff :(


On macos, that program is sometimes just Finder trying to calculate folder size


ssh only after a reboot doesn’t solve the problem, of course


IIRC SteamOS reinstalls itself every update. Just updating the software will probably undo all of these changes.
Does anyone know how much more accurate this is compared to other interferometer gyroscopes like fiber-optics?


Ooh ouch. It was a pain in the ass to remove.
My advice: Look up a video of how the microsd latch mechanism works. This will help you understand how to unlatch it if you need to. Use a thin sewing needle or two to carefully push/pull the card out. It takes a little while and requires some patience.
For Jellyfin? You really don’t need much. A raspberry pi can run jellyfin if you don’t have a transcoded format. My main server is only using a 6th gen i3, and Jellyfin runs perfectly for everything. Never tried going above 1080p though.


Another glorious day of not having to worry about my nice and stable Debian server. It runs on an old Dell thin client I got on ebay, which isn’t much, but it gets the job done.


Who do you even email for that? [email protected]?


Until every television can tell the difference between your eyes and a video camera, video drm is never gonna be effective


Interesting! My lcd deck hasn’t had any issues with fan sounds. I don’t think I’ve ever even noticed the fan while playing. My sd slot is a little here and there though.
Speaking of which… everyone remember to REMOVE THE MICRO SD BEFORE REMOVING THE BACK PLASTIC PLATE!! For some reason it’s designed to snap the poor sd card right in half if you don’t. Lost a good 512 GB from that mishap 🥲


My only experience is with gpu-side OpenGL, so here goes:
Your gpu is a separate device designed to run simple tasks with a staggering amount of parallelization. What does that mean? Basically every vertex and pixel on your screen needs to be processed before it can be displayed, and the gpu has a bunch of small cores that do all of that for every single frame your monitor outputs. A programmer defines all this using shaders. In OpenGL, the shader language is called GLSL.
In the opengl graphics pipeline, the cpu-side code defines which effects apply to which geometry in what order. For example, you may want to render every opaque object first, and then draw the transluscent objects on top with semi-transparency (this is called deferred rendering, and is a very common technique). Maybe you’d want a different shadow map for each light-emitting object. Maybe you’d want a setting to define how much bloom to draw to the screen. Maybe you want to provide textures for the gpu to access. The possibilities are endless.
On the gpu-side, we write code in shaders. The shaders, written in GLSL, get compiled by your device-specific drivers into the machine code your hardware uses. In OpenGL there are several types of shader, but there are two main ones: Vertex and Fragment shaders.
Vertex shaders run first. They run on every vertex in the scene and do that math that puts each vertex in the correct location. You can also assign varying values specific to each vertex that get passed down the pipeline to the next shaders.
Between the vertex and fragment shaders, the gpu automatically saves performance by removing any vertex that ends up off-screen, or any triangle that’s definitely not visible to the camera (this is called culling), and then fills in each triangle with pixels called fragments (in a process called rasterization). Each fragment will also have access to the varying value of it’s three vertices interpolated across the face of the triangle (ie the closest triangle will have the most influence).
After this, the fragment shaders are run on every pixel/“fragment” on screen - this is where you’d render effects like lightning and shadows and apply textures. The fragment shaders determine the color of the pixel as it appears on your screen.
There are other specialized shaders you can add too! But your gpu needs to be new enough to support them:


I just use “for EGsample” and “for EXample”
Also “In Eoutherwords”
But for those wondering: ie and eg are actually Latin acronyms!
i.e. - id est - “that is”
e.g. - exempli grata - “for example”


ie basically means “in other words”
eg and ex mean “for example”
Dell optiplexes regularly sell on ebay for like $50-60, and they often have 6th-gen intel i5s inside