I figured rule of threes meant it was funnier to leave it out. 2017 would have been sad gooning to pornhub during the first trump nightmare.
Then 2027 could be sad gooning to ai hyperporn during the second trump nightmare.
Maybe I should have used 20 year jumps, but "2037, I am jerking off because there’s no food, and the internet is nothing but ai porn.’ didn’t seem as funny a point for the “time shattering” bit.
I thought this was going to go Watchmen for a moment. Like…
It is 1997, I am a young boy, I am jerking off to a grainy porno playing over stolen cinemax.
It is 2007, i am in my dorm, i am jerking off to a forum thread full of hi-res porno.
It is 2027, i am jerking off to an ai porno stream that mutates to my desires in real time. I am about to nut so hard that it shatters my perception of time.
Like all good sci-fi, they just took what was already happening to oppressed people and made it about white/American people, while adding a little misdirection by extrapolation from existing tech research. Only took about 20 years for Foucault’s boomerang to fully swing back around, and keep in mind that all the basic ideas behind LLMs had been worked out by the 80s, we just needed 40 more years of Moore’s law to make computation fast enough and data sets large enough.
As always though, “technology is ruining the kids” is actually cover for “our policies are ruining the kids, let’s blame technology!”
Am I saying AI should teach kids? Hell no, but this is a distraction. The article, via the teacher herself, outright states the problems. One, she has no time or motivation to work with the kids individually so she relies on tech to get more done. Two, this is because she’s teaching 160 students. Three, and that is because we’ve slashed education funding to practically nothing. And with DOGE destroying the DOE it seems now we’ll be educating the children with literally nothing.
This isn’t an AI problem. It’s a “we don’t give a fuck about children or the future” problem.
Wildly, in C# you can do either and it has different results. I believe a bare throw
doesn’t append to the stack trace, it keeps the original trace intact, while throw e
updates the stack trace (stored on the exception object) with the catch and rethrow.
In C#, you can only throw objects whose class derives from Exception.
This is incorrect. The C# is valid. Throw in a catch statement simply rethrows the caught exception. Source: I’ve been writing C# for 20 years, also the docs.
I won’t act like MS absolutely didn’t steal core concepts and syntax from Java, but I’ve always thought C# was much more thoughtfully designed. Anders Hejlsberg is a good language designer, TypeScript is also a really excellent language.
PETA isn’t going to like all those für
loops
At least the names are extremely self-documenting. Some of those German variable names are long enough they might even be self-aware!
Words also just rotate around in popularity like any other fad. Remember synergy? Paradigm shifts? Thinking outside the box?
Academia isn’t immune to memes, far from it. In the semi-contained world of higher education, trends in words and phrases are even more pronounced and likely to spread.
If this is evidence of LLM usage, it could easily be the machines reflecting back trends. These things pick up on subtle cues in your prompts to match tone with you as well so I wouldn’t rule out human influence either in prompts or the RLHF process.
It’s an untenable situation because its so much bigger than the tech world and open source. FOSS fundamentally works on a communal model: everyone needs lots of software, no one can hope to write it all themselves, so what if we distributed the labor out among the community so that everyone can work on some things important to them and the whole community benefits.
Then, capitalist businesses entered the picture and began using more and more open software as backbone for their enterprises. Government entanglements further complicate the picture, but fundamentally the capitalist mindset is incapable of building or maintaining our current technological base. It isn’t capable of maintaining or building our infrastructure either: almost all of that was built on government subsidies, socialism.
And now that vulture capitalism is the law of the land, everything is falling apart because there’s no more “slack” in the system where people can engage in personal socialism on projects like FLOSS, every bit of our time is being stolen to pad the numbers of capitalists.
This bleeds over into attitude as well. Every entitled user who thinks their personal issue is more important than any other concern is a trump or musk in miniature, believing that the the blowhard bravado of our current government is a model for forcing work to get done rather than a death spiral there’s no pulling out of.
You want FLOSS software that’s good? You want less burden on maintainers? You want a safer, saner, more human-centric technology base? You want a better tech world?
Eat. The. Rich.
Imagine being a director at this company. One of your employees brings you a report showing that your most active users, who are the backbone of your business, have a huge overlap with rape reports. This will destroy the company, and you know they’ll fire you for bringing it up and suppress it anyway. So you just… forget… to bring it up at the next quarterly. You used to work at Uber, and before that covering up how gambling and gaming companies float on a raft made of addicts, so this is well practiced blindness.
It was never about consumer behavior, just another front in the culture war while the people who could actually do something about pollution hide behind endless discussions about straws.
I’m being sarcastic but not by much. Nordic countries do have much better digital id systems and the EU overall looks to be following their model.
He’s complaining that a number isn’t unique and is being poorly used, but the number isn’t supposed to be unique and he’s complaining that it’s not being used in a way that experts are specifically warned not to use it in.
But on a second, stupider layer, this is the system those numbers originate from. So however they use them is how they’re supposed to be used.
But then, back above that first stupid layer, on an even more basic and surface level degree of stupid, the government definitely uses SQL databases. It uses just… so many of them.
It’s wild too. I’ve been in the hospital a lot lately and in addition to a bar-code wristband, every healthcare worker, before doing anything with me (the patient) will ask my full name and either birthday or address and then double-check it against the wrist band. This is to make sure, at every step, that they didn’t accidentally swap in some other patient with the same name. (Not so uncommon, lots of men have their father’s name.)
Meanwhile in like Iceland, everyone gets assigned a personal GPG key at birth so you can just present you public cert as identification, not to mention send private messages and secure your state-assigned crypto-wallet. Not saying such a system is without flaw but it seems a lot better than what we’re doing!
This is a good summary. I had to go pull up wikipedia on it since I roughly knew that social security was a national insurance/pension kind of system but am actually hazy on details.
The major issue with it as id (aside from DBA’s gripes about it) is that credit agencies and banks started to rely on it for credit scores and loans. You see, the US has a social scoring system (what we always accuse China of) but the only thing it tracks is how reliable you are about paying off debts. So with your home address, name, and SSN, basically anyone can take out loans or credit cards in your name. This will then damage your credit score, making it harder to get loans, buy a home, rent property, or even get a job.
That’s why Americans are always concerned about having our identity stolen: because you don’t need a lot of info to financially ruin someone’s life.
I’m hardly the king of databases, but always using a surrogate key (either an auto-incremented integet or a random uuid) has done me pretty well over the years. I had to engineer a combination of sequential timestamp with a hash extension as a key for one legacy system (keys had to be unique but mostly sequential), and an append-only log store would have been a better choice than an RDBMS, but sometimes you make it work with what you have.
Natural keys are almost always a bad idea though. SSNs aren’t natural, which is one pitfall: implicitly relying on someone else’s data practices by assuming their keys are natural. But also, nature is usually both more unique than you want (every snowflake is technically unique) and less than you’d hoped (all living things share quite a lot of DNA). Which means you end up relying on how good your taxonomy is for uniqueness. As opposed to surrogate keys, which you can assure the uniqueness of, by definition, for your needs.
I’m sure folks on here know this, but you know, there’s also that 10K a day that don’t so…
What makes this especially funny, to me, is that SSN is the literal text book example (when I was in school anyway) of a “natural” key that you absolutely should never use as a primary key. It is often the representative example of the kinds of data that seems like it’d make a good key but will absolutely fuck you over if you do.
SSN is not unique to a person. They get reused after death, and a person can have more than one in their lifetime (if your id is stolen and you arduously go about getting a new one). Edit: (See responses) It seems I’m misinformed about SSNs, apologies. I have heard from numerous sources that they are not unique to a person, but the specifics of how it happens are unknown to me.
And they’re protected information due to all the financials that rely on them, so you don’t really want to store them at all (unless you’re the SSA, who would have guessed that’d ever come up though!?)
It’s so stupid that it would be hilarious if people weren’t dying.
Oh don’t worry, I get myself involved in plenty. I prefer to make problems at the architectural or “leadership” level though.