User agents cannot be fully trusted anymore since every browser puts every possible word in it so they are not excluded by anything.
Previously SecretPancake on Feddit
User agents cannot be fully trusted anymore since every browser puts every possible word in it so they are not excluded by anything.
Did you use all 7 ports at the same time? Computers had so many different ones because the accessories all had their own ports. It was a mess back then. Now everything is USB-C and Macbook Pros have 4 of them. Plenty of ports.
If you have no idea what Git is, that warning message is not telling you you’re about to delete 5000 files.
But I wonder if this person maybe does know about Git because they used the word „stage“.
So then „bury“ it behind a paywall, why is that bad? A server costs money so let the people who want to use that server pay their part. I see no problem with that.
I hated every second of it and it’s ugly as hell. Factorio is king.
(Trolling because Op is so super sensitive about it)
deleted by creator
deleted by creator
Because pieces might break off and fall into the mix. Because the eggs inside briefly touches part of the outside shell while it’s gobbling out. Because you’re touching the egg with your hand and might accidentally touch something else before washing your hands. It’s just another layer of protection.
Affinity really is a great alternative and it’s relatively cheap. There was a concern that when Canvas bought the company they would force a subscription model on it but apparently that’s not the case (yet).
Bring back chopping off hands for thieves.
Judging by the comments in this thread this will be the one defining mistake that kills Microsoft.
Legendary talk
It does make sense when you mix. You get the benefit of instant rendering and dynamic content all in one. And web dev becomes even more complicated…
There is no latency on static pages. They are rendered once as regular HTML and then saved on the server to be immediately ready for the user. The server is only processing that initial data fetching and rendering once per site. If needed, it can be retriggered. This is great for blogs and other regular pages.
Server pages on the other hand will do the initial fetch request every time but once the site is there, no data is missing and everything is there. It’s not for everyone. Regular dynamic pages still make sense. For every method there are use cases.
Disclaimer: I’m speaking from my experience with Next.js which did the same thing long before and React now aims to make that easier. But I’m not sure if React has the distinction between static and server. It’s all new and I haven’t had a project to test it on yet.
It’s called Server Components. If you actually build a fully static website, there is no DOM modification going on. I would actually not recommend doing that with React because it kinda defeats the purpose. The goal of it is to have a mix of both. The initial render is super fast because it is prerendered once for everyone. Then dynamic data is being fetched if needed and elements are replaced. It also improves SEO.
React 19 is not yet officially released but you can read more about it here https://react.dev/blog/2024/04/25/react-19
I’m a React dev. You can create server side websites, written in JS, that don’t require JS to be turned on in the browser. Granted, this just became a new official feature in React but has already been available with React frameworks like NextJS
It used to be huge.
That’s so sassy I kinda respect it.
Too bad for those who fall for it.
The manager has nothing to do with it so no reason to scream at them. You’re just ruining peoples day so you can feel like a hero.
So charge money for a notifications feature, what’s the big deal? If people don’t want it they don’t have to pay.