“Depending on how Bluetooth stacks handle HCI commands on the device, remote exploitation of the backdoor might be possible via malicious firmware or rogue Bluetooth connections.”
I of course don’t know details but I’m basing my post on that sentence. “Backdoor may be possible via … rogue Bluetooth connections.”
You literally need physical access to the device to exploit it
You don’t need physical access. Read the article. The researcher used physical USB to discover that the Bluetooth firmware has backdoors. It doesn’t require physical access to exploit.
It’s Bluetooth that’s vulnerable.
He demonstrated that he is senile when he blamed the current trade agreements on the president who signed the last agreement. Trump signed the last agreement. He can’t remember.
That’s why he changes from tariff off to tariff on every week. He has no plan. He has lost his memory.
With the current Minecraft monthly updates, paper is always behind on the latest features. There are also minor problems that paper introduces with its performance improvements.
Years ago paper was critical for a good Minecraft experience, but a newer PC (newer than 6 years old) runs great on vanilla.
So if you’re planning to have the kid play on Switch or something like that, it’s not going to work.
You can run Geyser (a modified Minecraft server) to let bedrock clients play on your Java server.
Removing American products had very little to do with tariffs. It started after Trump threatened to invade Canada.
Which is weird because one of Rossman’s sources claimed that they were on the phone with Brother, asked how to do manual registration, and were told it couldn’t be done unless a genuine Brother toner cartridge was installed.
“training a new model”
Is equivalent to “make a new game” with better graphics.
I’ve already explained that analogy several times.
If people pay you for the existing model you have no reason to immediately train a better one.
When you throw more hardware at them, they are supposed to develop new abilities.
I don’t think you understand how it works at all.
Data is collected. Training is done on the data. Training is done on the trained data (deep seek). You now how a model. That model is a static software program. It requires 700 GB of ram to run (deep seek). Throwing more hardware at the model does nothing but give you a quicker response.
If everyone pays you to use your model, you have no reason to develop a new one. Like Skyrim.
I’m supposed to be able to take a model architecture from today, scale it up 100x and get an improvement.
You can make Crysis run at higher fps. You can add polygons. (remember ati clown feet?) You can add details to textures. https://research.nvidia.com/publication/2016-06_infinite-resolution-textures
But really the “game” is the model. Throwing more hardware at the same model is like throwing more hardware at the same game.
Which part of diminished returns not offering as much profit did you not understand?
Current models give MS an extra 30% revenue. If they spend billions on a new model will customer pay even more? How much would you pay more for a marginally better AI?
Current games have a limit. Current models have a limit. New games could scale until people don’t see a quality improvement. New models can scale until people don’t see a quality improvement.
Sorry I meant wdm. Multimode was for home installs.
Like games have diminished returns on better graphics (it’s already photo realistic few pay $2k on a GPU for more hairs?), AI has a plateau where it gives good enough answers that people will pay for the service.
If people are paying you money and the next level of performance is not appreciated by the general consumer, why spend billions that will take longer to recoup?
And again data centers aren’t just used for AI.
If buying a new video card made me money, yes
But this is the supposition that not buying a video card makes you the same money. You’re forecasting free performance upgrades so there’s no need to spend money now when you can wait and upgrade the hardware once software improvements stop.
And that’s assuming it has anything to do with AI but the long term macroeconomics of Trump destroying the economy so MS is putting off spending when businesses will be slowing down because of the tariff war.
More efficient hardware use should be amazing for AI since it allows you to scale even further.
If you can achieve scaling with software, you can delay current plans for expensive hardware. If a new driver came out that gave Nvidia 5090 performance to games with gtx1080 equivalent hardware would you still buy a new video card this year?
When all the Telcos scaled back on building fiber in 2000, that was because they didn’t have a positive outlook for the Internet?
Or when video game companies went bankrupt in the 1980’s, it was because video games were over as entertainment?
There’s a huge leap between not spending billions on new data centers ( which are used for more than just AI), and claiming that’s the reason AI is over.
Cancelling new data centers because deep seek has shown a more efficient path isn’t proof that AI is dead as the author claims.
Fiber buildouts were cancelled back in 2000 because multimode made existing fiber more efficient. The Internet investment bubble popped. That didn’t mean the Internet was dead.
You missed the part where deep seek uses a separate inference engine to take the LLM output and reason through it to see if it makes sense.
No it’s not perfect. But it isn’t just predicting text like how AI was a couple of years ago.
The answer he gave was Firefox. But that seems out of date given their recent backtrack to not sell your data. His runner up was Librewolf.
LLM’s can now generate answers. Watch this:
Obsession over corporate sports is a problem but many people enjoy it. I’m sure you have hobbies or activities that aren’t productive.
I don’t follow professional sports but I also don’t care if people aren’t interested in 3d printing, cats, cooking, sci-fi media or any of my other hobbies.
(Cooking falls into hobby because it’s not about survival but taste.)