If you do not have physical access, it is not yours. Trust absolutely no one.
- 4 Posts
- 70 Comments
Outwit1294@lemmy.todayto
Technology@lemmy.world•Human-level AI is not inevitable. We have the power to change courseEnglish
2·9 months agoYeah. Cheap labor is so much better than this bullshit
Outwit1294@lemmy.todayto
Technology@lemmy.world•ChatGPT advises women to ask for lower salaries, study findsEnglish
14·9 months agoStep 2. Offer sexual favours
Pedophiles don’t deserve to die in jail, that is the easy way out. They deserve the shame, humiliation and sodomy till they die.
Outwit1294@lemmy.todayto
Leopards Ate My Face@lemmy.world•Texan Moved Fam to Russia to Flee Woke—Now He’s Headed to Ukraine Front LineEnglish
21·9 months agoI love lemmy. This comment would have got you banned on Reddit
Outwit1294@lemmy.todayto
Technology@lemmy.world•Reddit users in the UK must now upload selfies to access NSFW subredditsEnglish
1·9 months agoKind of
Outwit1294@lemmy.todayto
Technology@lemmy.world•Reddit users in the UK must now upload selfies to access NSFW subredditsEnglish
57·9 months agoThis whole thing is a security disaster waiting to happen.
I did say which one I used
Outwit1294@lemmy.todayBanned from communityto
Privacy@lemmy.ml•[ANSWERED] Should i use KeePass* instead of Proton Pass, for privacy?
210·9 months agoNever self host critical things
I don’t think it considers what the user wants to hear. It is concerned about what the data it has trained on would consider a logical answer.
But at that point, it is useful only for novice users of the internet who don’t know how to search for things. I am pretty sure a 30 second search engine search would yield the same result.
It can’t be that far away. We have been waiting since so many years. Trump is also making an effort to crash the market.
Bookmarked for watching/reading this week. Will let you know my thoughts.
I have experienced this first hand. Asking LLMs explicit things leads to “I can’t help you with that” but if I ask it in a roundabout way, it gives a straight answer.
I became biased after I used the products. I have no ethical concerns about AI, like most of this community.
What do you mean?
If hallucinations cannot be eliminated, how are they decreasing them (allegedly)?
It spouts out generic and outdated answers when asked specific questions, which I can identify as wrong (skill issue, lol).
If you are super confident with using them, maybe you are really not knowledgeable enough about those things. Skill issue, I guess.
I did. It did not help





Emphasis on if and have to