With the latest release of android it now supports some Linux functionality.
Wait, it does? Gonna have to check that out.
With the latest release of android it now supports some Linux functionality.
Wait, it does? Gonna have to check that out.
I disagree, LLMs have been very helpful for me and I do not see how an open source AI model trained with open source datasets is detrimental to society.
I’ll bite, what is malicious about this?
You are missing the Muse for your Poet. :D
It’s a credit or debit card where you can change the number at any time. Prevents shady online shops from loosing your credit card details. Revolut and N26 offer it for example.
Of course, the shop still has to accept credit/debit cards or you have to go though PayPal as guest.
You are telling me you are paying via PayPal without involving the banking industry?
I have the choice to either tell PayPal and my bank what I’m buying or just my bank. I chose the latter.
I once got banned for “vote brigading” on Reddit because I upvoted a crosspost that was deleted later, so this comes as no surprise. They have been monitoring upvotes and downvotes since forever.
Do you need one?
I recently deleted my PayPal account and use a virtual credit card instead.
So far no issues with paying. In the rare case that credit card is not available you can always pay without account in PayPal via credit card.
Reminder that Blender is struggling with funding right now. https://topicroomsvfx.com/news/the-price-of-free-blenders-funding-crisis/
Make sure to leave it a few bucks if you use it. https://fund.blender.org/
If getting rid of Microsoft entirely is the goal, Samba does AD with GPOs just fine.
I installed Zen a while ago when Flathub recommended it to me. Didn’t really like the minimalist design, especially with the auto-hide title bar.
I heard Ubuntu got some big upgrades starting with 22.04 in terms to support for GPOs.
I never tested it personally but they do have some documentation for it and they can be added to a Windows domain: https://documentation.ubuntu.com/adsys/en/latest/
This doesn’t make any sense to me either. Why do they need a license for what you type into Firefox if that data never gets shared with Mozilla?
I don’t know a single application that you need to give a license to so they can handle your data locally.
Which Firefox fork do people recommend? Ideally it should be available as Flatpak, keep the Firefox version number and not have a separate user-agent.
LibreWolf seems to be the best on first glance? https://flathub.org/apps/io.gitlab.librewolf-community
What? X11 has zero HDR support.
I’m not a composer but Kimai is indeed a very good time tracker. Beats the trash we use at work.
I’m curious. Say you are getting a new computer, put Debian on, want to run e.g. DeepSeek via ollama via a container (e.g. Docker or podman) and also play, how easy or difficult is it?
On the host system, you don’t need to do anything. AMDGPU and Mesa are included on most distros.
For LLMs you can go the easy route and just install the Alpaca flatpak and the AMD addon. It will work out of the box and uses ollama in the background.
If you need a Docker container for it: AMD provides the handy rocm/dev-ubuntu-${UBUNTU_VERSION}:${ROCM_VERSION}-complete
images. They contain all the required ROCm dependencies and runtimes and you can just install your stuff ontop of it.
As for GPU passthrough, all you need to do is add a device link for /dev/kfd
and /dev/dri
and you are set. For example, in a docker-compose.yml you just add this:
devices:
- /dev/kfd:/dev/kfd
- /dev/dri:/dev/dri
For example, this is the entire Dockerfile needed to build ComfyUI from scratch with ROCm. The user/group commands are only needed to get the container groups to align with my Fedora host system.
ARG UBUNTU_VERSION=24.04
ARG ROCM_VERSION=6.3
ARG BASE_ROCM_DEV_CONTAINER=rocm/dev-ubuntu-${UBUNTU_VERSION}:${ROCM_VERSION}-complete
# For 6000 series
#ARG ROCM_DOCKER_ARCH=gfx1030
# For 7000 series
ARG ROCM_DOCKER_ARCH=gfx1100
FROM ${BASE_ROCM_DEV_CONTAINER}
RUN apt-get update && apt-get install -y git python-is-python3 && rm -rf /var/lib/apt/lists/*
RUN pip install torch torchvision torchaudio --index-url https://download.pytorch.org/whl/nightly/rocm6.3 --break-system-packages
# Change group IDs to match Fedora
RUN groupmod -g 1337 irc && groupmod -g 105 render && groupmod -g 39 video
# Rename user on newer 24.04 release and add to video/render group
RUN usermod -l ai ubuntu && \
usermod -d /home/ai -m ai && \
usermod -a -G video ai && \
usermod -a -G render ai
USER ai
WORKDIR /app
ENV PATH="/home/ai/.local/bin:${PATH}"
RUN git clone https://github.com/comfyanonymous/ComfyUI .
RUN pip install -r requirements.txt --break-system-packages
COPY start.sh /start.sh
CMD /start.sh
However, for some reason on PC it’s often quirky (Windows or Linux). My PC bluetooth works through a dongle so I wonder if an integrated card would do better.
Is it an USB dongle?
If so, make sure to add a short USB-A to USB-A cable between your PC and the dongle. Interference is a serious issue on USB 2.4 GHz wireless dongles when directly connected to a mainboard.
That used to be the case, yes.
Alpaca pretty much allows running LLM out of the box on AMD after installing the ROCm addon in Discover/Software. LM Studio also works perfectly.
Image generation is a little bit more complicated. ComfyUI supports AMD when all ROCm dependencies are installed and the PyTorch version is swapped for the AMD version.
However, ComfyUI provides no builds for Linux or AMD right now and you have to build it yourself. I currently use a simple Docker container for ComfyUI which just takes the AMD ROCm image and installs ComfyUI ontop.
Dope, seems to not have landed yet in LineageOS but the Terminal app is already installed. Just missing the toggle in the developer options.