Trying it out in Shadows of Doubt right now, took performance from an unstable 25-31 fps to 61-71 fps with I set on performance mode and x2 fps. Don’t really notice input lag.
It’s not on the decky store yet, so you have to download the extension zip manually.
Here’s the extension github with full instructions and details.
Basically you’ll:
-
Install the plugin. Once it’s on the decky store you can install it from there, but in the meantime do this:
- Download the .zip from the release page
- In Game Mode, go to the settings cog in the top right of the Decky Loader tab
- Enable Developer Options
- In the new Developer tab, select “Install from zip”.
- Choose the “Lossless Scaling.zip” file you downloaded (likely in the Downloads folder)
- If it does not show up, you may need to restart your device
-
Purchase and install Lossless Scaling from Steam
-
Open the plugin from the Decky menu
-
Click “Install lsfg-vk” to automatically set up the compatibility layer
-
Configure settings using the plugin’s UI controls:
- Enable/disable LSFG
- Set FPS multiplier (2-4) Note: The higher the multiplier, the greater the input lag
- Enable performance mode - Reduces gpu load, which can sometimes majorly increase FPS gains
- Adjust flow scale (0.25-1.0)
- Toggle HDR mode
- Toggle immediate mode (disable vsync)
-
Apply launch commands to the game you want to use frame generation with:
- Option 1 (Recommended): ~/lsfg %COMMAND% - Uses your plugin configuration
- Option 2: Manual environment variables like ENABLE_LSFG=1 LSFG_MULTIPLIER=2 %COMMAND%
I was confused about what it meant by “Lossless” since it’s frame gen… there’s no compression, or anything to lose, it’s starting from nothing.
As far as I can tell it means nothing, it’s just the branding for the “Lossless Scaling” tool on Steam. There’s no new lossless algorithm involved here.
My understanding is the tool originally was focused on upscaling, and “lossless upscaling” was the apps main feature (along with allowing you to apply other kinds of upscaling).
So the frame generation part is more accurately “the lossless upscaling app’s unique frame generation”, but it’s shortened to just lossless frame gen even though that’s not really accurate.
Another useful use case is that the tool works on videos with
mpv
to interpolate to a higher frame rate. I know that subjectively not everyone likes that for film, but for footage that doesn’t rely on sets and the like such as sport and Youtube videos it’s a nice improvement.In terms of quality vs performance, I’d say it’s somewhere between the lower quality SVP default and the higher quality (but very resource intensive) RIFE implementation. There’s also
LSFG_PERF_MODE=1
and decreasing the flow rate, but the former was a pretty obvious decline in quality, but might be needed on slower GPUs.EDIT: Another piece of advice I’ll give is to set
PROTON_USE_WOW64=1
if you’re trying to run a 32-bit game, as there isn’t a 32-bit build forlsfg-vk
at the moment. The env above allows 32-bit games to use 64-bit dependencies, provided it’s a Windows game and you use a recent version of Proton (Experimental and likely Proton GE 10 or greater).So I’m 0-1 so far (Samsung-screened SD OLED). Tried Baldur’s Gate 3 with a large variety of settings, it either crashed upon boot or booted with no video.
I know it’s a DX11 game so it rarely agrees with tools like this, but I was hoping, lol. If I try anything else, I’ll edit this same post so as not to take over the thread.
EDIT: OMG. Make that 1-1. It was user error, I’d accidentally slid the files into the folder next door instead of the plugins folder.
After fixing it, I booted it up and… WOW. I already had BG3 set to 720p Balanced, after doing the x3 multiplier (because who cares about input lag in this game) I’m now up into the 75 to 90 FPS range.
This is absolutely NUTS, I’ve never consistently gotten more than 30 on the Deck on this game. What a game changer, at least for non-action games. Will have to see how bad the input lag feels on action titles, but just speeding up modern/slower RPGs alone is a big, big thing.
Thanks for the heads up!
I have a couple of RPGs where I have zero concern on input lag, but they could definitely use a frame boost. I’m going to give this a try today. 😀
I’ve tried it with 2 games in my Legion Go with CachyOS.
Heaven’s Vault: It launched without forcing a particular version of Proton, but it did nothing, same framerate with it on or off.
Tacoma: I had to force GEProton in order for the game to run. It did nothing, same framerate with it on or off.
And yes, I’ve followed the instructions and put “~/lsfg %COMMAND%” as an environment variable. Not sure if I’m doing something wrong, or if it just doesn’t work with every game.
Hey, I figured out what I had done wrong, I dropped the files in the wrong place. Everything seems to be working now.
If you’re still struggling, this YT video shows you where you have to put everything to get it all to work. If CachyOS is Arch based it should be pretty similar.
Since it didn’t crash you, but didn’t seem to kick in, check and make sure that you’re not full screen on your in-game settings. Go windowed or bordered instead.
I have used it a handful of times on Windows and that was always a prereq to get it to do anything.
deleted by creator
Make sure games are windowed or borderless and that you don’t have an external frame cap like steam overlay.
Stupid question, but what how does this work on deck hardware, I thought framegen was only supported on newer GPUs?
Different framegen techs have different requirements. Some like DLSS and the newer FSR require specific GPU hardware, some require being built into the game specifically. Lossless is great because it works on most hardware and most games.
My understanding here is that it’s working as part of the Vulkan pipeline, but I don’t have enough knowledge in that area to answer more accurately than that. This article discusses what the dev of lsfg-vk had to do to get lossless framegen working on Linux, and it can give some insight into how it’s working.
Lossless Scaling’s implementation runs on leftover shader compute units, so it’s hardware agnostic, but heavily limited in terms of latency and quality of the interpolation.
I love how when Nvidia announced frame generation Lemmy and Reddit were full of threads about how that’s horrible and useless and the latency was too high.
Then AMD and Lossless Scaling launch their own version, with significabtly higher latency, and suddenly its great you don’t even notice the artifacts the latency isn’t that bad trust me guys
Neither Reddit nor Lemmy are monoliths. Yes, some are likely being hypocritical, but it’s also likely that there isn’t much overlap between those that were critical of Nvidia’s FG and not of LSFG. I say this because there is still a lot of people shitting on FG in general, whether it’s justified or not.
Reddit/Lemmy don’t have to be a monolith to have a clearly defined and easily seen accepted narrative.
Posts and comments that considered frame generation viable when Nvidia was the only one with the technology were massively downvoted and got absurd replies, universally. Even more so here on Lemmy.
Now the Steam Deck subreddit is dominated by posts about Lossless Scaling and users claiming they can’t see any artifacts or latency at all. When AMD introduced their, quite objectively inferior, implementation of frame generation Lemmy immediately shifted tone. I can go to any random thread and say “I think FSR frame generation is great!” and nobody bats an eye. A shift in overall tone is quite easy to see and measure, regardless if the website is not written by a single individual.
No, they’re still all awful, lossless scaling especially, but:
a) Nvidia simps have fewer excuses to support Corpo #1 over Corpo #2 now and
b) people using lossless scaling often have either weaker hardware or lower standards in the first place
No, they’re still all awful
I don’t disagree, but that’s not my point at all
Having it available as a technology is great, what sucks is how Nvidia marketed their new 50 series on the basis of triple fake frames due to lack of actual hardware improvements. They literally claimed 5070 = 4090 performance.
The feature was released with the 4000 series, not 5000.
Single frame generation (2x FPS) was 4000 but 5000 added multi frame gen. (3x/4x FPS)
Through that they justified 5070 matching 4090.
So what? I never said anything about multi frame generation.
I’m talking about reception at launch on Reddit and Lemmy. This entire image is completely irrelevant to the point being made.
It depends on the game, framegen techs, and your base fps.
It can be a great way to squeeze more performance out of a game in some circumstances, but it’s a big problem when games like MH:Wilds rely on it to meet an acceptable fps at all.
I’m commenting on the contrast of attitude towards the feature when its made by Nvidia versus Lemmy’s beloved AMD, not really on whether the feature is actually useful.
I don’t know about anyone else, but the reason I say stuff like “fake frames, real flames” about nvidia, is that they include framegen in their performance stats.
As if it’s a tech that boosts actual performance. They use it to lie.
When they say “this card is twice as powerful as last gen” what they really mean is, it is exactly the same, but we turned on 2x framegen. Nevermind that there’s no reason you couldn’t do the same on the last gen card.