Garlic chilli powder. An Indian mate of mine introduced me to this condiment and it changed my life. I add a few pinches of it to most of my dishes now (noodles, pasta, pizza, sandwiches, fried rice, stir-frys and of course curries) - and it elevates then to the next level. (I love spicy food btw so this may not be for everyone, but for me it opened up a whole new world).
Yeah originally I used ujust on my old PC, but that command is gone in the latest Bazzite for whatever reason, so (on my new PC) I installed it using the command here: https://github.com/DeterminateSystems/nix-installer
espanso
I'm on Bazzite (similar to Bluefin) and I installed espanso via Nix. It was just one command to install it and one setcap to grant it permissions. The good thing about using Nix instead of Distrobox or Flatpak is that you don't run into annoying sandbox limitations, since these binaries live on your real filesystem and can access all system resources.
The key thing to make it work is that the setcap command needs to be run against the actual nix store executable and not the symlink in your home folder. Also, this is also why a Distrobox export of this would never work, because you'd be setcapping only the symlink which is useless.
It's four words but, because I'm a cool pwnz0r, the second and last word are written in leetspeak
correct h0r53 battery 5t4p13
?
Arch
I'm surprised you had an issue with Arch - I've got a 2015 MBA as well, and Arch installed without any issue, didn't have to mess around with any kernel boot parameters, nomodeset etc.
Indeed. I hope that design gets vetoed before being finalised. It would be good to have another native Linux gaming console.
Whoever designed this probably isn't a gamer I reckon.
Waffle. It's like wordle+jumble in a waffle shape. You need to solve the puzzle in the least number of moves possible.
Heardle (Rock version).Technically less of a puzzle and more of a song guessing game. Still fun, nonetheless. I prefer the rock version cause I don't suck at it lol.
Not sure, there aren't many reports so it's hard to say. I know at least the ROG Ally version has its own service which sets the TDP so it's probably not affected.
As for the Steam Deck, if you're running this on an actual deck it's not really a concern because 15W is the Deck's default TDP. And Bazzite-Deck is Steam-Deck-first distro, so you're still really better off using Bazzite.
Also, to clarify, this is a case of Steam (Gamescope) itself changing the TDP, so it's not a bug introduced by their devs.
The only Keen I know of (and acknowledge) is the Commander:
What sort of ML tasks exactly, and is it personal or professional?
If it's for LLMs you can just use Petals, which is a distributed service which doesn't need your own GPU.
If it's for SD / image generation, there are four ways you can go about it. The first is to rent a GPU cloud service like vast.ai, runpod.io, vagon.io etc, then run SD on the PC you're renting. It's relatively cheap, generate as much as you want in the duration you've rented. Last I checked, the prices were something like ~0.33 USD per hour, which is a far cheaper option than buying a top-end nVidia card for casual workloads.
The second option is using a website/service where the SD fronted is presented to you and you generate images through a credit system. Buy X amount of credits and you can generate X amount of images etc. Eg sites like Rundiffusion, dreamlike.art, seek.art, lexica etc.
The third option is to go for a monthly/yearly subscription offering, where you can generate as much as you want, such as MidJourney, Dall-E etc. This can be cheaper than an pay-as-you go service if you've got a ton of stuff go generate. There's also Adobe Firefly which is like a hybrid option (x credits / month).
Finally, there are plenty of free Google collabs for SD. And there is also stable horde, uses distributed computing for SD. And there's also an easy WebUI for it called ArtBot.
So yeah, there's plenty of options these days depending on what you want to do, you no longer need to actually own an nVidia card - and in fact for most users it's the cheaper option. Like say you wanted to buy a 4090, which costs ~$2000. If you instead spent that on cloud services at say $20 p/m, you can get 8.3 years of usage - and most GPUs would become outdated in that time period and you'd have to buy a new one (whereas cloud GPUs continue to get better and for as-a-service models, you could get better GPUs at the same price). And I'm not even factoring in other expenses like power consumption, time spent on maintenance and troubleshooting etc. So for most people it's a waste to buy a card just for ML, unless you're going to be using it 24x7 and you're actually making money off it.
Edit: A used 3090 is going for ~$715-850 at the moment, which works out to an equivalent of ~3+ years of image generation via cloud services, assuming you're going for paid subscription options. If you factor in the free options or casual pay-as-you-go systems, it can still work out a lot cheaper.
Thanks for confirming that. Yeah there was another report saying that the desktop image was fine. Seems like there's something extra in the Deck image that's enforcing this limit.
Mobile Suit Gundam