Please drink a verification can to continue.
deadbeef
The samsung TV that I bought for my son had this annoying overlay thing that pops up when you turn it on that shows all the different inputs and nags about various things it thinks are wrong with the world. It is plugged into an Nvidia shield that we do most things on, but you can't use the shield until the overlay calms the fuck down and disappears.
It'd be great if you could just have the thing turn on and display an input like our older TVs do.
This damnable prison of log and ice eats away at my fibre. I find the lack of culture astonishing.
Agreed, it seems like they should have put just a little bit more in the standard feature set so every little window manager doesn't have to reinvent the wheel.
I learned what a tankie is, which is fun.
I've been commenting a bit, whereas on reddit I would only post a comment a few times a year when I could be bothered dealing with the likely burst of negativity that would come as a response to it.
Kind of feels a bit more like Web 1.9 or so from about 2003 which I think was about the sweet spot for minimal rage bait and crazy and still a decent bit of user interaction and scale.
It would be about perfect if you could chop out a few of the folks trying to shoehorn in politics to every little thing.
Appreciate the reply. Which desktop environment are you using?
My only experience with Wayland is also with KDE. Wheres for the 27-ish years before that I've used all sorts of stuff with X.
I've scripted the machine that drives the frontend for our video surveilance ssytem to place windows exactly where I want them when it comes up.
I use a couple of dbus triggers that make the TV on the wall in my garage go to sleep from the shell, perhaps not tested via ssh though. They were pretty well the functional equivalent of some xset dpms commands that I used to use. Not sure if that is what you were meaning. I think I also had something working that disabled the output altogether. I think that was pretty clunky as it used some sort of screen ID that would occasionally change. Sorry I'm hazy on the details, I'm old.
I'll try it all out when I get home, I've got to find some old serial crap for a coworker in the garage anyway.
Which workflows? Asking because I'd like to experiment with some edge case stuff.
I'm running KDE with wayland on multiple different vintage machines with AMD and intel graphics and it would take alot for me to go back to the depressing old mess that was X.
The biggest improvement in recent times was absolutely pulling out all my Nvidia cards and putting in second hand Radeon cards, but switching to wayland fixed all the dumb interactions between VRR ( and HDR ) capable monitors of mixed refresh rates.
Even the little NUC that drives the three 4k TV's for the security cameras at work is a little happier with wayland, running for weeks now with hardware decoding, rather than X crashing pretty well every few days.
Last week I did an install of Debian 12 on a little NUC7CJYH to use for web browsing and ssh sessions into work and ended up with wayland by default. Seems to work great.
From what I have experienced, it goes great with intel integrated graphics, great with a radeon card and can be made to work with Nvidia if you are lucky or up for a fight.
We did an address check when we could first order it and about a third of the folks in the office could get it about a year and a half ago. I know the majority of the address checks that we do for commercial locations in tenders come up positive now.
It is not cheap to get an off the shelf router that does a solid job of forwarding multiple gigabits and the vast majority of folks ( me included ) probably will rarely notice the difference outside of speed tests. The last firewall build that I did for home was with a pair of virtual Linux boxes with 10G interfaces just so I could do a 2G or 4G GPON upgrade later on without having to throw everything out.
In New Zealand it seems like 10G GPON services are mostly cannibalizing high quality lit ethernet services at 1G and 10G subrate rather than replacing consumer tails. So more likely a business is going from spending $1500 a month on uncontended 1G to spending $400 a month on contended 4G, rather than a residential user going from spending $150 on 1000/500 to $280 on 2000/2000.
My day job is building ISP networks. It's been about 20 years since I had a home connection that I didn't configure up both ends of myself.
I've got a 1G / 500M tail into home where I am right now, not that that is particularly impressive. One of the jobs I've been putting off at work is standardising our usage of the 10G GPON platform available here in NZ, when I do that I'll get one of the >1G tails to use at home.
Usually the answer is how ever much I can be bothered building, but my usage is pretty low.
Solve for feef.
The situation is mostly reversed on Linux. Nvidia has fewer features, more bugs and stuff that plain won't work at all. Even onboard intel graphics is going to be less buggy than a pretty expensive Nvidia card.
I mention that because language model work is pretty niche and so is Linux ( maybe similar sized niches? ).