Trainguyrom

joined 1 year ago
[–] [email protected] 6 points 2 weeks ago

Bro needs to touch some grass

[–] [email protected] 3 points 2 weeks ago

I was a kid the first year after that went into effect, my computer automatically changed for the time change 2 weeks too early, so I had to have my parents fix it for me...twice. that was my first gripe with Dubya. Then his VP caused my luggage to get lost by seizing up all plane movement at Minneapolis St Paul because Dick Cheney had to take off but then stopped on the tarmac for 20 minutes for some reason while the whole airport was at a standstill for him, making every flight 30+ minutes late. So yeah I've got some personal gripes with Dubya's administration

[–] [email protected] 5 points 2 weeks ago* (last edited 2 weeks ago)

Especially with how normal memory tiering is nowadays, especially in the datacenter (Intel's bread and butter) now that you can stick a box of memory on a CXL network and put the memory from your last gen servers you just retired into said box for a third or fourth tier of memory before swapping. And the fun not tiered memory stuff the CXL enables. Really CXL just enables so much cool stuff that it's going to be incredible once that starts hitting small single row datacenters

[–] [email protected] 4 points 2 weeks ago

The main reason Intel can’t compete is the fact CUDA is both proprietary and the industry standard

Funnily enough this is actually changing because of the AI boom. Would-be buyers can't get Nvidia AI cards so they're buying AMD and Intel and reworking their stacks as needed. It helps that there's also translation layers available now too which translate CUDA and other otherwise vebdor-specific stuff to the open protocols supported by Intel and AMD

[–] [email protected] 12 points 2 weeks ago* (last edited 2 weeks ago) (1 children)

He’s not wrong that GPUs in the desktop space are going away because SoCs are inevitably going to be the future. This isn’t because the market has demanded it or some sort of conspiracy, but literally we can’t get faster without chips getting smaller and closer together.

While I agree with you on a technical level, I read it as Pat Gelsinger intends to stop development of discrete graphics cards after Battlemage, which is disappointing but not surprising. Intel's GPUs while incredibly impressive simply have an uphill battle for desktop users and particularly gamers to ensure every game a user wishes to run can generally run without compatibility problems.

Ideally Intel would keep their GPU department going because they have a fighting chance at holding a significant market share now that they're past the hardest hurdles, but they're in a hard spot financially so I can't be surprised if they're forced to divest from discrete GPUs entirely

[–] [email protected] 7 points 2 weeks ago

Seriously putting a couple gigs of on-package graphics memory would completely change the game, especially if it does some intelligent caching and uses RAM for additional memory as needed.

I want to see what happens if Intel or AMD seriously let a generation rip with on package graphics memory for the iGPU. The only real drawback I could see is if the power/thermal budget just isn't sufficient and it ends up with wonky performance (which I have seen on an overly thin and light laptop I have in my personal fleet. It's got a Ryzen 2600 by memory that's horribly thermally limited and because of that it leaves so much performance on the table)

[–] [email protected] 11 points 2 weeks ago* (last edited 2 weeks ago) (1 children)

To be fair, the arm SOCs on phones use BigLittle cores, where it will enable/disable cores on the fly and move software around so it's either running on the Big high performance cores or the Little low power cores based on power budget needs at that second. So effectively not all of those 6+ cores would be available and in use at the same time on phones

[–] [email protected] 1 points 2 weeks ago (1 children)

Luke Skywalker taking a lucky shot at a vulnerability that a team of engineers and military men, all of which were high-level Imperial defectors, with support from many planets of what is the Star Wars alternative of Western Europe and North America, had found by analyzing space station’s stolen blueprints, using computers and what not, is realistic.

I'm guessing you haven't seen Rogue One. The architect of the death star was sympathetic to the rebellion and deliberately created the vulnerability of the reactor that needs only a single hit with a blaster to blow up the entire megastructure, sent a message to the rebellion explaining said flaw and instructing them to aquire the designs of the death star to identify where the reactor is so that they can exploit the flaw.

Having been involved in large (software) projects this seems quite plausible that someone near the top could intentionally leave a backdoor in there and have it go unnoticed into live testing, especially with the mix of disciplines needed in constructing such a megastructure

[–] [email protected] 3 points 2 weeks ago (1 children)

Also I've heard that the water that first comes out of those sprinklers is RANK from having sat in the pipes for years

[–] [email protected] 3 points 2 weeks ago

Hey now, War Games had pretty dang realistic hacking!

[–] [email protected] 2 points 2 weeks ago

Worst was some show my MIL was watching. A team of super savants is trying to stop an ICBM from nuking Los Angelas, and not only was it completely not understanding orbital dynamics but they didn't even seem to follow any kind of rudimentary in-universe laws of physics (usually shows and movies just treat spaceships like submarines which at least if it's consistent it can still make a decent story) as the ICBM was 30 seconds from impact 3 times over a 20 minute period somehow

[–] [email protected] 2 points 2 weeks ago

I have to disagree. When I tried out a VR headset at a con I spent 2 hours with the headset on in Space Pirate Training Simulator thinking it had only been 20 minutes. This was the $250 Meta Quest 2 while I had a heavy backpack on my back because I didn't have anyone with me to leave my bag with. I was trying to be conscious with not taking too much time with the headset so others could have a chance and figured about 15-20 minutes would be appropriate but apparently I was completely in the zone!

I can count on one hand how many times I've had that much of a time traveling game experience, so I'd say VR is a pretty dang cool experience and once hardware costs come down (or headsets become more ubiquitous) it'll probably be a pretty big market for gamers, much like how consoles are now

view more: ‹ prev next ›