Please yes. The average person doesn't need an insane graphics card, if they can make an APU that's affordable it'll fly off the shelves.
PC Gaming
For PC gaming news and discussion. PCGamingWiki
Rules:
- Be Respectful.
- No Spam or Porn.
- No Advertising.
- No Memes.
- No Tech Support.
- No questions about buying/building computers.
- No game suggestions, friend requests, surveys, or begging.
- No Let's Plays, streams, highlight reels/montages, random videos or shorts.
- No off-topic posts/comments, within reason.
- Use the original source, no clickbait titles, no duplicates. (Submissions should be from the original source if possible, unless from paywalled or non-english sources. If the title is clickbait or lacks context you may lightly edit the title.)
That would make a nice Linux-based gaming PC for my kids that is for sure.
if they can make an APU that’s affordable
About that...
I meant, being devil's advocate, you're paying for a CPU, GPU and RAM in a single purchase. These things would be a MiniPC without the motherboard and connectors.
You're paying for the mainboard as well, it's all one soldered package.
So, not really?
The middle one is a similar kind of configuration to my Mac studio, and if you were to put this APU in an actual build it only comes out a little cheaper so the pricing tracks.
Steamdeck success
I’m at the point where I am only interested in handheld gaming.
The only reason I’d buy a dedicated gpu these days is to run AI stuff locally.
See, the thing about this framing is that by this logic the PS5 is also doing this "without a graphics card". They're both APUs, they both have a GPU in them.
Agreed, I really hated this description of PCs with only iGPUs or APUs since they are not "without a graphics card" almost all CPUs nowadays have some kind of integrated graphics solution, some better than others. It's just that in the past iGPUs have been laughably bad, which is why many people have needed an external one.
Does the PS5 really not have a dedicated graphics chip?
Pretty much every CPU these days has a dedicated GPU, integrated GPUs and APUs just have them on the Same die. So it's technically correct to say they don't have a "dedicated graphics chip" but it's also a bit misleading to say that, implying that they don't have or need a GPU at all.
They kinda do, but they put the dedicated chip into the same die as the CPU so ¯\_(ツ)_/¯
Yep. Same as the Strix Point APUs and the M series Apple stuff and the Steam Deck and the Switch and a whole bunch of other things.
It's kinda weird that people are beign shocked by this only now. I guess the long tradition of laptop iGPUs sucking has led to some weird assumptions.
🤯
The AMD Ryzen AI Max APUs for mobile, originally codenamed Strix Halo, could make an appearance in desktop PCs, as hinted by Dr Lisa Su.
What is there to hint? Framework Desktop has been announced weeks ago.
APUs like the 5700g during the pandemic's GPU shortage were already an amazing deal and kept me from blowing too much cash on a couple of new builds. AMD's a real hero when it comes to value.
The new Asus ROG Flow Z13 with the AMD Ryzen Al Max+ 395 is impressive to say the least.
If it would be 15-16 inch, I would get it right away.
Can't wait for what to come.
ROG Flow 13 looks dope, and while I appreciate your opinion for 15-16", the smaller the better in my book. Rocking a 10.1" Onexplayer netbook 5 and its okay, no Z13 tho, but also way cheaper
Potential great news, but need to see more than just the CEO saying this.
Looking forward to the Gamers Nexus (or Hardware Umboxed, etc.) review on the chip/products.
~This~ ~comment~ ~is~ ~licensed~ ~under~ ~CC~ ~BY-NC-SA~ ~4.0~
BOOO stupid long copyright signatures!
BOOO stupid long copyright signatures!
Some info about that ...
https://lemmy.world/post/26711096/15639879
~This~ ~comment~ ~is~ ~licensed~ ~under~ ~CC~ ~BY-NC-SA~ ~4.0~
I know what it is. It’s obnoxious.
Like putting your sexual kinks in a email signature or telling everyone who didn’t ask that you’re vegan.
Maybe but it is also a public fora and people are free to express themselves
I do think it is annoying tho
Would be nice if it could be set as some kind of post / comment metadata, hidden from view but there in the code.
Would be nice if it could be set as some kind of post / comment metadata, hidden from view but there in the code.
I'd love that, if it would be legally respected by corporations/scrapers. I even mention my license intention for my content in my bio here on Lemmy as well. It would save me allot of hassle arguing with others as well.
Until then, I have to embed the license in the content itself, especially while the new AI law is just starting to sort out these issues.
Like putting your sexual kinks in a email signature or telling everyone who didn’t ask that you’re vegan.
(quoting another person, not the one I'm directly replying to)
But honestly people, if you are being bothered by a simple link in a comment, that is allowed on Lemmy, then you really need to look within. If your client of choice is not displaying subscripted fonts correctly, and hence the text/link looks worse than it should, then you really need to speak with the devs of the client about that (as I mentioned in my 'FAQ' link, I'm using official Lemmy.World formatting). And if you don't like seeing a license in general, then I can't help you with that one, except for maybe giving some 'touch grass' advice (especially to the person I quoted above).
~This~ ~comment~ ~is~ ~licensed~ ~under~ ~CC~ ~BY-NC-SA~ ~4.0~
Signatures in Lemmy comments are spam.
Being little spam doesn’t make it cool. Being little spam about your copyleft views doesn’t make it cool.
Signatures in Lemmy comments are spam.
Like putting your sexual kinks in a email signature
Touch grass. Really.
Being little spam doesn’t make it cool. Being little spam about your copyleft views doesn’t make it cool.
I would argue that ignorance tends to benefit corporations moreso than regular people without power. Take avail of any power that you have available to you, before its all gone.
Also, if you don't defend others views, then one day, your views will not be defended.
~This~ ~comment~ ~is~ ~licensed~ ~under~ ~CC~ ~BY-NC-SA~ ~4.0~
MY message is important, not spam!
Reminder that all AMD's "AI" processors require soldered memory.
Also they already made an appearance in the Framework desktop. It's garbage.
No they don't, but the ones with the 8050S do. That's the tradeoff. If you don't want a soldered chip (the cpu is also soldered to the mainboard) buy the lower tier but same gen APUs with normal iGPUs, like the ones in the Framework 13.
I wonder if CAMM modules will ever take off, I've always been told the reason soldered memory exists was to cut down on latency.
now I know that's probably bullshit to some extent and more away to sell people on the higher tier SKUs
but I know moving off of existing technologies carries cost that companies might not want to bear
The Framework CEO said in a QA that AMD engineers did a simulation of this APU with CAMM and that they'd have to cut the memory bandwidth by half to make it work.
CAMM has been around for years at this point and I've only seen them in a couple devices so, no, I don't think so. They still can't match the speed of solder either.
Yes, they do.
That's incorrect.
The Framework Desktop features soldered RAM to achieve the DDR5-8000 speeds. The Framework 13 caps out at DDR5-5600 but has regular DIMM slots.
You didn't say anything to indicate that I was incorrect.
You said they require soldered RAM. But the framework 13 doesn't...