this post was submitted on 31 Jan 2024
501 points (97.0% liked)

Technology

58142 readers
4320 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

AMD’s new CPU hits 132fps in Fortnite without a graphics card::Also get 49fps in BG3, 119fps in CS2, and 41fps in Cyberpunk 2077 using the new AMD Ryzen 8700G, all without the need for an extra CPU cooler.

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 119 points 7 months ago (2 children)

I have routinely been impressed with AMD integrated graphics. My last laptop I specifically went for one as it meant I didn't need a dedicated gpu for it which adds significant weight, cost, and power draw.

It isn't my main gaming rig of course; I have had no complaints.

[–] [email protected] 19 points 7 months ago* (last edited 7 months ago)

Same. I got a cheap Ryzen laptop a few years back and put Linux on it last year, and I've been shocked by how well it can play some games. I just recently got Disgaea 7 (mostly to play on Steam Deck) and it's so well optimized that I get steady 60fps, at full resolution, on my shitty integrated graphics.

[–] [email protected] 13 points 7 months ago

I have a Lenovo ultralight with a 7730U mobile chip in it, which is a pretty mid cpu... happily plays minecraft at a full 60fps while using like 10W on the package. I can play Minecraft on battery for like 4 hours. It's nuts.

AMD does the right thing and uses their full graphics uArch CU's for the iGPU on a new die, instead of trying to cram some poorly designed iGPU inside the CPU package like Intel does.