this post was submitted on 01 Oct 2023
40 points (100.0% liked)

Linux

48044 readers
735 users here now

From Wikipedia, the free encyclopedia

Linux is a family of open source Unix-like operating systems based on the Linux kernel, an operating system kernel first released on September 17, 1991 by Linus Torvalds. Linux is typically packaged in a Linux distribution (or distro for short).

Distributions include the Linux kernel and supporting system software and libraries, many of which are provided by the GNU Project. Many Linux distributions use the word "Linux" in their name, but the Free Software Foundation uses the name GNU/Linux to emphasize the importance of GNU software, causing some controversy.

Rules

Related Communities

Community icon by Alpár-Etele Méder, licensed under CC BY 3.0

founded 5 years ago
MODERATORS
 

Background story: I recently bought a computer with AMD 7000 series CPU and GPU.

amdgpu_top reports 15 ~ 20 watts in normal desktop usage, but as soon as I have video playing in VLC, it goes to 45 watts constantly which is undesirable behavior especially in summer. (I hope that is just reporting issue... but my computer is hot)

When I do DRI_PRIME=1 vlc and then play videos, amdgpu_top doesn't report the power surge. (I have iGPU enabled)

Is there anything more convenient then modifying individual .desktop files? KDE malfunctions when I put export DRI_PRIME=1 in .xprofile so that's a no go.


Solved: removing mesa related hardware acceleration package makes VLC fall back to libplacebo which doesn't do these weird things.

top 15 comments
sorted by: hot top controversial new old
[–] [email protected] 9 points 1 year ago (1 children)

Are you just running and AMD CPU with integrated graphics, or do you also have a dedicated graphics card? From what I can gather online, the DRI_PRIME variable is mostly used for render offloading to a dedicated GPU, but your question appears to be about iGPUs.

You can also try to manually enable hardware decoding in VLC’s settings. Just go to Tools > Preferences > Input & Codecs and choose VA-API (AMD’s preferred standard).

[–] [email protected] 3 points 1 year ago* (last edited 1 year ago) (1 children)

do you also have a dedicated graphics card?

Yes, rx 7800 xt

My worry s that playing a 1080P video need 30 watts (assuming amdgpu_top is not wrong), I would like to move that workload to integrated GPU, which I enabled in BIOS.

Thank you for your answer, I can confirm by switching to VA-API it lowers my power usage by a lot (from 45 to 20~21 watts reported).

[–] [email protected] 1 points 1 year ago* (last edited 1 year ago)

Through some more testing, I found out mesa related hardware acceleration package can cause these power surge, on Archlinux it includes mesa-vdpau and libva-mesa-driver.

If I don't have these package installed, VLC reverts to libplacebo which doesn't seem to cause more power usage.

[–] [email protected] 5 points 1 year ago* (last edited 1 year ago) (1 children)

One thing you could do is plugging your monitor straight into the iGPU outputs and using DRI_PRIME only for applications that need the powerful dGPU.

Unless you want to run either everything or nothing on a specific GPU, I don't think there's a more convenient way than setting DRI_PRIME per application.

[–] [email protected] 2 points 1 year ago

I do this. It's part of a GPU passthrough setup, but in practice there aren't many applications that require PRIME offload. I don't use it for web browsers where I watch a lot of videos. I haven't used VLC in a little bit but I'm pretty sure I don't use it there either. Games and graphical applications. If I was doing video editing or modeling I would probably want it there too.

[–] [email protected] 2 points 1 year ago (1 children)

Do you have a dedicated GPU¿?

[–] [email protected] 2 points 1 year ago* (last edited 1 year ago) (1 children)

Yes, rx 7800 xt. I can confirm DRI_PRIME does switch to integrated GPU on demand

DRI_PRIME=0 glxinfo | grep "OpenGL renderer"
OpenGL renderer string: AMD Radeon Graphics (gfx1101, LLVM 16.0.6, DRM 3.54, 6.5.5-arch1-1)
DRI_PRIME=1 glxinfo | grep "OpenGL renderer"
OpenGL renderer string: AMD Radeon Graphics (raphael_mendocino, LLVM 16.0.6, DRM 3.54, 6.5.5-arch1-1)
[–] [email protected] 1 points 1 year ago (1 children)

I am assuming you have the monitor connected directly to the 7800xt. Which is why it is the default GPU.

Is the decoding being done when watching the video¿? amdgpu_top shows if the application(vlc in this case) is using the decoding hardware(column named DEC).

Also using the iGPU for video decoding should be more efficient because the massive number of cores in dGPU aren't needed while decoding yet are kept active because the dGPU is active

[–] [email protected] 1 points 1 year ago (1 children)

The problem has been solved, it's caused by mesa's video decoding package, I will answer anyway.

Yes, VCN (Video Core Next) column stays at constant value while playing video (3% for VA-API with mesa, 5% for VDPAU with mesa, 0% for libplacebo), GFX fluctuates between 0% and 1%.

Just playing a 1080P video (not even a high bit rate one) is enough to make GPU fan go spinning, disappointing.

[–] [email protected] 1 points 1 year ago

Hmm must be some bug in mesa or the way it interacts with vlc . I use VA-API with mesa for my decoding purposes on a system(laptop) with Vega iGPU and RDNA1 dGPU and I don't see high energy usage. In fact I get much better battery life with vaapi hardware decoding.

[–] [email protected] 1 points 1 year ago (1 children)

Which distro are you using. Fedora, manjaro and few others disabled hardware acceleration for certain codecs making CPU and power spike. For Fedora you can enable RPM fusion and install the hardware acceleration versions and be back to normal .

https://rpmfusion.org/

[–] [email protected] 1 points 1 year ago* (last edited 1 year ago)

Turns out in my case it's mesa driver causing my problem, after removing mesa's VA-API and VDPAU drivers VLC can still play things just fine, CPU is at 2~3%.

[–] [email protected] 1 points 1 year ago (1 children)

Maybe you could override the Video Acceleration driver instead?
LIBVA_DRIVER_NAME

Hardware Video Acceleration

[–] [email protected] 1 points 1 year ago (1 children)

I don't think that would work since both GPUs are AMD and use the same driver

[–] [email protected] 1 points 1 year ago

Oh yeah, I missed that :/