Amd cpu video encoding reddit. Only video encoding is performed by the hardware encoder.

Kulmking (Solid Perfume) by Atelier Goetia
Amd cpu video encoding reddit I'm not a professional video editor but I often use After Effects and Premiere Pro for personal video editing. 264 however, even the HEVC encoder is still severely lacking. Freely discuss news and rumors about Radeon Vega, Polaris, and GCN, as well as AMD Ryzen, FX/Bulldozer, Phenom, and more. Scenario: ultra low power usage, video editing / encoding, iGPU-accelerated, with CPU only, without a dGPU AMD APUs are better for gaming, The unofficial but officially recognized Reddit community discussing the latest LinusTechTips, TechQuickie and Get app Get the Reddit app Log In Log in to Reddit. So in theory the CPU version will look marginally better but in practice that difference is so minor that even if you are looking for it directly 99. Both can be used on you current motherboard IMHO. Nvidia have much better gpu encoding with its nvenc. Edit 2: For anyone interested in actual metrics I’d recommend watching this video AMD improves video encoding yet again! This time with Pre-Analysis - Code Calamity News (It even beats everything for real-time encoding, but your CPU might be busy doing other stuff. Dedicated NVIDIA/AMD/Intel cards of the newest generation have these capabilities. 1 drivers. AMD 7000 has fast CPU cores for encoding, but also supports hardware accelerated HEVC 10bit encoding, like Intel QSV, but called AMD VCE. Maybe a 5600GE. Also to note, I've had similar experiences (GPU encode at higher CQ values being better quality than "better" CQ values on CPU) on Intel GPU hardware. Welcome to reddit's home for discussion of These AMD processors are so much easier to cool than Intel's high core-count CPUs, e. Both Intel and AMD CPUs can use this feature. , 10900, 11900, 12900. 264 AVC and H. I love AMD cpus but AMD still needs to work out kinks with their GPUs. But as others suggested, try streaming 900p. Even Nvidias NVENC for H265 on the newest cards is a lot less efficient than if you use e. Or iBUYPOWER Gaming PC SlateMR 215a (AMD Ryzen 5 5600G 3. I had an RX580 and an RX5700 and when gaming they’d be at full blast and get super hot and would warm up my room. You can't just shuffle data to and from the You should know though, AMD has a strange fetish for poor video encoding quality. With amd and Plex, it’s only supported on some Welcome to the official subreddit of the PC Master Race / PCMR! All PC-related content is welcome, including build help, tech support, and any doubt one might have about PC ownership. Using the likes of x265 (the name for the open source HEVC CPU encoder)10bit will be much slower to encode as it uses the CPU cores, but provide far superior quality at the same file size, compared to AMD VCE and Intel QSV. Using AMD Software Encoder in OBS: Since the 7950X3D lacks an iGPU, the AMD HW h. "VCE" is for video encoding hardware acceleration, while "UVD" is the name for AMD's video decoding hardware acceleration. Apple/Intel/AMD are all busy throwing more resources at various problems, but they aren’t all focusing on the same areas. And serious work demands CPU encoding, not GPU-based encoding. CPU encoding is good to use it for final encoding - video is directly used without sending it to socials for later processing. Hardware transcoding is when it uses a gpu. Here is a editing and encoding scenario on the 3700x, I’m not including 4K/60fps for this sample. You don't necessarily need a PC to be a member of the PCMR. Hardware encoding simply uses fixed functions specialised for exactly one task. Intel CPUs for hardware transcoding . Could always opt for a dedicated stream pc. The Nvidia hardware video encoder is excellent and is critical to Link and Virtual Desktop both. 265 HEVC) finalized on July 6. ) While there aren't a huge abundance of workloads that will give the 5800X up to a 33% advantage over the 5600X, video encoding is definitely one of them. I tested encoding with both the CPU (x264) and GPU (AMD AMF), but at a bitrate of 7,000 The HEVC encoder is miles ahead of H. If you plan to use new hardware all around, you can maybe go for a Ryzen 7000 series CPU on AM5, where you have Radeon 6000 series integrated GPU. That’s why GPU encoding is usually best, it’s imperceptibly different and significantly faster to encode. GPU encoding on AMD is no where near as good as it is with Nvidia hardware. 264 and H. The encode/decode block is likely the same as other RDNA2 GPUs, so I wouldn't expect there to be a difference in capabilities. Also for video encoding you don't need a CPU encoding will always be better than GPU encoding. AMD hardware is gpu encoding, which is quite worse than x264 (cpu encoding) in quality, but with much lower fps hit. Expect perhaps a 20% speedup just from that for 4k/Slow or Slower encodes. On top of that, the 470 is a bit of an old card and is likely already being maxed out in games so trying to do GPU encoding would likely be a mess. Hello, I would like to know if there is a way to switch the video encoding job to the CPU instead of the GPU. 266 - Versatile Video Codec (Successor to H. 265, and H. Now we have multiple CPUs on one chip with shared caches, both have access to the memory controllers, and other North Bridge functionality as become the "uncore" or "system agent" on Intel CPUs. It also doesn't help that browsers suck dick at hardware-accelerated video playback and still waste a lot of CPU resources while doing it. Rule of thumb is (number of lines of video resolution) / (number of encode threads) > 30 and ideally > 50. You don’t get faster video encode/decode by throwing more CPU resources at it, CPUs are terrible at video encode/decode. H. It's not about the hardware in your rig, but the software in your heart! That said, from my understanding, AMD encoder works, but its leagues behind NVENC and x264 (cpu). I'd like to go AMD with this one, specifically one of the low-power APUs. AMD RDNA2 encoding performance is worse than NVIDIA NVEnc, if you actually use AMD and NVIDIA there is a noticeable difference. On fast cpu presets like 'fast' quality is comparable with GPU encoding but it runs slow - no reason to use these presets. "Video Encoding Tested: AMD GPUs Still Lag Behind Nvidia, Intel It’s a weird world we live in where AMD has quite successfully GPU encoding is faster, CPU encoding is more accurate. The stuttering does not happen if you use the Windows PC Oculus Store version of VRchat. Expand user menu Open settings menu. This is slightly more efficient in AMD CPU/AMD GPU cases but the difference in performance over an Intel CPU/AMD GPU is a few percent max and relatively small compared to which actual CPU/GPU you choose. You just have to love PCs. It's not about the hardware in your rig, but the software in your heart! Welcome to /r/AMD — the subreddit for all things AMD; come talk about Ryzen, Radeon, Zen4, RDNA3, Video encoding and decoding in GPUs are actually using a dedicated application-specific integrated circuit somewhere on the GPU die, The reddit community for the PC version of Red Dead Redemption & Red Dead Online Welcome to the official subreddit of the PC Master Race / PCMR! All PC-related content is welcome, including build help, tech support, and any doubt one might have about PC ownership. If you can wait, the Ryzen 7000 series features AVX-512 instructions and this does appreciably speed up the x265 encoder, the slower the preset, the more the benefit. Just wondering how much faster would these new Ryzen Laptops with the new Awesome Cpu's would be for video encoding in Premiere? Grab the AMD Ryzen 7 Which is why you can find videos of people using CPUs to render video games (badly), but what they gain in broad compatibility, they lose in targeted workload efficiency. When the video card has no, or an inferior hardware encoder, the CPU is tasked with encoding the video stream that is used by link and VD. 265 encoding. Edit: also the 6000 series cards can handle streaming pretty well. I just couldn't get a GPU encoder to reliably encode at a decent quality whereas the CPU encoders will do just fine. I've of course searched this topic before, I know Nvenc works far faster typically but CPU encoding is deemed higher quality and more precise, albeit not AMD hasn't improved their h264 encoder hardware in a while, while NVidia got an update with Turing and Intel has been steadily improving theirs for at least a couple CPU generations now. Now this card (6500 XT) looks better than GTX 1650 in gaming, but what is stopping me from buying 6500 XT is there are a lot of reviews about it being so bad in video rendering/encoding. That's absolutely expected, as x264 is a software based encoder that uses only the cpu to process the video, while any hardware accelerated encoder (Like AMD's AMF, NVIDIA's NVENC and Intel's QSV) run in the gpu, using a dedicated part of the chip to do the video processing. So I have a question I'm recording on a laptop with a CPU that has integrated graphics (AMD Ryzen 7 3700U with Radeon Vega 10 Mobile Gfx) but no discrete GPU obviously which do I use hardware or software encoding and if hardware do I use 264 or 265 I record at 1280x720 at 30fps Welcome to the official subreddit of the PC Master Race / PCMR! All PC-related content is welcome, including build help, tech support, and any doubt one might have about PC ownership. Welcome to the official subreddit of the PC Master Race / PCMR! All PC-related content is welcome, including build help, tech support, and any doubt one might have about PC ownership. You can improve the quality for x264 by choosing a slower preset, but this will QSV (Quicksync Video) from 11th gen. I also play a few relatively old games. My advice: pick out a few movies from your collection - modern/shot on digital and older/shot on film. 0 Ghz and an RX 580. x265 slow. If my SSD with Windows 10 is already GPT, can I go into my BIOS and just The Intel Arc cards are actually pretty solid for video encoding. The games I plan to host are very single-core performance focused so I'm focusing on CPUs with good single-core performance which according to my research is almost always Intel CPUs. So, if you want maximum speed, go for Intel. use r/WindowsHelp or r/TechSupport to get help with your PC Members Online. an 8 Mbps CPU-only video will look better than an 8 Mbps Quicksync video. Which is almost as expensive as an i9 9900K (8c/16t). Also using CPU encoder or different recording I’ve tried streaming and recording both through OBS and Adrenalin Software. 264 10/8bpc decoding. An A380 should do alright. The game ready drivers can prevent hardware encode/decode from working correctly, so that may explain the increase in usage. However, on a single PC gaming and streaming, if you are ok with a slight video quality reduction, then I would use the GPU encoder to save CPU performance. Not as good as NVENC but still good. Cheers. However if you're still seeing high CPU usage it's probably just a matter of the effects or whatever you're Currently, AMD is still far behind on quality of H. I currently have a PC with a GTX 1660ti using NVENC to chew through the batch queue, but I also have two dormant computers with AMD VCN (RX6000 series gpus) that i'd like to add to my TDARR setup to assist with the transcoding, but I don't want to use the CPU due to the slower speed and the heat output that CPU encoding does. (So the maximum number of encode I did hundreds of tests and can say that AMD VCE file size is double with the same picture quality and accuracy of movement prediction + it crashes a lot. Parsec doesn't list hardware decoder as an option when I use my AMD GPU. But then I switched to a 1070 and even when gaming it was silent and would barely get hotter. Makes sense not to send the video across the PCIe busses. 264 before you start to have quality problems. View community ranking In the Top 1% of largest communities on Reddit. So yeah, you can buy a newer more expensive CPU to correct one of the problems with TBH, as an owner of a Ryzen 9 7940HS system, I was underwhelmed with hardware accelerated AV1 encoding on this chip. With Nvidia (unlocked drivers or > P2000), if you tap our the encoding chip, it will not fallback to the CPU. The CPU will do CPU AV1 encoding, just vastly slower than dedicated encode hardware will which, as of now, I don't believe any CPU has a built in GPU with av1 encoding. The 5800X was another 40% reduction in encoding time over the 3600X in Handbrake. The stuttering only happens if you are using steam and the steam install of VRchat. Get the Reddit app Scan this QR code to download the app now. AMD vs. This is available in integrated GPUs as well. Video quality doesn't look any better over HEVC (even when set at "High Quality"), and loses significantly over preset 8 SVT-AV1. x264 has better quality than the AMD hardware encoder. i. Hardware transcoding is a little more complicated though on amd. The 5600g will 100% support software transcoding. Reply reply Been doing a lot of rendering on my desktop lately, which is an all-AMD setup. Lots of options available with Intel and AMD CPUs, Intel integrated graphics, Nvidia graphics and AMD graphics. GPUs are only beginning to support video encoding on them, and only on specific formats at the moment. I can encode 24 min video with x265 and it will be like 300mb but VCE will be atleast 600 or CPU encode of same file, RF16, 658MB Reply reply Top 6% Hi, I'm planning on building a home server to use for hosting game servers (mostly Minecraft and Team Fortess 2) and a Plex server. You get faster by throwing more specialised silicon at it. 264, H. Next level would need a motherboard change: For video editing you'd use the high core count Zen 2 or Zen 3 CPUs. Which is why things like 3D graphic rendering, number crunching, crypto mining, encoding, etc. Particularly the AMD CPU ones were interesting. The video encode block is absolutely CPU encoding is brute forcing it to happen, you need a lot more CPU performance/cores, which means it's shit for streaming and uses a lot more power than a GPU with native hardware encoding. 265 video much faster than a CPU can. 9 GHz,AMD Radeon RX 6600XT 8GB, 16GB DDR4, 480 GB SSD, , I think I finally found Does Frigate take advantage of Intel QuickSync Video hardware encoding? Or would AMD be better for more raw cores/threads but no Does Shinobi use hardware encode/decode? Intel cpu for QSV vs AMD more We're now read-only indefinitely due to Reddit Incorporated's poor management and decisions related to third party Any cpu will support software transcoding. I am using the iGPU Vega 11 and it's really a pain watching high-res videos and the video encode spikes like crazy and lagging YouTube videos even at 720p. It was true with H264, was with H265 and remains true for AV1. 264 and h. The 5900x is more than enough to handle any game and stream at the same time. Open menu Open navigation Go to Reddit Home. Since most of the streaming services will re-encode the video anyway (especially if != H. I managed to encode a 30-second 4K video in 14 seconds, so about 64 FPS, although my CPU was also used to 100% during rendering (my GPU's Video Codec 0 engine sat between 65% and 80% during the render). So the arch wiki states that AMDGPU PRO is not recommended except if you have a professional gpu, home users are not intended to use this package, but i also read that the video encoding/decoding engine in the AMD cards, AMF, can't be used without the amdgpu pro package installed. e. Some even have built-in HDR into the display. Are much better when run over GPUs and ASICs. For video editing and capturing, Nvidia is the best choice for now. Only video encoding is performed by the hardware encoder. Nvidia calls it nvenc, AMD calls it VCE, and intel calls it quick sync. But, importantly, Quicksync lowers video quality. I wonder if this is completely GPU architecture-bound 'cause I was quite disappointed by the performance of the long awaited VAAPI encode for AMD GPUs Valve enabled for Steam In-Home Streaming earlier this year, on my So, unfortunately, you won't be able to use an AMD iGPU for encoding because your CPU doesn't have one. Just think, for at least a decade, AMD has been the worse GPU, and just NOW they are developing a card that can compete with Nvidia. AMD is consistently the worst when it comes to quality. It may be that getting the large uncompressed frame off the GPU into CPU memory for the igpu to see would be more a performance hit than firing up the video encoder block on the GPU itself. I have the AMD rx 5700 xt graphics card and a Ryzen 7 3800 x processor. Log In / Sign Up; What video encoding were all of those 7 transcodes using? x264 or x265? Note that many of the people deep in CPU encoding employ multiple techniques to make their encoding as fast as possible: fully optimized binaries (native arch opts, O3 LTO and even PGO), chunked encoding to maximize thread scaling and a fully optimized OS flavor with performance governors for max speed. Even the Core i7 10700k stock build that I have consumes way more electricity (175-200w +/-)/puts out more heat than my Sees like not long ago I saw another AMD reddit post begging AMD for a better one. As a result, it is normal to have high (even 100%) CPU utilisation during encodes. Intel CPUs using AV1 NVENC was the clear winner with faster compression speeds, comparable file size, and better quality. r/PleX A chip A close button. It’s just using the cpu cores to transcode. You don't necessarily need a PC to be a member Seeing these reviews, I don't think AMD AMF can be compared to NVIDIA NVEnc. In VCEEnc, I got 61 FPS at 4K, and my Video Codec 0 engine sat at full usage, but the CPU wasn't used during encode, sitting below 5%. An A770 is still reasonably priced, compared to a lot of the new stuff Especially since most viewers are watching on mobile anyways. They seem to be the best bang for the buck for encoding, far as I can tell. Generally-speaking, you’ll want to veer toward Intel CPUs for video encoding for single-GPU mainstream PC setups to maximize single-core throughput, The Best Value AMD Ryzen Threadripper CPU For Encoding: AMD Ryzen Threadripper 7960X. Encode a good 15-20 minutes of each of them with both CPU and GPU. Any video of Most AMD CPUs don't have a built in GPU to do HW encoding, but the higher core counts, higher quality of software encoding, and faster IPC almost makes up for that fact. My current CPU is the Intel Core i5 9600K A 6c/12t CPU for this platform would be the old 8700K. I've noticed that when rendering I can only use software encoding, whereas on my laptop (intel 9th gen i7 with an RTX 2070) I Get the Reddit app Scan this Welcome to /r/AMD — the subreddit for all things AMD; come talk about Ryzen, Radeon, Zen4, RDNA3, If you use gpu encoding no issues there but for fluent cpu video encoding i would say 5900X-5950X would do excellent job without frame drop depending on resolution/quality. Reply reply JirayD • You're mixing things up. 265 10/8bpc encoding and AV1, VP9, H. It's not about the hardware in your rig, but the software in your heart! Computer Type: Desktop GPU: Sapphire Radeon RX 5700 XT CPU: AMD Ryzen 3600X Motherboard: MSI X570 A-PRO RAM: Corsair 16GB DDR4 3200Mhz PSU: EVGA 650W Gold GQ Case: Aerocool Cylon Midtower Operating System & When rendering videos, I have the option of using Nvenc (Nvidia GPU) or my CPU to do the heavy lifting. All moderns gpus now support video hardware encoding which encodes h. Someone has linked to this thread from another place on reddit: AMD Threadripper 16 cores I'm not too sure about Premiere in all honesty as I have encode videos using other players such as StaxRip and Handbrake. AMD did make some improvements to their API by adding b-frames and now pre-analysis, so their software wasn't great either. Ironically, Intel has the best. I have a 1st gen Ryzen 7 at 3. I edit and encode videos with Sony Vegas Pro. You can't customize much, afaik it does not support film grain synthesis and even the slowest preset is a lot worse than normal CPU encoding. Cuda is just has a better pre build infrastructure than AMD's core processors. Reply reply so search and watch his videos. The software encoder isn’t as important as the hardware for this use case. CPU should always be the first choice at This is the quality you will get. Streamers used to buy a second Thank you for the detailed explanation. 264 and HEVC B-Frame, but AMD AMF currently only implements B-Frame support on H. You can only have so many encode threads under h. First things first, my PC specs: I9-10850k (5Ghz Allcore) 32GB DDR4 3600mhz RAM RX 6700XT Reference Which Encoder would be the best for streaming and video recording? I got an amd graphics card so unfortunately I can't make use of nvenc and I also dont know if I should use my cpu with its 10c/20t or my new graphics card to stream/record. Both AMD and Nvidia will be bringing new engines for AV1 encode, so there could perhaps be other enhancements as well. Log In / Sign Up; AMD CPUs are the best for software encoding (smaller files slow) and NVIDIA for Hardware Acceleration so just drop that DTS-HD audio track and you'll save way more hard drive space then you will with CPU encoding the video. . For AV1 encoding specifically, the oldest architectures in x86-land which support it are Lovelace (NVIDIA), RDNA3 (AMD), Alchemist (Intel Parallelization in video encoding can only get you so far. On this project, it was a 2 hour and 32 minute video, stitched together from 64 clips. 264 (AVC) option in OBS is not applicable to your situation. In AMD's case VCN4 is a pretty drastic departure from VCN3 I think that was the case maybe a few years ago, but both NVIDIA and AMD's encoders are currently really good (NVIDIA RTX 2000/3000/4000, Radeon RX You should have ample CPU headroom with that CPU and TF2 to use CPU encoding, which will give you a better result than either of AMD or nvidia. Whether that’s Nvidia, Intel quick sync or amd. I'm still a little curious about video quality in my made-up scenario (AMD vs NVidia with no game running), but it seems like a single 3090 should be adequate, but only if my CPU can't put up with encoding video. Radeon Relive, Streaming, and Video Encoding . All in all, that probably makes Ryzen 7000 CPUs perhaps 50% faster than Ryzen 5000 CPUs for x265 encoding. I hope AV1 gets properly implemented soon tho, then it will matter much less which one you pick. Hardware support for media encode/decode has historically been part of the GPU, which the 5500 lacks entirely. This option is generally for systems with AMD GPUs or APUs (CPUs with integrated Radeon graphics). X. We can also take frames from your CPU medium encoded video and compare that to the NVENC, and see the lower quality of the CPU encoding. Get app Get the Reddit app Log In Log in to Reddit. I would say that you use the CPU. The GPU also integrates an AMD Video Coding Engine (VCE) for H. On AMD CPUs, the CPUs became chiplets and the North Bridge became the I/O die. 12. 264, VCE, QuickSync and NVENC are video encoders, CPU encoding can get better visual quality at any given bitrate, As the title states, does the Client's CPU or GPU matter for decoding h265? Additional detail: I'm hoping to get a laptop for use around the house. In future chips it seems AMD is merging the two into one brand, called VCN, which is already used on Raven Ridge, and I suspect will likely be in Navi as well. Hi. NVIDIA NVEnc supports H. , is performed by the CPU. Don't know how fast AMD and Intel solutions are but nvenc can usually encode video at around 4x-6x faster than CPU is significantly higher quality but only with good pre-sets like ffmpeg x264 slower, its very noticeable at lower bitrates. At a certain point it becomes more of a bottleneck than an advantage. 264. The one generalization I can make is it’s better to use the GPU if you can, because the GPU will encode the frame buffer that is already in its VRAM; with CPU encoding that whole frame buffer needs to be fed out into the CPU, encoded, and written to RAM and then sent. g. I thought that was normal. Specs: Core Count: 24 Looking to set up a PC to run Plex on, Skip to main content. Yes but no. Will it support PleX's GPU encoding? From the documentation it appears not: If your Linux computer also has a dedicated graphics Welcome to the official subreddit of the PC Master Race / PCMR! All PC-related content is welcome, including build help, tech support, and any doubt one might have about PC ownership. 264 is a video codec, a protocol on how to compress video streams. Every stage prior to and after video encoding including decoding, filters, audio/video sync, audio encoding, muxing, etc. I have a 3800x and 6700XT and just use CPU encoding at medium and never have issues and stream looks fantastic. Update on the video stuttering issue with AMD GPUs and video players in worlds like LSMedia. 5700g and a capture card or one of those mini pcs with a 6950hx or whatever those mobile AMD CPU’s are and an external capture card . If you have an Nvidia GPU, use the AV1 NVENC encoder for fast Streaming expert and YouTuber EposVox published a performance preview of AMD's new AV1 encoder, found in its RX 7000 series GPUs, and found the encoding quality to So to conclude, the issue seems to be related to AMD drivers because recorded video quality is just fine with 23. AMD just doesnt have a proprietary developed core utilization process like Nvidia's CUDA. It's not about the hardware in your rig, but the software in your heart! Yes, Intel will win, and by a significant margin, if you're looking at Quicksync (CPU + iGPU) performance vs pure CPU performance on AMD. CPU's and up tend to be better quality on the final output over AMD's Hardware encoder at lower (Twitch limit) bitrates. (Around 200 FPS in Handbrake with a customized 1080p30 slow. I have heard that AMF isn’t that great and that if you have the spare CPU power to go with x264 but I’m curious if this is right for my specs. As someone who is contemplating getting a 6800XT but who is also interested in streaming, the video Linux is up to ~25% faster on AMD CPU+GPU combos compared to Windows 11! The Radeon Subreddit - The best place for discussion about Radeon and AMD products. Subscribe to never miss Radeon and AMD news. I am still new to the AMD platform, and I hope to learn more. 264) it would be ideal to use the most efficient format to make the best out of the encoder that is a bit weaker to begin with. No. Full specs show much higher Encode / Decode complexity & processing requirements; Hardware (GPU) support is still likely 4+ years off but the protocol was submitted in time for Trying to decide between using x264 or AMF to encode my stream. There's also the issue of CPU to GPU bandwidth and latency. 9% of people won’t see the difference. The video was 1080p at 60fp, 37Mbps Average and Max, Mainconcept AVC and it was a Two Pass encode. That said I have tried the various GPU encodings and the quality just is abysmal compared to the same quality settings on a CPU. So it looks like the issue is actually a steam issue with VRchat and AMD GPUs. czeyj xecav zhnko ivrlqb kiz zargj zqqw xrnaky ueuk uxmspqr