Welcome Stranger to OCC!Login | Register

AMD New Horizon Event Showed Off Ryzen CPU Technologies and Peak at Vega GPU Performance

Category: CPU's
Posted: 06:51PM
Author:

Earlier today AMD held its New Horizon event, which was live-streamed for anyone interested in seeing the upcoming Zen-based CPUs being put to use, along with some new information about the new architecture. One of the announcements was that the name for the upcoming processors will be Ryzen. AMD also revealed SenseMI, which consists of multiple machine intelligence components to improve performance and efficiency. Following the Radeon Instinct announcement yesterday, we can clearly see AMD is trying to leverage this field.

The five SenseMI components are Pure Power, Precision Boost, Extended Frequency Range, Neural Net Prediction, and Smart Prefetch, and all of them will run in the background to optimize performance. Pure Power uses the over 100 sensors within the CPU to selectively reduce power consumption, based on the millivolt, milliwatt, and single-degree precision temperature measurements those sensor collect. Precision Boost is able to increase clock speeds by increments of 25 MHz, based on the task at hand and the measurements from those sensors, at it can do this at up to one thousand times a second. Extended Frequency Range allows Precision Boost to push the clock speed higher than the normal boost clock, if the cooling system has the headroom. Neural Net Prediction is an artificial intelligence neural network that analyzes what an application does so it can predict what pathways it will need in the future. Smart Prefetch fetches data it predicts will be needed ahead of when it is requested, like other prefetch systems but apparently with more sophisticated learning algorithms. No doubt helping these last two components is the combined 20 MB between the L2 and L3 caches.

Of course, the various new technologies and fancy marketing names are of little importance if the hardware cannot perform, so the Ryzen CPU with 8-cores and 16-threads, clocked at 3.4 GHz without any boost was pit against the Intel Core i7 6900K in a number of tests. The i7 6900K, which is also an 8-core/16-thread processor, was effectively off-the-shelf with the default base clock of 3.2 GHz and boost clock of 3.7 GHz enabled (though comments I have seen point out that in a multithreaded task, we might not see it boost) and was matched or beaten by the Ryzen CPU. These tests included rendering an image in Blender (and you can download the project from the New Horizon webpage), transcoding video with Handbrake, and playing Battlefield 1 at 4K resolution. For the gameplay, the CPUs were matched with a Pascal-based NVIDIA Titan X. By the way, the i7 6900K has a TDP of 140 W while the Ryzen CPU has a TDP of 95 W.

At the end of event some of the upcoming Star Wars Battlefront – Rogue One DLC was shown off, running at 4K and over 60 FPS using a Ryzen CPU and a Vega GPU. No specific information was given about the GPU, so that will have to wait for details on that.

The livestream is up on YouTube, and embedded below, for anyone who would like to watch it.

 

Source: AMD and HotHardware



Register as a member to subscribe comments.
Waco on December 14, 2016 02:42
I watched it live and left disappointed that AMD thinks it's fans are this oblivious to reality. The MOBA "demo" with a 6700K was worse that just misleading since even AMD announced a minimum-overhead streaming solution for its GPUs in the past week. There was nothing here to sate the enthusiast, and I found the entire thing a bit...distasteful. It could have easily been a well thought out demo day and instead they brought themselves down to dishonest demonstrations that meant essentially nothing.
Guest_Jim_* on December 14, 2016 04:50

For me it did not help that I had already seen a lot of the information via rumors and leaks, but I think your interpretation of the streaming demo is a bit off. I really do not mean to be insulting or anything like that, but how much streaming do you do or how closely do you follow information about streaming? Because of how much I use OBS for streaming and recording, I very frequently check its forums, and while that is only a subset of streamers, there is a clear preference for CPU-based, software encoding (using x264) over the ASIC-based, hardware encoding AMD ReLive and NVIDIA Shadowplay/Share offer. The reason is because software encoding can offer better quality streams than the ASICs can, but it is very hard to achieve real-time encoding and needs very powerful CPUs to manage that and play a game. That's why some actually use second PCs just for the encoding, to free up the resources on the gaming computer while still having the quality of a software encoder.

There actually were some people on that forum who recently tested the various encoders available through OBS to rank them, but everything can vary depending on the game. They found that x264 on the veryfast or faster presets are the best quality overall. The mathematical metric used seemed to indicate NVENC could trade blows with it, but the original tester was very clear that in high-motion scenes, it falls behind. Someone else put Pascal-NVENC ahead of x264, but it looks like they were also considering the minimal performance hit in their judgement, and only tested at 720p30 and not the 1080p or 60 FPS some want to put out and the original tester also looked at. Intel's QuickSync came in behind x264 and NVENC, and AMD's VCE behind that, while the superfast and ultrafast presets for x264 were discounted for their low quality. This testing was done with OBS and before ReLive came out, but that might not mean much. The AMF SDK that gives access to VCE has not been updated since before these tests, so it is possible neither performance nor quality has improved with the new drivers, as far as OBS is concerned. Hopefully updates to AMF will improve quality, as my understanding is that VCE has more potential than the SDK is currently making available, and hopefully Vega will offer an even better version of VCE that can more evenly compete with NVENC.

 

Of course the ASICs are very valuable to have as they open up game streaming and recording to people without those powerful CPUs (like me), but that does not diminish the difficulty of real-time video encoding on a CPU while playing a game. Also I sent an email to one of the PR contacts given with the press release asking about the software and settings used for the streaming demo, as I have not seen those stated anywhere. Hopefully I will hear back soon and then share the answer.

Guest comment
Guest_Guest_* on December 14, 2016 23:05
I really hope amd is not a bust this time, again. And it appears it won't be but that was also the impression last couple times as well, so only time will tell. Hopefully there is more to offer than just this combination of gaming, encoding, and streaming, cause honestly, how many people do that? And I'm curious for this combination of things, is the amd doing well because of a new set of instructions, the raw power, or simply the larger number of cores? Because intel have chips with an even larger number of cores if this is what people really want to do..
Guest comment
Guest_Guest_* on December 15, 2016 18:49
I watched that video as well I don't know that streaming part showed the i7 6700K struggling to play the game and have enough power left to stream as well. But at the same time we have to realize i7 6700 is a 4/8 and the AMD Ryzen was a 8/16 so pretty much double the core count so the AMD had resources left to handle the extra load. If I am also to note both systems had the same Graphics cards installed as well. It does look like it might be promising for the about to be released Ryzen but only time will tell how it fairs in the reviews I guess.
Braegnok on December 15, 2016 20:10

 Smoke & mirrors,.. why not disclose boost frequency. :dunno:

 

[attachment=21088:comic-20111024.png]

Guest_Jim_* on December 17, 2016 00:03

Okay, I heard back from the AMD PR guy and got the information I wanted about the streaming test. It was done using OBS Studio (which is also what I use, coincidentally), the x264 encoder, at a constant bitrate of 3500 Kbps, with a resolution of 1920x1080, the game set to run at 240 FPS, and, most importantly, the Fast preset. For reference, the order for these presets is:

  • Ultrafast
  • Superfast
  • VeryFast
  • Faster
  • Fast
  • Medium
  • Slow
  • Slower
  • Veryslow
  • Placebo

Those at the top of the list encode more quickly but the image quality is worse as a result, so real-time encoding for streaming with middle-of-the-pack Fast, while playing the game, is actually quite impressive.

Waco on December 17, 2016 02:15
...except that nobody uses CPU encoding when they have an Intel, Nvidia, or AMD GPU.

Oh, right, that's everyone. It was a pointless comparison and an insult to any enthusiasts intelligence.
Guest_Jim_* on December 17, 2016 02:53

...except that nobody uses CPU encoding when they have an Intel, Nvidia, or AMD GPU.

Oh, right, that's everyone. It was a pointless comparison and an insult to any enthusiasts intelligence.

Yeah, except for all of those streaming enthusiasts who want the better quality only CPU encoding offers and will even go so far as to build a second computer specifically for CPU encoding. Just those enthusiasts who know the limitations of ASIC encoders. Oh, and maybe also those of us who recognize it is not even close to easy for a CPU to handle a video game and real-time encoding, let alone at the Fast preset and not something faster.

Seriously, Waco? Why are not only dismissive but demeaning with this? It's irrelevant that you don't care about it, but don't start claiming my enthusiast opinion here, because at least my years of doing video capture and encoding disagrees with you.

Waco on December 17, 2016 05:01

Yeah, except for all of those streaming enthusiasts who want the better quality only CPU encoding offers and will even go so far as to build a second computer specifically for CPU encoding. Just those enthusiasts who know the limitations of ASIC encoders. Oh, and maybe also those of us who recognize it is not even close to easy for a CPU to handle a video game and real-time encoding, let alone at the Fast preset and not something faster.
Seriously, Waco? Why are not only dismissive but demeaning with this? It's irrelevant that you don't care about it, but don't start claiming my enthusiast opinion here, because at least my years of doing video capture and encoding disagrees with you.

If you're going to tell me that GPU/hardware encoding looks so much worse than brute-force CPU encoding for live streaming that it makes a difference...well, that's why I'm dismissive. Because, in reality, it's not different enough to matter. Every comparison I've seen shows that they're very slightly different if you pause the videos and look for artifacts. If there's something to disprove that, I'm all ears, but so far I've never seen concrete evidence to make me want to waste CPU power encoding when a GPU can do it far faster.

It was a stunt by AMD to sell more cores. I get that, but when AMD just released a driver that will do encoding on the fly on their own GPUs, it left the demo feeling incredibly hollow. I want AMD to succeed more than anyone (the implications of an Intel-only world could easily cost me many millions of dollars at work), but the encoding / gaming demo was just an insult.
Guest_Jim_* on December 17, 2016 15:41

Perhaps to you it is not different enough to matter, but by dismissing it like you have been, the insult is not from AMD but from you and aimed at those of us who do want to produce the best quality we can. I don't care if you think the difference isn't important, that's for you to decide for yourself, but it does matter to me and I am sure plenty of others. That's why I now try to do all of my recordings losslessly and then re-encode them later. It takes a lot of time and forces me to work with massive files, but the product at the end is better and I can see the difference. Sometimes it's the lack of artifacts and sometimes it's smaller files, but it is there and I benefit from it either way.

Dismiss it's value to you all you want, as I said that is your decision, but please stop dismissing its value to me.

Waco on December 18, 2016 02:20
If you say so. I found it pointless and contrived.
Braegnok on December 18, 2016 11:55

Using different implementations, using hardware or software, playing with encoding switches, avisynth filters etc is all marginal tinkering.

 

It is actually very simple,.. to get good quality you must encode with a good codec and with a high bitrate, no exceptions!  http://help.encoding.com/knowledge-base/article/understanding-bitrates-in-video-files/


This news has comment postings disabled because it is now archived.

© 2001-2017 Overclockers Club ® Privacy Policy
Elapsed: 0.0241200924   (xlweb1)