Welcome Stranger to OCC!Login | Register

Serious Statistics Review

   -   
» Discuss this article (1)

OCAT/PresentMon:

To be completely clear, I have not used PresentMon. I mention it to clarify what OCAT is and because documentation on PresentMon explains what the data in the output CSV is. It is a tool that extracts Event Tracing for Windows (ETW) information related to swap chain presentation, which translates to it can record how long a frame takes to render, to be sent to the display, and more. It can also recognize if a frame has been dropped and in some cases the runtime used.

An advantage to PresentMon over other tools that capture and record performance information is that it does not really care what API the game uses. With OCAT providing a convenient GUI that hooks into the applications, starts recording data, and either stops recording at a keypress or after a set amount of time, you have a versatile tool that will basically just work. The catch for both pieces of software is that you also get a LOT of data you do not necessarily know what to do with, but OCAT does also provide a perf_summary.csv file with average FPS, average frame time, and 99th-percentile frame time. I am not looking at that summary, though I am getting some of that information out of the full data.

To give you an idea of just how much data this is, a CSV stands for Comma Separated Value, so they separate what would be cells in Excel with commas, but are otherwise plain text files. These text files ranged from just over 3.7 MB to almost 6 MB. (Technically there are two 2 MB files, but those are from a separate test I will cover at the end.) I happened to have written a book recently that comes in at 233 pages (letter paper) and those text files total just 0.5 MB, so there is a lot of information here.

A decent portion of the information in the CSV is actually not too important for this, as it is essentially just labelling what the application being recorded was. All I am interested in is the TimeInSeconds value, MsBetweenPresents, and MsBetweenDisplayChange. (Two other columns, MsUntilRenderComplete and MsUntilDisplayed are also of interest, but for that later test I mentioned.) TimeInSeconds is naturally the time since the recording started in seconds. MsBetweenPresents is the time, in milliseconds, between the Present() API call and the previous one. I am not an expert in these matters, but a quick search reveals that the Present() function is what declares a frame has finished being rendered. This means that the time between Present() calls is effectively the time it took to render the latest frame, ie. the frame time.

The MsUntilDisplayed column is a bit easier to understand, as it is the time between two frames actually being displayed, which is also an important piece of information. For a 60 Hz monitor, like mine, an implementation of vertical sync means the frames can only be displayed on a multiple of 16.667 ms. Because of this, graphing the data of this column, with v-sync, results in the dots collecting right around those multiples. To make such a graph easier to read, I changed the unit on the relevant axis to be 16.667 ms, so that the axis represents how many display frames later a frame was actually shown. (Ideally, with v-sync, this will always be one.)

This display time measure is just as important as the frame time, as it is what will capture judder. The definition I found for judder is whenever a monitor needs to display a frame more than once, because it has not received a new frame yet. I am comfortable saying this differs from stutter as I would include noticeable input latency or lag with stutter, but not necessarily with judder. Fortunately, Serious Sam Fusion 2017 was almost completely free of judder and stutter for me, so it is hardly an issue currently, but I think I put together ways to capture both.

One last thing I want to mention about OCAT specifically is that it has an overlay that will show FPS and frame time, and some of the summary information when a recording completes. It has almost never appeared for me in any of the various tests I have done with it since I first installed it. It did however like the Vulkan runs I did and then seemed to have an unexpected interaction. Apparently when running the Vulkan API, Serious Sam Fusion 2017 decided to load up and run various relevant DLLs whenever it launched, so even though I had OCAT closed its overlay would still be running. I'm not sure how to undo this, but it is not too big an issue for me.

Since I did these tests, OCAT 1.0.0 has released which may do a better job of applying the overlay, but I have not tested this yet.

 

R:

As I mentioned in the introduction, I do have a BS in mathematics, but as I am mostly interested in more abstract math, I have limited experience with statistics. It is hard to avoid picking up some statistics knowledge when working with graduate students who were covering applied math, and it is actually because of this that I first heard of R . It is a programming language and environment that is free to use and Open Source under the GNU General Public License. On its own, R may be a bit overwhelming for someone as it is almost like a command-line tool, but thankfully I have been working with the likes of FFmpeg and LaTeX long enough to feel right at home. All I needed was to learn the syntax and find the documentation for the commands I was interested in. There are applications that provide a potentially more user-friendly experience, but I will be honest, I found working directly in the code and scripts easier.

Like LaTeX, which is also a language and environment but for typesetting, there are a variety of packages people have made for it, extending what it is able to do, but for this work I have only used two. One is a package that let me load the CSVs and the other is the ggplot2 package, which is what actually generated the plots you will see. Excel may offer me a neat, spreadsheet view of everything, but getting the same information out of R is pretty easy, and I find it much superior for creating good looking graphs.

Something I can do in both Excel and R is examine the percentiles of a data set, which are a handy characteristic to consider for performance information. Basically, there is always a chance for outliers in a dataset, but by considering percentiles you can cut them out and look at where 99% of the data falls, or any arbitrary percent. This is something I have observed several outlets use now (and some people on the Internet demanding) but to be honest, I am not sure just how useful 1% and 99% data is when most of us are really just concerned with hitting 60 FPS consistently. To that end, I found the commands in both pieces of software to reverse the function, allowing me to tell you what the percentile is corresponding to 60 FPS, or any arbitrary framerate. (For example, being able to say that 99.9% of the frame times were less than 9.09 ms is nice, but I think being able to say only 0.006% of the time was the framerate less than 60 FPS might be a bit easier to understand.)

The command R has that Excel lacks is the difference (diff) function. It will find the differences between adjacent values in a list (or vector as R considers them). You will see later on that I use this to characterize how much frame times change, from one frame to the next. I am actually somewhat happy Excel does not have this function because it gave me a reason to explore using R and every sane and crazy thing I can do with it.

One example of a crazy thing I can do is generate a frame time overlay for a video, which I have a prototype of below. While I have put a lot of the work into scripts, (that make the scripts to run) making an overlay video like this takes a fair amount of time, so I am not sure how much I may do this in the future. Also, it may not work with all games, as not every application likes being hooked into by others, like OCAT and OBS. It took a few attempts before Serious Sam Fusion 2017 decided to not crash so I could capture this video and data. (This run was using DirectX 11, as OBS cannot capture Vulkan games yet.)

 

While capturing frame time data and video together might not always work, this will not stop me from using OCAT while playing a game to provide a more accurate and detailed description of how it performs.

 

Serious Sam Fusion 2017:

I should probably talk some about the game, or rather platform I have been doing these tests in. Serious Sam Fusion 2017 is currently in beta and was created by Croteam, the Serious Sam developers, as a central hub for playing all Serious Sam games. I played its version of Serious Sam HD: The First Encounter, which originally released in 2009 and was an HD remaster of the original 2001 Serious Sam: The First Encounter. As you might be able to guess, it has rather low requirements compared to many modern first person shooters, so it also ran very well on my machine.

Prior to starting any of these tests, I played through the entire game, and I will admit I did so on Tourist difficulty. I wanted to get to the end so I would have every level unlocked and could pick from any of them for these tests. (Also Normal difficulty got a bit too hard for me to handle, so I dropped it down all the way.)




Related Products
Random Pic
© 2001-2017 Overclockers Club ® Privacy Policy
Elapsed: 0.0749261379   (xlweb1)