Wednesday, May 29, 2013

Game Recording with MPEG-4: using H.264/AVC in Programs such as Dxtory, Bandicam and MSI's Afterburner (Text-Only, Long Version)


This article is a Text-Only version, showing how to use a few programs (one of them completely free) to utilize this efficient codec in game recording, using steps and settings that I personally found optimized performance and showing which ones slowed things down when recording. 


For a video example of how to set the x264/AVC codec recordings to be editable in
Sony's Vegas or Adobe's Premiere video editing applications, see my post about it here:
http://gametipsandmore.blogspot.ca/2013/06/and-more-how-to-record-with.html


The MPEG-4 video codec has been around for over a decade now. I remember recording TV shows to watch later on, on a system with an ATI All-In-Wonder videocard (back when videocards had only 8MB of VRAM!) and the joy of the changes I was seeing, going from compressing the shows in MPEG-2 format to MPEG-4 using either Quicktime or DivX (or it's open-source competitor, XviD). Smaller file sizes and still decent quality? Awesome. Those were MPEG-4 Part 2 or ASP (Advanced Simple Profile) iterations of MPEG-4. Today, we are up to MPEG-4 Part 10 or AVC (Advanced Video Coding) and great times are to be had by all who record their video game adventures, as modern hardware and capturing apps allow not only for h.264/AVC to be used for video compression and archiving - it can also be used for small filesize 'live' game recordings and great retainment of detail, if desired.

Dxtory, Bandicam and MSI Afterburner all provide the ability to utilize the various codecs installed on your system to record with (others do as well, I am merely choosing these more popular game recording apps as examples). To record with MPEG-4/h.264/AVC, it is simply a matter of installing that codec on your computer [if it isn't already], then choosing it inside of whichever game recording app you prefer. The codec's interface (GUI, Graphical User Interface) will allow you to change whatever settings you wish - but these settings will be quite different from what you may be used to, if you have done any h.264/AVC video compression in the past. Why?

Because we are going to be balancing the settings - not just for retaining quality at a small file size (as you would like to when archiving a movie to keep on your computer) - but now also for recording speed. For instance, if we try to set things for high compression and attempt to keep detail at the same time (as we would for archiving a movie), it simply takes too long to process and compress the changes between frames 'on-the-fly' and save them into a file, when attempting to record game output. This would result in the game 'lagging' and dropping frames to try and keep up, as it falls behind dealing with analyzing and compressing and then writing the data, resulting in a video with 'choppy' playback as well. So, when recording our gameplay 'live', we must now consider the various settings and their affect on how fast we can put through the processing of frames and writing it to a file at the same time.

I will be addressing most of the settings in the h.264/AVC codec, but not all of them. I will be concerned mainly with the ones that will slow down processing, so that things do not take too long and fall behind and cause 'lag', both in the game and the resulting video file itself. This differs from compressing for archiving our own movies, because instead of being only concerned with Quality (setting everything on 'high' and letting it take as long as it needs), we must now balance Speed of the compression as well, being now more concerned with each of these settings and how they can possibly slow things down when recording the 'live' game rendering. As is the nature of live recording, we want it to easily and quickly process the frames and save them to a file. I will explain how to do all of this.


Recording with H.264/AVC


"x.264" is a free/open-source utilization of the h.264/AVC codec (the XiWave GNU GPL MPEG-4 Codec). It is normally a command-line driven executable [when you run it, you type things in, to get it to do things], so what we want for game recording with these programs, is a version with an 'interface' so that we can just tell our apps what to do with the mouse and buttons/sliders and it translates it into commands for the codec. All we would have to do is choose a few settings and checkboxes (what most people are used to - a nice, easy, graphical user interface).


Doing a search for 'x264+windows', there are a few places you can get the installer/setup program for x264 and Windows, here are the main ones:

This is the "Official" Open-Source Video For Windows version of the x264 codec (Red Logo) at the time of this writing:
http://sourceforge.net/projects/x264vfw/
This official codec is what is covered in the "Easymode" sections of this article


This is an "Unofficial" x264 Video For Windows Codec (Black Logo), at the time of this post, that allows for far more settings to be edited via checkboxes and pulldown menubars, but is more difficult to use (these two links are both the same thing):
http://www.digital-digest.com/software/x264_VFW_Codec.html
http://komisar.gin.by/index.html
This unofficial codec is what is covered in the "Hardmode" sections of this article


Basically, you can see that what we want is a "Video For Windows" version, which (thanks to all those great people that have worked on it over time!) has a nice, easy-to-use interface for picking the settings you want to use, without typing in a long line of commands every time. After installing the codec/interface into Windows, it's just a matter of opening whatever game recording program you prefer and selecting it to use it.


Here's how to select it for usage in these three game recording programs:


Recording with x264 and Bandicam
  • Once the codec is installed, run Bandicam and go to the Video tab and click on the Settings button
  • In here, under the Video category, next to Codec, click on the pull-down menu bar (with a little triangle at the end) and choose External Codec, which allows you to use other codecs installed in your system
  • Then, click on the three ellipsis ("...") button and it will let you "Select external video codec"
  • Select the "x264vfw" H.264/MPEG-4 AVC codec from the list and click on the Configure button
  • This is the x264vfw configuration interface and in here are all the settings we will talk about next...

Recording with x264 and Dxtory
  • Once everything is installed (Dxtory requires dotNET 4.0, a download link is on their main Download page), run Dxtory and click on the Movie settings button (which shows a little handy-cam with it's lcd screen hanging out the side)
  • Under the Video Codec category, next to the word Codec, click the pull-down menu bar (with a little triangle at the end) and choose "x264vfw" H.264/MPEG-4 AVC Codec from the list and click on the little pen icon/button to the right, which opens the Configuration dialog box of the codec
  • This is the x264vfw configuration interface and in here are all the settings we will talk about next...

Recording with x264 and MSI Afterburner
  • Once the codec is installed, run MSI Afterburner and click on the Settings button at the bottom 
  • In here, go to the Video Capture tab and under the Video Capture Properties category, click on the pull-down menu bar (with a little triangle at the end) and choose "VFW compression", which allows you to use the other codecs installed in your system
  • Then, click on the three ellipsis ("...") button and under Compressor, click on the pull-down menu bar (with the little black triangle at the end) and choose "x264vfw" H.264/AVC codec from the list and then click on the Configure button to the right
  • This is the x264vfw configuration interface and in here are all the settings we will talk about next...

» Note:  To record with x264/h264/AVC and have it easily-importable and recognized properly in NLE's (Non-Linear video Editing applications, such as Sony's Vegas and Adobe's Premiere lines of products [for example]) without having glitches or corruption or other problems, one setting for MPEG-4 codecs must be changed from the Default Setting right from the start. I created a video example of how to change this setting in Bandicam, Dxtory and MSI's Afterburner and the post with that video can be found here at this blog:


To understand some of the concepts and settings utilized by this codec (and helpful information to know for game recording and video compression in general), a quick word on Bitrate:
[This section is highlighted in green headings for navigation, to assist you whether you are re-reading this article or you feel you know a lot about bitrate in video editing and wish to skip it]

Bitrate (in Layman's Terms)  /start


When talking about game recording, bitrate is an expression of the amount of data we are using to create the recorded file [literally, how many bits of information we are using]. It is usually expressed as how much information per second we are telling the codec to use, to represent the frames that are getting pushed through, and save them to our output video file.
The main thing to remember is that More Bitrate = Bigger Filesize
For example, if we record using a 1MB per second (1MB/s) setting, then after 2 minutes of recording (120s), our recorded file size will be 120MB. At that bitrate, if we recorded for one hour straight (3600 seconds), our recorded file size will be 3600MB (3.6GB).
If we record using a larger amount, let's say 2MB per second (2MB/s), then after 2 minutes of recording (120s), our recorded file size will be 240MB. At that bitrate, if we recorded for one hour straight (3600 seconds), our recorded file size will be 7200MB (7.2GB).
It's that simple. The more bitrate we use, the bigger the recorded file will be. 
For those used to video compression and editing, or even general users of multimedia, you may be more familiar with data rates in the realm of:
MP3 (MPEG-3 audio) song bitrates such as: 128kbps, 192kbps, 320kbps
PSP (PlayStation Portable) video bitrates such as: 768kbps, 1500kbps
DVD (MPEG-2) bitrates such as: 8000kbps, 9800kbps
Blu-Ray/HD bitrates such as: 16000kbps, 25000kbps, 50000kbps
All of these are usually expressed as Megabits/Kilobits/bits over time (in seconds), hence the "ps" at the end.
eg. 16 Mbps = 16,000 kbps = 16,000,000 bps (bits per second)
~bits (lower-case 'b') and Bytes (upper-case 'B') are different~
There are 8 bits (lower-case 'b') in 1 Byte (upper-case 'B')
8 bits in 1 Byte
8000 bits = 8 kilobits ('kilo', which is 1000 Bytes) = 1 KiloByte
8000 kilobits = 8 megabits ('mega', which is 1000 KiloBytes) = 1 MegaByte 
For example, if you were rendering out a movie to upload to YouTube and you chose an output bitrate of "8000 kbps" in your editing/compression application, that is 8 Mbps (bits, lower-case 'b'), which means 8 Megabits per second. Converting that into Bytes (upper-case 'B'), means that video will be running at 1 MBps, which is 1 MegaByte per second (upper-case 'b'). 
At that bitrate (1MB/s), it is the same bitrate as our example above [just under where it says "More Bitrate = Bigger Filesize"] and it will take up roughly 60MB of space on your hard drive every minute of recording. Thus is the interaction between Bitrate and File Size and how to convert between the two. The higher the bitrate setting used, the bigger the output file size will be (the recording). 
There is one other formula to remember: More Bitrate = Better Quality. This formula applies to almost everything digital: video (DVD/BluRay), audio (MP3/MP4), pictures (PNG/JPG), any multimedia that is digital. If you allow/use more bitrate, the picture/music/video is represented better [or to be more precise, closer to the original input] because there are literally more bits used to create the picture/sound/etc.
Quick example - think of a square that is divided up into 9 sections. Now pretend it is trying to 'represent' a painting, any painting you can think of. With only 9 blocks of data available (each one can only be a certain color), then it would look like nothing but some colored blocks and barely look like the painting at all. Now, imagine a square divided up into 80 sections. Pretend it is trying to represent the same painting. Even though it will be 'blocky' still, if each section can be only one color, it will still look 'more like the original' than the 9-sectioned block, right? 
That's the interaction of bitrate and quality.
Which means, for digital compression, it is essentially a 'balancing act' between Quality and File Size, with Bitrate being the tool to measure with. Do you want a high-quality output? Then turn up the bitrate and you'll end up with a large file size. Do you want a small file size? Then lower the bitrate and you'll get lower quality as well. That's the essence of bitrate, in a nutshell.

Bitrate (in Layman's Terms)  /end




The x264 Interface and Configuration in "Easymode"
(The Official interface with red x264 logo)



With the official version of the interface for using the x264 codec ("x264vfw"), there is only one screen, with pulldown bars to adjust most settings.


The Basic Section

To start with, under the Basic category, there is a Preset setting. This is a very handy setting, which makes many of the more obscure/hidden choices within this codec (there are dozens, behind the scenes) easy to configure. By clicking on the question mark near the bottom right corner ("?"), you can see the intimate details of the different Profiles and Presets that can be selected in these pull-down menu bars within this first "Basic" section of the interface.

For game recording, we are more concerned with speed. We don't want the codec to break out it's magnifying glass and scrutinize each frame coming through in strict detail, because that would cause it to slow down, which will cause it to 'lag' behind, not only in the game, but in the recorded file itself as well ("choppy-ness"). 
So, the first thing to do, when using this codec for 'live' game recording, is to change the Preset setting from it's original default of Medium to "Ultrafast". This is the fastest option for this setting.

The other options, such as Superfast, can also be used, but be aware that the more you go down the selections available, the more options are being 'turned on' behind-the-scenes with this codec, and while some of them do help apparent quality of the video, they are geared more towards a slow, steady, long-term compression session (like when you might archive a video/movie) with high-quality settings and slow, scrutinizing analysis of the frames. While not a bad thing, for game recording, we don't want that. We want it to save what we are seeing on the screen fast, to a file. We want speed. Feel free to take a look and learn about the various options the other Presets turn on [many of them are covered in this article further down], but for optimal speed, Ultrafast is the setting to use.

The next pull-down set of options to choose from is Tuning. These are also a set of pre-selected options that, depending on which one you choose here, will enable or disable certain functions in the codec. These choices are helpful is easily fine-tuning the codec to compress a movie/video/clip of each certain type, as it enables options in the background that will help with efficient, detail-oriented compression of that certain type of material. 
For game recording, again we are more concerned with speed. These options, while helpful in a slow, 'leave-it-overnight' compression session of clips or movies of ours we want to save; for live game recording we want to leave it at it's default setting of "None", so that the codec keeps it's magnifying glass put away and doesn't spend any extra time analyzing the frames coming through.

There are two checkboxes below these first two pull-down menus and they are titled "Fast Decode" and "Zero Latency":
Fast Decode turns off a few options [behind the scenes] that help by reducing the processing 'load' of the video stream, such as CABAC and Deblocking [these are explained in more detail further within this article]. These are options that help keep some quality (especially at lower bitrates), but as the name proclaims ["Fast"], for optimal speed of recording we want this option "Checked", to enable it (which will turn off these extra options for now).
Zero Latency also turns off a few options behind the scenes, ones that increase compression time, such as B-frames [explained later in this post] and how far to 'look ahead' at frames coming in, for analysis. Since we want speed for recording our game footage and less analysis [remember, more analysis slows things down], we want this option "Checked" to turn off the extra options that are 'behind-the-scenes' with this setting.

The rest of the settings in this Basic section of the interface need not be adjusted for game recording, but a good Wikipedia page, talking about the various Profiles and Levels and their capabilities can be found here:
http://en.wikipedia.org/wiki/H.264/MPEG-4_AVC



The Rate Control Section

All of the below paragraphs between these red lines of text is the same for both the Official and the Unofficial versions of the x264vfw interface and is only duplicated for ease of reading their respective sections

The main thing in here (and really the only thing to adjust for game recording in this tab) is the longest bar right in the middle, the main datarate control and compression decision to make, is all in that one bar. I cannot even make a suggestion on what to use [ok, I can actually], since it will partially depend on what type of game you are recording, your hardware, what kind of compression you are looking for and many other factors. I will attempt to simplify it however and give a suggestion at the end. (More importantly for game recording, there is one choice that is slightly faster than all the others).

To begin, since we are going to be using x264 for game recording, we cannot use the Leeloo Dallas Multipass Options. Multiple passes (usually 2 or 3) when compressing/archiving video can greatly increase quality, but that means 2-3 times the analysis, double-checking and then further compression by the codec. Literally processing every frame twice (for 2-passes for example). That's great when we want to keep a movie in a small, 'as-good/high-as-we-can-get' quality, forever. When recording games however, we want [one guess] speed. We can only accept "Single pass" as our option, because we just want the codec to see what frames are coming in, take a quick glance, and compress them into our output video file. One pass.

Starting with ABR (Average BitRate), the "bitrate-based" setting, this setting allows you to punch in the average bitrate you want to record at and it attempts to stick with it (it will go lower, but try to never go higher than what you set here). This setting basically tells the codec, "Keep within this bitrate, I don't care if the quality goes down", because as the bitrate ceiling is reached, it will quickly degrade in quality, as more/high movement occurs on the screen/frame. It is good for keeping within a certain file size, if that is your desire, but it also causes a bit of 'lag' and is not seemingly optimized for 'live' capturing.

CQP (Constant Quantizer Parameter) is a setting where you are basically telling the codec, "Keep this level of Quality", and it will do it's best to keep that level of quality for all frames/scenes. However, it will spend more time (and bitrate) on fast-motion/high-action scenes. This is good if you want to keep a movie visible clearly when a lot of things are going on, but it will also result in the higher usage of bitrate, which means larger file sizes. This may sound like a good thing for game recording (and for quality, it is), but the time spent analyzing the faster-motion scenes means that it is actually slowing down (in terms of the codec breaking out it's magnifying glass and scrutinizing the frames that are passing by), which results in 'lag', both in the game and in the recorded video file itself ("choppiness" on playback).

Lossless should attempt to lose no quality, processing only very little and passing all of that nice detail directly to the recorded video output. While sounding good in theory, in practice the utilization of 'lossless' in x264 must be geared towards slow, analyzed video compression and not 'as fast as we can get to avoid lag' "live" game recording, because the result of this setting [at this time] is actually a lossy, compressed (it does not seem to go far beyond 100,000kbps), low bitrate (compared to 'true loss-less compression' which is much higher) capture. It does look decent, but it is also very demanding on the system and causes a large framerate drop for the level of detail that should be coming out of it [and doesn't]. This codec does not seem to be optimized for lossless recording at this time and I do not suggest using the Lossless setting here.

CRF (Constant Rate Factor) is sort of a combination of ABR and CQP. At any given rate factor, a certain bitrate is maintained, and when the motion on the screen goes very high and the bitrate gets too high to represent what is occurring in the frame (or hits the 'ceiling' bitrate that it is restricted to, which can be set), then the quality begins to suffer, as the codec ramps quality down and compresses the fast-moving ("not as easy for the human eye to see") material, until things settle down on the screen and there is slower motion (such as a person walking). Then the quality ramps back up (the bitrate staying with the specified parameters) to keep the apparent quality high to the human eye. This is how CRF is supposed to work, and it seems to do a good job of that. 

Game recording with CRF isn't as cut-and-dry as slow, long-term video compression/archiving with CRF, where it has time to figure out how to compress the fast/blurry scenes and make the slower/clearer scenes look better and change the bitrate/quantization respectively. Quantization can be thought of as 'apparent spoilage' of the material, where a certain amount isn't even noticeable to most humans, and in some fast-moving-high-action scenes, it is even preferable to some eyes. Low bitrate and/or high quantization would both result in loss of detail and 'blurring' or 'smoothing' of video most of the time and can result in compression artifacts such as Macroblocks and Gibbs Effects ("mosquito noise"), as the codec tries to decide 'what to keep' and 'what to lose' (lossy compression). With CRF, it usually will try to keep a slow-moving scene (someone walking, people talking) detailed, without much quantizing ('spoilage'), so that it looks good. It will take high-motion (fast action, fast changing) scenes and quantize them more (smoothing, blurring, 'spoiling') since the human eye won't notice it as much on fast-motion, already-slightly-blurred changes on the screen.

To summarize the differences between the types of data rate controls (the choices in this pulldown bar):
CQP is like stating, "Keep this quality, I don't care how big the bitrate/file gets" and ABR is like stating, "Keep this bitrate/filesize, I don't care how crappy you have to make the video look to stay within that", CRF is more like stating, "Try to keep this bitrate, but change it a little as you need (within a certain amount) and make the video look slightly crappy if you need to as well, but don't let either one get too out of wack". As CRF seemed to also be the one with the least amount of effect of the system in the form of 'lag' [only slightly less than the other ones in testing], being so highly configurable (compared to the other choices, where you can state a bitrate to stay within and it adjusts itself), I suggest using CRF for your x264/AVC game capturing.


Some Data / Bitrates seen with CRF


Since it looks like we are going to be sticking with CRF as our main datarate control factor [it performs slightly better than the other choices in tests], here's an example of some bitrates that can result from using it (in kilobits per second):

Recorded game: Hitman: Absolution Benchmark [grainy, panning, high and low motion areas]
Recorded codec: H.264/AVC (MPEG-4 Part 10) using the x264vfw interface
Recorded settings [some]: No Deblocking, No Max Bitrate, No CABAC, adjusting only CRF

CRF 51  ~1200 kbps (lowest possible quality setting for CRF)
CRF 42  ~1600 kbps
CRF 32  ~6000 kbps
CRF 29  ~8500 kbps
CRF 26  ~14000 kbps
CRF 23  ~22000 kbps (codec default setting for CRF)
CRF 22  ~23000 kbps
CRF 21  ~32000 kbps
CRF 18  ~45000 kbps
CRF 15  ~64000 kbps
CRF 13  ~79000 kbps
CRF 10  ~101,000 kbps
CRF 5    ~119,000 kbps
CRF 1    ~119,000 kbps (highest possible quality setting for CRF)

As you can see, the higher the CRF, the lower the bitrate, so the lower the recorded file size will be (but also the lower the apparent quality). If you desire a higher-quality recording, then a lower CRF is what you want (even with a very low CRF, the bitrate is nowhere near as high as say, a FRAPS or YV12 recording (which can both easily be over 500,000 kbps), but that is the nature of the codec as it tries to slightly compress everything that passes through it.

It can also be seen, that is it is not a strict/hard-and-fast rule of evenly-spaced steps, when it comes to the CRF setting and Bitrate. It cannot be easily calculated that "two steps up in CRF equals this much more bitrate". That is partially the nature of the compression and partially what occurs using CRF as a datarate control. When there is more motion on the screen and as it changes, the datarate will change as well, to try to keep within certain bitrate/quality boundaries and still accurately represent what is occurring on the screen/in the frames. With almost any form of compression/codec, only if the recording was using a completely fixed bitrate, or recording a static picture/view, would it be easy to calculate the adjustments required for a certain bitrate change. It is built into the codecs to adjust themselves as needed.


What CRF setting to use


So, what CRF setting to use? The general rule is: the lower the CRF number, the more bitrate/quality you are allowing it to use, but the bigger the recorded file size will be.

The choice is somewhat objective, as CRF18 may look good to me to record a first-person-shooter game with (the default is CRF23), but you may not like how it looks and want to use CRF10 to keep more details that you want to see.
Someone may not like the compression artifacts they can see when recording Minecraft using CRF18 and want to turn it up to CRF15 so that it is crisper, with less 'mosquito noise' around the edges or corruption that they can see; but you may think it looks good enough at the default of CRF23 and leave it there like someone else may do. You see?
Those are two very different types of video/games mind you, one is dark and grainy and the other is smooth, with hard edges, like animation; but you get the point. It can vary, not only person to person, but also game to game. A few short tests is all it will take however, and you quickly will find a CRF setting that you are happy to record with. You'll find your own balance between, what will essentially be, these considerations:

The higher the bitrate (lower CRF number) the higher the quality and file size will be
The lower the bitrate (higher CRF number) the lower the quality and file size will be

You will eventually decide, with just a couple tests, what is 'good enough' quality for your eyes/uploading, and what CRF to use on that game (remember the 'balancing act' of bitrate vs quality mentioned at the beginning?). You will also find that the quality recorded isn't even kept, after compressing the final/edited video to upload at a video sharing site somewhere.

All of the above paragraphs between these red lines of text is the same for both the Official and the Unofficial versions of the x264vfw interface and is only duplicated for ease of reading their respective sections


That's about it for the nice-and-easy one-panel official graphical interface for the x264vfw H.264/AVC codec. Just a few small changes and a decision of what quality/bitrate you would like to use and it is all ready  for you to record with whatever game recording program you prefer. Have fun with it!




What follows below, is the unofficial, more detailed, three-tabbed version of the x264vfw H.264/AVC codec interface. If this version is not installed on your system, or you do not care to use the more complicated version (you definitely don't have to), then please skip down to the section called 
"Slash-whew-sweating-emoticon". No version is better than the other, by the way, they both utilize the same codec, one just gives you more options to set [and I happened to install it by accident when learning about recording with the x264 codec in the beginning], that's all.





The x264 Interface and Configuration in "Hardmode"
(The Unofficial interface with black x264 logo)



There are three main tabs in this version of the x264 Video For Windows interface: 
Main
Analysis and Encoding
Rate Control and Other

Each Tab has many different settings. We will talk about most of them - but not all of them - we are mainly concerned with the ones that will affect game capturing. I am going to be going through the Tabs in reverse order. This will help to explain the concepts in a more logical order [believe it or not].


Rate Control & Other Tab

In the 'Rate Control and Other' Tab, our main attention need only be on the VBV bitrate/buffer settings in the first Rate Control area. The other settings can be tweaked, but this need be the only one to change - when concerned only with game capturing - as it can dictate the final output filesize of our recording. 

Why all that math and talk about bitrate above? Because of the very first setting we are going to cover, part of the Rate Control section:

The very first setting in this third tab of the x264vfw interface is something called "VBV max bitrate" and it is expressed in "kbit/s". Yay, we just learned about that! It means that the value typed in here will be accepted as "kilobits per second".

What does the "VBV" mean? It stands for "Video Buffering Verifier" and this setting will let us state the overall maximum bitrate we want to restrict the video recording to be, so that it does not go over it. 
Why would we want to do that? Remember that the amount of bitrate (per second) that we are using to record with affects the output video file size. If it is a large amount, the file size will be bigger. If we wanted to restrict the recorded output file size and keep it smaller, we could put an amount in here and it will do it's best to keep within that amount, allowing us to control how much bitrate it uses to record with (and thus control how big the game recording output will be).

So, to use our example just above, if we wanted to restrict the bitrate of the game recording output to be only 1MB/s (1 MegaBytes per second), what do we do? 
That's right, we convert it to lower-case (small-'b') bits, first:
We want 1 MegaByte per second of a bitrate
1 MegaByte is 1000 KiloBytes
since 1 Byte = 8 bits
then 1000 KiloBytes = 8000 kilobits
so to restrict the maximum bitrate allowed for our capture to 1MB/s (taking up 60MB of drive space per minute of recording) we would put "8000" in the 'VBV max bitrate' box.

This box then, gives some control over how much diskspace you want to devote to the game recording file output. If you do not want to restrict the bitrate, do not put anything in these first two boxes on this third tab. 
If you do want to set a restriction here on the bitrate, set a "buffer size" as well (the box just below the first one). This 'buffer size' gives a sliding-window-view of how much the codec will keep track of what is going through and 'watch' to make sure that it stays below the amount in the box above (the 'max bitrate' box). It is used more for compressing for portable devices (which have less RAM and are not able to buffer/keep track of a large amount at a time and therefore must have a restricted amount set here).  

For our purposes - game recording - remember that we must be more concerned with speed (to reduce lag), so in this box, we cannot put a very large amount. Why? Because the larger the buffer, the more that the codec will try to 'store up' in RAM, in order to 'watch it' (process it) and keep track of how big the bitrate gets, in order to keep our restriction of the 'max bitrate' box on it. So, a smaller amount such as "2000" up to maybe "4000" in this 'buffer size' box is a good amount. The overall buffer size also allows room for the video to 'go over slightly and come back down to within' the amount we set, but the larger the amount is, the more data will be held behind for processing that will be done on the game recording and the slower everything will get - and we don't want that - we want as close to 'Lag Free' recording as we can get. A "0" then, would be the optimal amount to put in here, when speed is a concern; but then we would not be able to set a maximum bitrate... more on this later in the article...

That's all we will adjust on the Rate Control & Other tab.


Analysis & Encoding Tab

Here's where we will do the most adjusting, as these settings in here are the ones with the most impact on speed and game recording.

In the Analysis section, the first few adjustments are checkboxes concerning Partitions. Partitions are exactly what they sound like: it is how much the codec will divide up the screen, into sections, so that it can analyze each section and look at what is changing, decide how to compress it, how much to compress it, and so on. For game recording, remember we are more concerned with speed, than messing with how the codec is going to scrutinize the screen, so we are going to uncheck everything in the Partitions sections. Really. We are also going to uncheck the Adaptive DCT setting (which will deselect/lock out/grey out some of the checkboxes for us).

For game recording, we are going to leave a checkmark in the "Fast P-skip" setting.


Frame Types (in Layman's Terms)  /start


There are three main types of frames in video encoding:
I-type frames
P-type frames
B-type frames

I-frames are 'Intra-coded Frames', sometimes called Information Frames, because they hold the most information in a 'group of frames' (which is what makes up a video) and therefore I-frames take up the most space (as far as the amount of space it will take up on your hard drive). They also look the best as well, as they usually use a higher bitrate (which is why they take up the more space). They are utilized by the codec to indicate a large change in what is going on in the video/on the screen, like a sort of 'reset'. They can usually be identified by the screen being 'cleared up' of blocks/glitches from video compression, as the compressed video is played back. 
For instance, if you are running down a dark hallway and turn a corner and there is an explosion, in order to more accurately represent that large change in color and motion in the game recording, the codec will most likely decide to insert at least one I-frame and use a larger amount of data to keep more detail as it tries to represent all of the huge changes going on, on the screen and in the frames.
I-frames also serve some other purposes: the 'intra-coded' means that they are not dependent on other frames around it in the video  [which is a collection of frames], as they hold all the information needed to represent the frame all within itself [remember 'inter-murals' in school involved other schools and 'intra-murals' was contained all within your own school, this is the same thing]. 
They also are usually used as Keyframes. Since I-frames contain all the data need to represent the entire frame, on it's own, video applications can use it as a 'starting point' or 'reference point' ("keying off of it") and allow for more compatible cutting/editing on them and use them to start 'seeking' from, when editing a video. The two main things to remember about I-frames/Keyframes [the same thing] is that (1) I-frames can stand alone all by themselves, as they have all the information needed to display an image/frame, and (2) I-frames allow video editing programs [like Sony's VegasVideo/MovieStudio/Pro and Adobe's Premiere/Pro] to 'reset' and show you where you are starting to edit from - if there are a large number of frames between these I-frames ("keyframes"), it will take longer for the program to read between all of them and then show/allow you to start editing from where you have selected in the video editing program (it will take longer overall to edit by hand as the program starts and stops and you have to wait while editing).  
MJPEG game recordings are made up entirely of I-frames, as every frame in the captured video is it's own JPEG compressed picture and it is the most editing-friendly and compatible codec to capture with, no matter what game capturing program you use. In fact, most video editors can read MJPEG without installing any other codecs on your system 
P-frames are called 'Predicted Frames'. They hold far less information in a group of frames (video) and take up far less space. They actually only hold the differences from the Previous I-frame, that is, they only keep track of the changes on the screen/what has changed on the screen since the last I-frame, which is why they are far smaller than I-frames. 
For instance, if you are playing a character in a game and you are just standing at the mailbox reading an in-game message, there isn't much changing on the screen at all. Perhaps on the side of the mail window, there is only someone running by and that is all that is happening on the screen, nothing else is moving at all. P-frames will only keep track of the movement that person running by and not keep track of the mail message or anything else going on in the frame (things that aren't moving) and it will save only those changes to the game recorded file as P-frames. Hence, they can indeed take up a much smaller amount of space in a game recording, since they are only recording the things that are changing on the screen.
B-frames are 'Bi-directional Frames' (called 'Bi-Predictive Frames'). They hold even less information than P-frames, as they are only keeping track of the differences between any frame before or after itself (even using P-frames that are around it and not needing an entire I-frame to start from). This means they are only only keeping track of very small changes (only one frame ahead or behind) and can therefore be extremely small and take up very little space in a video file. B-frames are one of the reasons why we can archive/compress movies in such small file sizes - only the differences of what is going on, on the screen/in the frames, is being kept track of. Both P and B frames help to save a ton of space when it comes to video compression.

Frame Types (in Layman's Terms)  /end




Why all that talk about the different types of frames in a group of frames (video)? Because now we can easily understand the next few settings in the configuration window. 

Fast P-skip is a setting that allows the codec to look at P-frames and decide if there are enough changes to analyze it further, or 'skip' over them and just process them as they are. This is decided on by the codec when it looks at some other settings on this tab (such as the "ME" settings) but for the most part, it helps speed - our main concern with game recording - more than it helps things to 'look good', so we will leave a checkmark in it, as we briefly stated above.

Max frame refs and Mixed refs are talking about Reference Frames and how many to use in a group of pictures/frames. When encoding, a codec can use frames in front (P-frames) or in front and behind (B-frames) of the current frame it is working with, to keep tracks of what has changed between frames. When encoding for movies, we want as many as possible (or as many as the codec decides to put in) in order have things keep a lot of detail wherever it is needed, especially for scene changes; but for game recording, we want things to run faster (so that the game capture does not lag the game or cause lag within the recorded file ("choppy-ness" in playback), so we actually want the codec to use as little number of Reference Frames as possible.

Remember, every time the codec has to 'think' or analyze the frames more, it will slow down to break out it's magnifying glass and scrutinize, so what we want for this setting is the lowest (one) Reference Frame (put a "1" in the Max frame refs box) so it doesn't have to do much more work than just looking quickly one frame ahead or behind, when when figuring out what has changed. 
Mixed reference frames allow the codec to compress frames based on changes in frames that are also being referenced by other frames [if you can follow that]. Again, the codec is going to have to slow down and take a look, and for game recording, we want things to stay lean and mean (fast), so uncheck this bad boy (if you make the Max frame refs a "1", it will be disabled anyway, since it does not have enough frames to deal with to look around at them anyway).

The next few settings down that all say "ME" refer to Motion Estimation. This is the process of keeping track of where things are moving on the screen and how to deal with compressing the changes it is keeping track of. For game recording, we want the codec to analyze as little as possible (so it doesn't lag behind what is going on, on the screen) so we must choose the lowest settings for all of the "ME" options. 
That means "Diamond" analysis as the ME algorithm (pulldown menu choice), an ME range of the smallest we can choose from ("4") and the lowest amount of Subpixel ME refinement we can have ("0" or no subpixel refinement). Remember, we want the codec to just save what is happening on the screen and analyze as little as possible, so it doesn't fall behind.

With the choices above, Chroma ME and the Psycho-visual Rate Distortion Optimization strength will be locked out/not applicable, which is good for game capturing, as it scrutinizes the frames that are coming from the game even less.

The next couple settings, in the middle column of the tab, that say GOP in them, are talking about the size of the Group Of Pictures we want in our capture. If you can recall the things we have already said about how a codec decides to compress frames, it looks to frames ahead and behind at times, looking at the differences between them, and then decides how to compress the frames based on what it finds (and the settings we set). For movie compression, we want a large number, so that the codec mainly keeps track of the differences only between frames, as much as it can, and a large GOP size allows it a lot of 'room to move' and look around. For game capturing, we don't want it to be quite as big, as the more frames on it's plate it has to deal with, the more it will slow down and take time to decide what to do with them. 

There is also a chance that some video editing applications will not like the large (or any) Groups Of Pictures and may see the video stream as corrupted or give wierd artifacts/effects when it plays it back. This is one reason why using MJPEG as a game recording codec is suggested often, as it's built-in GOP is "1", as evey frame is it's own self-contained 'group' and no editing app has to mess with it or over-analyze it. It is the most compatible, but you can use any codec you wish, but you may have to change the codec's Keyframes or GOP to "1" for compatibility [more on this in future articles!]

Hence, we will set a nice low number [I personally used "50" around the time of this post]. Be aware that if the number is a multiple of the framerate you are recording in, Videophiles may notice the screen 'shifting' or having other odd effects 'in time with' the framerate (eg. if recording at 60fps and the GOP size is 60, some people notice the screen 'correcting' or 'shifting' or 'flickering' every second, as it ends the group or frames and starts a new one every second).
You can use "1", which will make the codec act like MJPEG in a way, but then the codec will not have any room to analyze/compare/compress anything, and the resulting file will require much more bitrate to keep detail (to accurately represent what is going on in each frame)... more on that later in this article...

Weighted P-frames and Max consecutive B-frames is yet more analysis for the codec to do (especially B-frames, which remember, is a frame that 'looks both ways' for changes to frames in front and behind it, therefore can slow down recording a lot). For game capturing, we gots'ta keep setting things for speed and not letting the codec analyze too much. Change the Weighted P-frames to "None" (via a pull down bar) and the B-frames to "0" if they aren't already. This is effectively telling the codec "you ain't got time fo' B-framez" so the rest of the settings that say "B-frames" should be locked/greyed out after that.
[In tests, enabling B-frames would actually crash the x264vfw app]

Under the Encoding section, the last column of this tab, the first section has to do with Deblocking

Deblocking is a way to try and 'hide' areas where the codec has cut out detail or 'let go of data' (lossy) in attempts to conform to the other settings in the codec and/or compress the frames highly. I'm sure almost everyone has seen Compression Artifacts like macroblocks or Gibbs Effects (mosquito noise) in high/over-compressed video and video streams. It's not very pretty and turning on the Deblocking filter is one way of masking these artifacts, so that the video overall looks nicer to the human eye. 

The problem with Deblocking is that the codec will smooth things out, when trying to hide artifacts (especially ones that occur with lower bitrates/quality). It can be forced to keep detail with negative values (useful for movies/games with film grain), but overall it still takes more compression time to use, as the codec stops and analyzes the frames, looking for what to Deblock. So, for game recording for the most part, for optimum speed, Deblocking should be disabled, that is, "No Checkmark" in the box for In-loop deblocking filter (which will disable the two Deblocking settings below it).

[In my own trials/experiments, I found that a low amount of deblocking (as in "1") for these two recording settings is acceptable and helps to hide some otherwise icky-looking compression effects that occur at lower bitrates (bitrate control is covered more in the next section), but for optimum speed (it is not needed at higher bitrates anyway) and especially for older/less powerful systems, turning deblocking off will always speed things up.]

CABAC is extra analysis that can really make a difference in compressing video. Some mobile players cannot even use it though, as it requires more processing power. The optimal setting for this, as it is extra analysis/processing being done - for game recording - would be "Off" (unchecked).

 [I have successfully done many recordings with CABAC on, as it seems to have only a slight effect on recording 'lag' on my system; but for optimum speed/less lag when recording (and at a slight loss for keeping some quality), it should be disabled.. for speed. ]

The only other thing to adjust in this tab, when game recording, is the Trellis setting. Trellis is a way of analyzing and attempting to keep certain detail when compressing, especially at lower bitrates, but as many times already mentioned, we want speed for game recording, so set the Trellis analysis to "Off" (via a pulldown bar). 


Main Tab

All of the below paragraphs between these red lines of text is the same for both the Official and the Unofficial versions of the x264vfw interface and is only duplicated for ease of reading their respective sections

The main thing in here (and really the only thing to adjust for game recording in this tab) is the longest bar right in the middle, the main datarate control and compression decision make, is all in that one bar. I cannot even make a suggestion on what to use [ok, I can actually], since it will partially depend on what type of game you are recording, your hardware, what kind of compression you are looking for and many other factors. I will attempt to simplify it however and give a suggestion at the end. (More importantly for game recording, there is one choice that is slightly faster than all the others).

To begin, since we are going to be using x264 for game recording, we cannot use the Leeloo Dallas Multipass Options. Multiple passes (usually 2 or 3) when compressing/archiving video can greatly increase quality, but that means twice the analysis, double-checking and then further compression by the codec. Literally processing every frame twice (for 2-passes). That's great when we want to keep a movie in a small, 'as-good/high-as-we-can-get' quality, forever. When recording games however, we want [one guess] speed [I just realized this whole article can be it's own drinking game]. We can only accept "Single pass" as our option, because we just want the codec to see what frames are coming in, take a quick glance, and compress them into our output video file. One pass.

Starting with ABR (Average BitRate), the "bitrate-based" setting, this setting allows you to punch in the average bitrate you want to record at and it attempts to stick with it (it will go lower, but try to never go higher than what you set here). This setting basically tells the codec, "Keep within this bitrate, I don't care if the quality goes down", because as the bitrate ceiling is reached, it will quickly degrade in quality, as more/high movement occurs on the screen/frame. It is good for keeping within a certain file size, if that is your desire, but it also causes a bit of 'lag' and is not seemingly optimized for 'live' capturing.

CQP (Constant Quantizer Parameter) is a setting where you are basically telling the codec, "Keep this level of Quality", and it will do it's best to keep that level of quality for all frames/scenes. However, it will spend more time (and bitrate) on fast-motion/high-action scenes. This is good if you want to keep a movie visible clearly when a lot of things are going on [I hate watching a low-quality video stream online and it goes absolutely-stupid blurry at high-motion scenes just because there are many things happening on the screen], but it will also result in the higher usage of bitrate, which means larger file sizes. This may sound like a good thing for game recording (and for quality, it is), but the time spend analyzing the faster-motion scenes means that it is actually slowing down (in terms of the codec breaking out it's magnifying glass and scrutinizing the frames that are passing by), which results in 'lag', both in the game and in the recorded video file itself ("choppiness" on playback).

Lossless should attempt to lose no quality, processing only very little and passing all of that nice detail directly to the recorded video output. While sounding good in theory, in practice the utilization of 'lossless' in x264 must be geared towards slow, analyzed video compression and not 'as fast as we can get to avoid lag' "live" game recording. The result is actually a lossy, compressed (it does not seem to go far beyond 100,000kbps), low bitrate (compared to 'true loss-less compression') capture. It does look decent, but it is also very demanding on the system and causes a large framerate drop for the level of detail that should be coming out of it (and doesn't). This codec does not seem to be optimized for lossless recording at this time and I do not suggest using the Lossless setting here.

CRF (Constant Rate Factor) is sort of a combination of ABR and CQP. At any given rate factor, a certain bitrate is maintained, and when the motion on the screen goes very high and the bitrate gets too high to represent what is occurring in the frame (or hits the 'ceiling' bitrate that it is restricted to, which can be set), then the quality begins to suffer, as the codec ramps it down and compresses the fast-moving ("not as easy for the human eye to see") material, until things settle down on the screen and there is slower motion (such as a person walking). Then the quality ramps back up (the bitrate staying with the specified parameters) to keep the apparent quality high to the human eye. This is how CRF is supposed to work, and it seems to do a good job of that. 

Game recording with CRF isn't as cut-and-dry as slow, long-term video compression with CRF, where it has time to figure out how to compress the fast/blurry scenes and make the slower/clearer scenes look better and change the bitrate/quantization respectively. Quantization can be thought of as 'apparent spoilage' of the material, where a certain amount isn't even noticeable to most humans, and in some fast-moving-high-action scenes, it is even preferable to some eyes. Low bitrate and/or high quantization would both result in loss of detail and 'blurring' or 'smoothing' of video most of the time and can result in compression artifacts such as Macroblocks and Gibbs Effects ("mosquito noise"), as the codec tries to decide 'what to keep' and 'what to lose' (lossy compression). With CRF, it usually will try to keep a slow-moving scene (someone walking, people talking) detailed, without much quantizing ('spoilage'), so that it looks good. It will take high-motion (fast action, fast changing) scenes and quantize them more (smoothing, blurring, 'spoiling') since the human eye won't notice it as much on fast-motion, already-slightly-blurred changes on the screen.

To summarize the differences between the types of data rate controls (the choices in this pulldown bar):
CQP is like stating, "Keep this quality, I don't care how big the bitrate/file gets" and ABR is like stating, "Keep this bitrate/filesize, I don't care how crappy you have to make the video look to stay within that", CRF is more like stating, "Try to keep this bitrate, but change it a little as you need (within a certain amount) and make the video look slightly crappy if you need to as well, but don't let either one get too out of wack". As CRF seemed to also be the one with the least amount of effect of the system in the form of 'lag' [only slightly less than the other ones in testing], being so highly configurable (compared to the other choices, where you can state a bitrate to stay within and it adjusts itself), I suggest using CRF for your x264/AVC game capturing.


Some Data / Bitrates seen with CRF


Since it looks like we are going to be sticking with CRF as our main datarate control factor [it performs slightly better than the other choices in tests], here's an example of some bitrates that can result from using it (in kilobits per second):

Recorded game: Hitman: Absolution Benchmark [grainy, panning, high and low motion areas]
Recorded codec: H.264/AVC (MPEG-4 Part 10) using the x264vfw interface
Recorded settings [some]: No Deblocking, No Max Bitrate, No CABAC, adjusting only CRF

CRF 51  ~1200 kbps (lowest possible quality setting for CRF)
CRF 42  ~1600 kbps
CRF 32  ~6000 kbps
CRF 29  ~8500 kbps
CRF 26  ~14000 kbps
CRF 23  ~22000 kbps (codec default setting for CRF)
CRF 22  ~23000 kbps
CRF 21  ~32000 kbps
CRF 18  ~45000 kbps
CRF 15  ~64000 kbps
CRF 13  ~79000 kbps
CRF 10  ~101,000 kbps
CRF 5    ~119,000 kbps
CRF 1    ~119,000 kbps (highest possible quality setting for CRF)

As you can see, the higher the CRF, the lower the bitrate, so the lower the recorded file size will be (but also the lower the apparent quality). If you desire a higher-quality recording, then a lower CRF is what you want (even with a very low CRF, the bitrate is nowhere near as high as say, a FRAPS or YV12 recording (which can both easily be over 500,000 kbps), but that is the nature of the codec as it tries to slightly compress everything that passes through it.

It can also be seen, that is it is not a strict/hard-and-fast rule of evenly-spaced steps, when it comes to the CRF setting and Bitrate. It cannot be easily calculated that "two steps up in CRF equals this much more bitrate". That is partially the nature of the compression and partially what occurs using CRF as a datarate control. When there is more motion on the screen and as it changes, the datarate will change as well, to try to keep within certain bitrate/quality boundaries and still accurately represent what is occurring on the screen/in the frames. With almost any form of compression/codec, only if the recording was using a completely fixed bitrate, or recording a static picture/view, would it be easy to calculate the adjustments required for a certain bitrate change. It is built into the codecs to adjust themselves as needed.


What CRF setting to use


So, what CRF setting to use? The general rule is: the lower the CRF number, the more bitrate/quality you are allowing it to use, but the bigger the recorded file size will be.

The choice is somewhat objective, as CRF18 may look good to me to record a first-person-shooter game with (the default is CRF23), but you may not like how it looks and want to use CRF10 to keep more details that you want to see.
Someone may not like the compression artifacts they can see when recording Minecraft using CRF18 and want to turn it up to CRF15 so that it is crisper, with less 'mosquito noise' around the edges or corruption that they can see; but you may think it looks good enough at the default of CRF23 and leave it there like someone else may do. You see?
Those are two very different types of video/games mind you, one is dark and grainy and the other is smooth, with hard edges, like animation; but you get the point. It can vary, not only person to person, but also game to game. A few short tests is all it will take however, and you quickly will find a CRF setting that you are happy to record with. You'll find your own balance between, what will essentially be, these considerations:

The higher the bitrate (lower CRF number) the higher the quality and file size will be
The lower the bitrate (higher CRF number) the lower the quality and file size will be

You will eventually decide, with just a couple tests, what is 'good enough' quality for your eyes/uploading, and what CRF to use on that game (remember the 'balancing act' of bitrate vs quality mentioned at the beginning?). You will also find that the quality recorded isn't even kept, after compressing the final/edited video to upload at a video sharing site somewhere.

All of the above paragraphs between these red lines of text is the same for both the Official and the Unofficial versions of the x264vfw interface and is only duplicated for ease of reading their respective sections




Slash-whew-sweating-emoticon


That was a lot to take in at once - but you made it through - and now you are set to record using H.264/AVC, whether you prefer Dxtory, Bandicam or use the completely free-to-use Afterburner from MSI. You are also now knowledgeable in both the Official and an Unofficial version of the x264 Video For Windows graphical way of setting up the codec, to change the settings.

Other than the Rate Control setting above (CRF) which you can adjust and set how you want it, the rest of the settings I have personally tested and found which ones allow for the faster recording performance and which ones negatively affect the speed of recording, utilizing the x264vfw interface (which these programs use to record in H.264/AVC). Turning off/down the ones I have stated throughout the article should give you the fastest/closest to 'lag-free' recording you can get with this codec, while still taking advantage of it's compression capability. [My own Personal Notes/Opinions/Settings are below]

Please note that if your system is older or not as capable (perhaps you have a notebook/laptop which has less capability than a full desktop system with it's own dedicated videocard and soundcard), you may have to do things like lower the recording resolution (or the resolution you are playing at in the game), increase the CRF number (so that it uses slightly less system resources to process the recording and will then use less bitrate and less disk space).

There are many things that can be done to help with speed, make for smoother recording, less lag, etc.
Here is a link to an article I wrote earlier on this blog, with general Tips to help with game recording:
http://gametipsandmore.blogspot.ca/2012/05/tips-for-game-recording-currently-text.html
No matter what game you play or what program you are using to record, these Tips will help you overall, with trying to record your games.

Lastly, please note dear reader, that I am not saying "This codec is the best one to record with" or "use this one only". I am merely showing that it is possible or how to tweak it for quality or file size, as to your own personal tastes. There are many codecs out there to choose from when game recording and although some are more apt for certain types of games than others, overall it is your own choice to do with as you will. 

As always, do some of your own testing, see what works the best for you on your system and get things looking exactly how you want them to look - and have fun with it!






Personal Short Version/Opinion
and 
Settings I use:

H.264/AVC (MPEG-4 Part 10), is not normally a recommended codec to record with, for most people. It does not have the capability of super-high-almost-what-you-see-is-what-you-get quality that other codecs like FRAPS1, YV12, Lagarith or RGB Raw would give you. 
Heck, the highest bitrate possible (using the CRF setting, the fastest of the Rate Control settings in my tests) is only about 100,000kbps. That's still about 2-4x the bitrate of the average BluRay movie mind you, and it seems fine to me, to look at the recorded videos of it.

Even if I had Terabytes of hard drive space to record to, I would still like to record in as small a filesize as I can, attempting at the same time to keep 'good enough' quality. Gone for me, are the days of filling my drives with FRAPS recordings (which have very high quality though) when today, with the more powerful CPUs and GPUs, I can record in a high MJPEG setting (for compatibility with editing apps) or a high MPEG-1 setting (which saves a lot on diskspace as well and has decent compatibility with video editors) - and now I can save even more space and record with MPEG-4 AVC, if desired. 
For instance, on my older system, a dual-core cpu with an NVIDIA GTS250, recording with x264/AVC brings my framerate down by about ten frames per second and sometimes more. On my newer system, a six-core cpu with an AMD HD 6870, recording with x264/AVC, with the same settings, brings my framerate down by only a couple of frames per second with my settings. Wonderful stuff.

I did many, many tests to find out what specific settings made differences (especially with the Unofficial version of the interface) and for the most part, it turned out as intuitive as it seemed at the outset: any setting that increased analysis and processing of the frames slowed things down. 
I had to test it all of course, and I found many settings that didn't make a 'huge' impact on performance, and with more powerful hardware these are able to be used and enjoyed, helping to compress the frames even further, keeping decent detail and resulting in a very small recorded file size. As you can adjust/optimize settings far more in the Unofficial interface (at least a bit easier/graphically), I tend to use that one more than the Official version; but in essence, both are the same thing. Here's some of my findings [my settings are in these square brackets]:

For the Official ("Simple", red x264 logo) interface:

I found that, on a more modern desktop system, the Preset can be turned 'down' a bit (the amount of analysis going up a bit), to Superfast or even sometimes Veryfast, since many of the hard-cpu-hitting options like Trellis and larger Motion Estimations don't kick in at these settings. Anything lower/slower than that makes it lag behind too much as it processes frames.
[I usually use Superfast]

While Zero Latency is pretty much required for live game recording, Fast Decode is not entirely needed, especially if you have a faster rig to power the game and record at the same time on. My computer isn't even top of the line but it can handle NOT using Fast Decode (which disables CABAC, and a couple other things). CABAC can have quite a bit of an effect on quality, and since it doesn't seem to have a huge performance hit, I use it. Fast Decode also disables Deblocking and that leads into my next setting...
[I always use Zero Latency]
[Even though it helps with speed, most of the time I don't use Fast Decode]

The only other thing I change in the interface is deciding on the main Rate Control type and Rate Factor. In my own tests, CRF seemed to have the least amount of effect on framerate out of them all, and I fluctuate on what CRF I use, depending on the game I'm recording and what the recording is for. If I'm recording a high-quality test, of course it will be a low CRF, but it need not be a 1 to look good. I am quite satisfied with a CRF of 15-18 for decent-quality captures. The bitrate of a 1080p capture at that CRF ran about 45,000kbps, which is a high-quality BluRay movie's bitrate. For just average gameplay for fun, I usually run a CRF of 21 up to 23. The quality is 'good enough' then, especially with a low Deblocking setting helping out (the Preset of Superfast leaves Deblocking "On" at the default 0:0 setting). I don't recommend going below CRF23, with deblocking or not, as it just becomes too 'garble-y', to use a more technical term... Web broswer/low-motion games or hard-edged/animated games that look like Minecraft also need a higher bitrate (lower CRF), so that the hard edges aren't messed up and the smooth/flat areas aren't too compressed and get 'blocky'.
[I use a CRF in the range of 18 for higher-quality capturing, going up to 23 if disk space is low, and don't suggest higher than 23, so that quality doesn't suffer too much] 

Nothing else need be adjusted in the Official x264/red logo interface, but I sometimes add "--keyint 50" or "--keyint 20" into the area (no quotation marks) 'for advanced users' at the bottom. This is a command that will force a keyframe/I-frame after that many frames have gone by. This helps to keep the seeking time low when editing and allows for closer cuts when not recompressing the clip when editing. If you having a problem with editing MPEG-4/AVC files, try using a --keyint setting of "1", so that every frame is an editable/seekable keyframe. A higher GOP is better for file size but can take longer to seek when editing and Vegas or Premiere might have 'trails' or 'corruption' so might need a GOP setting of 1, see here for more info). You may want to increase the bitrate to keep quality then though, as the codec can't really compress/do much of it's 'magic' then, without room to work between frames. As stated above, the optimal codec for editing is MJPEG, but many do not like the quality (try turning it up to 100%).
Leaving it blank and not using any "--keyint" command will allow the codec to use many more frames (250 is the default I believe) for compression and the resulting recorded file size will be very small.

For the Unofficial ("Complicated", black x264 logo) interface:

The first tab is just the Rate Control setting. CRF seemed to have the least effect on framerate.
I also sometimes add the --keyint setting to the command box at the bottom, just like I talked about a couple paragraphs up.
[I use a CRF in the range of 18 or less for higher-quality capturing, going up to 21 or if disk space is low, up to 23 at the most and don't suggest higher than 23, so that quality doesn't suffer too much]

For the second tab, there is a lot to change, but I mainly talked about why up in the article, so I'll just state what I like to use here, after this paragraph on Deblocking:

Deblocking can be forced to 'leave things in'/keep details, by using a negative value, especially when dealing with effects such as film grain and minute details and noise; but for the most part deblocking will 'smooth things out', especially trying to hide where a lot of detail has been lost, where compression artifacts like Macroblocks and Gibbs Effects (mosquito noise) around edges would show up. I don't recommend putting deblocking up too high though, as it can really make things downright blurry. With deblocking completely off, all of the detail is kept, but many things can look like they have a 'wood grain' pattern on them. With a high bitrate/low CRF, there is no need for deblocking at all anyway, as there would be a lot less compression/artifacting. When recording a high-quality test or something like that, I leave it off. When just recording random gameplay or myself doing something, I turn it on at a low setting, to help with keeping things looking slightly more 'clean'.
[I use Deblocking at 1:0 (CUDA default) up to 3:0, when I use it, but any more and things start to look 'too smoothed out' to me]

For this second tab then:
[No Partitions (all unchecked, Fast P-skip checked, 1 Max reference frame, ME algorithm on Diamond, ME range of 4, Subpixel ME refinement of 0 or 1 (any higher causes more lag), GOP size of 20 or 50 (this is the --keyint setting from above, higher is better for file size but can take longer to seek when editing and Vegas or Premiere might have 'trails' or 'corruption' so might need a GOP setting of 1, see here for more info), No Weighted P-frames, Max consecutive B-frames to 0 (none), In-loop deblocking filter unchecked (off) or checked (on) with a 1:1 deblock, CABAC on (it's slower for less-capable systems), DCT decimation checked (on), Trellis set to Off]

In the third and last tab of the Unofficial (black x264 logo) interface, I sometimes set a VBV max bitrate of about 40000 (kbps) just to keep the filesize down, especially when I'm going to edit and recompress for uploading somewhere (which usually gets rendered out for upload at that bitrate or lower anyway), it looks 'good enough' for me (this also requires a buffer of some sort, to allow it to check the bitrate as it goes up and down, so I put in 4000 or 10000 if I set a VBV max bitrate)
[No maxbitrate usually, unless you want to save some harddrive space (eg. 50000k)]

That's all I change for both of those interfaces, from their defaults, as of the time of this post. 
Keep in mind that as games are updated and Re-Optimized, drivers are Updated, and new hardware is released/purchased, I may change these settings (and I advise you to try some tests too, if you wish, when things change). Also, I am not a stickler for the utmost quality and don't mind things getting compressed a bit, so you might want tighter/higher settings. 
Mainly, I hope that this info helps some of you to either record with less lag (many people report that with more optimized MPEG-4/AVC settings they have less apparent lag) or just learned some terms and concepts that may help when editing and compressing your own videos/shows/etc and overall just help you record better/smoother. I actually enjoyed testing all of these things out and learning about it and I hope it helps anyone that is recording their gameplay adventures.


See you in the games!


4 comments:

  1. This is an amazing post but im having some issues with playback in sony vegas and premiere, its all corrupted and impossible to edit. so i changed the gop min to 0 and max to 1 as advised and it works but the files are huge now and it completely defeats the purpose of using this over lagarith. So what else can i do to fix it? i have tried fixing it for almost 3 months now and have no idea what i can do.

    The file size/quality is incredible for my 2560x1440 recordings. But since i can't edit it i have to re encode every single clip into lagarith or another lossless coded. Please help :)

    ReplyDelete
  2. @Anonymous said...
    This is an amazing post...

    Reply:
    Thanks! I'm glad that it has helped/edified anyone at all, that's great.

    To start, you are at least able to open it in Vegas/Premiere, so it isn't a codec problem, correct? Just making sure (you said "since I can't edit it").

    It seems you have run into the exact problem I did when I started recording with MPEG-4/AVC trying to keep the quality with a GOp/keyframeinterval of "1" - the problem is no longer trying to maintain Quality, it is with the increased file sizes...

    You are correct, adjusting the GOP/keyframeinterval to be 1, it will open/be editable in Vegas and Premiere, but the file size will go up... You are also correct in your inference that it basically takes away the function of the H.264/AVC compression algorithms (keeping only the differences between the frames), since then there are no differences between the frames - they are all then Full, Independent frames - it basically becomes a high quality MJPEG clip! There are a bunch of options over MJPEG however, as MJPEG has no anti-blocking capability and other artifact-hiding tricks up it's sleeves:

    Some of the things you can do [I may write an entire post on this], if you have to record with 1GOP, is to cap the bitrate usage and enable Deblocking and some other features. Since your system can handle 1440p recording with h.264/AVC, you should be able to enable a few of these:

    - cap the bitrate (this can be done either by setting a buffer and max bitrate
    [try starting at a buffersize of 5000 or 10000 and a max bitrate of 50000k for 1440p and see how it looks to you, and adjust up or down from there]
    or increasing the CRF/CQP [whatever you prefer to use], which will allow a higher quantizer and lower the file size output [stopping around a number that starts to look 'not good enough' to you]

    - enable Deblocking (turning this on will add some smoothing at the block edges to try to hide the macroblocking artifacts that may start to show up. Even if there is no 'blocky' artifacting problems with the recordings, enabling it will also lower the output filesize a bit by it's byproduct of slight smoothing where the blocks meet

    I wrote another article on maintaining Quality and some tests that I did with recording and 2K output here:
    http://gametipsandmore.blogspot.ca/2013/07/quality-test-h264avc-game-recording-2k.html
    and it will show some screenshots of the Max Bitrate (of 50000k), etc and how to set those..settings.

    A couple other suggestions (also shown in that article) would be to make sure that the Motion Estimation ("ME" settings) are low, which will make some things happen a little faster with the codec and allow it to soften things a bit behind-the-scenes, which will reduce the amount of small detail/data that the codec will try to hang onto - and it shouldn't look much worse to human eyes (eg. ME algorithm = diamond, Subpixel ME = 1 (fastest)).

    So, capping the bitrate, enabling Deblocking, adjusting the Motion Estimation are just a few things you can try to start off with, to lower the overall filesizes, when recording with h.264/AVC, all while keeping the GOp at 1 to be able to edit it. Try it out and come back if you need more ideas :)

    ReplyDelete
  3. Could you be so kind to show what your render settings was in sony vegas?

    ReplyDelete
    Replies
    1. Sorry I didn't mention that... I usually don't actually, as I change my settings a lot, based on a few concepts, such as:

      » The resolution I'm uploading in

      If I'm not going to be uploading into 1080p, but going to use 720p, I choose lower bitrates, as the lower resolution doesn't need as much data-per-second to represent the material (it can do a 'good enough job' with a lower setting), so the resolution will be 1280x720 ("720p HD") and the bitrate consideration leads into the next thing...

      » The bitrate used will be based on the material

      If I have recorded mostly a Desktop Tutorial, with many still areas and not a lot of detail (I mean textures, colours, etc), I will use a lower bitrate, as it can still be represented well, with something like 8Mbps, whereas
      If I have recorded a game with a lot of motion going on, such as a First Person Shooter (Counter-Strike, Battlefield, Call Of Duty, etc), I will use a higher bitrate, as it needs the higher data-per-second to more properly represent the material (so it doesn't lose a lot of detail), using something like 15-25Mbps or even higher and the compression brings up another concept...

      » The upload time and the compression quality

      If I have a longer video and want to reduce the upload time, I would lower the quality a bit by choosing a higher compression setting (a lower bitrate, which means more compression is done). I would sacrifice a bit of quality, while still trying to let it look good, in order to save space and upload time. This leads into the last idea of...

      » Personal Preference

      Not only are recordings different (source material or project targets) but everyone is different with how they want to do their video. I might not mind uploading a 720p video on a "just for fun" gameplay recording, but another person might always want to upload at 1080p or higher. Everyone is different, so I usually don't say to people "use this, it's the best", as there are always different uses for different settings, and different reasons to change things.

      [On a personal note, I don't like using the word "best" as things are changing and improving all the time as people try new things and technology changes, and I don't like saying "this is it, it will never get better". And I don't like telling people what to do (I don't like being told what to do either, heh) and so I always present my information as suggestions, that they can try; but I always want people to do some little tests on their own and find what they will like to use for their own projects :]

      Because yourself and others have been asking, I think I would like to start some articles on perhaps "Examples of Rendering Settings for Vegas", where I show some settings that I would use (which for me, depend on what I was recording (a game with a lot of detail or not) and what I wanted to upload in (lower quality and lower resolution, or higher resolution and keeping higher quality).... I may do just that soon!

      I hope these few ideas show how different it can be from person to person and project to project, but thanks to your query, I WILL present some rendering settings in the future :)

      Delete