Prevent Audio Desynchronization when recording via GeForce Experience

If you’ve ever tried to record gameplay footage using GeForce Experience, you will notice that the audio gradually desynchronizes over time. When the footage is < 5 minutes, you wouldn’t notice it but when it’s above 20 minutes, the audio delayed could be as much as 2 seconds. (Technically, it’s not the audio that’s delayed, rather, it’s the video not being synced properly.)

The following is an example of me trying to record gameplay at 60 fps but my machine can only produce 30-50 fps. The original recording was over 45 minutes long and I had to edit the footage by cutting and resyncing the audio multiple times. It was a pain to edit.

Valheim running at 30-50 fps while recording at 60 fps

Most common solution that you will find online is to limit your recording to 5-10 minutes per clip. This way, the desynchronization resets every start of a new clip. However, I found a way to keep the audio and video in sync. The following video had an original footage of 20+ minutes but I didn’t need to edit much to resync the audio and video.

Valheim captured and recorded at 30 fps

The main cause of the issue is simple: video games are rendered at a variable frame rate or frames per second (fps). Videos are played back in constant frame rate such as 29.97 fps (NTSC) or 25 fps (PAL) amongst many other standards. To minimize, if not eliminate, audio desynchronization, it is best to match the frame rate of the game and the recording frame rate. GeForce Experience has 2 recording frame rate options: 30 and 60 fps.

GeForce Experience recording settings

For the game, you must set a limit to your frame rate.

Frame Rate option in Overwatch

But note: limiting the frame rate in the game simply puts a maximum. It doesn’t necessarily mean that it will hit that frame rate. Meaning, if your game is running below the desired frame rate, we need to tweak some settings to increase it. The frame rate is primarily dependent on 2 things:

  • Graphics Settings
  • Graphics Processing Unit (GPU) or Video Card

Your graphics settings will determine the computation power needed while your GPU will perform those computations. The higher the quality of your graphics settings, the more powerful GPU you’ll need. If money isn’t a problem, then simply buy a more powerful graphics card and your problems would be solved – no need to read the rest of this guide. Who said money can’t solve your problems? But if you’re not related to Richie Rich, then please continue reading.

Lowering the following graphics options would have the highest impact on improving your frame rate.

  • Screen Resolution
  • Shadow Quality
  • Reflections/Refractions
  • Particles
  • Post-Processing (Bloom, HDR, Depth of Field, etc)

Screen Resolution setting is pretty straight forward – lower resolution means less pixels to show, less work for the GPU, more frame rate. In theory, 4K resolution has 4 times the number of pixels compared to 1080p. Dropping your screen resolution should be able to increase your frame rate by 4x but there are other factors that prevent that much gain. And those factors are quite deep in the details of GPU architecture which I do not plan to cover here. Nonetheless, the effect is immediately noticeable.

Shadow Quality affects how sharp the shadows are. Shadows are one of those graphics techniques that add a lot of immersion in video games simply because our brain intuitively uses shadows to determine a few things like depth or distance. Unfortunately in video games, shadows are an extra render pass. This means, that the scene is rendered once per shadow casting light source to determine which objects occlude light. The option to lower the quality of these render passes are the shadow quality settings. You will see significant frame rate boosts when lowering shadow quality or turning off shadows completely in scenes where shadow casting light sources exist.

Comparison of Shadow Quality Settings (Low, Medium, High, Ultra) in Overwatch

Side note: do not confuse Shadows with Lighting. Lighting is when a light source illuminates objects. When a light source is blocked by an opaque object, a shadow is cast behind. In reality, these 2 work in conjunction. In video games, lighting is a separate computation from shadowing. In fact, there are advanced techniques in lighting (ex; Deferred Lighting) that allows hundreds of lights in the frame and it would barely affect the frame rate. Shadows, on the other hand, are normally limited to 1-2 light sources since they’re computationally expensive. Games cheat on this limitation by putting a maximum distance to the shadows that a light source can produce.

Reflections and Refractions are uncommon in video games for the simple reason that they’re computationally expensive with little benefit to the overall experience. Similar to shadows, these features add additional render passes but unlike shadows, it has less effect on immersion. These options are normally a toggle but sometimes they can be a resolution quality. In order to see the effect, you need to look at a reflective or refractive surface which is, again, not commonly in front of you in many scenes.

Particles are a visual effect to denote fluid-like motion such as the flames in a campfire, explosions of a dynamite, smoke from a burning bush, sparks for an exposed live wire, bubbles when exhaling underwater, exhaust of jet engines, etc. The applications are endless and they add a lot of realism and signaling in video games. GPUs manufactured in the last 5 years can easily render thousands to tens of thousands of particles every frame. But just in case your game renders millions of particles or your GPU is a bit on the aging side, lowering the particle quality or count could have a small boost in your frame rate.

Post-Processing effects are very popular nowadays. They add visual effects that mimic how the eye dilates depending on the amount of light present (HDR), mimic how cameras feather out strong lights (Bloom), mimic how the eyes focus on objects of varying distances (Depth of Field), etc. These techniques are pretty standard nowadays and can easily be added in video games. However, they are an additional computation which could lower your frame rate. Some effects cost more than others and a bit of experimentation is necessary to measure your milage.

It is possible for frame rates to drop for other reasons. For example, (AI) if you have hundreds of pigs roaming and path-finding on a 3D map or (Physics) thousands of balls bouncing off one another in a confined space; then you would see significant drops in frame rate. Ideally, the game wouldn’t let you reach that point unless you did something crazy.

Unlimited lox, wolf, boar, and chicken farm in Valheim

Let’s do a recap on how to record your footage with minimal audio desynchronization:

  1. Leave your frame rate uncapped (for now)
  2. Lower your graphics settings until you get a higher than desired frame rate. For example, you’re aiming for 60 fps, lower your graphics options until you reach 70 fps.
  3. Cap your frame rate; either to 30 or 60.
  4. Set your GeForce Experience recording target frame rate to your desired frame rate.
  5. Double check that your game can almost constantly run in your desired frame rate.
  6. Hit record!

I hope this helps and may you produce high quality epic footage of your gaming experiences!

Did you notice a mistake in my post? Or simply have the urge to chat with me? Either way, feel free to reach out in this twitter thread.


New Workstation has arrived!

My current gaming desktop/workstation turns 4 this year. I bought it when I was still in Florida in late 2018. And since I’m applying for remote work with the intention of moving, I decided that I needed a new mobile machine for my game development and game playing needs. Instead of getting a gaming laptop, I settled for a creator laptop. It’s designed for multimedia content creators such as video editors or graphics artists. It’s powerful enough to be a gaming machine without the fluff of RGB keyboard or the bulky exterior. I wanted something sleek and simple on the outside but packs a punch on the inside. Hence, I got an ASUS Vivobook 16X OLED straight from the ASUS store.

New ASUS laptop on top of Old ASUS laptop

Here’s a quick rundown of my tech specs:

  • Win 11 Home
  • AMD Ryzen 7 5800H
  • 16GB RAM
  • NVIDIA GeForce RTX 3050 Ti Laptop GPU
  • 16.0-inch, 4K (3840 x 2400) OLED Display

As with all new machines, I needed to install my tools and toys and I keep forgetting what they were.

The following is a list of software for future reference. I hope it helps you too in one way or another.

I’ll update the list as needed.


Which came first; Awake(), OnEnable(), or Start()?

All Unity developers know that Awake() happens before OnEnable(), and both get called before Start(). Just to make sure, we visit the documentation to help us sleep at night. We never bothered to test it and simply assumed that all Awake() gets called before all OnEnable() before all Start(). But let’s say, out of boredom, we did test it.

Imagine 2 scripts, ScriptA and ScriptB, that do exactly the same thing: print out their game object’s name, script name, and the function being called (Awake(), OnEnable(), or Start()). Let’s have 2 game objects; Obj1, having both scripts, and Obj2, with just ScriptB. We run in the editor and see the logs to confirm what we already know.

Awake(), OnEnable(), and Start()

Surprise! Awake() can get called after OnEnable(). This may not look like a big deal until you start debugging with the wrong assumptions.

Let’s have another example; a single game object with a Rigidbody and the following 2 behaviors.

With the assumption that all Awake() gets called before all OnEnable(), then there should be no error. However, on the slight misfortune that ScriptD gets executed before ScriptC, ScriptD::OnEnable() will call ScriptC::Jump() before ScriptC gets to keep a reference to its Rigidbody, via ScriptC::Awake(). This results to a confusing Null Reference Exception that could haunt you for hours.

What the documentation doesn’t say is that Awake() is called before OnEnable() per behavior. All Start() occur after all Awake()/OnEnable(). We cannot determine which script gets executed first unless you manually specify that in the Script Execution Order Settings.

With that in mind, I pray to the Unity Gods that your bugs be all gone and your games be fun.

For those of you who are too lazy to type, feel free to download the source codes below.

3.2 KiB

P.S. yes, you can color your Debug Log messages 😀


Dragon(fruit) Jam by the Game Dev Network

Last week, I joined a 10-day game jam that ended last September 10. I’m proud of the output despite the time and asset constraints (all assets, including sfx and bgm, must be created within the duration of the jam).

As keeping with my personal rules of doing game jams; I worked with people I have not worked with before and I explored a mechanic that I have not done before.

I worked with Brandon Bittner, a multi-talented artist and co-worker at Autodesk Inc, and with Steve Biggs, a programmer and 3D artist working at Bettis Atomic Power Laboratory. Together, we made…


A resource-management boat-skirmish simulator

You are the captain of your boat and you assign your crew to certain stations. There are 3 types of stations: Sails, Wheel, and Cannons. Putting more crew on the Sails make you move forward faster. Putting more people on the Wheel allow you to turn faster. Putting more people on the Cannons, there are 4 of them, makes them aim and reload faster. Destroying enemy boats drop a survivor which you can rescue (by ramming into them).

You can download and play the game here. Feel free to leave your honest feedback and rate the game (I need the ratings to win this game jam so if you would be so kind. Rating ends on Sept 17. Thank you in advance!)

I have 2 technical experiments here:

  • a dynamic and loosely-coupled resource system
  • procedurally-generated sea

Resource System

For this game, the resource are the crew members that you assign. The classes in this system are as follows:

  • ResourceManager which holds the count of available resources
  • ResourceConsumer which take resource from the manager and can return them
  • ResourceManagerUI which links the ResourceManager to UI elements
  • ResourceConsumerUI which links the ResourceConsumer to UI elements and hotkeys

The consumers should be able to send an OnLevelUp or OnLevelDown message to listeners. I’d prefer to use Unity3D C# SendMessage() functionality to prevent any tight-coupling between the resource system and the gameplay.


I followed the tutorials on procedurally generating a grid here and generating a height map from noise here. For Deckhand, I needed to fulfill the following criteria:

  • Appears to be endless
  • Mimic waves in water

To make it appear endless, I needed the sea to follow the player (which is followed by the camera) but should have the illusion that it’s not moving. Meaning, if there was a lump at (2, 3) of the grid and I moved the grid (1, 1), the lump should now be at (1, 2) of the grid. Unfortunately, simply using Mathf.PerlinNoise(x – pos.x, y – pos.y) didn’t result to what I wanted.

Water Test 1

Water Test 2 – Unscaled

Notice on both Tests above that there’s a certain position in which the sea flattens out. The certain position is when x or y is a whole number.

What I’ve discovered with Mathf.PerlinNoise(x, y) of Unity3D is that it’s periodic. Meaning, (0.5, 0) and (1.5, 0) as parameters return the same value. This actually makes sense because Perlin Noise is naturally periodic. What I didn’t understand was that when x or y approaches 0 or 1, the returned value approaches 0 as well.

I had to play around with the values I supply to the noise generator. The biggest takeaway was that I needed to use the scale of the grid as a factor. See the downloadable example later on.

For the second criteria of mimicking waves, the Red Blob Games tutorial (linked above) demonstrated how to do just that. See the section on Elevation: Frequency and Octaves. Lastly, I added a high-frequency time-based offset to show waves flowing even when the player is stationary. Initially, I didn’t want to put a texture on the sea but it’s very difficult to sea if the player is turning. I created a texture with a few white dots (stars) with clouds, as if reflecting the sky, which gave the player a sense of reference.

Here’s a unity package of the sea:

Procedural Sea
Procedural Sea
1.9 MiB


I’ve successfully met my objectives for this game jam. I am particularly happy about the sea and the resource management. The graphics look awesome, thanks to Steve. I loved the user interface and icons that Brandon put in.

My last minute sound effects were hilarious. For the wood creaking sound (when speeding up or turning) came from a wooden table at home. I nudged it and it made the perfect wood creaking sound. For the others, I mouthed the sound effects. And yes, that was me doing weeeee! and nooooo!

Boss Fight

I do understand that it’s not perfect. Play testing the game with my coworkers revealed that space + 1234 doesn’t make sense. In fact, it’s better to just hit 0 and reassign the crew rather than unassign them one by one. The biggest frustrations came from where’s my crew? when trying to assign crew. I failed to inform the user of the UI on the upper left indicating the health and available crew. The icons of the stations being at the edge of the screen didn’t help make the user make intelligent decisions since their eyes are normally at the center. The suggestion was to make them smaller and float them above the actual cannons that they represent.

I originally wanted to let go of this project but Brandon suggested making it into a mobile space game. Think of Artemis for mobile. What do you think?


Experiment at the GGJ17

My biggest takeaway in my GGJ17 experience was implementing a ‘sprite sheet animation‘-selector based on a given angle. This is only applicable to 2D games with more than 2 views per animation. If you haven’t read about our game from my last post, you can get more info here.

Here are the requirements:

  • Surfers could go be going to the right, down right, down, down left, or left
  • Dolphins could go left, up left, up, up right, right
  • Tiki could point left, up left, up, up right, and right

Each of these directions is a sprite sheet/sequence animation. The solution:

  • Each direction is a game object that animates through the sprite sheet/sequence
  • These direction game objects are children of a selector game object
  • The criteria of the selector is based on an absolute direction (to a target object, mouse, etc)
  • The selector activates the direction closest to the criteria and deactivates all others

If the directions weren’t animations (like the Tiki pointing), then a simple sprite selector based on angle would be enough.

A public repo of the project can also be found here. Here’s a UnityPackage of the demo:

AnimationAngleSelector » Post
1.5 MiB


My 5th GGJ!

Another Global Game Jam completed! Woohoo! Congrats to all who participated and successfully completed a working prototype in just 48 hours! Whether you worked solo, in a team of strangers, or a team you’ve worked with before; the experience gives valuable lessons that would hopefully push you closer to that game dev career you’ve always dreamed of. Again, CONGRATULATIONS to ALL!

This is my 3rd GGJ location – from Manila to Seattle and, now, Pittsburgh. This one is hosted by Pittsburgh IGDA (PIGDA). Every game jam inspires me to leave my comfort zone in order to experiment and explore. It is invigorating to be connected with passionate Game Developers of different professions, varying age groups, interesting backgrounds, and diverse interests.

For those who missed it, the keynote can be found here. The end of the video also shows the theme: WAVES! I like this theme. It’s very open – a lot of possibilities and interpretations. You can basically make any game and insert waves somehow.

My Team

Keeping my primary objective of working with people I’ve never worked with before; I teamed up with 4 people I met at the jam to form Team Surfer Babies!

Team Surfer Babies

(left to right) Sean, Addie, Francisco, Richard, and me

Together, we created Tiki Vs The Surfers! It’s a defense game where you use dolphins, mermaids, and whales to fend off surfers who want to defile your peaceful island!

Richard and Sean are the 2 amazing artists. Sean made the surfers (baby and tourist) and the Tiki while Richard made the weapons (dolphin, mermaid, whale), background, and UI. They collaborated to make the intro slideshow which explained why the Tiki was upset with the surfers. Addie was our talented musician and foley artist. She created every audio in the game. She even voice-acted for the dolphin and whale! Francisco and Francis (me) were the programmers implementing the different behaviors and integrating everyone’s work.

Here are some more pictures of the event.

Nova Place, venue for Pittsburgh GGJ 17

The cake was not a lie!

Richard playtesting the game

Most hardcore electric guitar ever! c/o Addie the Musician/Foley Artist

See my next post to read about my technical learnings from our game.


Recommended Facebook Privacy Settings

This post is not related to Game Development or Programming but I find it relevant to spread this. Nonetheless, I hope you find this useful for your online reputation.

Have you ever had a stranger like a picture on your Facebook? Or someone you don’t know suddenly commenting on your status? Or did you know that whenever you get tagged, it shows up on your wall without your permission? But most importantly, do you want to have more control over the privacy of your Facebook profile? Continue reading


Are arrays pointers?

New C/C++ programmers often have this confusion that all arrays are pointers. Well… they’re kinda, sorta, but not really. Let’s consider the following code:

Try compiling the code snippets using Coding Ground.

The pointer p points to the first character of arr. Remember that the index operator ([]) is automatically converted to addition then dereference.

Note: 2[p] would be evaluated to *(2 + p) giving us the same result as p[2]. It’s bad practice to use that form even if it works.

And dereferencing arr is similar to getting the first element

So when do arrays and pointers differ?

Continue reading


Setting Up SDL2 using XCode 6 in OSX

My first semester in DigiPen has just ended and I get to make tutorials again! One of my classes was Game Engine Fundamentals and I used SDL2 to create my game engine. SDL2 is a cross-platform C library to create games on. My favorite tutorial to learn SDL is Lazy Foo. He gives a detailed tutorial in multiple aspects of the library. However, setting up on Mac is not yet done. Hence, I have created this post.

Continue reading