01/29/23

Prevent Audio Desynchronization when recording via GeForce Experience

If you’ve ever tried to record gameplay footage using GeForce Experience, you will notice that the audio gradually desynchronizes over time. When the footage is < 5 minutes, you wouldn’t notice it but when it’s above 20 minutes, the audio delayed could be as much as 2 seconds. (Technically, it’s not the audio that’s delayed, rather, it’s the video not being synced properly.)

The following is an example of me trying to record gameplay at 60 fps but my machine can only produce 30-50 fps. The original recording was over 45 minutes long and I had to edit the footage by cutting and resyncing the audio multiple times. It was a pain to edit.

Valheim running at 30-50 fps while recording at 60 fps

Most common solution that you will find online is to limit your recording to 5-10 minutes per clip. This way, the desynchronization resets every start of a new clip. However, I found a way to keep the audio and video in sync. The following video had an original footage of 20+ minutes but I didn’t need to edit much to resync the audio and video.

Valheim captured and recorded at 30 fps

The main cause of the issue is simple: video games are rendered at a variable frame rate or frames per second (fps). Videos are played back in constant frame rate such as 29.97 fps (NTSC) or 25 fps (PAL) amongst many other standards. To minimize, if not eliminate, audio desynchronization, it is best to match the frame rate of the game and the recording frame rate. GeForce Experience has 2 recording frame rate options: 30 and 60 fps.

GeForce Experience recording settings

For the game, you must set a limit to your frame rate.

Frame Rate option in Overwatch

But note: limiting the frame rate in the game simply puts a maximum. It doesn’t necessarily mean that it will hit that frame rate. Meaning, if your game is running below the desired frame rate, we need to tweak some settings to increase it. The frame rate is primarily dependent on 2 things:

  • Graphics Settings
  • Graphics Processing Unit (GPU) or Video Card

Your graphics settings will determine the computation power needed while your GPU will perform those computations. The higher the quality of your graphics settings, the more powerful GPU you’ll need. If money isn’t a problem, then simply buy a more powerful graphics card and your problems would be solved – no need to read the rest of this guide. Who said money can’t solve your problems? But if you’re not related to Richie Rich, then please continue reading.

Lowering the following graphics options would have the highest impact on improving your frame rate.

  • Screen Resolution
  • Shadow Quality
  • Reflections/Refractions
  • Particles
  • Post-Processing (Bloom, HDR, Depth of Field, etc)

Screen Resolution setting is pretty straight forward – lower resolution means less pixels to show, less work for the GPU, more frame rate. In theory, 4K resolution has 4 times the number of pixels compared to 1080p. Dropping your screen resolution should be able to increase your frame rate by 4x but there are other factors that prevent that much gain. And those factors are quite deep in the details of GPU architecture which I do not plan to cover here. Nonetheless, the effect is immediately noticeable.

Shadow Quality affects how sharp the shadows are. Shadows are one of those graphics techniques that add a lot of immersion in video games simply because our brain intuitively uses shadows to determine a few things like depth or distance. Unfortunately in video games, shadows are an extra render pass. This means, that the scene is rendered once per shadow casting light source to determine which objects occlude light. The option to lower the quality of these render passes are the shadow quality settings. You will see significant frame rate boosts when lowering shadow quality or turning off shadows completely in scenes where shadow casting light sources exist.

Comparison of Shadow Quality Settings (Low, Medium, High, Ultra) in Overwatch

Side note: do not confuse Shadows with Lighting. Lighting is when a light source illuminates objects. When a light source is blocked by an opaque object, a shadow is cast behind. In reality, these 2 work in conjunction. In video games, lighting is a separate computation from shadowing. In fact, there are advanced techniques in lighting (ex; Deferred Lighting) that allows hundreds of lights in the frame and it would barely affect the frame rate. Shadows, on the other hand, are normally limited to 1-2 light sources since they’re computationally expensive. Games cheat on this limitation by putting a maximum distance to the shadows that a light source can produce.

Reflections and Refractions are uncommon in video games for the simple reason that they’re computationally expensive with little benefit to the overall experience. Similar to shadows, these features add additional render passes but unlike shadows, it has less effect on immersion. These options are normally a toggle but sometimes they can be a resolution quality. In order to see the effect, you need to look at a reflective or refractive surface which is, again, not commonly in front of you in many scenes.

Particles are a visual effect to denote fluid-like motion such as the flames in a campfire, explosions of a dynamite, smoke from a burning bush, sparks for an exposed live wire, bubbles when exhaling underwater, exhaust of jet engines, etc. The applications are endless and they add a lot of realism and signaling in video games. GPUs manufactured in the last 5 years can easily render thousands to tens of thousands of particles every frame. But just in case your game renders millions of particles or your GPU is a bit on the aging side, lowering the particle quality or count could have a small boost in your frame rate.

Post-Processing effects are very popular nowadays. They add visual effects that mimic how the eye dilates depending on the amount of light present (HDR), mimic how cameras feather out strong lights (Bloom), mimic how the eyes focus on objects of varying distances (Depth of Field), etc. These techniques are pretty standard nowadays and can easily be added in video games. However, they are an additional computation which could lower your frame rate. Some effects cost more than others and a bit of experimentation is necessary to measure your milage.

It is possible for frame rates to drop for other reasons. For example, (AI) if you have hundreds of pigs roaming and path-finding on a 3D map or (Physics) thousands of balls bouncing off one another in a confined space; then you would see significant drops in frame rate. Ideally, the game wouldn’t let you reach that point unless you did something crazy.

Unlimited lox, wolf, boar, and chicken farm in Valheim

Let’s do a recap on how to record your footage with minimal audio desynchronization:

  1. Leave your frame rate uncapped (for now)
  2. Lower your graphics settings until you get a higher than desired frame rate. For example, you’re aiming for 60 fps, lower your graphics options until you reach 70 fps.
  3. Cap your frame rate; either to 30 or 60.
  4. Set your GeForce Experience recording target frame rate to your desired frame rate.
  5. Double check that your game can almost constantly run in your desired frame rate.
  6. Hit record!

I hope this helps and may you produce high quality epic footage of your gaming experiences!

Did you notice a mistake in my post? Or simply have the urge to chat with me? Either way, feel free to reach out in this twitter thread.

09/23/22

New Workstation has arrived!

My current gaming desktop/workstation turns 4 this year. I bought it when I was still in Florida in late 2018. And since I’m applying for remote work with the intention of moving, I decided that I needed a new mobile machine for my game development and game playing needs. Instead of getting a gaming laptop, I settled for a creator laptop. It’s designed for multimedia content creators such as video editors or graphics artists. It’s powerful enough to be a gaming machine without the fluff of RGB keyboard or the bulky exterior. I wanted something sleek and simple on the outside but packs a punch on the inside. Hence, I got an ASUS Vivobook 16X OLED straight from the ASUS store.

New ASUS laptop on top of Old ASUS laptop

Here’s a quick rundown of my tech specs:

  • Win 11 Home
  • AMD Ryzen 7 5800H
  • 16GB RAM
  • NVIDIA GeForce RTX 3050 Ti Laptop GPU
  • 16.0-inch, 4K (3840 x 2400) OLED Display

As with all new machines, I needed to install my tools and toys and I keep forgetting what they were.

The following is a list of software for future reference. I hope it helps you too in one way or another.

I’ll update the list as needed.

09/12/17

Dragon(fruit) Jam by the Game Dev Network

Last week, I joined a 10-day game jam that ended last September 10. I’m proud of the output despite the time and asset constraints (all assets, including sfx and bgm, must be created within the duration of the jam).

As keeping with my personal rules of doing game jams; I worked with people I have not worked with before and I explored a mechanic that I have not done before.

I worked with Brandon Bittner, a multi-talented artist and co-worker at Autodesk Inc, and with Steve Biggs, a programmer and 3D artist working at Bettis Atomic Power Laboratory. Together, we made…

Deckhand

A resource-management boat-skirmish simulator

You are the captain of your boat and you assign your crew to certain stations. There are 3 types of stations: Sails, Wheel, and Cannons. Putting more crew on the Sails make you move forward faster. Putting more people on the Wheel allow you to turn faster. Putting more people on the Cannons, there are 4 of them, makes them aim and reload faster. Destroying enemy boats drop a survivor which you can rescue (by ramming into them).

You can download and play the game here. Feel free to leave your honest feedback and rate the game (I need the ratings to win this game jam so if you would be so kind. Rating ends on Sept 17. Thank you in advance!)

I have 2 technical experiments here:

  • a dynamic and loosely-coupled resource system
  • procedurally-generated sea

Resource System

For this game, the resource are the crew members that you assign. The classes in this system are as follows:

  • ResourceManager which holds the count of available resources
  • ResourceConsumer which take resource from the manager and can return them
  • ResourceManagerUI which links the ResourceManager to UI elements
  • ResourceConsumerUI which links the ResourceConsumer to UI elements and hotkeys

The consumers should be able to send an OnLevelUp or OnLevelDown message to listeners. I’d prefer to use Unity3D C# SendMessage() functionality to prevent any tight-coupling between the resource system and the gameplay.

Sea

I followed the tutorials on procedurally generating a grid here and generating a height map from noise here. For Deckhand, I needed to fulfill the following criteria:

  • Appears to be endless
  • Mimic waves in water

To make it appear endless, I needed the sea to follow the player (which is followed by the camera) but should have the illusion that it’s not moving. Meaning, if there was a lump at (2, 3) of the grid and I moved the grid (1, 1), the lump should now be at (1, 2) of the grid. Unfortunately, simply using Mathf.PerlinNoise(x – pos.x, y – pos.y) didn’t result to what I wanted.

Water Test 1

Water Test 2 – Unscaled

Notice on both Tests above that there’s a certain position in which the sea flattens out. The certain position is when x or y is a whole number.

What I’ve discovered with Mathf.PerlinNoise(x, y) of Unity3D is that it’s periodic. Meaning, (0.5, 0) and (1.5, 0) as parameters return the same value. This actually makes sense because Perlin Noise is naturally periodic. What I didn’t understand was that when x or y approaches 0 or 1, the returned value approaches 0 as well.

I had to play around with the values I supply to the noise generator. The biggest takeaway was that I needed to use the scale of the grid as a factor. See the downloadable example later on.

For the second criteria of mimicking waves, the Red Blob Games tutorial (linked above) demonstrated how to do just that. See the section on Elevation: Frequency and Octaves. Lastly, I added a high-frequency time-based offset to show waves flowing even when the player is stationary. Initially, I didn’t want to put a texture on the sea but it’s very difficult to sea if the player is turning. I created a texture with a few white dots (stars) with clouds, as if reflecting the sky, which gave the player a sense of reference.

Here’s a unity package of the sea:

Procedural Sea
Procedural Sea
procedural_sea.unitypackage
1.9 MiB
1301 Downloads
Details

Post-Mortem

I’ve successfully met my objectives for this game jam. I am particularly happy about the sea and the resource management. The graphics look awesome, thanks to Steve. I loved the user interface and icons that Brandon put in.

My last minute sound effects were hilarious. For the wood creaking sound (when speeding up or turning) came from a wooden table at home. I nudged it and it made the perfect wood creaking sound. For the others, I mouthed the sound effects. And yes, that was me doing weeeee! and nooooo!

Boss Fight

I do understand that it’s not perfect. Play testing the game with my coworkers revealed that space + 1234 doesn’t make sense. In fact, it’s better to just hit 0 and reassign the crew rather than unassign them one by one. The biggest frustrations came from where’s my crew? when trying to assign crew. I failed to inform the user of the UI on the upper left indicating the health and available crew. The icons of the stations being at the edge of the screen didn’t help make the user make intelligent decisions since their eyes are normally at the center. The suggestion was to make them smaller and float them above the actual cannons that they represent.

I originally wanted to let go of this project but Brandon suggested making it into a mobile space game. Think of Artemis for mobile. What do you think?

01/23/17

My 5th GGJ!

Another Global Game Jam completed! Woohoo! Congrats to all who participated and successfully completed a working prototype in just 48 hours! Whether you worked solo, in a team of strangers, or a team you’ve worked with before; the experience gives valuable lessons that would hopefully push you closer to that game dev career you’ve always dreamed of. Again, CONGRATULATIONS to ALL!

This is my 3rd GGJ location – from Manila to Seattle and, now, Pittsburgh. This one is hosted by Pittsburgh IGDA (PIGDA). Every game jam inspires me to leave my comfort zone in order to experiment and explore. It is invigorating to be connected with passionate Game Developers of different professions, varying age groups, interesting backgrounds, and diverse interests.

For those who missed it, the keynote can be found here. The end of the video also shows the theme: WAVES! I like this theme. It’s very open – a lot of possibilities and interpretations. You can basically make any game and insert waves somehow.

My Team

Keeping my primary objective of working with people I’ve never worked with before; I teamed up with 4 people I met at the jam to form Team Surfer Babies!

Team Surfer Babies

(left to right) Sean, Addie, Francisco, Richard, and me

Together, we created Tiki Vs The Surfers! It’s a defense game where you use dolphins, mermaids, and whales to fend off surfers who want to defile your peaceful island!

Richard and Sean are the 2 amazing artists. Sean made the surfers (baby and tourist) and the Tiki while Richard made the weapons (dolphin, mermaid, whale), background, and UI. They collaborated to make the intro slideshow which explained why the Tiki was upset with the surfers. Addie was our talented musician and foley artist. She created every audio in the game. She even voice-acted for the dolphin and whale! Francisco and Francis (me) were the programmers implementing the different behaviors and integrating everyone’s work.

Here are some more pictures of the event.

Nova Place, venue for Pittsburgh GGJ 17

The cake was not a lie!

Richard playtesting the game

Most hardcore electric guitar ever! c/o Addie the Musician/Foley Artist

See my next post to read about my technical learnings from our game.

12/29/14

Are arrays pointers?

New C/C++ programmers often have this confusion that all arrays are pointers. Well… they’re kinda, sorta, but not really. Let’s consider the following code:

Try compiling the code snippets using Coding Ground.

The pointer p points to the first character of arr. Remember that the index operator ([]) is automatically converted to addition then dereference.

Note: 2[p] would be evaluated to *(2 + p) giving us the same result as p[2]. It’s bad practice to use that form even if it works.

And dereferencing arr is similar to getting the first element

So when do arrays and pointers differ?

Continue reading

05/15/14

Mandala Design by Programming

Last school year, I can confidently say that many of my students enjoyed this one lab exercise we did in class. The exercise was to create circular designs using lines with C++. It may sound weird to find students enjoying a combination of programming and trigonometry to create digital math art but it did happen. When I uploaded their submissions in Facebook, I got a couple of private messages from former students complaining why they didn’t have those. And there were a few who even got curious enough to ask me to teach them. Unfortunately, the exercise was a cumulative effort from different activities and I couldn’t teach it in just one sitting. Since then, I put “Mandala Design by Programming” as one of my to-do video lectures.

And here it is…

The CPNG header and source files can be downloaded here. The PNG encoder could be acquired here. Make sure to get the lodepng.cpp and lodepng.h.

In the video, I used XCode to compile. In case you’re compiling by terminal, include the following flags: -ansi -pedantic -Wall -Wextra -O3. For example, if your source code is main.cpp and you wish the executable to be called main.exe, you can compile by

Ideally, this video is for fellow programming teachers to inculcate the value of loops. It can also be a side exercise where students would manipulate lines and colors instead of the usual text or number processing. I think, this can also be valuable for Math teachers teaching trigonometry and help the students appreciate sine and cosine.

If you encountered issues while creating your own Mandala Design, e-mail me at francis dot serina at gmail dot com. Also, I’d like to see your own mandala designs so please post in the comments below the results of your own artwork 😀

05/2/14

Token from TechSmith

Last December, I won the TechSmith ScreenChamp Awards 2013. With that, I was interviewed and showed up in one of their segments, 48 in 24. It was fun because it was my first time and I got to share my experiences with creating video lectures. I actually forgot about the interview until today. It turns out that TechSmith sent a token of appreciation for the interview and it arrived today 🙂

Drinking Cup

Back of the Drinking Cup with the different TechSmith products

Thanks and Drinking Cup

Thank You card and Drinking Cup

Thank You card

Thank You Card signed by the 48in24 Crew

With that, I immediately searched for the interview and I found it! Feel free to watch and heckle me. The skype video call seemed low quality and the video is out of sync. Oh well, bare with me.

Ugh… there’s an electric fan at the back. Makes me look poor. And my friends pointed out the hanger on the closet door. Eeep! Don’t even ask what I was wearing down there. Ahahaha!

And darn, they cut out my favorite part. They asked me “If you were to put the experience into one sentence, what would it be?” and I said “It was worth the shame” 😀

02/4/14

Prizes from TechSmith

As some of you know, I joined ScreenChamp Awards 2013 last December and emerged People’s Choice Award. My prizes have arrived today in 3 boxes.

3 boxes from TechSmith

It left Amazon last Jan 20 and arrived in the Philippines on the 24th. It’s been stuck in Pasay City due to a clearance delay for a week. But FedEx got it through and it arrived at my doorstep yesterday. My mom, who received it, was excited herself to see what’s inside. She helped me open the boxes and take pictures of the contents.

box1 from TechSmith

And in box #1 is a foldable reversible pop out background panel! I saw the pictures in Amazon but I didn’t expect them to be this big. That’s my mom behind the panel by the way 🙂

box2 from TechSmith

And in box #2 was my ScreenChamp Trophy 🙂 That’s some beautiful trophy. It’s the first time that I saw this and I wasn’t sure what it was when I took it out the box. I’m quickly running out of space in my room. I have no idea where to put this…

And lastly…

box3 from TechSmith

Box #3 contained the prizes that makes ScreenChamp worth fighting for. I got a red The Forge T-shirt, Canon T5i DSLR + EF 50mm lens, Zoom H4 Audio Recorder, Shure Wireless Lavalier, 4 The Forge Stickers, 2 32GB SDHC Memory Cards, Moleskin Notebook, and Staedtler pens.

I already got my other prize of 250USD Gift Certificate to Premium Beat through an e-mail.

To all those who liked my video, a big THANK YOU! This wouldn’t be a success without you!

02/2/14

Targeting Multiple Aspect Ratios using Unity3D

Unity3D GUI is hell. Add multiple aspect ratios and you’d pretty much be pulling your hair off.

For those developers who are new to tackling multiple aspect ratios; here’s a project that you could test out.

MultiAspectRatioTest
MultiAspectRatioTest
MultiAspectRatioTest.zip
182.0 KiB
345 Downloads
Details

The main issue with different aspect ratios (and screen resolutions) is that they “see” things differently. You could solve this by either using Pillarbox/Letterboxing or Stretching. Pillarbox/Letterboxing is totally out of the question. Nobody likes losing screen real estate. Googling around and you’ll see suggestions of stretching the GUI by manipulating the GUI.matrix via Matrix4x4.SetTRS() but stretching is not cool.

For 3D games, this is barely an issue. For 2D games, yes it is. It’s also an annoyance to the develop User Interfaces for different aspect ratios. Then I stumbled upon How to support 3 iOS aspect ratios in Unity3D which showed how to use a single camera to be deployed in multiple devices with different aspect ratios. The solution: change the orthographic size of the camera! However, there were some discrepancies with the exact pixel values which led me to create the test project above. Nigel also mentioned to use the orthographic size for the UI.

As for the User Interface, to hell with Unity3D GUI since you can’t even see it while you’re coding. Sprites are way better. It makes the world a better place for game developers. Combined with a dedicated camera to draw the user interface from sprites and Text Mesh for dynamic text, we can actually replace most of the form elements in GUI. 

I ran my project on an iPhone 5s, iPhone 3GS and iPad 3 and the results are as follows.

iPhone 5siPhone 5s

iPad3

iPad 3

iPhone 3gs

iPhone 3GS (with a different font size)

What does this mean? Keep your game elements inside the green box and it will be seen in any of the 3 aspect ratios. Then for your UI, use sprites and see how I aligned those orange boxes to the top/bottom edges of the screen. What about the extra space on wider aspect ratios, up to you actually.

01/17/14

MGJ2014 Interview at ANC

This afternoon, I was interviewed by ABS-CBN News Channel regarding the Manila Game Jam 2014!

Check it out here!

Ack, there’s a reason why I prefer to be behind the camera.

Nonetheless, MGJ2014 Orientation in less than 11 hours! MGJ2014 in 6 days! Talk about anxiety attacks