As technology has advanced, game development has made significant progress in creating immersive and realistic game worlds. In this post, we will go through the main gaming technologies that allow us to create realistic game worlds, AI for characters and worlds, physics, sound and graphics. Let’s see what’s in the top right now, where these technologies are used, in what projects and what companies are developing them.
What’s happening
Developing AAA games can now cost as much as making a similar movie, if not more. Studios strive to surpass competitors in realism and immerse the player in a fictional world as much as possible. And all this requires the most advanced development and process optimization tools.
Generative AI and more
Artificial intelligence has always been in games, but generative AI appeared in the arsenal of developers not so long ago. Some believe it has the potential to revolutionize game development, while others criticize it.
One of the ways in which developers are already introducing AI into the development process is through the generation of art and assets. Generative AI trained on a handful of images can create large numbers of similar works faster than a human. And DALL-E 2 and ChatGPT can be used by game writers to create original stories, expand on core ideas, or create in-game text.
Scenario recently opened up its generative AI platform, GenAI, to developers. It claims to offer customizable generators that developers can use to create art to suit their specific style. Scenario CEO Emmanuel de Maistre showed how it works by creating dozens of small pieces of art based on a series of clues. De Maistre also showed how the process works on Twitter by creating several images of potion bottles using Scenario.
Inworld AI plans to use generative AI to allow NPCs in games to produce more dialogue in response to what the player says to them. They have already used this technology to create a demonstration of a virtual Santa Claus that reacts to children’s interactions with him.
NVIDIA Omniverse Avatar Cloud Engine (ACE) offers a fast and versatile solution for bringing interactive avatars to life at scale. Game developers can use ACE to seamlessly integrate NVIDIA AI into their applications, including NVIDIA Riva for creating expressive character voices using speech and AI translation, or Omniverse Audio2Face and Live Portrait for 2D and 3D character animation using AI.
Audio2Face is already used in game development and helps artists animate characters more efficiently without the tedious manual process. The latest release of the application introduced headless mode and a REST API, allowing you to run the application and process multiple audio files from multiple users in a data center.
Machine learning
NPCs in games such as 9 masks of fire free play are often a pain. They have a limited number of scripts, and such characters become predictable and uninteresting for the player. But machine learning can make these characters more intelligent and hyper-realistic. For example, reinforcement learning allows NPCs to adapt their behavior and decision making based on the player’s actions.
Civilization games use AI to create intelligent opponents that can adapt to the player’s strategy. The AI in the game can analyze the player’s actions and react accordingly, making the game more challenging and fun.
Minecraft uses reinforcement learning (RL) to train agents called “bots” to perform various tasks and tasks in the game. Minecraft also uses Microsoft’s Malmo framework, which includes reinforcement learning. Developers can train and test RL algorithms in the virtual world of Minecraft using the Malmo platform.
Procedural Content Generation, or PCG, is a widely used technique in the gaming industry that helps create game levels, environments, and other content using algorithms. Developers can quickly and efficiently create complex game worlds while maintaining a high level of variety and unpredictability.
The space exploration game No Man’s Sky is a real-life example of the use of PCG. In No Man’s Sky, the AI creates an infinite universe with planets, flora, fauna and terrain. Using predefined parameters and rules, the algorithm creates a certain type of planet based on distance from the star or sun, presence of elements, etc. This makes the game unpredictable.
Machine learning helps balance game difficulty by optimizing the game ecosystem and mechanics, reducing bugs and glitches, and improving the gameplay experience. By analyzing player data, behavior, and game statistics, ML algorithms help developers fine-tune game mechanics.
The FIFA game uses ML to control difficulty. Algorithms analyze data about the team and player behavior and dynamically adjust the difficulty. The game adjusts the difficulty level based on the player’s skill level and behavior to keep the game challenging without being overwhelming. ML algorithms are used to analyze the movement and positioning of players, allowing the characters to move and act like humans on the field.
Black and White games use AI to create a god-like experience, allowing players to control and manipulate the game world through their actions. The game’s AI can learn from the player’s choices, adapting to their playstyle and providing a unique gaming experience.
Sound
3D audio modeling allows developers to create realistic and immersive soundscapes where sounds are positioned in 3D space based on the player’s position and orientation. This creates a feeling of depth and realism.
To achieve 3D audio simulation, developers use a variety of techniques, including binaural audio and positional audio. Binaural audio uses two microphones to create a stereo effect, mimicking the way the human ear processes sound. Positional audio uses algorithms to calculate the position of sound in 3D space and adjusts the volume and direction of the sound based on the player’s movements.
For example, The Last of Us used a wave propagation technique to help players determine where sounds were located: from the location of enemies, doors, and any other activity. The game’s sound engine produces about 1500–2500 waves per frame. This means that sound waves in the game travel the same way as in real life, the sound is spatial and contextually responsive.
Blockchain, NFT and other buzzwords
And finally, let’s talk about blockchain, NFTs and other favorite things in the context of game development.
Blockchain has given rise to a new subgenre called “crypto games.” They differ from traditional games in that they allow players to earn cryptocurrency (or digital currency) by playing the game. Cryptocurrency earned in these games can be converted into real money or used to purchase new items in the game.
Blockchain-based games could have a significant impact on the gaming industry by giving players a new level of ownership and control over their gaming assets. Because blockchain-based games use a decentralized network, players can securely own and transfer in-game assets without intermediaries such as game developers or publishers.
Blockchain games also provide a new way for developers to monetize their games. In traditional games, developers earn income from game sales or in-game purchases. In blockchain games, developers can create and sell unique in-game items or currency that players can buy, sell, or trade with each other.