close
Xbox

Microsoft Project Acoustics Details Point to New Era of Audio on Xbox Series X

Project-Acoustics-Microsoft-768×512

Last month, Microsoft discussed its audio ray-tracing solution for the Xbox Series X. Called Project Acoustics, the software “models wave effects like occlusion, obstruction, portaling and reverberation effects in complex scenes without requiring manual zone markup or CPU intensive raytracing.”

In a blog post last year, Microsoft said the solution is similar to static lighting meshes that are integrated, allowing shadows and light sources to be calculated beforehand. Now, the solution is available in the Unity engine for developers to access.

“Ray-based acoustics methods can check for occlusion using a single source-to-listener ray cast, or drive reverb by estimating local scene volume with a few rays. But these techniques can be unreliable because a pebble occludes as much as a boulder. Rays don’t account for the way sound bends around objects, a phenomenon known as diffraction. Project Acoustics’ simulation captures these effects using a wave-based simulation. The acoustics are more predictable, accurate and seamless.”
Last month, Microsoft confirmed some core specs of its upcoming Xbox Series X console. During the announcement, the company said audio raytracing delivered by Project Acoustics was already part of the Unity game engine.

It is available on Unity as a drag-and-drop middleware. Developers can leverage the software into Unity’s audio source audio through a C# controls component on each audio object.

Workflow

On its page, Microsoft details the workflow for Project Acoustics:

Pre-bake: Start with setting up the bake by selecting which geometry responds to acoustics, by, for example, ignoring light shafts. Then edit automatic material assignments and selecting navigation areas to guide listener sampling. There’s no manual markup for reverb/portal/room zones.

Bake: An analysis step is run locally, which does voxelization and other geometric analysis on the scene based on selections above. Results are visualized in editor to verify scene setup. On bake submission, voxel data is sent off to Azure and you get back an acoustics game asset.

Runtime: Load the asset into your level, and you’re ready to listen to acoustics in your level. Design the acoustics live in editor using granular per-source controls. The controls can also be driven from level scripting.”

Source Winbuzzer

Chioma Ugochukwu

The author Chioma Ugochukwu

Leave a Response