Case Study: Optimization of a PC-developed Metaverse for Meta Quest 2 and 3

Project Background: The project entailed adapting a visually rich and complex metaverse, originally designed for PC, for the Meta Quest 2 and 3 VR platforms. The goal was to maintain high visual quality while achieving stable performance metrics suitable for VR experiences.

Problem Statement: Initial testing on the VR platforms revealed significant performance drops, particularly when the camera was directed towards complex scenes with high polygon counts and numerous dynamic elements. Frame rates would drop below acceptable levels, affecting the immersive experience and causing visual discomfort.

Optimization Strategy:

  1. Early Assessments and Adjustments:
    • Early Z Pass: Initially turned off to assess its impact on VR; no significant change in performance was observed, affirming its removal from the rendering pipeline.
    • Material Optimization: Introduced binary switches in materials to selectively reduce shader complexity, drastically cutting down the number of samplers used.
  2. Mesh and LOD Management:
    • Road Mesh Merging: Simplified the environment by merging multiple road meshes into a single asset to reduce draw calls.
    • Hierarchical Instancing: Applied to large groups of similar objects (like foliage and small structures), significantly reducing draw calls while maintaining LOD control to ensure detail at appropriate distances.
  3. Character and NPC Optimization:
    • LOD for Characters: Adjusted character models to generate LODs based on distance, reducing the detail as the player moved away from characters, hence saving rendering resources.
    • Base Class for NPCs: Centralized common behaviors and attributes into a single BASE class to streamline updates and optimize performance across all NPCs.
  4. Lighting Optimization:
    • Baking Local Lights: Converted all previously stationary lights to baked lights, reducing real-time computations and halving the number of draw calls.
    • Dynamic Light Adjustment: Eliminated shadows from dynamic lights and optimized settings for non-essential lighting elements, including complete removal of directional lights in favor of image-based lighting.
  5. Texture and Asset Streaming:
    • Optimized Texture Sizes: Addressed inefficient memory usage by resizing textures to power-of-two dimensions, ensuring optimal streaming and mipmapping.
    • Visibility and Culling Adjustments: Disabled precomputed visibility in favor of dynamic cull distance adjustments, enhancing performance especially in dense scenes.
  6. Post-Processing and Rendering Settings:
    • Adjusted Post-Process Settings: Ensured that costly post-processing effects like HDR were optimized or disabled to meet the hardware constraints of the VR platforms.

Results:

  • Pre-Optimization Performance:
    • Severe framerate drops to below 30 FPS when viewing complex scenes.
  • Post-Optimization Performance:
    • Consistent frame rates between 70 to 90 FPS on both Meta Quest 2 and 3 across various scenes.
    • Draw Calls: Reduced from 2500 to under 400.
    • Triangle Count: Decreased from 2.5 million to under 500,000.
    • LOD Pop-Ups: Virtually eliminated, ensuring a seamless visual experience.

Conclusion:

The comprehensive optimization strategy led to a successful adaptation of the metaverse for the Meta Quest 2 and 3. By addressing mesh complexity, material efficiency, lighting, and asset streaming, the team was able to overcome significant performance challenges. The improvements not only stabilized the frame rate but also maintained the visual integrity of the original PC version, making the VR experience both immersive and comfortable for users. This project serves as a benchmark for similar high-fidelity VR adaptations and underscores the importance of targeted optimizations in VR development.

Scroll to Top