Now that the WebXR Hand API has an experimental implementation in the wild (as covered in issue #12) developers like Stewart Smith are building higher level hand utilities. This week Smith released Handy.js, a library for defining and detecting complex hand gestures. Combined with rendering libraries like Three.js, PlayCanvas (see below), and Babylon.js, Handy.js makes it possible for users to see and intuitively use their hands on the immersive web.
Following the recent release of PlayCanvas version 1.33.0 with support for the experimental WebXR Hand API, prolific developer Max M created a demo with support for WebXR Controller Profiles. Together, these two technologies will ensure that users with or without tracked controllers can enjoy immersive web experiences without a lot of effort from developers.
Three pieces of history landed on the immersive web this week. First, the rather amazing museum at Bletchley Park released a detailed immersive scan of a wide variety of antique computers, including the WWII-era Bombe and Enigma machines. Second, the famous San Francisco Bay area hackerspace, Noisebridge, was scanned and released on Sketchfab. Finally, noclip (an amazing collection of class video game levels) has WebXR support for people who want to escape 2020 to travel a few old maps.
Khronos revealed that the DGG's RapidCompact web-based service now supports the latest glTF standard. DGG's tool converts massive computer aided design (CAD) files and detailed location scans (see previous article) into standards-based, web-sized files that won't saturate users' net connections and burn through their batteries.