Apple Vision Pro
Early in the Apple Vision Pro development cycle, the company toyed with bespoke VR controllers for the headset, but ultimately decided that eye and finger tracking with cameras was the way to go.
At some point in early development, Apple reportedly company considered a device like a smart ring to control the Apple Vision Pro. According to the latest newsletter from Mark Gurman, the company also reportedly tested third-party VR controllers from HTC at least.
And, it apparently has no intention of developing a VR controller for itself. Apple's distaste for this kind of controller also extends to lack of support for third party controllers — at least for now.
But, it works with game controllers from Microsoft and Sony that are intended for consoles. Apple's macOS and iOS support a wide array of controllers as it stands now, and it's not clear if this range will work on Vision Pro.
Apple has an in-air keyboard for the Vision Pro, of course. The headset will also support a Bluetooth one, or one connected to a Mac.
Apple's work on a "smart ring" may have been a predecessor to Apple Vision Pro
The company has been working on technology surrounding a smart ring for some time. Apple has been researching smart rings plus accessories for it.
Most recently, a newly-granted patent, "Skin-To-Skin Contact Detection," covers multiple ways of detecting "contact or movement gestures between a first body part and a second body part" like snapping fingers, or gestures that were covered in the WWDC keynote that revealed the Apple Vision Pro. That includes options that seem more relevant to Apple Watch bands, or even just a person's hands interlocking, but it comes down to both skin and gesture detection.
One of the example illustrations shows how a person's hand position changes when they press their finger and thumb together, for instance. A second illustration shows when "the index finger is now making contact with the thumb."
Apple's proposed system for that smart ring, and likely the Apple Vision Pro too, will recognize the touch, and also generate "a sense output signal when the index finger and thumb make and break contact."
Other examples of Apple's research on gesture detection show a user pressing a finger of one hand into the palm of the other, which in theory doesn't necessitate a physical controller.