Today the US Patent & Trademark Office published a patent application from Apple that relates to extended reality (XR) environments and future devices like their Mixed Reality Headset will be able to provide user's with visualization of non-visible phenomena. No, this isn't a script for a future Sci-Fi movie but it certainly reads like one at times. Describing a user visualizing "Augmented Reality" in eyewear as one seeing "non-visible phenomena" is a new descriptive twist, based on environmental sensor feedback. This could apply to both consumers and industrial uses.
To begin with, Apple clarifies that their future mixed reality devices could take on many different forms such as head mountable systems, projection-based systems, heads-up displays (HUDs), vehicle windshields having integrated display capability, windows having integrated display capability, displays formed as lenses designed to be placed on a person's eyes (e.g., similar to contact lenses), headphones/earphones, speaker arrays, input systems (e.g., wearable or handheld controllers with or without haptic feedback), smartphones, tablets, and desktop/laptop computers. Projection systems may also be configured to project virtual objects into the physical environment, for example, as a hologram or on a physical surface.
XR systems display a virtual representation of non-visible features of a physical environment, such that a user of the XR system perceives the non-visible features at the location of the non-visible features in the physical environment.
For example, a device may detect, and/or receive information regarding, one or more non-visible features within a direct or pass-through field of view of a physical environment, and display a visualization of those detected non-visible features at the correct location of those features in the physical environment.
For example, responsive to a detection of non-visible feature of a physical environment, the device may display a visualization of the non-visible feature overlaid on the view of the physical environment at a location that that corresponds to the detected non-visible features.
The non-visible features may correspond to, for example, electromagnetic signals such as Wi-Fi signals, airflow from an HVAC system, temperatures of physical objects, fluids or gasses, an audible fence created for a pet (e.g., using ultrasonic pitches), sounds generated by a musical instrument, and/or hidden physical objects such as objects with known locations that are obscured from view by other physical objects (as examples).
This takes augmented reality to a whole new level.
If the HMD, smartglasses or other eyewear "detects" a non-visible phenomenon (via specialized sensors), the device will then allow the user to actually see the range of ultrasonic pitch or see where a dangerous gas actually is within their view.
Apple's patent FIG. 6 below illustrates a flow chart of an example process for providing computer-generated visualizations of non-visible phenomena; In the example of FIG. 3, a visualization #302 of a non-visible ultrasonic fence generated by ultrasonic fencing device #210 is also displayed, to be perceived by a user of electronic device 105 at the three-dimensional location of the non-visible ultrasonic fence.
Apple's patent FIG. 4 below illustrates other visualizations of non-visible phenomena that can be provided by an eyewear device such as electronic device #105. In the example of FIG. 4, a visualization #400 of airflow from vent #206 is illustrated. In this example, the visualization is a representation, in the visible light spectrum, of the airflow existing outside the visible light spectrum. In this example, the visualization is based on a visual context that includes wavy lines with a wave frequency and a separation that increases with decreasing airflow.
The airflow may be detected using an airflow sensor in the eyewear device that detects airflow directly from the movement of air through the sensor and/or indirectly.
In Apple's patent FIG. 5 above, a musical instrument implemented as a guitar #500 is shown being tuned using a visualization #501 of the non-visible sound generated by a string of the guitar. In this example, the visualization is generated based on a visual context that includes two sine waves such as a first sine wave #502 and a second sine wave #504.
For more details, review Apple's patent application number 20220292821.
Ying bai: AR/VR Software QA Engineering Manager
Marco Cavallo: Data Visualization & Machine Learning Tools
Arun Srivatsan Rangaprasad: Researcher, developing algorithms for AR/VR systems.
Tiejian Zhang: Computer Vision & Machine Learning Engineer
Kieran Dimond: Engineering Project Manager who joined Meta Reality Labs 9 months ago.
Posted by Jack Purcher on September 15, 2022 at 04:38 AM in 1A. Patent Applications | Permalink | Comments (0)
This is only a preview. Your comment has not yet been posted.
The letters and numbers you entered did not match the image. Please try again.
As a final step before posting your comment, enter the letters and numbers you see in the image below. This prevents automated programs from posting comments.
Having trouble reading this image? View an alternate.
(You can use HTML tags like <b> <i> and <ul> to style your text. URLs automatically linked.)
(Name is required. Email address will not be displayed with the comment.)
Name is required to post a comment
Please enter a valid email address
This weblog only allows comments from registered users. To comment, please enable JavaScript so you can sign in.