In early August Patently Apple posted a patent report titled “Apple Invents an Augmented Reality Windshield that even Supports FaceTime Calls between Different Vehicles.” Today the US Patent & Trademark Office published a patent application from Apple that reveals more aspects of Apple’s Augmented Realty assisted windshield’s invention that could be used on manned and unmanned vehicles, aerial vehicles, submersible vehicles and more. Apple even envisions that some day AR Windshields will be used in driver education both in real and simulator environments.
Apple begins their patent application with a broad overview. Autonomous machines, including robots, aerial vehicles, aerospace vehicles, submersible vehicles, automobiles, other ground vehicles, and the like, are capable of operating within their respective environments with limited input from a user.
Conventional autonomous machines generally fail to compensate for this lack of operational engagement between the user and the autonomous machine. As a result, the user may be unaware of future actions planned by the autonomous machine and/or lack an understanding of a basis of a decision or action of the autonomous machine, which may unnecessarily decrease a confidence of the user in the autonomous machine or otherwise make the user uneasy.
It is with these observations in mind, among others, that various aspects of the present invention were conceived and developed.
Apple’s invention generally relates to systems and methods for interaction between a user and a machine, such as an autonomous machine such as an autonomous vehicle.
Generally, a primary interface controller generates an interactive user interface for interaction between the user and the machine. The interactive user interface facilitates user understanding of statuses, actions, and decisions of the machine, while permitting the user to impact or control various operations of the machine.
For example, the interactive user interface may permit the user to dynamically modify a route of the machine along a path of travel. In one particular aspect, the interactive user interface includes a timeline, a communication layer, and indications to provide simple and intuitive means for the user to interact with and understand the machine.
The various systems and methods disclosed generally provide for interaction between a user and a machine. The example implementations discussed in Apple’s patent application reference an interactive user interface for a manned autonomous vehicle for transporting a user. However, it will be appreciated by those skilled in the art that the presently disclosed technology is applicable in other human machine interface (HMI) contexts and to other manned and unmanned autonomous machines, including, without limitation, robots, aerial vehicles, aerospace vehicles, submersible vehicles, ground vehicles, and the like.
The presently disclosed technology may be further applicable to other types of machines and user devices, such as a personal computer, workstation, terminal, mobile device, smartphone, tablet, multimedia console, and other computing devices.
Next Generation Driving School Tool
In Apple’s patent FIG. 9 below we’re able to see an interactive user interface #100 includes a virtual reality interface #324 presenting various interactive virtual features simulating a physical presence and experience.
For example, the interactive windshield interface #104 may turn opaque, such that a user cannot see through the windshield, and display a simulated environment for a student to learn how to drive.
In addition, the interactive windshield interface #104 may permit the transmission of light and overlay artificial elements through augmented reality as a learning course while the student is driving.
In some cases, the interactive dashboard interface #102 provides driving feedback 326, including instructions, recommendations for improving technique, scoring, and/or the like.
Find me a Coffee shop on my Route
In another example, we’re able to see in Apple’s patent FIG. 4 above that the user may interact with timeline #300 to express an interest in stopping for coffee on the way to the final destination point #308.
In one implementation, the timeline (Siri) may automatically provide coffee as a waypoint option based on user preferences or user history. In another implementation, the user will just ask Siri “Find me coffee” and an option will be provided that’s in context with the user’s final destination.
The options may be displayed and selected in a variety of manners. For example, as shown in FIG. 4, the options may be presented in a message bubble #310 where the user can drag and drop one of the options onto the timeline or tap one of the options for more information.
Alternatively or additionally, the options may be projected into a field of view of the user through augmented reality using the interactive window interface 104.
The options may show the locations of the options in the field of view, as shown in FIG. 4. To select one of the options, the user may point to one of the options.
Augmented Reality Assisted Windshield
As noted at the top of our report, we covered Apple’s first AR Windshield patent back in August that elaborately detailed an autonomous vehicle navigation system. It also covered a wide range of elements involved in a future autonomous vehicle such as internal and external sensors, control elements, vehicle interfaces and the ability of having a FaceTime chat using the windshield as the display.
Today Apple’s second patent covering an AR Windshield was published by the U.S. Patent Office further describing this system.
Apple notes that “the interactive window interface #104 as noted in the patent figures presented above provides a view of an environment external to the autonomous vehicle. The view may be a live direct view through a transparent cover disposed in an opening in the autonomous vehicle, such as a windshield, window, porthole, door, and/or the like.
The view may alternatively be a live indirect view reproduced using one or more output devices. In one implementation, the interactive window interface #104 provides an augmented reality view where the external environment is interactive and digitally manipulable.
Stated differently, the interactive window interface may utilize augmented reality techniques, along with computer vision techniques, to modify one or more elements in the external environment and/or to overlay interactive features or information.
The interactive user interface includes one or more interactive features generated based on machine status information. The machine status information may include, without limitation, navigation information, vehicle motion information, vehicle perception information, vehicle status information, vehicle environment information, and/or the like.
The navigation information may include any information pertain to a travel path of the machine from an origination point to one or more destination points. For example, the navigation information may include, without limitation, mapping information; location information; environmental condition information, such as traffic information, obstacle information, weather information, road closure information; and/or the like.
The vehicle motion information includes data pertaining to movement of the autonomous vehicle, such as location information, inertial information, and kinetics.
The vehicle perception information provides a perception of the autonomous vehicle relative to a notion of the external environment and may be generated based on raw data of the external environment captured with one or more sensors.
The vehicle status information pertains to a status of each of the subsystems of the vehicle. For example, the vehicle status information may indicate an operational status of various engine components, fuel level, battery charge, tire pressure, window orientation (e.g., open or closed), and statuses of other interior and exterior components.
The vehicle environment information relates to interior vehicle services, such as climate control, music and sound control, seat orientation, and other user experience services.
Apple’s patent application was originally filed back in Q1 2018. Considering that this is a patent application, the timing of such a product to market is unknown at this time.
Patently Apple presents a detailed summary of patent applications and/or granted patents with associated graphics for journalistic news purposes as each such patent application is revealed by the U.S. Patent & Trade Office. Readers are cautioned that the full text of any patent application should be read in its entirety for full and accurate details.