Okay, Apple signed another patent – but this time it’s a goodie.
Yesterday Apple was awarded the patent for an augmented reality system, giving users the ability to tag objects on screen in live video relayed by the phone camera, which would bring up information about those real-world items in a heads-up display. In english – this key feature enables you to build visual, interactive layers on top of reality with your phone, and map and annote data, as well as share the data with other iOS users.
Basically, its Google Glass, on a phone screen. Which is why this patent is important.
Information about the real world environment can be stored and retrieved as an information layer which can be overlaid on the live view and interacted with by a user,” Apple explained in the filing.
Still confused? Maybe this picture below will help, it shows a example of what some of the features of the app would be.
In one example given by Apple, a teacher could hold device over a student’s exam paper and “an outline showing incorrect answers to exam questions can be displayed in the live video to assist the teacher in grading the exam paper.”
Data can be received from one or more onboard sensors indicating that the device is in motion,” the filing said. “The sensor data can be used to synchronize the live video and the information layer as the perspective of video camera view changes due to the motion. The live video and information layer can be shared with other devices over a communication link.”
[Source:Mashable]
[imagesource:Facebook/DFW Fire & Rescue NPC] A frantic call to the DFW Fire & R...
[imagesource: Dongemond Police] Police in the southern Netherlands came across a tripp...
[imagesource: Fuego Volcano / Peter Fisher for National Geographic] The end of 2024 is ...
[imagesource:freepik] Scientists took a deep dive into the minds of astronauts who spen...
[imagesource:Missing Nick Frischke /Facebook The mother of missing German tourist Nick ...