Okay, Apple signed another patent – but this time it’s a goodie.
Yesterday Apple was awarded the patent for an augmented reality system, giving users the ability to tag objects on screen in live video relayed by the phone camera, which would bring up information about those real-world items in a heads-up display. In english – this key feature enables you to build visual, interactive layers on top of reality with your phone, and map and annote data, as well as share the data with other iOS users.
Basically, its Google Glass, on a phone screen. Which is why this patent is important.
Information about the real world environment can be stored and retrieved as an information layer which can be overlaid on the live view and interacted with by a user,” Apple explained in the filing.
Still confused? Maybe this picture below will help, it shows a example of what some of the features of the app would be.
In one example given by Apple, a teacher could hold device over a student’s exam paper and “an outline showing incorrect answers to exam questions can be displayed in the live video to assist the teacher in grading the exam paper.”
Data can be received from one or more onboard sensors indicating that the device is in motion,” the filing said. “The sensor data can be used to synchronize the live video and the information layer as the perspective of video camera view changes due to the motion. The live video and information layer can be shared with other devices over a communication link.”
[Source:Mashable]
Hey Guys - thought I’d just give a quick reach-around and say a big thank you to our rea...
[imagesource:CapeRacing] For a unique breakfast experience combining the thrill of hors...
[imagesource:howler] If you're still stumped about what to do to ring in the new year -...
[imagesource:maxandeli/facebook] It's not just in corporate that staff parties get a li...
[imagesource:here] Imagine being born with the weight of your parents’ version of per...