According to a new report from Ming-Chi Kuo (via 9to5mac), a reliable analyst on all things Apple, the company has been working on an augmented reality headset and is about to launch the device. This pair of glasses could go into mass production as early as Q4 2019 and should be available at some point during the first half of 2020.
It’s still unclear what you’ll be able to do with this mysterious headset. Kuo says that it’ll work more or less like an Apple Watch. You won’t be able to use the AR headset without an iPhone as it’ll rely heavily on your iPhone.
The glasses will act as a deported display to give you information right in front of your eyes. Your iPhone will do the heavy lifting when it comes to internet connectivity, location services and computing. I wouldn’t be surprised if the AR headset relies on Bluetooth to communicate with your iPhone.
Kuo’s report doesn’t say what you’ll find in the headset. Apple could embed displays and sensors so that the AR headset is aware of your surroundings. An AR device only makes sense if Apple puts sensors to detect things around you.
Apple has already experimented with augmented reality with its ARKit framework on iOS. Developers have been able to build apps that integrate digital elements in the real world, as viewed through your phone cameras.
While many apps have added AR features, most of them feel gimmicky and don’t add any real value. There hasn’t been a ton of AR-native apps either.
One interested use case for augmented reality is mapping. Google recently unveiled an augmented reality mode for Google Maps. You can hold your phone in front of your face to see arrows indicating where you’re supposed to go.
Apple has also been rebuilding Apple Maps with its own data. The company isn’t just drawing maps. It is collecting a ton of real world data using LiDAR sensors and eight cameras attached on a car roof. Let’s see if Apple Maps will play an important part in Apple’s rumored AR headset.
I first encountered the founders of Litho, a new hardware and software startup developing a new finger-worn controller, at London’s [email protected] last April. The event sees startups pitch in front of the British royal family and other esteemed guests, and naturally the company’s young founders, 24-year-old Nat Martin (CEO) and 25-year-old Charlie Bruce (CTO), were a little overawed by the occasion, just like many of the other founders pitching that day. However, perhaps unbeknownst to them, Litho was also one of the more notable companies, not least because, as the saying goes, hardware is hard.
Fast-forward to today and the young company is ready to show the world the first publicly available iteration of what it has been building: an innovative finger-worn device that provides control over various “spatial interactions” and should find applications ranging from AR and VR to the smart home and the control of other IoT devices. The next stage for Litho is to offer the controller and access to its SDK to developers who join the startup’s beta programme for $ 199/£179.
“Computing is increasingly structured around the real world rather than the desktop,” says Litho’s Nat Martin. “With the advent of smart devices such as lights, thermostats and door locks, physical things are becoming digitally connected. Equally, with the advent of AR, digital things are becoming physically anchored in the real world. These are two sides of the same coin — digital interactions are entering physical space.”
However, the status quo is for the smartphone to be the primary interface for these spatial interactions, but smartphones were designed to interact with 2D content on screens and are therefore struggling to make the leap. “Trying to interact with objects in the real world through a smartphone is like trying to do heart surgery with a spork,” says Martin. “More often than not our phones end up being a frustrating barrier to the digital world, rather than a tool to enable interactions with it.”
To solve this problem requires a combination of hardware and software, while the Litho device itself is described as an unobtrusive finger-worn controller that connects via Bluetooth Low Energy to a smartphone or AR headset. The controller has a capacitive touch surface on the underside, which allows for precise 2D input, scrolling and tapping. But, more significantly, it also has an array of motion sensors and provides haptic feedback.
The Litho SDK uses the popular 3D game development platform Unity, and Martin says developers will be able to make apps that can not only identify the direction (/vector) in which the wearer is pointing, but what they are pointing at in the real world. It also provides an interaction framework of off-the-shelf solutions for core interactions, including templates for tools such as object creation, movement and deletion, making it easier for developers to quickly build “delightful and intuitive experiences.”
“Having an input device designed from the ground up for 3D interaction opens a whole new paradigm of mobile interactions,” he adds. “Instead of an awkward and frustrating interface, developers can create precise yet effortless interactions in 3D space. This opens up a whole new range of use cases — architects and designers can create precise 3D models in the context of the real world, and gamers can create a virtual theme park in their back garden simply by pointing and drawing. At home, instead of opening up a smartphone app, searching for the right bulb and operating a virtual dimmer, you can simply point and swipe to dim your lights.”
Meanwhile, Litho has already picked up a number of notable investors. The burgeoning startup has raised an undisclosed amount of seed funding from U.S. venture firm Greycroft, Paul Heydon (an early investor in Unity and Supercell) and Chris Albinson (who co-led investments in DocuSign, Pinterest and Turo), along with several other unnamed angel investors.
Researchers at the New Jersey Institute of Technology, while testing the “station keeping” functions of the glass knifefish, have created an augmented reality system that tricks the animal’s electric sensing organs in real time. The fish keeps itself hidden by moving inside of its various holes/homes and the researchers wanted to understand what kind of autonomous sensing functions it used to keep itself safe.
“What is most exciting is that this study has allowed us to explore feedback in ways that we have been dreaming about for over 10 years,” said Eric Fortune, associate professor at NJIT. “This is perhaps the first study where augmented reality has been used to probe, in real time, this fundamental process of movement-based active sensing, which nearly all animals use to perceive the environment around them.”
The fish isn’t wearing a headset but instead the researchers have simulated the motion of a refuge waving in the water.
“We’ve known for a long time that these fish will follow the position of their refuge, but more recently we discovered that they generate small movements that reminded us of the tiny movements that are seen in human eyes,” said Fortune. “That led us to devise our augmented reality system and see if we could experimentally perturb the relationship between the sensory and motor systems of these fish without completely unlinking them. Until now, this was very hard to do.”
To create their test they put a fish inside a tube and synced the motion of the tube to the fish’s eyes. As the fish swam forward and backward, the researchers would watch to see what happened when the fish could see that it was directly effecting the motion of the refuge. When they synced the refuge to the motion of the fish, they were able to confirm that the fish could tell that the experience wasn’t “real” in a natural sense. In short, the fish knew it was in a virtual environment.
“It turns out the fish behave differently when the stimulus is controlled by the individual versus when the stimulus is played back to them,” said Fortune. “This experiment demonstrates that the phenomenon that we are observing is due to feedback the fish receives from its own movement. Essentially, the animal seems to know that it is controlling the sensory world around it.”
Whether or not the fish can play Job Simulator is still unclear.
“Our hope is that researchers will conduct similar experiments to learn more about vision in humans, which could give us valuable knowledge about our own neurobiology,” said Fortune. “At the same time, because animals continue to be so much better at vision and control of movement than any artificial system that has been devised, we think that engineers could take the data we’ve published and translate that into more powerful feedback control systems.”
“Yeah! Well of course we’re working on it” Facebook’s head of augmented reality Ficus Kirkpatrick told me when I asked him if Facebook was building an AR glasses at TechCrunch’s AR/VR event in LA. “We are building hardware products. We’re going forward on this . . . We want to see those glasses come into reality, and I think we want to play our part in helping to bring them there.”
This is the clearest confirmation we’ve received yet from Facebook about its plans for AR glasses. The product could be Facebook’s opportunity to own a mainstream computing device on which its software could run after a decade of being beholden to smartphones built, controlled, and taxed by Apple and Google.
This month Facebook launched its first self-branded gadget out of its Building 8 lab, the Portal smart display, and now it’s revving up hardware efforts. For AR, Kirkpatrick told me “We have no product to announce right now. But we have a lot of very talented people doing really, really compelling cutting-edge research that we hope plays a part in the future of headsets.”
There’s a war brewing here. AR startups like Magic Leap and Thalmic Labs are starting to release their first headsets and glasses. Microsoft is considered a leader thanks to its early Hololens product, while Google Glass is still being developed for the enterprise. And Apple has acquired AR hardware developers like Akonia Holographics and Vrvana to accelerate development of its own headsets.
Mark Zuckerberg said AR glasses were 5 to 7 years away at F8 2017
Technological progress and competition seems to have sped up Facebook’s timetable. Back in April 2017, CEO Mark Zuckerberg said “We all know where we want this to get eventually, we want glasses”, but explained that “we do not have the science or technology today to build the AR glasses that we want. We may in five years, or seven years”. He explained that “We can’t build the AR product that we want today, so building VR is the path to getting to those AR glasses.” The company’s Oculus division had talked extensively about the potential of AR glasses, yet similarly characterized them as far off.
But a few months later, a Facebook patent application for AR glasses was spotted by Business Insider that detailed using “waveguide display with two-dimensional scanner” to project media onto the lenses. Cheddar’s Alex Heath reports that Facebook is working on Project Sequoia that uses projectors to display AR experiences on top of physical objects like a chess board on a table or a person’s likeness on something for teleconferencing. These indicate Facebook was moving past AR research.
Facebook AR glasses patent application
Last month, The Information spotted four Facebook job listings seeking engineers with experience building custom AR computer chips to join the Facebook Reality Lab (formerly known as Oculus research). And a week later, Oculus’ Chief Scientist Michael Abrash briefly mentioned amidst a half hour technical keynote at company’s VR conference that “No off the shelf display technology is good enough for AR, so we had no choice but to develop a new display system. And that system also has the potential to bring VR to a different level.”
But Kirkpatrick clarified that he sees Facebook’s AR efforts not just as a mixed reality feature of VR headsets. “I don’t think we converge to one single device . . . I don’t think we’re going to end up in a Ready Player One future where everyone is just hanging out in VR all the time” he tells me. “I think we’re still going to have the lives that we have today where you stay at home and you have maybe an escapist, immersive experience or you use VR to transport yourself somewhere else. But I think those things like the people you connect with, the things you’re doing, the state of your apps and everything needs to be carried and portable on-the-go with you as well, and I think that’s going to look more like how we think about AR.”
Oculus Chief Scientist Michael Abrash makes predictions about the future of AR and VR at the Oculus Connect 5 conference
Oculus virtual reality headsets and Facebook augmented reality glasses could share an underlying software layer, though, which might speed up engineering efforts while making the interface more familiar for users. “I think that all this stuff will converge in some way maybe at the software level” Kirkpatrick said.
The problem for Facebook AR is that it may run into the same privacy concerns that people had about putting a Portal camera inside their homes. While VR headsets generate a fictional world, AR must collect data about your real-world surroundings. That could raise fears about Facebook surveiling not just our homes but everything we do, and using that data to power ad targeting and content recommendations. This brand tax haunts Facebook’s every move.
Startups with a cleaner slate like Magic Leap and giants with a better track record on privacy like Apple could have an easier time getting users to put a camera on their heads. Facebook would likely need a best-in-class gadget that does much that others can’t in order to convince people it deserves to augment their reality.
You can watch our full interview with Facebook’s director of camera and head of augmented reality engineering Ficus Kirkpatrick from our TechCrunch Sessions — AR/VR event in LA:
The Pansar Augmented is a Swedish smart watch that looks like a standard three-handed wristwatch. However, with the tap of a button, you can view multiple data points including weather, notifications, and even sales data from your CRM.
Pansar is a Swedish watch company that uses Swiss movements and hand assembled components to add a dash of luxury to your standard workhorse watch.
The watch is fully funded on Kickstarter. It costs $ 645 for early birds.
The watch mostly displays the time but when the data system is activated the hands move to show any data you’d like.
The world is full of interesting data: be it the quest for information on the perfect wave, keeping track on your stock value, or the number of followers you’ve acquired since yesterday. Pansar Augmented collects the data that matters to you and streams it conveniently to the hands of your watch. This is made possible because of the unique dual directional Swiss movement combined with the Pansar Augmented app.
The watch comes in three models: the Ocean Edition that shows “relevant data on weather, wind, and swell amongst others,” the Accelerator Edition that shows website visits or Instagram views, and the Quantifier Edition for the “analytical mind” that wants to track sales numbers.
It’s definitely a clever twist on the traditional smart watch vision and, thanks to some nice styling, these could be some nice pieces for folks who don’t want the distractions of a normal Apple Watch or Android Wear device.