Apple could launch augmented reality headset in 2020

According to a new report from Ming-Chi Kuo (via 9to5mac), a reliable analyst on all things Apple, the company has been working on an augmented reality headset and is about to launch the device. This pair of glasses could go into mass production as early as Q4 2019 and should be available at some point during the first half of 2020.

It’s still unclear what you’ll be able to do with this mysterious headset. Kuo says that it’ll work more or less like an Apple Watch. You won’t be able to use the AR headset without an iPhone as it’ll rely heavily on your iPhone.

The glasses will act as a deported display to give you information right in front of your eyes. Your iPhone will do the heavy lifting when it comes to internet connectivity, location services and computing. I wouldn’t be surprised if the AR headset relies on Bluetooth to communicate with your iPhone.

Kuo’s report doesn’t say what you’ll find in the headset. Apple could embed displays and sensors so that the AR headset is aware of your surroundings. An AR device only makes sense if Apple puts sensors to detect things around you.

Apple has already experimented with augmented reality with its ARKit framework on iOS. Developers have been able to build apps that integrate digital elements in the real world, as viewed through your phone cameras.

While many apps have added AR features, most of them feel gimmicky and don’t add any real value. There hasn’t been a ton of AR-native apps either.

One interested use case for augmented reality is mapping. Google recently unveiled an augmented reality mode for Google Maps. You can hold your phone in front of your face to see arrows indicating where you’re supposed to go.

Apple has also been rebuilding Apple Maps with its own data. The company isn’t just drawing maps. It is collecting a ton of real world data using LiDAR sensors and eight cameras attached on a car roof. Let’s see if Apple Maps will play an important part in Apple’s rumored AR headset.

Gadgets – TechCrunch

Microsoft bringing Dynamics 365 mixed reality solutions to smartphones

Last year Microsoft introduced several mixed reality business solutions under the Dynamics 365 enterprise product umbrella. Today, the company announced it would be moving these to smartphones in the spring, starting with previews.

The company announced Remote Assist on HoloLens last year. This tool allows a technician working onsite to show a remote expert what they are seeing. The expert can then walk the less experienced employee through the repair. This is great for those companies that have equipped their workforce with HoloLens for hands-free instruction, but not every company can afford the new equipment.

Starting in the spring, Microsoft is going to help with that by introducing Remote Assist for Android phones. Just about everyone has a phone with them, and those with Android devices will be able to take advantage of Remote Assist capabilities without investing in HoloLens. The company is also updating Remote Assist to include mobile annotations, group calling, deeper integration with Dynamics 365 for Field Service along with improved accessibility features on the HoloLens app.

IPhone users shouldn’t feel left out though because the company announced a preview of Dynamics 365 Product Visualize for iPhone. This tool enables users to work with a customer to visualize what a customized product will look like as they work with them. Think about a furniture seller working with a customer in their homes to customize the color, fabrics and design in place in the room where they will place the furniture, or a car dealer offering different options such as color and wheel styles. Once a customer agrees to a configuration, the data gets saved to Dynamics 365 and shared in Microsoft Teams for greater collaboration across a group of employees working with a customer on a project.

Both of these features are part of the Dynamics 365 spring release and are going to be available in preview starting in April. They are part of a broader release that includes a variety of new artificial intelligence features such as customer service bots and a unified view of customer data across the Dynamics 365 family of products.


Android – TechCrunch

Litho is a finger-worn controller for augmented reality, IoT and other ‘spatial’ interactions

I first encountered the founders of Litho, a new hardware and software startup developing a new finger-worn controller, at London’s [email protected] last April. The event sees startups pitch in front of the British royal family and other esteemed guests, and naturally the company’s young founders, 24-year-old Nat Martin (CEO) and 25-year-old Charlie Bruce (CTO), were a little overawed by the occasion, just like many of the other founders pitching that day. However, perhaps unbeknownst to them, Litho was also one of the more notable companies, not least because, as the saying goes, hardware is hard.

Fast-forward to today and the young company is ready to show the world the first publicly available iteration of what it has been building: an innovative finger-worn device that provides control over various “spatial interactions” and should find applications ranging from AR and VR to the smart home and the control of other IoT devices. The next stage for Litho is to offer the controller and access to its SDK to developers who join the startup’s beta programme for $ 199/£179.

“Computing is increasingly structured around the real world rather than the desktop,” says Litho’s Nat Martin. “With the advent of smart devices such as lights, thermostats and door locks, physical things are becoming digitally connected. Equally, with the advent of AR, digital things are becoming physically anchored in the real world. These are two sides of the same coin — digital interactions are entering physical space.”

However, the status quo is for the smartphone to be the primary interface for these spatial interactions, but smartphones were designed to interact with 2D content on screens and are therefore struggling to make the leap. “Trying to interact with objects in the real world through a smartphone is like trying to do heart surgery with a spork,” says Martin. “More often than not our phones end up being a frustrating barrier to the digital world, rather than a tool to enable interactions with it.”

To solve this problem requires a combination of hardware and software, while the Litho device itself is described as an unobtrusive finger-worn controller that connects via Bluetooth Low Energy to a smartphone or AR headset. The controller has a capacitive touch surface on the underside, which allows for precise 2D input, scrolling and tapping. But, more significantly, it also has an array of motion sensors and provides haptic feedback.

The Litho SDK uses the popular 3D game development platform Unity, and Martin says developers will be able to make apps that can not only identify the direction (/vector) in which the wearer is pointing, but what they are pointing at in the real world. It also provides an interaction framework of off-the-shelf solutions for core interactions, including templates for tools such as object creation, movement and deletion, making it easier for developers to quickly build “delightful and intuitive experiences.”

“Having an input device designed from the ground up for 3D interaction opens a whole new paradigm of mobile interactions,” he adds. “Instead of an awkward and frustrating interface, developers can create precise yet effortless interactions in 3D space. This opens up a whole new range of use cases — architects and designers can create precise 3D models in the context of the real world, and gamers can create a virtual theme park in their back garden simply by pointing and drawing. At home, instead of opening up a smartphone app, searching for the right bulb and operating a virtual dimmer, you can simply point and swipe to dim your lights.”

Meanwhile, Litho has already picked up a number of notable investors. The burgeoning startup has raised an undisclosed amount of seed funding from U.S. venture firm Greycroft, Paul Heydon (an early investor in Unity and Supercell) and Chris Albinson (who co-led investments in DocuSign, Pinterest and Turo), along with several other unnamed angel investors.

Gadgets – TechCrunch

Researchers are putting fish into augmented reality tanks

Researchers at the New Jersey Institute of Technology, while testing the “station keeping” functions of the glass knifefish, have created an augmented reality system that tricks the animal’s electric sensing organs in real time. The fish keeps itself hidden by moving inside of its various holes/homes and the researchers wanted to understand what kind of autonomous sensing functions it used to keep itself safe.

“What is most exciting is that this study has allowed us to explore feedback in ways that we have been dreaming about for over 10 years,” said Eric Fortune, associate professor at NJIT. “This is perhaps the first study where augmented reality has been used to probe, in real time, this fundamental process of movement-based active sensing, which nearly all animals use to perceive the environment around them.”

The fish isn’t wearing a headset but instead the researchers have simulated the motion of a refuge waving in the water.

“We’ve known for a long time that these fish will follow the position of their refuge, but more recently we discovered that they generate small movements that reminded us of the tiny movements that are seen in human eyes,” said Fortune. “That led us to devise our augmented reality system and see if we could experimentally perturb the relationship between the sensory and motor systems of these fish without completely unlinking them. Until now, this was very hard to do.”

To create their test they put a fish inside a tube and synced the motion of the tube to the fish’s eyes. As the fish swam forward and backward, the researchers would watch to see what happened when the fish could see that it was directly effecting the motion of the refuge. When they synced the refuge to the motion of the fish, they were able to confirm that the fish could tell that the experience wasn’t “real” in a natural sense. In short, the fish knew it was in a virtual environment.

“It turns out the fish behave differently when the stimulus is controlled by the individual versus when the stimulus is played back to them,” said Fortune. “This experiment demonstrates that the phenomenon that we are observing is due to feedback the fish receives from its own movement. Essentially, the animal seems to know that it is controlling the sensory world around it.”

Whether or not the fish can play Job Simulator is still unclear.

“Our hope is that researchers will conduct similar experiments to learn more about vision in humans, which could give us valuable knowledge about our own neurobiology,” said Fortune. “At the same time, because animals continue to be so much better at vision and control of movement than any artificial system that has been devised, we think that engineers could take the data we’ve published and translate that into more powerful feedback control systems.”

Gadgets – TechCrunch