Google will test glasses with integrated Translate in August

Google is going to start real-world testing of its prototype augmented reality glasses that offer real-time language translation through Translate. They have the need to carry out tests outside the laboratory due to the limitations offered by this field.
Testing in the real world should allow them better understand how these devices can help people in their daily lives. It will also help them to adapt to the different accents and different types of pronunciation of real people of the same language.
Google advances with its glasses with sound translator
According to a Google blog post, the company is developing experiences such as browsing RA that will help you to "take into account factors such as weather and busy intersections, which can be difficult, sometimes impossible, to fully recreate indoors«One of the first prototypes is a pair of simple glasses that they have been testing in their labs. The glasses offer real-time translation, with transcription placed directly on the lens.
These research glasses prototypes offer experiences such as translation, transcription and navigation. They look like normal glasses, they have a screen inside the lens and they have audio and visual sensors, such as a microphone and a camera to collect information. They have stated that they will investigate the different use cases that use audio detection, such as speech transcription and translation; along with visual detection, which uses image data for text translation and positioning during navigation. These prototypes do not support photography or videography, although the image data will be used for use cases such as navigation, translation, and visual search.

They give as an example that if you are in a country where you do not understand the local language, you can use the camera to translate the posters and signs, and receive directions to a desired place. google claims that the image data is deleted after the experience is over, except in cases where the image data will be used for analysis and debugging. When this happens, the image data will first be removed for sensitive content, such as faces and car license plates that could violate people's privacy.
The data will be stored on a server with access limited to a small number of Google employees for analysis and debugging; and by privacy policy, after 30 days, they are deleted. An LED indicator will light up if image data is being saved for analysis and purging, so you can request data deletion, as well as comply with local privacy and recording laws.



