As a general rule, AI isn’t great at using new info to make better sense of existing info. Facebook thinks it has a clever (if unusual) way to explore solutions to this problem: send AI on a virtual vacation. It recently conducted an experiment that had a “tourist” bot with 360-degee photos try to find its way around New York City’s Hell’s Kitchen area with the help of a “guide” bot using 2D maps. The digital tourist had to describe where it was based on what it could see, giving the guide a point of reference it can use to offer directions.
The project focused on collecting info through regular language (“in front of me there’s a Brooks Brothers”), but it produced an interesting side discovery: the team learned that the bots were more effective when they used a “synthetic” chat made of symbols to communicate data. In other words, the conversations they’d use to help you find your hotel might need to be different than those used to help, say, a self-driving car.
The research also helped Facebook’s AI make sense of visually complex urban environments. A Masked Attention for Spatial Convolution system could quickly parse the most relevant keywords in their responses, so they could more accurately convey where they were or needed to go.
As our TechCrunch colleagues observed, this is a research project that could improve AI as a whole rather than the immediate precursor to a navigation product. With that said, it’s easy to see practical implications. Self-driving cars could use this to find their way when they can’t rely on GPS, or offer directions to wayward humans using only vague descriptions.