Facebook releases new research dataset to help AI communicate better
Facebook Inc. today released an artificial intelligence dataset called Talk the Walk that aims to help computers learn how to interact with humans more naturally by exploring New York City.
Currently, the communications capabilities of AI software are fairly limited. That’s because artificial neural networks learn language by analyzing sample text for statistical patterns, which isn’t a particularly effective method.
Facebook has created Talk the Walk to let AI models learn in a way that more closely resembles how humans do. “One strategy for eventually building AI with human-level language understanding is to train those systems in a more natural way, by tying language to specific environments,” Facebook researchers Douwe Kiela and Jason Weston wrote in a blog post.
“Just as babies first learn to name what they can see and touch, this approach — sometimes referred to as embodied AI — favors learning in the context of a system’s surroundings, rather than training through large data sets of text (like Wikipedia),” they explained.
With Talk the Walk, researchers can use New York City as the learning environment. The dataset contains maps of different neighborhoods, 360-degree pictures of each block and sample dialogues in which one person tries to guide the other to a certain location. The files are accompanied by “baselines” that Facebook researchers have created to establish how an AI could use the information.
According to the company, Talk the Walk makes it possible to create realistic training scenarios where machine learning models can hone their language skills. Carrying out a simulation requires two AIs. One is a “tourist” with access to the 360-degree photos, while the other is a “guide” that only has the maps. The basic idea is to have the two models communicate with one another until they exchange enough details to figure out how to get from one neighborhood to another.
That’s no easy task. According to The Verge, Facebook estimates it could take a few years before researchers create AI models that can reliably generate navigation instructions using Talk the Walk data. The company’s hope is that the new AI capabilities developed with the project will be applicable not just for navigation but for other fields as well.
Kiela and Weston wrote that “our own experiments shed light on the sub-problems of localization and communication, but our hope is that others will use the task to better understand goal-oriented dialogue, navigation, visual perception and other challenges.”
Source from Siliconangle