Variables in grounded language learning

I keep reading grounded language learning papers trying to figure out what is the minimal ingredient to achieve a grounded representation in computers. One theme I keep seeing and coming back to is that while everything is grounded, compositionality may need more abstract thinking.

It seems like there definitely must be some recursive computation or structure. Think of this interesting definition I saw for

“to damage”:

X did something to Y that was bad for X and it changed Y like this: Before this happened, some parts of Y were good more than now. Before this happened, Y could do some things more than it can now.

“like this” in that definition is a form of recursion in that you have to parse the other sentence to be able to understand the original sentence.

Noam Chomsky is famous for talking about recursion in language before.

A different way to say it, is there must be some support for variables.

In their definition they directly use X and Y, which are variables.

Think of other words such as “to do”, “to happen” and “because”. Those words are general words that work with many other types of words. In the example above “X did something to Y”. “To do” acts as a generic variable verb where someone can understand the sentence regardless if X punched, kicked, ripped, pushed, etc to damage Y. The representation of those concepts would also extent to generic variable nouns like “someone” and “something”.

I think if you can find a way to represent “to do”, “to happen” and “because” in computers in a generalized way, you have have solved AGI.

Leave a comment

Your email address will not be published.