Richard Feynman famously wrote: “You have to keep a dozen of your favorite problems constantly present in your mind, although by and large they will lay in a dormant state. Every time you hear or read a new trick or result, test it against each of your twelve problems to see whether it helps. Every… Continue reading On Richard Feynman and your core questions
Tag: composability
Evidence of Compositionality in the human brain
Artificial Intelligence and Cognitive Science researchers have taken for granted that compositionality exists in the human brain. Fodor , Chomsky, and many others have spoken about the human brain ability to create infinite meaning from a finite set of tokens (words). There are people who argue that the brain is not compositional. You can read… Continue reading Evidence of Compositionality in the human brain
Representing Moving vs Doing
The formal definition of “to move” is: “to change from one place or position to another” or “to set or keep in motion”. This moving usually happens by their own volition, but things can be moved by others as a “patient”. It should be easy to know if something is moving and all animal intelligence… Continue reading Representing Moving vs Doing
The Language of Thought hypothesis and Compositionality was worked on hundreds of years before computers existed!
I just learned Gottfried Wilhelm Leibniz (1646-1716), famous for inventing calculus, was also interested in the philosophy of mind. He proposed a concept called the Alphabet of human thought, which is an idea that says there is a universal way to represent all human thought with small primitive components. With these primitive components, we should… Continue reading The Language of Thought hypothesis and Compositionality was worked on hundreds of years before computers existed!
Testing compositionality on GCCKR
can we just represent each prime as a point in a vector? We would need to have a programming language that could interpret the vector. So we are moving the meaning of the vector to the interpreter. We need to test compositionality. Can you have multiple vectors as a sequence to represent new concepts/words? The… Continue reading Testing compositionality on GCCKR
Notes on Neural Module Networks (NMN)
I’ve been studying Neural Module Networks from Jacob Andreas. He focuses on compositionality and grounding, my 2 favorite subjects. I just read 2 of his papers: https://openaccess.thecvf.com/content_cvpr_2016/html/Andreas_Neural_Module_Networks_CVPR_2016_paper.html https://openaccess.thecvf.com/content_iccv_2017/html/Hu_Learning_to_Reason_ICCV_2017_paper.html I’ll go over the first paper: “Deep Compositional Question Answering with Neural Module Networks” He has created a new type of neural network architecture where he builds… Continue reading Notes on Neural Module Networks (NMN)
Notes on “Creativity, Compositionality, and Common Sense in Human Goal Generation”
Just read this short and sweet paper from Guy Davison et al. Brenden Lake is also a co-author, I have read several of his papers. paper link: https://psyarxiv.com/byzs5/ They built mini programs as DSLs that represent game rules. The programs act as reward generating functions (goals as reward-generating programs). Their line of reasoning follow the… Continue reading Notes on “Creativity, Compositionality, and Common Sense in Human Goal Generation”
A computational definition of grounded compositionality
“The meaning of a whole is a function of the meaning of the parts and of the way they are syntactically combined”. It seems there is not a clear consensus on how we would implement compositionality in computers. I have gone through several research papers and collected several different definitions. Systematic The most common definition… Continue reading A computational definition of grounded compositionality
Candidate data structures and algorithms for representing ground-able compositional knowledge in computers
Some of the most interesting ways to represent knowledge in computers using Mathematical structures. Graphs and graph neural networks multi-spacial grid cell representations grid representations probabilistic programming holograms Voronoi tesselations – they allow distance calculations between any concepts https://towardsdatascience.com/the-geometry-of-thought-700047775956 word2vec – its such as simple model, its worth thinking about how to scale this up.
Variables in grounded language learning
I keep reading grounded language learning papers trying to figure out what is the minimal ingredient to achieve a grounded representation in computers. One theme I keep seeing and coming back to is that while everything is grounded, compositionality may need more abstract thinking. It seems like there definitely must be some recursive computation or… Continue reading Variables in grounded language learning