Notes on “Time, space, and events in language and cognition: a comparative view”

I recently learned about Peter Gärdenfors and his idea on conceptual spaces using Voronoi tesselations. I have been reading through some of his papers and came across this paper that he co-authored with Chris Sinha in 2014: Time, space, and events in language and cognition: a comparative view I have been studying time as well… Continue reading Notes on “Time, space, and events in language and cognition: a comparative view”

Notes on “Programs as Causal Models: Speculations on Mental Programs and Mental Representations”

I just read the paper from Nick Chater and Mike Oaksford. I found this paper from reading “Creativity, Compositionality, and Common Sense in Human Goal Generation”. Very cool paper: https://pubmed.ncbi.nlm.nih.gov/23855554/ They follow Judea Pearl’s thinking that counterfactuals and causality are central to intelligence, both natural and artificial intelligence. I know of Judea Pearl’s work and… Continue reading Notes on “Programs as Causal Models: Speculations on Mental Programs and Mental Representations”

Notes on ” Are there Semantic Primes in Formal Languages? “

Research was done comparing NSM to a few languages such as OWL,PDDL, and MOF. They found a subset of primitives are in formal languages already. The ones missing are about 40/65. So over 2/3 of these words are not in formal languages. The abstract: “Abstract. This paper surveys languages used to enrich contextual information with… Continue reading Notes on ” Are there Semantic Primes in Formal Languages? “

Notes on Neural Module Networks (NMN)

I’ve been studying Neural Module Networks from Jacob Andreas. He focuses on compositionality and grounding, my 2 favorite subjects. I just read 2 of his papers: https://openaccess.thecvf.com/content_cvpr_2016/html/Andreas_Neural_Module_Networks_CVPR_2016_paper.html https://openaccess.thecvf.com/content_iccv_2017/html/Hu_Learning_to_Reason_ICCV_2017_paper.html I’ll go over the first paper: “Deep Compositional Question Answering with Neural Module Networks” He has created a new type of neural network architecture where he builds… Continue reading Notes on Neural Module Networks (NMN)

Notes on “Creativity, Compositionality, and Common Sense in Human Goal Generation”

Just read this short and sweet paper from Guy Davison et al. Brenden Lake is also a co-author, I have read several of his papers. paper link: https://psyarxiv.com/byzs5/ They built mini programs as DSLs that represent game rules. The programs act as reward generating functions (goals as reward-generating programs). Their line of reasoning follow the… Continue reading Notes on “Creativity, Compositionality, and Common Sense in Human Goal Generation”

Notes on “Symmetry-Based Representations for Artificial and Biological General Intelligence”

The paper: https://arxiv.org/abs/2203.09250 Its from a team at DeepMind with the purpose of pleaing neuroscientist to look for symmetry-based representations in the brain. From the paper: “The idea that there exist transformations (symmetries) that affect some aspects of the system but not others, and their relationship to conserved quantities has become central in modern physics,… Continue reading Notes on “Symmetry-Based Representations for Artificial and Biological General Intelligence”

My notes on ”A Benchmark for Systemic Generalization in Grounded Language Understanding”

I meant to read this paper for a long time and finally got around to it. Paper link: https://arxiv.org/abs/2003.05161v2 Laura Ruis is the main author, I hope she goes on to do more work in this “general direction” 🙂 Jacob Andreas of NTM fame is a co-author, awesome! Brenden Lake who does tons of interesting… Continue reading My notes on ”A Benchmark for Systemic Generalization in Grounded Language Understanding”

What is grounded language learning

“An unknown, but potentially large, fraction of animal and human intelligence is a direct consequence of the perceptual and physical richness of our environment, and is unlikely to arise without it”. – John Locke (1632-1704) Powerful state of the art machine learning systems such as word2vec and GPT-3 are powerful machine learning models that seem to… Continue reading What is grounded language learning