Notes on Judea Pearl

I’ve known about Judea Pearl and his work for many years, but I just started studying it in detail. Many recent papers I have read directly cite his work. In thinking about my model building engine theory, cause and effect is one of the central ideas that came about from that thought experiment. If humans… Continue reading Notes on Judea Pearl

The Language of Thought hypothesis and Compositionality was worked on hundreds of years before computers existed!

I just learned Gottfried Wilhelm Leibniz (1646-1716), famous for inventing calculus, was also interested in the philosophy of mind. He proposed a concept called the Alphabet of human thought, which is an idea that says there is a universal way to represent all human thought with small primitive components. With these primitive components, we should… Continue reading The Language of Thought hypothesis and Compositionality was worked on hundreds of years before computers existed!

What is grounded language learning

“An unknown, but potentially large, fraction of animal and human intelligence is a direct consequence of the perceptual and physical richness of our environment, and is unlikely to arise without it”. – John Locke (1632-1704) Powerful state of the art machine learning systems such as word2vec and GPT-3 are powerful machine learning models that seem to… Continue reading What is grounded language learning

Examining concepts through different filters

I think the human mind an autonomous interlinked model building database of the world. I define models and concepts to be the same thing. One important aspect of this system that has been nagging me is the idea that the human mind can examine ideas/models/concepts from different filters. In my article about the mind as… Continue reading Examining concepts through different filters

Why logarithms are used in computation, STEM, and machine learning

I used logarithms a lot in school while studying math and computer science, but forgot a lot about the why as I got older. We often see logarithms used in machine learning and so I wanted to have a refresher as to why we use them. Mathematics The logarithm is the inverse of exponents. So… Continue reading Why logarithms are used in computation, STEM, and machine learning

Do grid like cells appear in the neocortex?

Grid cells exist in the entorhinal cortex (EC) and are known to fire for spatial navigation on a hexagrid structure. Grid cells along with the other cells in the EC are implicated as a key component that allows for generalization and conceptual maps in the human brain. Recently I was reading “A Framework for Intelligence… Continue reading Do grid like cells appear in the neocortex?

The human mind is an autonomous interlinked model building database of the world

My definition of intelligence has become refined over time. In my article about the the human mind being a simulation engine, I already wrote out some of the points I am going to make, but my thinking has on the topic has become more clear. The human mind is composed of tens of thousands of… Continue reading The human mind is an autonomous interlinked model building database of the world

Compositional vs composable in AGI research

These 2 terms are talked often in software development, philosophy of language, and machine learning. Sometimes they are used interchangeably and I’m not sure if that is on purpose or not, but their definitions are related and similar, but with slight differences. Compositional means that the meaning of the whole expression or structure of something… Continue reading Compositional vs composable in AGI research