Deep and Durable Learning

View Original

Cooperate with Your Brain

Image by ElisaRiva from Pixabay

(Strict right- vs. left-brain functioning like this illustration misunderstands the brain).

What do we know about how the brain learns?

That’s the foundational question we’ll be exploring this season.

We’re exploring this question, not just because I’m a biologist, but because we can’t hope to learn effectively if we ignore the way the brain was designed to function.

Educators have traditionally focused on delivering information and learners have been rewarded for their recall of that information. That’s been true for at least hundreds of years and it hasn’t changed since we began to gain insight to the contrary.

Human brains are designed to pursue understanding of significant ideas; they are not particularly good at storing and retrieving information!

You may recall from last season that I made the case that knowledge is far more than fact retrieval. Knowledge is justifiable belief. The process of justification entails a logical appeal to ideas (concepts) and evidence. Parroting facts that others have unearthed doesn’t count.

The human brain is without parallel in the animal world and even in the computer world.

Computers are unrivaled in data and information retrieval, but human brains are without peer in their ability to recognize and create patterns. The strengths of the brain are on display in a 2017 article in the journal Neuron titled “The Brain as an Efficient and Robust Adaptive Learner.” The authors lead off with this paean:

“The brain is a hugely complex, highly recurrent, and nonlinear neural network. This network is surprisingly plastic and sustains our amazing capability for learning from experience and adapting to new situations.”

 

Although we are rightly in awe of the brain, neuroscience has surprisingly little insight into how the brain learns. To be sure we know more now than we did fifty years ago. Much of this is due to an improvement in brain imaging through devices such as fMRI that allow imaging while patients are conscious and engaging in various cognitive actions. Imaging tells us something about the areas of the brain that “light up” while doing a computation, retrieving a fact, or solving a puzzle, for example. The large number of different brain structures involved on both sides of the brain in apparently simple tasks has been one of the big surprises emerging from advanced imaging. It must be emphasized, however, that anatomy answers a what (or where/when) question, and not a how (or why) question.

 

To be clear, I’m not playing down the great strides neuroanatomy has made through projects like the Human Connectome project ( begun in 2010) which aimed to discern a “wiring diagram” for the brain’s 1,000 trillion (quadrillion) synaptic connections. The wiring diagram could, in principle, go all the way down to individual neurons. The current map identifies large clusters of neurons in specific anatomical positions within the brain.

Image by Gordon Johnson from Pixabay

 

A wiring diagram would only be a beginning, however. It would not explain the logic embodied in the “circuits.” (The brain’s noted plasticity says it is simplistic to expect such a diagram that applies to all humans under all conditions.)

Many different disciplines are invested in exploring brain function. These include neuroanatomy, neurophysiology, molecular biology (nerve impulse conduction and neurotransmitter action at synapses), psychology, and learning theory to name the major players. These disciplines are typically siloed, isolated from each other by even a consistent vocabulary. Given that they are all interested in the same brain, the lack of integration is telling!

Because we all own a brain, we come to brain function with some insider perspective (and limitation).

“We divide the real estate of the brain according to our preconceived ideas, assuming — wrongly, as far as I’m concerned — that those preconceived ideas have boundaries, and the same boundaries exist in brain function,” György Buzsáki, NYU School of Medicine, Neuroscientist.

Psychologist Lisa Feldman Barrett concurs with Buzsaki.

“Scientists for over 100 years have searched fruitlessly for brain boundaries between thinking, feeling, deciding, remembering, moving and other everyday experiences,” .. . . [Current neuroscience shows these categories] “are poor guides for understanding how brains are structured or how they work.”

I don’t mean to imply that we know nothing of how the human brain functions, only that we are very far away from any kind of reductionistic explanation. A related philosophical question is whether the brain—or even a collection of human brains—can understand the brain. My answer is no.

No human brain or brains has/have the necessary “meta” position necessary to even ask the right questions about structure-function relationships.

Human brains are not computers. Computers follow linear language sequences, but human brains are nonlinear. Nonlinear means the brain uses parallel processing which processes multiple stimuli simultaneously (1). Human neural networks are also recurrent, meaning that they maintain the sequence of the input data throughout their processing of the data (2). (This is the basis for positive- and negative-feedback).

Enough of what we don’t know.

What do we know about how the brain functions and the implications for learning?

Brains are:

  • major consumers of oxygen

  • voracious and require 20%+ of our calories

On a gross level we know that the brain is a major consumer of oxygen because it is a major metabolic consumer responsible for about 20% of our caloric consumption. Adequate nutrition is needed for proper brain function.

  •  excellent at ignoring sensory information

The brain ignores most of the sensory information that it receives but is acutely sensitive to changes in the sensory channels. The brain would rapidly be overwhelmed if it treated all sensory input as equally important. This means that learning must be compelling if it is to get processing time.

  • extremely limited in short-term memory

The brain has a serious bottleneck in short-term (or working) memory. This is the path into the brain’s cognitive centers.

The brain can hold somewhere between 3 and 9 items in short-term memory (depending on the nature of the items). We can feel this limitation when trying to remember phone numbers and account numbers. Therefore, we make groups of numbers 3-4 digits long on credit card numbers or area codes followed by 3 and then 4 numbers to make phone numbers. Lectures filled with facts to remember are working against our limited input channel.

  • encoders of sensory information as concepts

The brain does not simply record information from sensory channels. Instead, it encodes what arrives through the eyes, ears, etc. through the creation of concepts or creating new links between existing concepts. Learning should be centered on helping learners develop ideas (concepts) rather than repeating information.

  • creators of patterns

In relation to cognitive function, we know that the brain is excellent at recognizing patterns (and creating them) and far outstrips any computer system in this regard. Artificial intelligence attempts to imitate brain functions through adopting some brain strategies such as neural networks. The ideas (concepts) the brain traffics in are really patterns. See for example the multiplicity of particulars wrapped up in the notion of a car. Helping students to recognize patterns needs to be central to teaching and learning.

  • highly associational

The brain creates elaborate networks of association and logical connection between concepts. The “car” concept is linked to related concepts such as wheels, engines, transmissions, fuel, drivers, etc. Teaching and learning require a consistent conscious effort to create logical linkages of ideas into networks.

  • concrete

The brain strives to connect cognitive processing of concepts with concrete items in the physical world through analogies or experiences. Learners and their teachers should strive to find concrete parallels in the physical world to even the most esoteric ideas.

  • excellent at forgetting

I don’t just mean that brains fail to remember. All of us have been let down on that account.

In addition to ignoring most sensory information, brains erase much of the rest that survives sensory inattention and the limitations of short-term memory. This active erasure is based on the perceived irrelevance of the information and it happens during deep sleep. This is called consolidation and the positive aspect is that ideas which are important to remember because of their logical and associational components are consolidated. Failure to consolidate means that the instruction is lost. Adequate deep sleep is essential for learning!

 

This season we’ll be exploring the 7C’s of Human Cognition. We’ve touched on some of them above. Next time I’ll be blogging and podcasting on Curiosity—which is foundational to all learning.

Until then, if you want to explore another illustration of the sensitivity of the brain to patterns, consider music. This YouTube video makes the point in less than two minutes:

The Human Connectome Project: A retrospective. NeuroImage 244 (2021).

https://doi.org/10.1016/j.neuroimage.2021.118543.

György Buzsáki https://www.quantamagazine.org/mental-phenomena-dont-map-into-the-brain-as-expected-20210824/

Lisa Feldman Barrett https://www.quantamagazine.org/mental-phenomena-dont-map-into-the-brain-as-expected-20210824/

Brain Function:

(1) https://dictionary.apa.org/parallel-processing

(2) https://dictionary.apa.org/recurrent-circuit

Podcast episodes at:

https://feeds.buzzsprout.com/1817564.rss

See this content in the original post