Your brain lives 15 seconds ‘in the past’ to help you see the world with stability

BERKELEY, Calif. — People often accuse others of living in the past, but it turns out everyone is — by exactly 15 seconds! Researchers from the University of California-Berkeley have discovered that the human brain shows you images from 15 seconds in the past, instead of trying to update your vision in real-time.

Just like a social media feed, the brain is constantly uploading new and rich visual stimuli. However, to keep everything that our eyes are taking in every second of every day in order, the study finds the brain actually presents us with an image from 15 seconds earlier.

The findings provide new insights into what scientists call the mind’s “continuity field,” a function of perception where the brain merges with what our eyes see to provide a sense of stability. Without it, study authors say the world would actually appear like a blurry jumble in your eyes.

“If our brains were always updating in real time, the world would be a jittery place with constant fluctuations in shadow, light and movement, and we’d feel like we were hallucinating all the time,” explains study senior author David Whitney, a UC Berkeley professor of psychology, neuroscience, and vision science, in a university release.

“Our brain is like a time machine. It keeps sending us back in time. It’s like we have an app that consolidates our visual input every 15 seconds into one impression so we can handle everyday life,” adds study lead author Mauro Manassi, an assistant professor of psychology at Scotland’s University of Aberdeen and former postdoctoral fellow in Whitney’s lab at UC Berkeley.

Our brains always return to an older image

Researchers studied this time-traveling effect of the brain by examining the mechanisms of “change blindness.” This is when people fail to notice subtle changes which take place over time, like when a TV show switches out an actor for their stunt double.

The team recruited nearly 100 people to view close-up images of faces which changed according to their ages and gender. The time-lapse videos lasted for just 30 seconds and only included a person’s eyes, brows, nose mouth, chin, and cheeks — not their head or facial hair.

Four Faces Berkeley
Time-lapse videos of faces morphing from young to old and male to female demonstrate how the brain lags when processing visual changes. (Image courtesy of Mauro Manassi)

When researchers asked each participant to identify the faces they saw in the video, the group almost always picked a frame that was halfway through the time-lapse aging video.

“One could say our brain is procrastinating,” Whitney says. “It’s too much work to constantly update images, so it sticks to the past because the past is a good predictor of the present. We recycle information from the past because it’s faster, more efficient and less work.”

Living in the past is not always a good thing

Although this visual “lag” generally has a positive impact on how people perceive the world around them, not seeing things in real-time can have its drawbacks as well.

“The delay is great for preventing us from feeling bombarded by visual input in everyday life, but it can also result in life-or-death consequences when surgical precision is needed,” Manassi explains. “For example, radiologists screen for tumors and surgeons need to be able to see what is in front of them in real time; if their brains are biased to what they saw less than a minute ago, they might miss something.”

Despite these potential issues, Whitney says the continuity field and its effect on perception is just an interesting example of “what it means to be human.”

“We’re not literally blind,” Manassi concludes. “It’s just that our visual system’s sluggishness to update can make us blind to immediate changes because it grabs on to our first impression and pulls us toward the past. Ultimately, though, the continuity field supports our experience of a stable world.”

The study is published in the journal Science Advances.


Leave a Reply

Your email address will not be published.