homehome Home chatchat Notifications


How much of human intelligence is genetic versus acquired? Is it even possible to get smarter?

Intelligence, a blend of genetics and environment, centers on problem-solving abilities and is best enhanced through learning and formal education.

Tibi Puiu
May 20, 2024 @ 6:22 pm

share Share

Intelligence genetic or acquired
AI-generated image. Credit: DALL-E 3.

Is intelligence a gift of nature or a product of nurture? This question has intrigued psychologists for more than a century, leading to extensive research and debate. The allure of enhancing intelligence has led to a bustling, multi-billion ‘brain-boosting’ supplement market for those seeking cognitive enhancements. Yet, the effectiveness of these methods remains dubious, often lacking regulatory approval and scientific backing.

Our society both reveres genius and, conversely, stigmatizes perceived lesser intellects. This explains why some people are obsessed with becoming smarter. But can intelligence be increased? Is it genetically predetermined, or does it wane with age? What does ‘intelligence’ mean in the first place? Let’s dive into what the latest science has to say about all this.

What is intelligence anyway? What about IQ?

There is no clear-cut definition of what constitutes intelligence. The consensus among scientists in this field is that intelligence refers to a set of mental abilities for problem-solving. There are many subsets of intelligence, including verbal ability, numerical ability, spatial ability, and even “emotional intelligence” (the ability to manage both your own emotions and understand the emotions of people around you).

However, the most important subset seems to be “general intelligence”, also known as the “g factor” among intelligence researchers. This factor plays a major role in differentiating individuals in standardized IQ tests, contributing to at least half of the variance observed in these tests’ scores. The g-factor is closely related to fluid intelligence, which involves problem-solving and reasoning skills.

The first attempts to quantify intelligence date back to the 1800s. Sir Francis Galton, an English polymath, pioneered quantitative methods to study intelligence. He was the one who coined the term ‘Nature vs Nurture’ in 1874, for a debate that persists today: Are smart people mostly born this way, or is this ability acquired through life?

Another key figure, Charles Spearman, proposed a ‘General Intelligence’ factor in 1904, suggesting a single underlying cognitive ability. Around the same, psychologist William Stern introduced the concept of ‘intelligence quotient’ (IQ) as a comparative measure of intelligence. Meanwhile, psychologists Alfred Binet and Théodore Simon developed intelligence tests for children, which eventually led to the standardization of IQ tests.

However, these tests were not without criticism. Even Stern, the originator of the IQ concept, recognized their limitations, emphasizing the inherent value and complexity of individual psychological lives.

In 1987, psychologist Raymond Cattell challenged the idea of a singular intelligence factor, proposing a split between structured and creative learning, known as Crystallized and Fluid Intelligence. Crystallized intelligence refers to intelligence manifested through the use of knowledge previously acquired through education or experience.

How accurate are IQ scores?

Do IQ scores accurately predict general intelligence? Critics argue that factors like socioeconomic status can skew IQ scores, potentially misrepresenting innate intelligence.

However, the consensus among experts in the field of intelligence research is that IQ is a good proxy for intelligence. Studies show that high-IQ people are more educated, achieve better grades at school, have higher incomes, are healthier and happier, are less likely to develop an addiction, and generally have higher social status than those with lower scores.

“The question is, does an IQ test that is administered to a young child predict these later life outcomes that we believe are indicative of intelligence? The answer to this question is an overwhelming ‘yes.’ IQ tests have enormous predictive capacity,” Louis Matzel, an expert from Rutgers University, wrote in a MetaFact review.

However, there are caveats. IQ tests generally fail to encompass broader aspects of intelligence such as creativity, practical skills, emotional, social, and wise intelligence (common sense).

Intelligence is mostly genetic

Intelligence is undeniably influenced by genetics, yet environmental factors also play a significant role. Twin and adoption studies reveal that genetic influence grows with age, and by adolescence, a significant portion of IQ differences can be attributed to genetics.

“There is a tremendous amount of empirical research clearly showing that intelligence is hereditary,” Dimitri van der Linden from Erasmus University Rotterdam responded to a MetaReview survey. “The estimates range somewhere between .50 to .80 (and are likely closer towards the latter).”

However, the expression of genes is shaped by environmental factors, and small genetic predispositions can be influenced by nurturing environments. This interplay between genes and environment shouldn’t be underestimated or overlooked.

Can you improve intelligence? Yes, but not with gimmicky brain games

Contrary to the belief that intelligence is fixed, research shows it is subject to change. The ‘Flynn effect’ shows there’s a steady increase in average IQ scores over the 20th century, associated with better improvements to education, especially higher ed. Additionally, schooling correlates with increased intelligence — every year of education adds 1 to 5 IQ points. So, yes, technically it is possible to raise your IQ score.

These effects are explained by the brain’s neuroplasticity — its ability to reorganize and form new neural connections throughout life, allowing it to adapt in response to learning, experience, or injury.

 “The brain is neuroplastic and keeps changing,” argued Gavin Brown from the University of Auckland in a MetaFact thread. “Stimulus from environments which requires flexibility in processing and exploitation of schematized structures keeps the brain active forming new paths.”

However, brain training games — the kind you see aggressively advertised online almost everywhere — do not work as promised at all. The most you can hope from engaging with these brain games is becoming better at their very specific and limited tasks, which can be said of anything where you apply practice. There are no general benefits. There is no evidence either that brain games can help stave off dementia, as some companies claim.

“Meta-analytic reviews of the empirical literature indicate either tiny or absent gains,” Nachshon Meiran from the Ben-Gurion University of the Negev told MetaFact. “In my opinion, given what we know, it is unfair (or worse) to promise otherwise.”

Focused cognitive training may temporarily boost specific abilities, but these effects often fade. The most substantial gains in intelligence appear to stem from formal education, which exercises the mind and brain through the acquisition of real knowledge.

This article appeared in January 2024 and was recently updated with new information.

share Share

Biggest Modern Excavation in Tower of London Unearths the Stories of the Forgotten Inhabitants

As the dig deeper under the Tower of London they are unearthing as much history as stone.

Millions Of Users Are Turning To AI Jesus For Guidance And Experts Warn It Could Be Dangerous

AI chatbots posing as Jesus raise questions about profit, theology, and manipulation.

Can Giant Airbags Make Plane Crashes Survivable? Two Engineers Think So

Two young inventors designed an AI-powered system to cocoon planes before impact.

First Food to Boost Immunity: Why Blueberries Could Be Your Baby’s Best First Bite

Blueberries have the potential to give a sweet head start to your baby’s gut and immunity.

Ice Age People Used 32 Repeating Symbols in Caves Across the World. They May Reveal the First Steps Toward Writing

These simple dots and zigzags from 40,000 years ago may have been the world’s first symbols.

NASA Found Signs That Dwarf Planet Ceres May Have Once Supported Life

In its youth, the dwarf planet Ceres may have brewed a chemical banquet beneath its icy crust.

Nudists Are Furious Over Elon Musk's Plan to Expand SpaceX Launches in Florida -- And They're Fighting Back

A legal nude beach in Florida may become the latest casualty of the space race

A Pig Kidney Transplant Saved This Man's Life — And Now the FDA Is Betting It Could Save Thousands More

A New Hampshire man no longer needs dialysis thanks to a gene-edited pig kidney.

The Earliest Titanium Dental Implants From the 1980s Are Still Working Nearly 40 Years Later

Longest implant study shows titanium roots still going strong decades later.

Common Painkillers Are Also Fueling Antibiotic Resistance

The antibiotic is only one factor creating resistance. Common painkillers seem to supercharge the process.