Where did generations really come from?

Worse lenses are easier to believe, but they distort the way we see the world

We're living through an interesting marketing moment. There's increasing evidence about what works, what doesn't, and more recently, how to measure the impact of things we've always intuitively believed to be effective, like branding. As a result, we're beginning to let go of old certainties (like last-click attribution) and replacing tools and abstractions with more robust ones.

In qualitative research, we've always known that our choice of analytical framework, often referred to as a lens (thematic analysis, ethnography, semiotics, etc.), significantly changes how we view things.

However, not every lens or abstraction is useful, and the problem with the bad ones is that they shape how we perceive the world and distort what we're seeing.

The lenses we use absolutely influence our perception of reality. Have you noticed any patterns or groupings in this projection?

Our macro environment is conducive

In a country like Brazil, ranked first in falling for fake news, with a very high incidence of functional and digital illiteracy, where more than half the population believes there are hidden cancer cures suppressed by commercial interests, and where sports betting is rationalized as an investment, it's no surprise that simplistic or misleading ideas gain significant traction here. We have incredible qualities as a people and culture, but critical thinking isn't collectively our strong suit.

The Forer Effect and the tricks we fall for because of it

The Forer effect (or Barnum effect) is a cognitive bias that makes people believe that generic personality descriptions apply specifically to them, even though they could apply to almost anyone. You know those Buzzfeed-style quizzes that tell you which Harry Potter house you belong to or which Succession character you are, and you think it's spot-on? Let's understand why that happens.

Bertrand Forer, an American psychologist, conducted a very interesting experiment. He administered a personality test to his students, informing them that, based on their responses, they would receive a personalized analysis. Afterward, each student received a written description of their personality - with the crucial detail that everyone received the exact same text.

The description consisted of vague and ambiguous statements that appeared to have psychological depth. Among them were assertions like: "You have a great need for other people to like and admire you," "Although you have some personality weaknesses, you are generally able to compensate for them," and "You tend to be critical of yourself." Then, Forer asked the students to rate, on a scale from 0 to 5, how well the description applied to them. The result was surprising: the average score was 4.26 - meaning the students judged the completely generic descriptions as very accurate. Now do you understand the talk about "Generation Z seeks authenticity?" I've written about this before.

This effect is also known as the Barnum effect because of P.T. Barnum, a 19th-century circus owner and notorious charlatan, supposedly the author of the phrase "There's a sucker born every minute." The effect partially explains why we believe in things like horoscopes, the Enneagram, and other scientifically unfounded attempts to explain personality traits.

Meanwhile, Big Five or OCEAN, the most validated model for studying personality, is relatively unknown outside academia and HR - do all the legwork, get none of the credit?

But it's not just vague descriptions or those that apply to everyone, like Buzzfeed quizzes, that make these things believable, according to studies. What else makes us believe:

  • Seeing the source as an authority - academics, scientists, celebrities, large companies, all have an innate ease in pushing nonsense onto others. Several notorious charlatans have master's degrees and PhDs, thousands of followers, a lot of money, or all of the above.

  • When the mentioned characteristics tend to be positive

  • When we attribute personal meaning to what we're receiving or filter the truth of what we're hearing based on our individual experiences

You know something we hear about daily that ticks all these boxes and still adds the magical ingredient of tribalism and rivalry between groups that makes so many things go viral on social media, like remote vs. in-person work. Generations.

Generations: A timeline of how a poorly structured idea from the U.S. took over our feeds and became global

A timeline of how generations became the standard framework for reading society:

Germany, 1928

Karl Mannheim published "The Problem of Generations," proposing to group individuals of similar age because they share experiences during their formative youth period. For him, chronological age is insufficient - identity arises from active involvement in these events, which influences values and behaviors. He also considered fundamental differences in class, culture, and location, being transparent about the limitations of the theory he proposed unlike most of what followed.

United States, 1991

Playwright William Strauss and historian Neil Howe divided generations in the United States in the book "Generations" into four archetypes (this very idea is more comparative mythology than scientific - sorry, Jung fans and branding folks!): Artists, Prophets, Nomads, and Heroes, linked to four living generations at the time in the country - Silent, Baby Boomers, X, and Millennials. The term Millennial itself is a creation of Neil Howe.

Based on this not-so-solid foundation, they extended these same archetypes to explain the past and predict the future, suggesting that they repeated and would continue to repeat throughout American history - a simplistic and pretentiously prophetic historical determinism, more Nostradamus than Hobsbawm. The book's subtitle is "The History of America's Future from 1584 to 2069."

If this sounds more like the Wheel of Time to you than a theory that truly explains what it ambitiously aims to, you're not alone. There's a detailed critical analysis discussing the theory's limitations and omissions here - the more politically inclined will note that Steve Bannon is one of their ideas' defenders.

United States, late 1990s

The American media embraces the idea of generations, and soon, stereotyping and criticism of youth disguised as concern for the future, historically recurring, grace the covers of magazines like Time.

Worldwide, 2000s - What was once only for the U.S. becomes global

At a certain point, with the sheer influence that marketing has as an industry in the U.S., the idea of generations is exported as if it applied worldwide. Due to globalization (at the time, an unstoppable force, though the story is different today) and the internet, the promise was that our major historical milestones and life circumstances would be more shared from then on, something that's not even completely true in the U.S...

In the 2020s, criticism finally begins to emerge

Huge generalizations made on tiny amounts of people, varnished by beautifully designed carousels and PPTs with misleading hooks, became a constant. Many repeating exactly the same nonsense said about Millennials in the 2010s about Generation Z, which became the new main characters: "seek authenticity," "value self-expression," "are digital natives," "are more conscious" - not coincidentally, characteristic traits of youth as a whole, not a specific generation.

Various scholars and institutions began publishing studies and critiques on the subject. In 2021, a group of demographers sent an open letter to the Pew Research Center, perhaps the organization most responsible for popularizing the idea of generations worldwide and one of the leading authorities on public opinion in the U.S., stating that it was an arbitrary, unscientific idea that hinders serious research on the subject. They relented and committed to almost completely eliminating the criterion, except in cases where different generations can be compared at the same life stage - and they are one of the few organizations worldwide with sufficient historical data to make such comparisons. If they, who were the main propagators, abandoned it because the criticisms are legitimate, what are we waiting for?

From HBR to BBH Labs and Mark Ritson, the criticisms began to leave the social sciences and reach the market. A common point among all is that intra-generational similarities are very little and that insisting on them creates false stereotypes and factoids. HBR's article even states that stereotypes change our behavior.

Here in Brazil, Superinteressante (a major pop science magazine) published a scathing article, but the silence in the research and insights market is deafening, with rare exceptions, including someone in my network who brilliantly summarized the problem: "People aren't born in vintages." No one wants to be the killjoy who spoils the fun? The client is always right? Anything for engagement?

Is it still possible to give the benefit of the doubt and treat the insistence on an imprecise, unscientific, and arbitrary criterion as ignorance, not malice?

The problem is there’s a harsh human truth here — we only condemn pseudoscience and fake news when others believe in them and when we’re not emotionally or financially invested in them, or when they’re not part of our identity. That’s how conspiracy theories are born. There’s even an episode of The Simpsons about this — the feeling of belonging and community weighs heavily. We had a global demonstration of this phenomenon with the rise of anti-vaccine movements during the pandemic.

That’s why understanding cognitive biases and fallacies is fundamental for researchers, strategists, and marketers: to recognize flaws in one’s own reasoning. Are generations the marketing world’s favorite flat earth theory?

Why abandon generations as a criterion?

1 - The central premise treats historical context as far more universal and determining than it actually is. Ironically, Mannheim, one of the pioneers, took this limitation into account, but most of what came after did not.

2 – We ignore far better criteria, backed by decades of serious research, which focus precisely on what changes less. Adolescence, middle age, and other life stages are studied culturally, socially, and psychologically in established fields like developmental psychology. Unlike in generations, age brackets tend to be more fixed. That’s because they’re linked to things like brain development, socialization, social roles at different life stages - things that vary much less over time and are therefore far more enduring. But we ignore all that to listen to Joe Blow, LinkedIn’s Gen Z specialist, whose repertoire basically revolves around his social circle. But hey, his carousel posts are soooo cool!

3 - It assigns generations characteristics that actually relate to life stages. We’re almost always very different people at 20 and at 40. All those studies showing generational slices mislead readers. For example, by portraying the search for stability and risk aversion as traits of Gen X and Boomers (not older people - weren’t they like that in their 20s too?) or emphasizing idealism in Gen Z - will they really stay the same when they’re older?

4 - It’s a particularly bad lens for understanding children and youth. For example, in 2025, Pew Research’s definition of Gen Z covers ages 13 to 28 — meaning it spans puberty through the average age at which Brazilian women have their first child! What similarities could a group that broad possibly have, when every 2-3 years people change so much? Not to mention differences in income, education, culture, lifestyle, etc.

On the other hand, working with a global leader in youth entertainment, I was struck by the level of detail they had in segmenting products by life stage (early and middle childhood, tweens, teens, young adults, etc.), sometimes with smaller subdivisions, always highlighting what makes each period unique - cognitive and behavioral issues - with incredible detail that undoubtedly helps explain the success of almost everything they do. Nuances matter!

5 - “Digital nativity” is a flawed concept that the idea of generations amplifies, ignoring the impact of purchasing power and other variables on tech adoption - always assuming older people are necessarily less able or interested in using new tech. Ageist and worrying in a rapidly aging world, including Brazil.

6 - If we’re talking marketing, the primary criterion is the relationship with the category - which generally best predicts how people will act. Meaning, if we want to understand how people consume coffee, the most important things should be the specifics of that relationship: frequency, what types they buy, how they prepare it, and so on. If I prefer a moka pot over drip coffee, it’s unlikely my generation has anything to do with that.

How long will those controlling advertising budgets, strategy, and branding keep applauding nonsense and treating groups of millions as if birth years were destiny and as if our values and choices were just products of the historical context? How long will marketing folks, agencies, and unfortunately even researchers - who should be leading this debate - keep claiming to be “data driven” while insisting on a criterion that is proven to be one of the worst for understanding humans in groups? In a time of so much change, wouldn’t it be more rational to focus on what changes less and has more consolidated knowledge?

Not to mention how this insistence punishes intellectual honesty, which should be a more encouraged trait in our field. If we remove from the room those who are transparent about blind spots and methodological limitations, who’s left?

If you’re feeling bad after reading this — maybe reflecting on generalizations you made, decisions you took, or studies you’ve been involved with — you’re not alone! Let me extend a hand. Since childhood, I’ve always been curious about personality, behavior, culture. On long car trips, I would quiz adults about how people from other countries were and what the future would be like. In elementary school, I devoured books on mythology and astrology, and for a science fair project, my presentation was casting people’s astrological charts using a borrowed laptop. No, no teacher objected to an astrology presentation at the science fair — embarrassing! I’ve worked in research for over 20 years, and I too once believed generations might be an interesting angle to understand consumer trends and made presentations about it for clients, much earlier in my career. But I never stopped studying the subject (and I try to read viewpoints opposite to mine) and here I openly admit: I was wrong before! Because that’s what growth is, right? Changing your mind when you find better arguments and evidence.

On the other hand, if you radically disagree with me, I’d love to hear your arguments - comments are open - but for a fair conversation, bring data. Don’t bring a knife to a gunfight.

To conclude: you know the drill - next time someone tries to sell you a perspective “through the generational lens,” remember that lenses, literal or metaphorical, shape what we see and can create terrible distortions. And in our info-overloaded world, seeing poorly is a luxury we can’t afford.

If it’s just for fun, maybe it’s better to read Susan Miller (huge in Brazil BTW!): she’s a brilliant storyteller, but you won’t use that to justify strategic decisions about brands or people. And more: she figured out that no one comes back to check if last month’s forecast was right - sound familiar to some content you read out there? It’s time to separate entertainment from understanding.

But Rodrigo, everyone uses it. You want me to stop? To correct others?

One step at a time. We once believed the Earth was the center of the universe, thought it was normal to drive without a seatbelt and to smoke (or be smoked) indoors. Change starts with doubt, then a conversation, an argument, a choice. This text is just an invitation. Like when we calmly explain the value of qualitative research to those who ask, “Can I trust something done with 12 people?” And as your mom would say: you’re not everyone. That’s why you got this far. Thanks for reading till the end!