top of page

“Gens”: Generation, Generosity, Genocide

 

by Roman Sympos

 

​

Part 2: Generation

 

 

People try to put us d-down
Just because we g-g-get around
Things they do look awful c-c-cold
I hope I die before I get old

                                                      --The Who

 

When I was a boy of 14, my father was so ignorant I could hardly stand to have the old man around. But when I got to be 21, I was astonished at how much the old man had learned in seven years.

                                                      --Mark Twain

 

 

As we saw last month, individuals and the groups with which they identify come to understand who they are by making sure they all understand who they’re not, namely, an “Other” that corresponds to an abstraction or amalgam of undesirable traits and tendencies that cannot exhaust the infinite complexities of any single specimen of a human being. “We’re not like that” is where the journey to “We are like this” begins.

​

In most cases, as with hobbies, amateur sports, professional organizations, or avocations generally, this process of negation takes the relatively benign form of “not knowing”: the out-group doesn’t “know that” or “know how,” while the in-group does.

​

The gens ("family" or "tribe") alone anchors “Othering” in a genealogy that traces the origins of the group-that-is-not-Other to a single blood ancestor or family of progenitors. This entails a process of renewal and reconnection because, in order to perpetuate itself, the gens must produce new members to replace those who have “passed on” and at the same time get each of them to identify with their genealogical predecessors, beginning with their parents. The gens has deposited all the tools necessary for this task—all the rituals, codes, signs, and symbols—in a single cultural tool kit: the wisdom of the elders.

​

We’re talking about enculturation, otherwise known as “brain-washing.”

​

Of course, with kids, you’re really talking about brain-writing, since they’re pretty much born with the proverbial tabula rasa or “blank slate” firmly lodged between their ears. There's nothing on it to wash off.

​

Nowadays we know there’s more to that slate than just a flat, thin slab of sedimentary rock. The brain offers a much more sophisticated and receptive surface for cultural inscription than that. What gets inscribed on the blank brain is what we call Mind. The hand that inscribes it belongs to adults and the language in which it is inscribed is the language of their family, clan, tribe, or nation.

​

But with the adults wielding not only the upper but often the only hand busily at work writing the cultural Ur-text of Mind on every child’s blank brain, how do successive generations ever get to the point of wanting to differentiate themselves, as groups that are chronologically distinct iterations of a common bloodline, from those coming before and after?

​

​Clearly, children, as individuals, know what and who they are not: adults, whether living or dead. But how does that negation lead to the negation necessary to allow children—and now we’re talking about adolescents and teens and young adults—to think of themselves as a distinct cohort, defined by age and date of birth—the Beats, the Boomers, X, Y, Z—from the generations preceding them, including their parents and grandparents? After all, that’s where the power lies: the power of autonomy, of authority, of knowledge handed down from time immemorial, not to mention physical size and strength. Why would any child cling to the notion that they belong to a generation distinct from that of the omnipotent and omniscient adults? Why wouldn't they want to join the club of grown-ups as soon as possible?

​

In premodern times and societies, they did, and still do. Premodern societies do not distinguish successive generations by giving them group names and identities. There are children and there are adults and there's a process for turning the former into the latter, once they've reached a certain age.

​

The meaning of “premodern” has been much debated since it was coined by Reinhart Koselleck in 1970. Roughly (very roughly), it’s the opposite of what we used to call “primitive” (no longer PC) or the “Third World” (“undeveloped”) or “Second World” (“developing”) countries. Modern societies and economies call themselves (immodestly) “First World.”  

​

Signs of “modernity” include, among other things, economies based on industrialization, large-scale monetary investment, the division of labor, and the commodification of goods; bureaucratized governments that gather information, distribute tasks based on certified expertise, and enforce explicit rules of law rather than implicit norms of behavior; and a scientific worldview validated by rapid technological advance.

​

But however we define the term, all modern societies are marked by their tendency to erode the power of the gens at the family and community levels by redistributing it in larger units at the level of town, state, or nation. Power moves from the gens to the government, from the sphere of what German sociologists call Gemeinschaft (groups bound by common interests, practices, and values) into the sphere of Gesellschaft (groups bound by rules devised to balance competing self-interests). The first is intimate—in the words of the theme song to that old TV show “Cheers,” it’s the place where “everybody knows your name” (and a good thing, too!). The second is impersonal—privacy is highly valued (you don’t want strangers to know your name), giving money talks louder than giving your word, and lawsuits take the place of working things out over the back fence or a shared meal.

​

In short, premodernity puts the interests of the community first, the rights of the individual second. Modernity reverses this distribution of value. As a result, kids in modern societies tend to see grown-ups, including their parents, merely as powerful individuals, not as authorized and legitimated keepers of the gate to adulthood. In premodern societies, as a rule, the keepers of the gate are understood, without question, to be the elders of the family, clan, or tribe, and that gate is clearly demarcated. It’s the rite of initiation.

​

We have vestiges of initiation ceremonies in modern societies, coexisting happily with rampant depersonalization, proliferating litigation, and the commodification of anything that moves. But these rituals are, by and large, merely ceremonial and non-binding. They may, as in the Bar or Bat Mitzvah, represent something real and meaningful to the inductee’s family, relatives, and local synagogue (their gens), and even impose some serious requirements, like learning Hebrew. But the child who eventually decides Judaism is not for them will not be ostracized or banished by family or faith, unless, perhaps, that family and faith are Orthodox or Ultra-Orthodox. In many households, there is nothing resembling such a ceremony.

​

The initiation ceremony allows the initiate—literally, the naïve and unknowing “beginner”—to join the ranks of the knowing and powerful. Like baptism, which symbolizes being “born again” from the life-giving waters of the womb, it bestows upon the child a new identity, in this case, one that is gendered and adult, rooted in but different from its pre-sexual and dependent one. The initiation ceremony invests the child with an authority and power ballasted by real-world responsibilities (usually symbolized by a difficult trial, test, or ordeal) for perpetuating and sustaining the gens.  In patriarchal versions of premodernity, this means, for women, bearing and caring for children, and for men, hunting for food and defending the gens from enemies.

​

There are no such obligations, as a rule, imposed on children in a modern society, nor are there any universally recognized rituals of maturation for imposing them. (High school graduation may be the nearest thing.) In a polity that values the individual over the gens—freedoms over obligations, competition over cooperation, unearned autonomy over earned authority—children are left to make their own coming-of-age rituals, so they do so with other children, their peers. After all, they’ve been told by adults, at least since 1972, when the song was first released, that they’re all “free to be you and me.”

​

There’s much to be said for modernity’s respect for individual freedom. When genuine and not used as ideological cover for special interests (as in, “We’re a Christian nation, so religious freedom applies only to Christians”), it has alleviated a great deal of suffering among those who don’t fit the roles traditionally prescribed for its members by the gens, e.g., for the transgendered, the differently abled, the non-religious or religiously anomalous, the professional woman. But individual freedom, if pursued too relentlessly, brings its own set of anxieties, sometimes bordering on panic. When you are not given, by those with the know-how, the tools to construct your own identity or a set of plans to follow, you have to make your own. It’s the equivalent of trying to build yourself a place to live with a stone ax, a crayon, and some pieces of driftwood just as the leading edge of a thunderstorm starts to peek over the horizon.

​

The rituals and codes and costumes and ceremonies that children in modern societies use to turn themselves into adults—or try to—differ from one generation to the next, but bear a striking resemblance that spans the last three or four centuries at least (that’s how long modernization has been gaining ground in one part of the world or another): they all affirm the autonomy—the “adulthood,” if you will—of the members of the younger cohort not by adopting or imitating the practices of their autonomous elders, but by shutting out, flouting, defying, mocking, or blatantly  transgressing those practices. Which is to say, by negating them.

​

Rebellion is the most appropriate coming-of-age gateway for children in a society that prizes individual freedom and autonomy over the gens. Perhaps it is the only one possible. In contrast to the initiation trial facing premodern children, which tests their ability to shoulder the responsibilities of adulthood, the modern child must demonstrate that it is, like its elders, a free and autonomous individual, and thus capable of making the only choice that a society with such values will recognize as legitimate: the free choice to become one of them. Anything else would look like capitulation, a loss of autonomy. The child, in short, must refuse to obey to win acknowledgement of its ability to choose to obey. As we’ll see in a moment, modern children usually make the right choice, at long odds far in excess of what true freedom of choice would predict and, moreover, without realizing they’ve done so—that is, while still thinking of themselves as not like their parents.

​

Marlon Brando, playing Johnny Strabler, leader of the Black Rebels motorcycle gang in The Wild One (1953), spoke for his entire R&R cohort of pot-smoking juvenile delinquents, bongo-bopping Beats, and postwar gang-bangers generally when he answered the question posed by one sweet-sixteener, “What are you rebelling against?” with the reply, “Whadayagot?”

​

Wars are particularly fertile epochs for spawning generational solidarity through negation, first, among young surviving combatants and secondly, among the combatants’ offspring. World War II’s “Greatest Generation” negated the horrors of combat by burying them under piles of consumer goods, affordable houses, and kids. The ordeal legitimizing their coming-of-age was the War itself. The kids who grew up under war conditions but never saw combat, like Johnny, negated their parents’ bland, boring “return to normalcy” by rebelling against nothing more than their elders’ complacency, but the next generation, the postwar Boomers, who were born in the course of this prolonged Pax Americana, found a cause, in fact, quite a few, to underwrite their rebellion: Civil Rights, Anti-Imperialism, Free Speech, and the topper, Vietnam.

​

There is nothing like a cause to energize a generation’s sense of autonomy and self-direction and dismantle the authority of the elders. A cause entitles the younger generation to pull rank on the adults by pointing out where they got it—and are still getting it—wrong. It helps if the adults being targeted are hypocrites, which would include most of the human race. The Lost Generation of the 1920s had the Great War to hold against their elders, the Boomers Vietnam. The Zs have Gaza. We’ll return to them, and their cause, when we get to our last “G,” Genocide, in Part 4.

​

Generational identification didn’t begin with the Boomers, and it doesn’t always need a cause. The Lost Generation of disillusioned ex-combatants who survived World War I had to put up with the antics of a younger, less burdened cohort, the playboys and flappers of the Roaring Twenties, who never saw combat but knew how to get a rise out of their elders by wearing short skirts and saddle-shoes and dancing the Charleston unchaperoned. A century before that, during the Napoleonic Wars, the Dandy debuted on Britain’s home front. Unhappy with the mess his elders were making of the world, this young man expressed his disdain by wearing fake military regalia and “borrowing” passenger coaches to tear around the streets of London like a Hessian cavalry officer—an early version of joy-riding.

​

As our editor and guest essayist, Charles Rzepka, writes in this issue’s "Essay of the Month," it was in the long shadow of rapid industrialization and rural depopulation following the Civil War that thousands of deracinated young women left their family farms and rural communities to take clerical and sales jobs in the booming economies of big cities like Chicago and New York, far from the normative gaze of parents, siblings, and community of origin. They became known as New Women. Among them, as our essayist observes, we can count Frank Baum’s Dorothy, savior of Oz.

​

But if each new generation in a modern society is busy defining itself in opposition to those that came before—aka, “the old fogies”—how can the gens ever perpetuate those fundamental norms, values, and rituals that enable it to survive not just biologically, but culturally?  It can by letting each oncoming generation of adult aspirants negate them in the effort to affirm their own autonomy. Negation doesn’t mean the utter destruction or annihilation of what's negated. What you resist, persists, usually stronger than ever and in direct proportion to the magnitude of your resistance. You can cast out others, but you can’t cast out the Other, which, as we saw in Part 1, is an indispensable and intrinsic part of your identity as a member of the group. It lurks deep inside you, ignore it as you will, and shapes your world view and behavior even as you deny, vociferously, that it does or can. For the rebellious adolescent, even death may be preferable to becoming a grown-up (or seem so). “I hope I die before I get old,” sang The Who’s Roger Daltrey. But his hopes were dashed. On March 1 of this year, he turned 80, and while he’s still rockin’ and rollin’, there’s no mistaking the fact that he’s turned into a mature, well-behaved, and law-abiding adult.

​

Like Daltrey, most teen rebels eventually “come round right,” to quote the old Shaker hymn, and become one or another version of their parents, usually by their early twenties. The recent series of TV ads plugging Progressive Insurance to post-Progressives (I’m sure Progressive’s ad agency appreciated the irony) makes this point with wry humor: what could be less youthfully exhilarating and reckless or more stodgy, parental, and risk-averse than the idea of buying insurance? As the ads suggest, it’s about on a par with hosting a dinner party or showing off your new gas grill. Progressive’s “Dr. Rick” is there to help young adults cope with creeping “parentamorphosis.”

​

I say “cope” because there is no cure, and chances are good you won’t even notice you’ve succumbed. Mark Twain, to his credit, did notice, and what’s more, noticed (eventually) that he really hadn’t—that is, he noticed that he’d managed to ignore noticing. It wasn’t until many years later, in describing his contrasting impressions as a boy of 14 and a young man of 21, that he implicitly acknowledged what I’m describing as the work of negation, which is the projection of what cannot be accepted in oneself onto the straw man of the Other, in this case, his father—all without noticing he’d done so. From the young man’s perspective, it’s not he who’s come round right, but his dad.

​

Twain’s quotation is apocryphal. His father died when he was eleven, and if Twain did write or say anything of the sort, it must have been in the voice of an imaginary character (who has yet to be discovered in anything he wrote). That only makes the depth of his wisdom more remarkable and aptly proportionate to the vigor of his imagination—which is to say, his ability to imagine himself as another.

​

And that’s a topic, the sympathetic imagination, with which we’ll begin next month, when we turn our attention to “generosity.”

bottom of page