So in this series we’ve picked up the whole idea of idol pop like we would a Magic 8 Ball, jiggled it and tried to see what answers it would give us, tried to pry it open and look into its inky insides. We’ve looked at how idol pop became a trend that spread from Britain to the United States to Japan to South Korea to Kazakhstan; how it becomes intertwined with social and linguistic self-assertion, and can be used in the service of nationalist soft power; how it draws from some traditions of gender expression, and runs counter to others; how it might give its most successful performers (and, more often, their bosses) room to indulge in their worst impulses and get away with it. We’ve discussed whether idol pop is harmful just by the nature of its shallowness, or by the way it seems to encourage fans to make servants of their idols. And we’ve also looked at Ninety One, a very particular collection of serious idol performers who are also frequent on-camera goofballs, and bilingual Kazakhstani nationalists, and potential subtle critics of the Nazarbayev regime. We’ve taken this analysis just about as far as anyone has the patience for it.
But we still haven’t gotten to why. We still haven’t explained what it is that prompts some people to pursue a career in idol pop in the first place, and others to attend their shows and replay their videos. Idol pop is silly, often thoroughly so, and expensive, and difficult; a great deal like professional sports, without the twin advantages of direct competition or easily quantifiable measurements of progress and skill. So why pay it any mind at all?
Well, you ask cynically, why do anything? And that actually gets us somewhere. I would reply that idol pop, as silly and shallow and sparkly as it is, emerges from the same drive that produces buildings, and policy papers, and microbrews, and novels whose authors could not be more disgusted by the idea of silly sparkles, and children, and statues of Ozymandias, King of Kings. Idol pop is a hedge against death.
“The idea of death, the fear of it, haunts the human animal like nothing else,” wrote Ernest Becker in his 1973 book The Denial of Death.1 I picked it up after hearing it recommended several times for thinking about human mortality, although I should warn you that if you’re not already familiar enough with the works of Freud, Kierkegaard, and Otto Rank to give them thorough and respectful critiques, as Becker does, it can be tough going.2 Becker is partly concerned with the future of psychoanalysis, and partly with the future of science, but mainly at pains to demonstrate that nothing motivates and terrifies any human being more than death.
By death I do not mean dying; it doesn’t matter, for Becker’s purposes, whether the death is sudden and violent or peaceful. The terror comes from the prospect of nonexistence; in fact, the certainty of it; and worse, the simultaneous knowledge that one lives, and one will die. As Becker puts it:
Man has a symbolic identity that brings him sharply out of nature. He is a symbolic self, a creature with a name, a life history. He is a creator with a mind that soars out to speculate about atoms and infinity….
Yet, at the same time, as the Eastern sages also knew, man is a worm and food for worms. This is the paradox: he is out of nature and hopelessly in it; he is dual, up in the stars and yet housed in a heart-pumping, breath-gasping body that once belonged to a fish and still carries the gill-marks to prove it. His body is a material fleshy casing that is alien to him in many ways–the strangest and most repugnant way being that it aches and bleeds and will decay and die. Man is literally split in two: he has an awareness of his own splendid uniqueness in that he sticks out of nature with a towering majesty, and yet he goes back into the ground a few feet in order blindly and dumbly to rot and disappear forever. It is a terrifying dilemma to be in and to have to live with.
The terror of nonexistence is such that it makes life itself terrifying: for to live more greatly is to feel the distance between the god-self and worm-self more greatly, to lose more upon certain death. There has to be something to ease this constant terror. One solution is, simply, to live less. “Modern man is drinking and drugging himself out of awareness,” Becker writes acidly in the last chapter of The Denial of Death, “or he spends his time shopping, which is the same thing.” If you’ve ever reproached yourself for doing some time-wasting activity, or for being too much in your own head and not “present” enough, consider the possibility that “being present” brings with it the threat of having to become more aware of your own mortality.
Another is to fight the mortal quality of the worm-self, to invest in what Becker calls “heroism” and transcend death. The methods of heroism vary by time and place, but all societies have them, for it is impossible for people to function without them:
It doesn’t matter whether the cultural hero-system is frankly magical, religious, and primitive or secular, scientific, and civilized. It is still a mythical hero-system in which people serve in order to earn a feeling of primary value, of cosmic specialness, of ultimate usefulness to creation, of unshakable meaning. They earn this feeling by carving out a place in nature, by building an edifice that reflects human value: a temple, a cathedral, a totem pole, a skyscraper, a family that spans three generations. The hope and belief is that the things that man creates in society are of lasting worth and meaning, that they outlive or outshine death and decay, that man and his products count.
The building is easier to do when everyone agrees on which edifices count for cosmic specialness. One of the functions of religion has always been to help the mind cope with death: by promising some sort of afterlife, or proposing a moral code by which one can live and take solace in living in accordance with divine wishes, or constructing a conception of the self such that death does not destroy it. Religion offers the promise that the individual, small person can be heroic without being known for heroism. Our chosen gods are witnesses to our heroism, their testimony permanent and unshakable. “For the Lord knoweth the way of the righteous,” reads the very first Psalm, “but the way of the ungodly shall perish.”
But what if there are no gods? Then the only way to ensure one’s own value is to have it affirmed by other people. The threat of insignificance becomes even greater with no reassuring divine presence. We are all at risk of suffering the fate of Jones, Aaronson, and Rutherford in 1984: put to death, and then the photograph of them burned, their posthumous narrative completely erased and rewritten. And so we move even more desperately to create things that will extend our presence, somehow, after our death, that will serve as proof that we were here, that we were more than worms.
I was thinking of this while listening to Hamilton (with my children, who far, far prefer it to any kind of idol pop). At the beginning of “The Room Where It Happens,” Alexander Hamilton and Aaron Burr have a brief conversation:
Burr. Didja hear the news about good old General Mercer?
Burr. You know Clermont Street?
Burr. They renamed it after him. The Mercer legacy is secure.
Burr. And all he had to do was die.
Hamilton. That’s a lot less work—
Burr. We oughta give it a try.
This is a twenty-first-century approach to immortality. General Mercer wasn’t a character before, and isn’t mentioned since. Lin-Manuel Miranda’s annotations on the Genius page for “The Room Where It Happens” describes him dismissively as “some general who died two years into the war.” We have no idea what his conduct was, or what prompted anyone to propose naming a New York City street after him.3 General Mercer is, in the Hamilton universe, merely famous for being famous. And yet that is enough for the characters to declare that “the Mercer legacy is secure.” It is Mercer’s name, not Mercer’s deeds, that needs preserving, and so long as people are still saying his name, Mercer is blessed with a respite from nonexistence.
We have been living with filmed celebrity for over a century now, long enough to have some evidence that filmed celebrity can, in fact, give you some insurance against nonexistence. Even if everyone who ever saw any movie you were in, or heard you perform live, is now long dead, your name and your likeness can still get passed down; and that won’t be true for millions of people who were alive at the same time you were.4 There is more concrete evidence for immortality, of a sort, through filmed celebrity than there is for any sort of divine recognition. When Andy Warhol (or one of his friends) made the comment about the future being a place where everyone would be famous for fifteen minutes, he did not add that everyone would still be grappling with certain death, and that everyone would thus need that fifteen minutes; that that fifteen minutes might be the triumph of the god-self over the worm-self.
In this context idol pop becomes an attractive weapon against the terror of nonexistence, over and above its inherent time-wasting diversionary qualities. First, idol pop draws only from the young and seemingly healthy: 25 is considered almost impossibly old for an idol debut. Singing, dancing, energetic, sensual, the idols can serve as Tadzios to all sorts of von Aschenbachs in the audiences. Even the teens and preteens who serve as the most common audience for idol pop want to believe in eternal youth, to be distracted from the fact of death. As the mortician and writer Caitlin Doughty once put it:
When a child is about 8 years old two things happen. One, they first develop feelings of love and desire (eros) and two, they discover the harsh reality of death (thanatos). Artists like Justin Bieber are designed, packaged, and set out into the world to act as beacons of unadulterated eros….
We watch (in 3D) as young girls are whipped into a mass hysteric frenzy as Bieber floats about the crowd in a giant suspended metal heart apparatus. They love him, they require him. They cry and scream away their budding thanatos. Bieber loves them, creates life, and keeps the demons at bay. “I want to marry him,” they scream. He WILL be mine. I will posses him, subsume him. Make him a part of me so I am not doomed to be alone in this world I have only recently discovered contains the promise of death.
But modern idol pop has more weapons against death than just youth. It second weapon is the sheer amount of recording it does, all that adorkable content. The more often an idol is filmed, the better the chances that that idol’s image will somehow survive posthumously. Idol-pop fans like to talk about “eras,” referring to former styling concepts as if they were demarcations worthy of historical note. They take and upload fancams of performances, doing their best to help recopy and preserve the idol’s image—keep backups, as it were. So while the idol’s physical body, the worm-self, continues to age and decay, the past image of the idol, the god-self, is retained.
And idol pop’s third, and possibly most potent, weapon is its encouragement of the idea of idol pop as a shared project between performers and management (think of Ninety One making a repeated public point of their affection for Yerbolat Bedelkhan, and vice versa), performers and fans, fans and fans. Ninety One spent a good first two years as a group yelling Eaglez! at every opportunity. They sat down, on camera, and watched fan tributes together six months after their debut and again for their fifth anniversary fanmeeting. They, like other idol-pop groups, make a point of thanking their fans repeatedly. The group tells the fan: you help make us immortal. By uploading fancams, by streaming videos, by attending concerts and joining fanclubs, by contributing to the idols’ fame, the fan gets claim some small part of the group’s immortality. The fan has found a heroic project. The fan can say: I am a worm and I am food for worms, but I have this.
I haven’t seen much research about the question of whether idol pop is more likely to be popular in countries where religious belief has declined, though it wouldn’t surprise me.5 If idol pop fills a need previously filled by religious belief, it is not, as we saw in Part 8, that fans have turned from worshiping gods to worshiping pop idols. Instead fans adopt idols as a means to an end, a route to addressing the terrifying paradox of humanity.
All well and good, so long as you agree idol pop is worth preserving, and celebrity is immortality enough.
Sturgeon’s Law applies to idol pop as much as it does anything else; and that ten percent of idol pop that isn’t crud is still pop music. The four minutes’ worth of distraction fades, and you’re left with an uneasy feeling that is about as far from cosmic specialness as it gets: that, instead of being a creator with a soaring mind, you are a mindless pawn who has let an opportunity for imagination slip by, about as special as a cow in mid-chew.
If celebrity is one answer to the terrifying paradox Becker identified, another is the phrase “the right side of history.” It gets used a great deal in American politics; Barack Obama was particularly fond of it. “We have a mission and a mandate to be on the right side of history,” the late Congressman John Lewis said about his vote to impeach Donald Trump. But Lewis’s political opponents use the phrase too; the conservative pundit Ben Shapiro made it his book title. A good many people want to be on the right side of history; which is to say, want to be reassured that future generations will approve of present choices. They are afraid of being remembered as unheroic: villainous, even.6
Again, this is a sensible response when religious certainty is lost but the terror of nonexistence remains. We have a lot more evidence to believe that future generations will judge us than we do of the presence of gods—here we are, judging past generations, after all. A century ago, for example, many Americans, occupying all sorts of spots on the political spectrum of the time, believed in ideas about eugenics and variations by race we find appalling today.7 In 2019 Vox asked a series of contributors to guess at what common practices of the present day will be “unthinkable” by 2070: the answers ranged from meat-eating to abortion to tackle football. But you could also guess buying cheap clothes, or drinking water out of plastic bottles, or failing to learn a second language, or playing video games, or saving our valuable data in nebulous “clouds,” or smoking marijuana, or spending money on idol pop.
None of us know: and therein lies the problem with trying to govern one’s own behavior by looking forward to the judgment of history. By definition we don’t have as much information as future generations will have in hand when judging our behavior. Moreover, to talk about the “right side of history” assumes that history has only one side, when it is constantly changing, being reinterpreted, evidence being added or discounted or weighed differently. Think of all those Confederate statues being torn down, having been put up in the first place in an attempt to assert control over the collective narrative of the American Civil War. Think, too, of Nursultan Nazarbayev, insisting on his status as father of his country; dictators want to dictate to history as much as to any other audience.
But no one gets to control history; it keeps shifting and being rebuilt. Long-ago agreed-upon truths get questioned, new contributions (or fabrications) are added into the mix, once-reviled figures are rehabilitated. The only certain thing about history is that, in the very long run, it all gets ground into finer and finer dust. If we look far enough ahead we can glimpse our own irrelevance. Someday we will all not only be dead but unremembered, every building we ever laid eyes on destroyed, every person who ever heard our name dead with us. Our languages will mutate past all recognition, our national borders be redrawn, our societies dissolve and regroup, our very planet succumb to cosmic obliteration. Immortality is impossible. We know this, all of us, and we cannot deal with it, any of us.
There is an alternative to the frantic striving to appease the god-self’s bid to outlast the worm-self. I won’t claim to understand it well, or practice it myself well. But at present it seems to me the best insurance against the potential destruction that can result from a bid for immortality, whether said self-destruction is as overwhelming as violence towards other people or as small and simple as working on a project you don’t actually believe in. The alternative strategy is: cultivate hopelessness, as far as the god-self is concerned. Assume all of your bids for immortality will fail and fail hard. Don’t live so that people will only say nice things about you at your funeral; live in the expectation that no one will attend your funeral at all.
That then sets up the question: Would I still do this if I knew for certain that it would not be remembered? To be fair, in certain contexts the question loses its moral utility. (It shouldn’t be used to justify shoplifting, for example.) But in thinking about projects undertaken to bid for immortality, it can help you figure out whether the project is worth doing in itself. We do lots of things simply for the sheer joy of them, don’t we? Including listening to music, or dancing or singing along.
Trying to let go of the ambitions of the god-self is also a more accurate approach to the judgment of history. Which is to say: humility. You don’t know how history will remember your actions, or if history will remember you at all. On top of everything else, history tends to favor the dramatic over the mundane, the murderous over the gently careful, and the authoritarian over the deferential, which is why we read books about serial killers and not about interesting, kind, funny people who happen to have their lives unfairly and violently ended.8 You can live a good life completely independent of history, if you want. The god-self will protest, but: good can be good without being remembered. This is even more true in the case of creative work: the work you think is terrific goes neglected; the work you think is terrible strikes a chord. All you can do is do, and let the god-self stamp in frustration. All you can do is work for the sake of working, to say you made the work, even if no one else cares.
Idol pop mainly runs off of celebrity, but it draws from the need to create that goes deeper than celebrity. Suppose you went to all the aspiring idols, in South Korea and Japan and Kazakhstan and Kyrgyzstan and China and Thailand and the Philippines and everywhere else, and posed to them this question: told them that they might get to perform, they might have moments of satisfaction in practice rooms, they might be able to say at the end of their career that they performed to the best of their ability, but they would be guaranteed not to be remembered. Would they still want to pursue the tough job of becoming an idol? Some might drop out. A good many, perhaps. But I don’t think all of them would.
In 2016 one of the Kazakh TV stations ran a short and generally skeptical piece on Ninety One, under the headline “Business on Pop.” After briefly recapping the group’s confrontations with its critics—this was before the oft-disrupted fall tour—and following the members as they get their hair dyed and styled, the narrator says, “The stilyagi had a hard time being understood, too.”9 I’m not well trained in inflection of emotion in Russian, and so I can’t tell whether the comparison is sympathetic or sarcastic or neither.
Stilyagi is not a Kazakh word but a Russian one (derived from стиль, style) that was applied to a subculture of young Russians in 1950s in cities such as Leningrad and Moscow. They were known for wearing their clothes bright and loud: wide pants, narrow ties, carefully pomaded hair. (A 1950s stilyagi and a 1990s rockabilly would have a lot of notes to compare.)10 Stilyagi also liked to chew gum (or paraffin wax, gum not being widely available) and listen to American rock ‘n’ roll and jazz, famously copying music onto discarded X-rays when they had to. Reportedly the look died out after the 1957 Youth Festival in Moscow, when curious stilyagi were able to get a glimpse of current Western fashions and update their wardrobes accordingly.
But during their heyday, the stilyagi just wanted to dress up and dance. Despite the emphasis on bright fashions, they weren’t making some grand queer statement: stilyagi culture included the idea of women as “chicks” to be pursued or displayed rather than as potential equal participants in the fun. Similarly, the stilyagi were not particularly pro-Western or anti-Communist. “The majority of trendsetters,” wrote Juliane Fürst in her 2010 book Stalin’s Last Generation, “were not only at the height of fashion, but also represented the height of political indifference… Anti-Sovietness, unlike a sense of superior difference, was not a required part of the stiliagi repertoire.”11
Nonetheless, the official Soviet response to the stilyagi, while not as harsh as it might have been during the worst of the Stalinist times, was still openly hostile. Multiple scholars have suggested that the stilyagi, by pushing the limits of how Soviet youth could express themselves, contributed to the “thaw” under Khrushchev that led to greater access to Western art and cultural products in Soviet Russia. But in the early 1950s, “bone records” traders were known to serve multiple-year prison sentences, and the sanctioned satirical magazine Krokodil was mocking stilyagi as early as 1949: “His lips, eyebrows and thin mustache were made up, and the most fashionable lady in Paris would have envied his ‘permanent’ hair-do and manicure.”
The stilyagi posed much less of a threat than, say, more openly rebellious or curious samizdat circulators. But they posed a threat nonetheless, for being overly individual, peacock-like, self-absorbed: unuseful. The site Seventeen Moments in Soviet History, a collection of primary sources, features translated excerpts from a 1953 short story, “Grigorii Dudko Finds Friends”, in which Grigorii, a promising young man, is led astray by a stilyaga:
Once Grigorii asked: “Where do you work, Victor?”
The young man chuckled: “Why work? Anyone with a head on his shoulders can get along without working. I wouldn’t even have begun to study, but Dad wouldn’t give me any rest.”…
Once, after an exam, Deev took Grigorii by the sleeve and, making a face, asked: “Couldn’t you dress a little better?”
Grigorii wanted to object and say that his uniform was not so bad, but for some reason he decided not to but simply nodded his head affirmatively.
Grigorii’s subsequent fall, marked by his expulsion from the Youth Communist League, is arrested by reproaches from his comrades and his hardworking love interest Nadia. The story ends with the pair happily married and Grigorii redeemed: “He has become the best welder in the shop and also a brigade leader.”
As best can be told from the excerpts, there’s no actual sexual seduction; the evil stilyaga Victor tempts Grigorii into indifference and indolence, not homosexuality. Nonetheless, the stilyagi represents a challenge to Grigorii’s commitment to the collective, both at work and in his relationship with Nadia. The stilyagi is both untraditional in his approach to masculine presentation and unwilling to think of the larger collective, and the two vices are linked. His dancing, his listening to forbidden bone records, his interest in fashion are all ends in themselves. He is not thinking of the judgment of history. In an empire whose entire system ran around collective pledges to the future—even when said collective pledges led to murderous famine—such indifference to the judgment of history could only be threatening.
And this is not at all unique to the Soviet Union.12 We’ve seen it in Chinese worries that its animé-addled, xiǎo xiān ròu-worshipping youth will be unable to make the nation proud, and in American mockings of disco. We’ve seen it in leftist declarations that time spent listening to, or composing, inferior music is just falling into complicity with oppressive capitalism. We’ve seen it in South Korea, whose innovation was to solve the dilemma by casting idol pop as a nation-serving export. And we’ve seen it in Kazakhstan, where Ninety One were judged as disruptive and dangerous just by putting on eyeliner, before they even started getting near statements that might upset the Nazarbayev coterie.
Idol pop, for all for its formulaic touches, seems to contain something that sparks unpredictable emotions, which do not do us the favor of fitting neatly into containable political boxes. Idol pop, like a lot of the products of human imagination, retains the capacity to surprise. It provides an argument for us to be humble about the consequences of our behavior; we actually can’t, in fact, predict when a seemingly shallow or selfish action will result in increasing, rather than shutting down, human connection. As ZaQ and AZ put it in “Why’m”: Often we know the answers to the questions in advance, but you come first!
2020 has been a weird year for Ninety One, quite possibly their most challenging period since they stopped having to defend themselves in fistfights on a regular basis. In March they ended up quarantining together. (“We’re a family!” they said on Instagram Live, and made a show of giving thumbs-ups to the camera, and I started wondering how long it would take for the fistfights to move in-house.) Over the summer they gave an interview to the Korean English-language newspaper JoongAng Daily, promising to release a song that has yet to surface. Soon afterwards AZ struck out on his own, and the group moved to rebound, giving a bunch of (as yet untranslated) interviews and releasing a four-song EP, simply titled 91. (It’s better than Aiyptama but not as good as Dopamine or Men Emes.) They did manage to film a music video for the lead single, “Señorita (TekkeTekke),” but they can’t tour to support the EP like they did for the previous three releases, a tough situation for a group that generally doesn’t produce physical copies of its albums and can’t move merchandise outside of Kazakhstan.
But in the midst of promoting 91, Ninety One also released a full album—actually, ZaQ released a full album, titled DO LOT, a play on his real first name, Dulat. The grandiose introductory video for DO LOT actually made it to YouTube before the “Señorita” video did.
The absence of AZ notwithstanding, this is one of the more Ninety-One-ish of Ninety One videos, given that it has ZaQ philosophizing at length about balance and language and the role of music in his life, and ZaQ saying, “I believe that what I’m doing is a sign of patriotism,” and, since Ninety One is an idol-pop group, footage of them all being goofy in the studio together, complete with Ace trying to make it rain with no actual rain available. The video ends with ZaQ standing on a hill, gazing out into the distance—but the distance isn’t the majestic steppe of official state public relations, but a scattered, slightly fogged-in collection of apartment buildings and industrial structures.
“Kazakhs should be modern,” ZaQ told the Zhas Otan audience. Remember?
To be “modern” is to sit at a distance from one’s antecedents: to be less easily identifiable, less obviously tribal, less pure. ZaQ likes to talk about “balance,” but we could just as easily substitute in “ambiguity,” or “dualism,” or “mixing,” or even “corruption.” Outside Kazakhstan, and quite possibly within it as well, it’s easier to get our heads around the concepts of “Kazakh” and “modern” separately than to say, Kazakh and modern. Or loyal Soviet citizen and not particularly political. Or masculine and made up. Or loving one’s fans and insisting on keeping them at a distance. Or committed to doing good and at peace with one’s own eventual nonexistence. Or making silly music and very serious about finding rewards in doing so.
These amibiguities, mixings, corruptions are harder to describe than the simple and clear. This makes them less likely to be swept up by history, which generally likes its stories brisk and obvious. They are also not necessarily easy to like in the present; “ambiguous,” in English, usually brings with it negative overtones. Who knows what might be combined next, after all, and to what ends? There’s no predictability available, no reassurance. One ambiguity opens up the possibility of a million others, which leaves the thinker exposed to the ultimate known unknowns: when are we going to die, and how, and what happens after that?
It is human and understandable to shrink away from ambiguities. We all do it, all the time. We rely on gross generalizations and mental shortcuts because life is exhausting enough without them. Anyone who claims to be always open and in favor of ambiguity is setting themselves up for charges of hypocrisy. I’ll fully cop to reading things that reinforce what I already believe, especially when I’m tired or worried or both.
But to me it seems better to run toward ambiguities when you can. Braver, and more creative, and more loving. It doesn’t do anything to extend our immortality projects—if anything ambiguity makes chasing memorialization harder. “Kaida?,” a pleasant, utterly unsurprising 2017 single by Kazakhstani singer Erke Esmakhan, has more YouTube views than every Ninety One music video combined.
I still say the ambiguity is where we expand our concept of what it is to be human, where we make ourselves greater. More ridiculous, and more potentially beautiful. I don’t know what’s going to happen with Ninety One; I don’t know what kind of work, good or bad or otherwise, they have in their future; I love them anyway. And so it goes, for all the bad music and bad actors: I am not sorry, to find something unpredictable and worthwhile in idol pop.
My copy is a 1985 reprint by the Free Press.↩
Both Mark Manson’s The Subtle Art of Not Giving a F-ck (HarperOne, 2016) and Oliver Burkeman’s The Antidote: Happiness for People Who Can’t Stand Positive Thinking (Faber and Faber, 2012) contain informal summaries of The Denial of Death’s main points.↩
General Hugh Mercer, who was almost 51 when he died from wounds suffered during the Battle of Princeton in 1777, was a close friend of George Washington; born in Scotland, he had previously served as a doctor in the army of Bonnie Prince Charlie and had to become a fugitive after the Battle of Culloden. In between military stints he opened his own apothecary; you can still visit the historic site in Fredericksburg, Virginia. If you’re considering entering the crowded field of biographies of Revolutionary War figures, he’s not a bad choice for a subject.↩
One exception to the rule is a 2018 survey of Malaysians in their early 20s, which found that positive attitudes towards Korean idol pop didn’t vary with religion, though the authors were measuring religious self-identification, not strength of religious belief. See Grace Phang Ing, A.A. Abdul-Adis, and Zaitan Osman, “Korean Wave and Malaysian Young Adults: Attitudes, Intention, and Behaviour,” Malaysian Journal of Business and Economics, Volume 5., No. 1 (2018).↩
Bill Bryson’s One Summer: America, 1927 (Doubleday, 2013) includes an overview of how popular eugenicist language and explanations were in America in the 1920s. See also Thomas C. Leonard, “Eugenics and Economics in the Progressive Era,” Journal of Economic Perspectives, Volume 19, No. 4 (2005), a prelude to his 2016 book on the same subject, which includes the tidbit that in 1928 there were more than 300 college courses available on eugenics.↩
See also Devin Kelly’s recent essay, “Out There: On Not Finishing”: “And yet: when was the last time anyone ever told a man to be ordinary? Think of the difference that would make, to begin to dismantle our need to be heroes, to finish things, to consider ourselves defined by accomplishment…”↩
The translation by the Qpop Translations sub team reads, “In the end ‘stilyagi’ of 80’s, too, weren’t understood.”↩
Stalin’s Last Generation: Soviet Post-War Youth and the Emergence of Mature Socialism was published by Oxford University Press in 2010. See also Yulia Karpova’s 2009 master’s thesis at Central European University, “Stilyagi: Soviet Youth (Sub)Culture of the 1950s and Its Fashion.”↩
For yet more examples of “temporary, noisy, intense, ecstatic” popular culture opposed to, and suppressed by, grim collective immortality projects in action, see Charles Paul Freund’s fantastic 2002 essay for Reason, “In Praise of Vulgarity.”↩