Saturday, August 05, 2017

links for the weekend, Atlantic Monthly pieces about men posing as women authors and women finding they'd rather work for men in corporate settings; and stuff like the distinction between mastery and brainstorming in the creative process

Over at The Atlantic there were a couple of features that stood out about men and women.  One of them was the following piece about how men have taken up feminine pseudonyms to publish novels.


https://www.theatlantic.com/entertainment/archive/2017/08/men-are-pretending-to-be-women-to-write-books/535671/
 
Almost 10 years ago, Martyn Waites, a British crime writer, was having coffee with his editor. Waites, who was at something of a loose end project-wise, was looking for new ideas. His editor, though, was looking for a woman. Or, more specifically, a high-concept female thriller writer who could be the U.K.’s Karin Slaughter or Tess Gerritsen.
 
“I said I could do it,” Waites recalls. His editor was skeptical. But then Waites outlined an idea for a book based on a news story he’d once read, about a serial killer targeting pregnant women and cutting out their fetuses. The concept, he admits somewhat bashfully, was a gruesome one.
 
“That’s exactly what we’re looking for,” was his editor’s response.
 
That idea became The Surrogate, a crime thriller published in 2009, and Waites simultaneously became Tania Carver, his female alter ego. Before he started writing, he embarked on a period of research, reading novels by popular female crime writers, and made “copious notes” about their various heroes and villains. Waites was an actor before he was a writer, and “Martyn” and “Tania” soon became different personas in his head, almost like characters. He’d sit down to write as Tania and then realize the concept was much better suited to Martyn. Martyn books, he explains, “were more complex, more metaphorical. The kind of things I like in writing.” Tania books were simpler: mainstream commercial thrillers aimed at a female audience. And they rapidly became more successful than any of Waites’s previous books had been.
 
The case of a male author using a female pseudonym to write fiction was relatively unheard of when Tania Carver emerged, but the explosion of female-oriented crime fiction in the last five years has led to an increasing number of male authors adopting gender-neutral names to publish their work. Last month, The Wall Street Journal’s Ellen Gamerman considered the phenomenon, interviewing a number of writers who fessed up to being men: Riley Sager (Todd Ritter), A.J. Finn (Daniel Mallory), S.J. Watson (Steve Watson), J.P. Delaney (Tony Strong), S.K. Tremayne (Sean Thomas). The trend is ironic, Gamerman pointed out, because the history of fiction is littered with women writers adopting male or gender-neutral pseudonyms to get their work published, from the Brontë sisters to J.K. Rowling.

Another is about women and men in the corporate world and how women find they don't like working for women in high-powered contexts.

https://www.theatlantic.com/magazine/archive/2017/09/the-queen-bee-in-the-corner-office/534213/
...
After 16 months, Shannon decided she’d had enough. She left for a firm with gentler hours, and later took time off to be with her young children. She now says that if she were to return to a big firm, she’d be wary of working for a woman. A woman would judge her for stepping back from the workforce, she thinks: “Women seem to cut down women.”
 
Her screed against the female partners surprised me, since people don’t usually rail against historically marginalized groups on the record. When I reached out to other women to ask whether they’d had similar experiences, some were appalled by the question, as though I were Phyllis Schlafly calling from beyond the grave. But then they would say things like “Well, there was this one time …” and tales of female sabotage would spill forth. As I went about my dozens of interviews, I began to feel like a priest to whom women were confessing their sins against feminism.
 
Their stories formed a pattern of wanton meanness. Serena Palumbo, another lawyer, told me about the time she went home to Italy to renew her visa and returned to find that a female co-worker had told their boss “that my performance had been lackluster and that I was not focused.” Katrin Park, a communications director, told me that a female former manager reacted to a minor infraction by screaming, “How can I work when you’re so incompetent?!” A friend of mine, whom I’ll call Catherine, had a boss whose tone grew witheringly harsh just a few months into her job at a nonprofit. “This is a perfect example of how you run forward thoughtlessly, with no regard to anything I am saying,” the woman said in one email, before exploding at Catherine in all caps. Many women told me that men had undermined them as well, but it somehow felt different—worse—when it happened at the hands of a woman, a supposed ally.
 
 
Even a woman who had given my own career a boost joined the chorus. Susannah Breslin, a writer based in Florida, yanked me out of obscurity years ago by promoting my work on her blog. So I was a bit stunned when, for this story, she told me that she divides her past female managers into “Dragon Ladies” and “Softies Who Nice Their Way Upwards.” She’d rather work for men because, she says, they’re more forthright. “With women, I’m partly being judged on my abilities and partly being judged on whether or not I’m ‘a friend,’ or ‘nice,’ or ‘fun,’?” she told me. “That’s some playground BS.”
 
Other women I interviewed, meanwhile, admitted that they had been tempted to snatch the Aeron chair out from under a female colleague. At a women’s networking happy hour, I met Abigail, a young financial controller at a consulting company who once caught herself resenting a co-worker for taking six weeks of maternity leave. “I consider myself very pro-woman and feminist,” Abigail said. Nevertheless, she confessed, “if I wasn’t so mindful of my reaction, I could have been like, ‘Maybe we should try to find a way to fire her.’?” [emphasis added]

It's fascinating how Phyllis seems implicitly not a part of the sisterhood implied in the article.  Women could be described as both a historically marginalized group but also as a group of which the late Phyllis Schlafly might not be a welcome participant ... ?  This kind of writing reminded me of Hanna Rosin's proposal a few years back that what many women writers regard as the sisterhood might be more properly described as rich white ladies who can afford to live in New York from the income they make as writers and that the actual sum of women the world over do not really reflect this sisterhood.  The odds of a woman like Margaret Thatcher being celebrated by authors at Slate's Double X seem comfortably close to zero, for instance. 

One of the things social psychologist Roy Baumeister has written about in his book Is There Anything Good About Men? is male social systems tend to develop in ways that ensure that the membership or participation of any one male in the system is contingent.  To put it in more blunt, practical terms, unless you can prove there's a good reason to keep you within the team and that you fulfill the requisites of participation there's no good reason to keep you around.  A tension between this and women participating in the higher levels of the corporate world may be that while women want to contribute at higher levels of corporate life at which society is often guided, the competition and viciousness of that world is often alienating. 

It's possible that what this article circles around but doesn't get at directly is that the social dynamics inside a corporate context call for a type of socialization that requires brutal pragmatism for better and worse, and that perhaps women are socialized to refuse the kind of compartmentalization that men are socialized to accept within post-industrial Western societies.  It involves a gender stereotype, yes, but the stereotype is that men can hate you at a personal level and still work with you in the grudging concession that you're good enough at what you do that you're the best person for the job, personal animosities withstanding.  Perhaps what women have been observing is that in a corporate setting where they work for women this capacity to differentiate is arrived at with greater difficulty, or to go by the sum of the complaints, never arrived at in settings where the aggressive supervisor is female, the proverbial Dragon Lady.

That conflation of professional and personal socialization may not really be "playground BS", it may simply be how women in Western contexts have been socialized from birth. 

There may be a flip side to this in the male stereotype, the guy who can cut you loose because you're just not getting the job done but can recognize that you're a nice enough person who would be a promising employee in another context at some other company. 

Of course ... there's also the proverbial good old boy network in which men who by outsider standards would be regarded as completely unfit for a job and not even competent are granted huge levels of power, access and privilege because the right person knows them and decides to give them a job.  While it's possible to conjecture at some examples of this sort of corporate culture it can also exist conspicuously in non-profit settings and can be potentially just as bad--after all, it's not like no one these days can think of churches were sub-par employees had large amounts of influence and power within an organization exceeding their provable abilities because someone upstairs decided they were on mission.  It seems safe to guess that there's nothing about male or female that exempts managers from engaging in nepotism, cronyism and networking in a good old way to the detriment of an organization.

Over at Quarts, a little piece proposing that pursuing creativity as a goal in itself is very likely a waste of time compared to simply (irony alert) acquiring a comprehensive mastery of the field you're working in.

https://qz.com/996936/why-mastery-beats-creativity-every-time/

Why mastery beats creativity—every time
 
The idea that comes out of nowhere. The eureka moment. If we could figure out how to get there faster and automate up the process, humankind would be forever changed, right? This is something we can’t stop obsessing about as a society—but maybe we’re thinking too hard about it.
 
Two years ago, the New York Times reported on a whimsical new trend on college campuses: studying creativity itself. Schools were suddenly offering minors in creative thinking and asking their students to problem-solve for problem-solving’s sake. The classes seemed to make the students more confident, and had benefits that were tangible if slight: one student figured out a quicker way to re-shelve DVDs at his library job.
 
And yet this worship of creativity has haunted me since I first read the article. Aren’t we thinking of “creativity” too broadly here? Is it truly something we can study on its own, divorced from the problems and distractions and flash cards of the real world?
 
...
 
So yes, a creative studies minor can be useful for the first part of “being creative”—the convergent phase. But when it comes to the divergent phase, learning to be broadly “creative” isn’t enough. In the divergent phase, where you assess the ideas you’ve generated, existing knowledge is incredibly important. In other words, “being creative” starts to depend heavily on what we already know.
 
This prior knowledge of a system or field may be the most important aspect of “creativity”—much more so than convergent thinking.
 
Some of the most compelling experimental evidence describing brain activity patterns during the “divergent” phase of a creative task implicates the medial temporal lobe and hippocampus, which is the part of the brain that humans use when making, storing, and accessing memories—and the hippocampus lights up like a firecracker during memory recall. Evidence of hippocampal activation during the “divergent” thinking part of the creative process may indicate that subjects are calling upon existing knowledge to complete the task, in order to ultimately generate unique or novel outputs. The mathematician Terry Tao hinted at the same end point, albeit less neurologically, when he said that the ability to apply and intuit arises from mastery.
 
This is why learning to brainstorm and listening to the Muse isn’t enough when it comes to studying creativity. In order to “be creative,” in order to problem-solve with the best of them, we need to work on becoming not just artists—but experts.  [emphasis added]


adfasd

asdfasdf

In the category of "we're better than average", there were a couple of links at ArtsJournal about how people who pursue artistic experiences and participate in their local arts communities are more likely to be charitable givers.


https://psmag.com/social-justice/artists-are-also-altruists
https://www.thestage.co.uk/news/2017/people-experience-arts-likely-donate-charity-new-research/

Michael Lind's argument over at The Smart Set is that we should stop bragging about charitable generosity and work toward an economic system in which it's less and less necessary to solicit that sort of charity.  Certain types of cranky folks on the internet might say the above links about artists and arty types as altruists is pharisaical virtue signaling.  It's not like the Boyle Heights situation coverage made it seem like everybody thought artists and associated gentrification with arts venues was actually making life easier for longtime residents but let's leave that at that for the time being.  We can note in passing that an engagement with the arts and a concern about the arts scene can be found in someone like a Zhdanov, too, so we should be extremely cautious about how virtuous we think it is to be into the arts as if that were a thing in itself to be regarded as great.  After all, ArtsJournal links are probably not going to be going be people touting the remarkable musicianship and cogent philosophical musings of a band like Rush.  Which gets us to ... .

There's more being written about somebody's book about progressive rock, and it's been taken up as an opportunity for authors to vent about a musical genre they don't exactly enjoy, once again, over at The Atlantic,  by a James Parker, which could faintly come across as though it was penned by what's colloquially known among critical circles as a "rockist":

  
...
 
Money rained down upon the proggers. Bands went on tour with orchestras in tow; Emerson, Lake & Palmer’s Greg Lake stood onstage on his own private patch of Persian rug. But prog’s doom was built in. It had to die. As a breed, the proggers were hook-averse, earworm-allergic; they disdained the tune, which is the infinitely precious sound of the universe rhyming with one’s own brain. What’s more, they showed no reverence before the sacred mystery of repetition, before its power as what the music critic Ben Ratliff called “the expansion of an idea.” Instead, like mad professors, they threw everything in there [emphasis added]: the ideas, the complexity, the guitars with two necks, the groove-bedeviling tempo shifts. To all this, the relative crudity of punk rock was simply a biological corrective—a healing, if you like. Also, economics intervened. In 1979, as Weigel explains, record sales declined 20 percent in Britain and 11 percent in the United States, and there was a corresponding crash in the inclination of labels to indulge their progged-out artistes. No more disappearing into the countryside for two years to make an album. Now you had to compete in the singles market.
 
Some startling adaptations did occur. King Crimson’s Robert Fripp achieved a furious pop relevance by, as he described it, “spraying burning guitar all over David Bowie’s album”—the album in question being 1980’s Scary Monsters (And Super Creeps). Yes hit big in 1983 with the genderless cocaine-frost of “Owner of a Lonely Heart.” And Genesis, having lost ultra-arty front man Peter Gabriel, turned out to have been incubating behind the drum kit an enormous pop star: the keening everyman Phil Collins.
 
These, though, were the exceptions. The labels wanted punk, or punky pop, or new wave—anything but prog. “None of those genres,” grumbled Greg Lake, retrospectively, “had any musical or cultural or intellectual foundation … They were invented by music magazines and record companies talking together.” Fake news! But the change was irreversible: The proggers were, at a stroke, outmoded. Which is how, to a remarkable degree, their music still sounds—noodling and time-bound, a failed mutation, an evolutionary red herring. (Bebop doesn’t sound like that. Speed metal doesn’t sound like that.) [emphasis added]
 
I feel you out there, prog-lovers, burning at my glibness. And who knows? If the great texts of prog had inscribed themselves, like The Lord of the Rings, upon my frontal lobes when they were teenage and putty-soft, I might be writing a different column altogether. But they didn’t, and I’m not. The proggers got away with murder, artistically speaking. And then, like justice, came the Ramones.
 
What might make this sort of condescension toward progressive rock and its fans more diplomatic would be if Parker had demonstrated enough musical history to show us that this kind of thing has happened before.  There's plenty of music from the eighteenth century that has simply note stood the test of time even within the "classical" tradition.  Why?  Haydn's complaint of one of his contemporaries, for instance (citation pending but, trust me, I'll eventually dig it up) was that the composer flitted from one idea to the next and made nothing of his themes so there was nothing to treasure in the heart. 

To put that lament in more 21st century terms, some composers don't respect the cognitive constraints of the human brain as much as they regard their own virtuosity; so they show off their awesome chops and make the mistake of thinking that because they can see how it all holds together on the page the audience (who must surely show some gratitude) must be able to hear it.  Like forgotten eighteenth century symphonies with too many ideas for any one hook to take hold, progressive rock could be considered a comparable dead end.  It's just more fair-minded to the aspirations of the musicians themselves to suggest that the problem was less one of the tunes than that, as a friend of mine from college put, there are too many tunes and in too few songs do the prog rockers commit to the tunes they have.

Whether it was the James Parker review of a book about progressive rock or Richard Brody's recent write up about Die Hard, both reminded me of a piece by Arthur Krystal.

http://www.chronicle.com/article/Should-Writers-Reply-to/131157
...
I'm not complaining—OK, I am complaining, but not because reviewers find fault, but because given a chance to perform they forget they're rendering a service to the reader, not one to themselves. [emphasis added] A flawed book gives no one license to flog it in print. If there are mistakes, why not sound regretful when pointing them out instead of smug? If the book doesn't measure up to expectations, why not consider the author's own expectations with regard to it? While no one wants shoddy work to escape detection, a critic must persuade not only the impartial reader but also the biased author—as well as his biased editor and biased family—that the response is just.

And tone matters, tone is crucial. Even writers who check their personalities at the door often condescend without meaning to. Perhaps it can't be helped. There's a reason, after all, that a judge's bench overlooks the courtroom: Sentences must appear as if passed down from on high. I'm not saying only Buddhists should review, but wouldn't it be nice if the superior attitude, the knowing asides, and the unshakeable convictions could disappear from the world of print? From personal experience, I can tell you that my own books have been discussed by people who had no idea what most of my essays were about, but whose pontifical airs demonstrated (as if further proof were needed) that lack of knowledge is never an obstacle to self-esteem.

I got to the end of Brody's write up on Die Hard and it seemed the world would have been no worse a place if he'd never bothered to watch Die Hard.  Brody, over time, has come across like the kind of arts critic who can look down on the half-century of Star Trek as mass culture without being willing to simultaneously repudiate the social and political ideals it has stood for. If there is a vice to which professional critics seem particularly prone in Anglo-American journalism it's that they would prefer to review films in which they can revel in their powers of introversive and extroversive observation about the sum of cinematic art rather than review a movie whose moralizing agenda is 1) patently obvious in its presentation and 2) just possibly not the moralizing lesson they would wish to have presented.  When a reviewer at Salon said of Christopher Nolan's The Dark Knight Rises that Nolan was a fascist there was apparently no real need to resort to evidence, the assertion was enough.  Now it's possible Nolan has a political view you or I would disagree with but that a reviewer at Salon could so confidently assert Nolan was a fascist filmmaker suggests that when journalists despair of Trump fans riffing viciously in comboxes they may not fully appreciate the extent to which they themselves have been leading the way but with the insulation of institutional imprimatur.  Just another idea to consider for the weekend there.

If even professional critics can be found guilty of writing bad reviews as rendering service to themselves rather than the reader, as Arthur Krystal put it, then how shocked should we be that on the internet trolls are trolls?  How shocked should we be if a sea of people who are not professional arts critics could be even more self-referential or self-serving in spraying vitriolic comments about books or movies and any and all associated creative people who were involved in them?  If the biggest topic about the Ghost in the Shell remake was the whitewashing involved in the American remake then that may not signal that the professional critical scene is really engaging with the ideas of the manga or the anime so much as being, well, skin deep. 

It's not necessarily just pop culture criticism or contemporary journalism.  It can happen in academics.  Kyle Gann has blogged about how he actually likes Clementi better than Mozart much of the time.  If we wanted to lionize a composer for being studiously devoted to an intensive development to a small set of thematic ideas you'd think Clementi would be more highly regarded in academic musicology but, nope, he's no Beethoven.   While as a guitarist composer I've found myself benefiting from studying Haydn and Clementi's sonata forms in more formal academic land Beethoven and Mozart are the better and more profound artists.  That's a shame and not because I really dislike Beethoven or even that I dislike all Mozart (though I have to admit I am bored with most of it).  It's that the museum nature of the academic canon excludes as it embraces.  I have felt over the last fifteen years that being a guitarist has been a kind of advantage in a new music (i.e. classical music with modifier) scene.  But more on that, perhaps, some other time.

Finally, longtime readers of the blog know that this blog has featured an awful lot of material about the rise and fall of what used to be called Mars Hill.  IN a somewhat fiery period between 2013 and late 2014 there was this thing going on where we might quote something here and then a week after the quoting happen the material would go down.  Then with use of  The Wayback machine stuff could be brought up for public consideration to keep information in public view.  Then websites might go down or get modified and robots.txt would get introduced to preclude the use of search engines via archiving sites.  For better or worse Wenatchee The Hatchet managed to document a lot of stuff faster than Mars Hill admins and leadership could take it down so a lot of raw material for historical research (emphasis on raw!) has been preserved here for the public record.  But the challenge has been that when so much of the history of the former church has been in the virtual reality of cyberspace rather than in books a lot of material has been purged in ways that make it hard to recover.  Well, for those who might be curious about what robots.txt is/doss ... 

https://blog.archive.org/2017/04/17/robots-txt-meant-for-search-engines-dont-work-well-for-web-archives/

Thursday, August 03, 2017

a few links about the sins of the middlebrow from, well, kinda more than vaguely middlebrow publications. But first let's start with some dystopian apocalypse

http://www.vulture.com/2017/08/william-gibson-archangel-apocalypses-dystopias.html

Sometimes it seems as though a dystopian narrative is simply a future projection of the perceived consequences of a social or policy trend that we've already committed to as a society and cannot reverse.  In a way a dystopian narrative can be a prophetic warning ... but in a secular context can there really even be prophetic warnings or would this be supplanted by a statistically replicable prediction?  Then again ... if it could be statistically replicated ... .

https://lareviewofbooks.org/article/ordinary-life-greil-marcus-commonplace-songs-and-commonplace-listening

AS SCIENCE EATS AESTHETICS, as rationality consumes imagination, and as what Marco Roth and the editors of n+1 diagnosed in 2013 as the “sociology of taste” devours the chance, freedom, pleasure, and individualism of art, including music, and leaves nothing but bones on the sandy floor of the cultural arena, all listening threatens to become socially determined. Increasingly in this spectacle the point of every song is to take its place in a system (a genre, the charts, a certain history), the point of every singer is to take her place as a representative of a certain interest or community (indie, drill, queer, celebrity, neoliberal), and the point of the nation is to provide the gladiatorial
stadium for a series of contests into which everyone is drawn. More and more it seems Guy Debord was correct when he wrote in 1967’s The Society of the Spectacle that “all individual reality has become social reality directly dependent on social power and shaped by it.” What music you listen to reveals your class status and aspirations; your opinions reveal the same, and their expression in conversation is really just part of a social game played to accumulate prestige. [emphasis added]

Might have been the gist of a complaint at blouinartinfo about Scott Timberg's Culture Crash book as a middlebrow rant.  Turns out ...

https://pagesix.com/2017/07/06/artinfo-used-fictional-bylines-after-outsourcing-staff-to-india/

although if middlebrow and middle class values are as lame as some highbrow leftists think they are ... would that be reason to celebrate the decline of the middle class?

http://www.pewsocialtrends.org/2015/12/09/the-american-middle-class-is-losing-ground/

It seems as though even among middle-brow publications there's a certain suspicion of working class or even middle class personas.  Take Richard Brody's recent piece in which he finally got around to watching Die Hard.

http://www.newyorker.com/culture/richard-brody/i-watched-die-hard-for-the-first-time

It's begun to seem as though there is a pious cliche publications like this in which it's good to lament the cinematic depiction of redemption through violence.  It's a shame that films tell stories in which you are redeemed or doomed based on who you physically attack.  Respectable middlebrow and highbrow cinematic narratives are more apt to traffic in how you are redeemed or doomed depending on who you end up banging.  The myth of redemptive violence is not necessarily different in its end than the myth of redemptive boinking, is it?  Why is one considered preferable to the other?  Perhaps neither is really "preferred", but in the pages of The New Yorker we can read features on a new innovation in the adultery novel. 

Brody was the film critic who claimed that Lady Susan Vernon didn't break any of the "important rules" in the Whit Stillman film Love & Friendship.  That was enough to have me briefly wonder what planet Richard Brody has been living on since Austen's narrative and Stillman's adaptation of that narrative make it pretty hard to escape the impression that Susan Vernon only breaks the important rules and gets away with it for as long as she does because of her pedantic dedication to the smaller and unimportant rules.  When she tries to strong-arm her daughter into not divulging what mother is doing by invoking a commandment from the Decalogue without betraying she doesn't even really know the ten commandments herself that's ... heroic to Richard Brody.  Brody impressed me in a negative way by rambling at length about the kind of Fantastic Four film we "would" have gotten a few years ago if the studio hadn't interfered.  Right.  Brody closed his thoughts on Die Hard thusly:

Of course, pop culture already, and always, existed. In the fifties, however, French critics saw some Hollywood movies as the artistic equals of any in the world—and as art equal to that of any painting, novel, or musical composition. American critics soon followed suit, and suddenly attention to Hollywood became an essential part of intellectual and artistic life. But, by the mid-seventies, the idea was stood on its head: in the wake of the sixties’ great political, social, and generational disruptions, nostalgia (for the fifties, and sometimes for earlier) took hold, as did the dream
and the yearning for the kind of cultural unity that seemed (but only seemed) to have been lost. (In fact, the earlier mainstream simply excluded vast swaths of the population, large spans of experience.) Popularity itself, and its correlate, celebrity, became an intrinsic value—the mere fact of widespread knowledge and familiarity became a reason to pay attention. Suddenly, filmmakers, critics, and viewers all became aware that they were functioning in an environment of pop culture, as if fish had suddenly become aware of living in water, and the attention paid to the most prominent
productions of mass media further amplified them, turning filmmaking into a mighty feedback machine of cultural self-reflection. “Die Hard,” like many movies of the eighties, is in effect a signifier of itself. There’s no need for eighties nostalgia—because, in this regard, the eighties have never ended.

This comes off like the kind of pious bromide a professional critic traffics in for stuff that isn't in his or her wheelhouse.  It's not like he has to like Die Hard, obviously. But imagining that all these things that supposedly happened in a way that was distilled in the 1980s was really unique to the 1980s seems dumb.  What if we float the idea that the French New Wave had already established that film, at a global level and as a globally practiced art form, had become a mighty feedback machine of cultural self-reflection.  Reportedly Orson Welles' complaint about Godard was that the man didn't make films for audiences but for film critics.  Unless "cultural self-reflection" was supposed to mean that film began to reflect back to the stupid masses what the stupid masses wanted rather than a cultural self-reflection in which filmmakers made films to reflect back to the literate critical classes what they imagined to be true about themselves? 

Thanks to my parents' generation the 1960s never exactly ended either.  At some point the last of the Beatles will die off and the cultural narrative about the pop culture that changed the world can get reassessed.  Until then ... the song remains the same, kinda like Led Zeppelin said it was. 

My brother once told me that it seems like there's this generational rule that "you" think that rock and roll or whatever the popular music of the era may be became "dead" the year you decided you were a grown up.  It's not really that the musical style died so much as you decided that you're a grown up now so you've heard everything worth hearing.  Someone once wrote that when critics lament the death of an art form, like let's say ballet, that odds are decent that the old man is not so much confronting the mortality of the art form is evading the approach of his own mortality. 

I have a slightly more upbeat variant of this idea, which is to propose that the day you complain that cinema is spent is probably not a sign that cinema is spent, it may be nothing ore than a sign that you've been watching way too many movies and need to immerse yourself in other art forms. 

I get coming back to a decades old movie and feeling obliged to say it wasn't as amazing as advertised even if there were some fine things about.  I'm planning to do this about Oshii's Ghost in the Shell ... . 

Maybe it's easy for mainstream liberal critics to act as though things ended in the 1980s but I can't shake that that alternative timeline presented in Alan Moore's Watchmen simply did not happen.  Besides, what was "I'm with her?" doing if it wasn't trading on a Clintonian nostalgia?

Tuesday, August 01, 2017

Fredrik deBoer has a simple checklist for how you can tell whether you or your kid is primed for academic success, be born upper-middle class or better without being born too soon

https://fredrikdeboer.com/2017/07/31/the-academic-success-sequence-get-lucky-at-birth-mostly/

...

I sometimes get anxious emails from parents, wondering what they need to do to make sure their children are going to be OK academically. And because of networking effects and the nature of who reads this small-audience education blog, I can mostly tell them accurately that they don’t really have to do much of anything; they’ve already set up their children to succeed simply by virtue of having them. Here’s the real Academic Success Sequence:
  1. Be born to college-educated parents.1
  2. Be born to middle-class-or-above parents.
  3. Be born without a severe cognitive or developmental disability.
  4. Don’t be exposed to lead in infancy or early childhood.
  5. Don’t be born severely premature or at very low birth weight.
  6. Don’t be physically abused or neglected.
There's more but the checklist is the salient thing. 

Now unlike deBoer I don't think there will ever be socialism and I don't believe it's worth attempting. I side with Ellul in believing the evidence of the entirety of human history so far indicates that there is ultimately no solution for the plight of the working class and that the very idea of collective ownership of the means of production is pure fiction.  Now I'll grant that in some sense a highly decentralized distribution of participation in the means of production might be feasible but in this respect the socialist and the libertarian might be on the same page ... potentially.   The workers will never own the means of production, there will always be grotesque inequality that cannot be ameliorated at more than a remedial level at best.  Imperialism is the baseline of all human civilizations in the end. 

We can't even legitimately say that all hunter gather societies were egalitarian because there's a wealth of information about the Native American tribes of the Pacific Northwest region that indicates that despite the fact that they were hunter gatherer societies they had complex codes regarding property ownership and they even had a caste system of chiefs, free people and slaves.  Our best shot at a better civilization is not pretending that humans have not been commodifying themselves since the dawn of humanity but by recognizing how prevalent this tendency is across human history and taking steps to curtail excesses. 

Thematically that might lead to an idea that deBoer mentions, that there are student activists who are demonstrating that what they want is to shut down views that, while conservative and perhaps objectionable to many on the left can be considered mainstream.


https://fredrikdeboer.com/2017/07/27/yes-campus-activists-have-attempted-to-censor-completely-mainstream-views/

His concern, which seems like a fair one, is that if people on the left want the Republicans to stop having ideas that it would be just as well to dismantle the education system as we know it to stop giving them ammunition by agitating against mainstream views in ways that inspire retaliation at a legislative and budgetary level. 

John Halle, over at his blog, was noting how some folks seem ready and able to exploit cycles of outrage

http://johnhalle.com/outragesandinterludes/on-the-exploitation-of-outrage/

Over the past few years, the following sequence has occurred often enough to have become a familiar pattern.

1) Professor X, a relatively obscure academic (as most academics are), shares an incendiary statement on social or broadcast media. While recognizable as a left position on racial justice, Palestinian rights or the Trump administration, it is conspicuous for implicitly or explicitly condoning violence. Furthermore, its tone is emotional, overheated and hectoring. Few regard it as highly effective as it is more likely to antagonize rather than convince those not already inclined to agree.

2) The right seizes on the most extreme interpretation of the statement, calling for X’s firing, sometimes being able to recruit elected officials in their support (particularly if X is at a public university). Whatever the subsequent outcome, it is mostly irrelevant as the main purpose is to fan the flames of right wing vitriol. The story is invariably entered into wide circulation at Breitbart, Fox and talk radio, likely (though this can’t proven) advancing both the right agenda and the range and intensity of its influence .

3) The left responds (reasonably) by strongly defending X’s first amendment rights. Letters are circulated with hundreds of signatures, including from those who have serious reservations about the original statement. For so-called free speech absolutists, the content of the statement is irrelevant as the right to free expression should always be defended. These and other statements of support are widely reported on left wing media such as Democracy Now, the Real News, Jacobin, etc. X is a frequent guest on these and other outlets.

4) As a result of 3), X is no longer obscure, rather the opposite: having made the rounds of left wing media X is now a bona fide left celebrity, a status which is maintained after the commotion resulting from 1) has subsided. They go on to become go to sources for a left perspective on their own areas of expertise, race relations, Middle East politics or Central American liberation movements and sometimes even outside of these.

As should be obvious, 4) should be a matter of some concern. That’s because those who should be speaking for us are those who can be counted on not only to represent a left consensus viewpoint but to do so effectively. The paradox here is that they are being promoted to this status is for exactly the opposite reason: Having put the left on the defensive and provided the right with an issue to exploit for their own advantage is an indiction not of successfully communicating our message but of failing to do so.

I have to admit that given the way some leftists and libertarians go on about eschewing violence that it's as though they don't want to believe something biblical authors took for granted, that social order has always been enforced by the sword.  If that hasn't changed in twenty-thousand years what makes people think it's going to change in our lifetimes?  To believe that would be tantamount to believe that the Rapture's going to happen in a few weeks because Jack van Impe said something. 

NYT article claims that movies these days are based on intellectual property as opposed to screenplays? A few thoughts on missing the shift in IP inspirations in film from stuff you look at to things you play with

A friend of mine, James Harleman, has said that every time a film critic complains that Hollywood has run out of ideas that film critic reveals his or her profound ignorance of the reality that Hollywood has never had its own ideas

That was a thought that came to mind reading a piece over at the NYT, Alex French's piece called "How to Make a Movie Out of Anything — Even a Mindless Phone Game"

It's like journalists for mainstream papers and magazines learned absolutely nothing from anybody associated with the Frankfurt school some of them name-drop so much. 

Take the very premise of the article, the use of the word "mindless" in particular.  Does this mean mindless or immersive?  Is the phone game mindless because it's truly mindless or because a journalist doesn't think it's worth writing or thinking about apart from the assignment?  And then there's this:
...
So over time, Vinson has moved toward making movies backed by intellectual property. He was the executive producer of the so-bad-it-was-good ‘‘Hansel & Gretel: Witch Hunters’’ (2013), which barely broke even domestically but went on to record a worldwide gross of $226 million. He also produced the ‘‘Journey To’’ franchise (‘‘Journey to the Center of the Earth,’’ 2008; ‘‘Journey 2 the Mysterious Island,’’ 2012) based on Jules Verne’s stories, which has been solidly profitable, with a worldwide gross of over $500 million. (A third installment is in development.) He is now working with Disney on a film about Snow White’s sister, Rose Red. And following the trend of taking successful movie concepts to TV, Vinson has started on a serialized version of the Martin Scorsese film ‘‘The Departed.’’

Did Fritz Lang invent the plot and characters for Metropolis whole cloth?  No, obviously not!  F. W. Murnau got some trouble from the Stoker estate because somebody thought there was a case that he cribbed Stoker's Dracula story to create the film Nosferatu.  There's nothing new about making use of existing intellectual property to make films.  What might be new is the pervasive use of intellectual property that is not so much narrative-based as it is trademark-based.  This would be most true of stuff like superheroes, board games, and the like. 

But for the article, French seems to believe, or writes as if believing, that there's some fundamental distinction between intellectual property of the sort leveraged by studios today and ... screenplays that would, technically and legally, also fall under the domain of what's known as intellectual property. 

It's as though the author had a chance to see a simple thread running through video games and toys and mobile phone games and board games and didn't see the thread.  Twenty or thirty years ago the screenplays were stories we got told about other people and these days films are apt to be made about games or toys we have actually played with.  The fact that so many of these films are so often terrible is not necessarily the point.  Studios are looking for stuff they can adapt from existing trademarked intellectual properties that, to as far as can reasonably be controlled for, have pre-committed existing audiences.  How do you know to measure pre-existing audiences in a contemporary market?  Well, one possible way to do this is to keep track of game, video game and toy sales.  Sure, let's insist that athletics and sports count, too.  Just because America's pastime may have shifted from baseball to something like Angry Birds doesn't mean American's aren't still killing time.

Let's take this back to a scene from the recent Spiderman film where Michael Keaton's Adrian Toomes muses upon how kids used to draw cowboys and Indians.  The idea there, for what it's worth, was that kids would draw the stuff that was part of their aspirational or fantasies of play.  The gist is that Toomes resents a little (and later a lot) that the kids of today are aspiring to play at being superheroes rather than something more conventional from his own era.  Film critics may have a comparable lament, really, when they complain about superheroes.  Why can't kids these days want to emulate a Kubrick or a Godard?   Why, indeed, would they wish to?

The stuff your generation played with in childhood or played at in childhood will very likely become the tentpole cinematic and televised franchises of your middle age or, if not that, then in some way inspire homage or reaction.  What is Game of Thrones in the end if not a kind of reaction to Tolkien?  It can also be thought of as what you get if you multiply a narrative like the Book of Judges by 20,000 but let's just set that aside for the time being.

Film critics seem to be more ready to get into films that have author surrogates than audience surrogates.  It's easier for a film critic to cut some slack to a Woody Allen for more or less continuously playing himself or having other people play stand-ins for him than for a Tom Hanks to keep playing a relatable everyman who is, at some level, expected to be a stand-in for the movie-goer rather than the movie-maker.  The trouble with this kind of metric of philistinism is that it's not necessarily more sophisticated in the end for a film critic to bask in the glory of being able to read himself or herself into the deliberately blank slate of an artier film than it is for a regular shmuck to imagine that in some sense he could be Steve Rogers or that she could be that Meg Ryan character who gets the guy in the end.  Mystery Science Theater 3000's Mike Nelson once observed that it was really weird how a character played by Harry Connick Jr. could be a horror story antagonist for only doing exactly the same things that a Meg Ryan character could do in a romantic comedy that would be regarded as cute and charming and completely defensible.  Maybe both these types of characters belonged behind bars. 

But let's ask for a moment whether the difference between prestige television and television you don't see critics talking about has something to do with what gatekeepers as a group decide has to be talked about to be with the times.  For as much as I've seen people talking about Game of Thrones there's another show that passed the seven season point lately, My Little Pony: Friendship is Magic.  It's even getting a movie this fall.  Sure, Archer has season 8 wrapped up and Rick and Morty has a season 3 starting up now.  South Park has just passed the twenty year mark.  But one of these shows does not give the journalistic scene every opportunity to opine about current events by way of tagging everything as having a potential headlining corollary.  Somebody could talk about how Game of Thrones is about climate change because they can and do read that idea on to the show.  Does anybody do that for My Little Pony?  It's not as though there aren't fans of MLP who are similar in fundamental outlook to Trekkies, utopians who regard their pet show as emblematic as all that which will make humanity better and usher in a possibly divine (if you're into that) era of social cohesion.  I can get to hwo the Hasbro properties better embody the ideals of the total work of art in the European avant garde some time later, though ... .

When critics resentfully ask how many more of these superhero movies are going to get made they should just ask themselves, how many audience surrogates do they think the studios think they can crank out before the market bottoms out?  There's more than one kind of audience surrogate pandering film out there.  Who's to say that Spotlight isn't ultimately in that category, for instance?  It's not that it's a badly made movie, but then it's not like Wonder Woman is a badly made movie, either.  I enjoyed both movies as a matter of fact. 

Having mused upon these things a little bit let's end with a downbeat haiku I wrote in the last year or so about arts critics and art criticism

every arts critic
must consecrate consumption
to live with their craft

Sunday, July 30, 2017

over at The Week, Ryan Cooper suggests "The Great Recession never ended"., Mere Orthodoxy's Meador considers young Christians leaning left and the market not promising anything while American Christian polemicists define adulthood in terms of market activities

http://theweek.com/articles/714423/great-recession-never-ended

In the first few years after the 2008 economic crisis, a great deal of political attention and energy was focused on continuing economic problems. The Obama stimulus was too small, and it was followed by tons of austerity after Republicans swept the 2010 midterms, so unemployment came down with grinding slowness. But as unemployment has finally reached something like normal levels — and as the ongoing catastrophe of the Trump presidency has consumed everyone's attention — possible economic under-performance has faded from view.
 
But the problems are still there — indeed, in some ways things are actually getting worse. The Great Recession never fully ended.
 
 
First, recall that an economic crash is caused by a collapse in demand — essentially, total spending plummets throughout the economy. Virtually nobody disagrees that this was the case right after the 2008 financial crisis, but many today think that the demand problem has been solved.
 
Economist J.W. Mason, a professor at John Jay College and a fellow at the Roosevelt Institute, has compiled a detailed argument that lack of demand is still the major problem in a brilliant paper. The most obvious and jarring part of the case is the fact that from the end of the Second World War to 2007, inflation-adjusted American GDP per person trundled upwards at a rate of 2.2 percent per year. Any periods of slower growth were followed by periods of catch-up faster growth.
 
But after 2007, there was not only the worst economic crash since the Great Depression, but no catch-up growth whatsoever. On the contrary, the succeeding years after the immediate crash have seen much slower than average growth — and as a result, the gap between what forecasters thought the trajectory of economic output would be in 2006 is actually bigger today than it was in 2010, and getting steadily worse. ...
 
In other words, it's quite possible that not only are we very far from maximum economic output — meaning literally trillions of economic output gone unproduced every year, and more importantly millions of people left unemployed for no reason — but also that maximum output might be receding ever further over the horizon.

For everyone who read those think pieces on the Great Recession being the "mancession" and caught a few pieces about how more women were getting more advanced degrees than men, that's stuff to bear in mind if the Great Recession never really ended.  We might want to ask whether it's a good idea that more women have more advanced degrees than men not so much because of some patriarchal polemic that women should be at home but for another reason, it's not like the patriarchy doesn't run the lending institutions to whom everyone will owe mountains of student debt. 

But I can put it another way, let's take various pieces at Mere Orthodoxy concerned about people not growing up by getting married or embracing adult life.  Jake Meador had a piece recently about how ...

https://mereorthodoxy.com/young-christians-socialism/


3. I suspect that Joe will respond, with some reason, that such a career path or an even more difficult one is not really anomalous and may even be a good thing for most people. That may be the case, but my point here is that the free market system we grew up in promised us one thing—a relatively smooth path to affluence following graduation from college—and it still hasn’t really delivered for many of us. [emphasis added]

That prompted a response from a commenter James McClain:
http://disq.us/p/1kv8dx3

...

With respect to the third footnote above, the "free market system" didn't promise anything to anyone; that's part of the point.


as well as:

http://disq.us/p/1kwcjr8

I want to second this comment about the free market system promising an easy road to affluence. I have seen this sentiment frequently used as a justification for things like Occupy Wall Street, and basically Millenial dissatisfaction in general. I just don't get it.

Jake, how did you come to understand that this was the case? How was this promise made, and how was it broken? I'm serious. I'd really like to understand this. This seems to be such a strong motivating factor for Millenials, but I don't understand how it happened or where it came from.

Well, in a way it's not that difficult to understand, or it shouldn't be for anyone from Generation X or older because we're the ones who received and reformulated the script that merely getting a college education would secure better-paying and more secure employment.  This wasn't exactly true even during the Clinton administration because a lot depended on what you knew, who you knew, where you were and at what time. 

Maybe it's not about what the free market promised but about what activities on the market evangelicals and social conservatives keep insisting all the benchmarks of having arrived at real adulthood, and how the not-ended Great Recession rendered a lot of those market activities so moot for younger generations they just aren't bothering to embrace what was previously considered normal market behaviors.

Let's take another entry from Mere Orthodoxy, for instance.

https://mereorthodoxy.com/ben-sasse-vanishing-american-adult-book-review/

or this one

https://mereorthodoxy.com/strenuous-life-martin-bucer-ben-sasse-vanishing-american-adult/

Evangelicals and social conservatives have a penchant for telling people what the hoops are that they must jump through in order to be considered truly responsible adults on the one hand, but on the other hand we'll get commentary that the market doesn't promise them any of the things they have so often been told in church contexts and political contexts are the requisite benchmarks of adulthood.  It's like the idea that imposing burdens of this sort on people that you won't lift a finger to help them lift on your end might get you accused of being like a Pharisee or something ... .

The free market didn't make the promise, parents and teachers made the connection through their pedagogy and example that if you do X, Y and Z then A, B and C become possible.  Promises of material reward and blessing as a reward for diligence and obedience is kind of a theme in Proverbs.  Sure, it gets undercut in Ecclesiastes with cause but back when Mark Driscoll was more of an up and comer he was willing to speculate that a recovery of the wisdom literature might lead to a new kind of awakening of people being wise about stuff, or something vaguely like that.  Mark Driscoll had a long season in which he would extoll hard work and thrift and doing things responsibly and people ate that up within evangelicalism and socially conservative American Christianity for a time ... and then we found out he didn't quite live up to that stuff all the time ... and maybe significantly soft-pedaled the amount of wealthy patronage and clemency he really needed to get where he got. 

So let's do a little thought experiment, let's imagine that market friendly Christians step back and consider that if the free market hasn't promised millenials anything then the smartest thing millenials can do with respect to the market-anchored benchmarks of functional adulthood in the United States is to simply pass on pursuing those benchmarks, whether it's buying a car, buying a home, getting married, having children or any of that other stuff.  Or as the hippies of yore ... drop out.  It's not that they are necessarily explicitly rejecting the ideals of home and heart in the way progressives might want to believe on the one hand or that conservatives might dread on the other, it might just be that they like the idea in theory but realize they can't possibly afford this stuff in practice, so they may go for whatever surrogates are available on ... the market.

It may be precisely the preferred surrogates that freak out evangelicals and social conservatives, and not necessarily without cause ...

even contributors to Slate can see how the gaming thing might have drawbacks that are related to what are actually strengths in the gaming approach as a socialization process:
...
In social science there’s a framework called self-determination theory, developed by psychologists Edward Deci and Richard Ryan, that seeks to explain human motivation. It suggests that humans are not driven simply by rewards and punishments, but also (and in many cases even more strongly) by innate psychological requirements for autonomy, competence, and relatedness. Activities that are driven primarily by these three basic factors are considered intrinsically motivated; extrinsically motivated actions are undertaken to gain rewards or avoid punishment. [emphasis added] Deci found that if you offer people money for an intrinsically motivated task—like working on a puzzle or generating newspaper headlines—people will actually spend less time on it.
 
Psychologists in the field have since sought to facilitate intrinsic motivation to improve learning in schools and employee investment in the workplace. They’ve found that giving people more choice of tasks (autonomy) tends to increase their motivation while restricting them decreases it. They’ve also found that offering positive praise instead of money (reinforcing competence) for an intrinsically motivated task increases subsequent time spent on that task rather than decreasing it. Tasks that involve an element of creativity or skill tend to be intrinsically motivated while simpler, more repetitive tasks are more extrinsically motivated and respond more normally to rewards and punishments.
 
That brings us back to video games. Games have always offered the player a chance to experience competence by requiring them to solve puzzles or master new skills. In this way they’re similar to other intrinsically motivated tasks like working on a physical puzzle or playing a sport. Most of today’s games include elements of autonomy so that players can make choices about where to explore, which goals to pursue, or how to customize their characters and gear. That’s two of self-determination theory’s basic psychological needs. The one that remains is relatedness—a feeling of connection to other people. With the advent of massively multiplayer online role-playing games and live-streaming services like Twitch, social contact is increasingly part of gaming. It’s likely no coincidence that the people who are most likely to feel comfortable and find their peers in these social gaming environments are young men—the very people who are apparently choosing to forgo work hours in favor of more game playing
 
Of course, we want our hobbies to fulfill us, and you can find competence, autonomy, and relatedness in anything from softball to crochet to crossword puzzles. If you’re a young man living in a community where the available jobs are repetitive and low-skilled, offer little prestige, don’t have a path for advancement, and aren’t particularly well compensated, however, there may not be many other opportunities for you to meet these needs. Video games offer an alluring, almost sinister ability to flatter you into feeling competence, soothe your need for autonomy by offering in-game choices, and connect you to other people. Games may be doing their job too well, keeping players from seeking true creative outlets, forging independent paths through life, and achieving in school or the workplace, because their needs are being blunted by a synthetic substitute. [emphasis added]
You could go so far as to suggest that the market has gotten so good at producing surrogates for the benchmarks of adulthood that ...

let's tie this back for a moment to the earlier admonition that the free market never promised any of these people anything. 

What took place in the Dead Men sessions at Mars Hill circa 2001 to 2002 was a kind of integration propaganda campaign in which participants were offered a shot at autonomy, competence and relatedness that the men jumped at.  Since I was there I think I can fairly safely assert this.  There were systems of rewards and punishments, too, in a way, by way of Midrash battles.  But here in 2017 it looks as though social conservatives have worked out that the hoops they want millenials to jump through to become adults don't seem to constitute intrinsic motivation any more for some reason while there seem to be no compelling systems of rewards or punishments to get them to start jumping through those traditional market-based hoops that define functional adulthood. 

Rather that back up and take a page from social psychologist Roy Baumeister, who observed that the foundational baseline for defining the transition from boyhood to manhood independent of rites and touchstones is that the male produces more for his social unit than he consumes from it.  That simple measure of manhood does not require marriage or home ownership or buying cars or ever having sex.  Yet it would seem the explicit and implicit command of manhood from social conservatives would have it that you're not a real man until you've married and had kids. 

So maybe the market didn't promise anything to millenials but if that's the case social conservatives may need to stop being aghast that millenials could conclude that if there are no promises from the market find they have more rewards from the new surrogates than trying to jump through the old hoops.  Why bother?  If you try and fail the verdict of the market is that you failed to be an adult and the pundits will condemn you.  On the other hand if you don't try then you're forsake the path to adulthood and you'll get condemned for that.  It's basically a social commentary damned if you try and damned if you don't scenario. 

What progressives may need to grapple with is the promise is a lie, and if the promise of a better income and standard of life by way of getting a college education is a lie, particularly where liberal arts are concerned, then maybe it's a bluntly ethical issue why liberal arts degrees can cost so much, particularly at more prestigious schools

As Chris Jones put it:

Under the Obama administration in 2015, the Department of Education toughened up a set of rules known as the Gainful Employment Regulations. The rules were designed to protect students from being buried in debt by for-profit trade school programs that created far more student indebtedness than verifiable value for money. No one expected a graduate arts program at Harvard University, no less, to be ensnared in the netting of these regulations.

But, remarkably, that is exactly what just happened. In recent days, it was announced that a graduate program in theater at Harvard would suspend admissions for the next three years after receiving a so-called failing grade from the Department of Education that could result in a loss of access to federal student loans. [emphasis added]

The finding, which I first read about in the Boston Globe, should be a shot across the bow for elitist arts programs with high tuitions, programs that long have ignored the realistic economic prospects of their graduates.

Simply put, the federal policy looks at the debts-to-earnings ratios of career-training programs  (and, yes, the arts are a career) in an attempt to discern whether the programs provide students reasonable returns on their investment in tuition. The 2015 regulations hold that the average student's debt from the program should not exceed 20 percent of their discretionary income or 8 percent of their total income. If that is not the case, then the program could lose access to federal student loans. When it announced the new tougher regulations, the department estimated that 99 percent of the affected programs would be at for-profit institutions. [emphasis added]

There is a new development: On June 30, Secretary of Education Betsy DeVos announced, as part of a wider move to cut regulations, the department was "pressing pause" on Gainful Employment, giving affected institutions an extra year to comply with disclosure requirements. This after the Trump administration actually defended the measure in March in federal court. And in any case, this does not take it off the books.

...

In many cases, these students are going into debt to acquire credentials and, yet more importantly, a network to aid them in a profession that, to its detriment, is growing ever-more nepotistic and lazily elitist, especially when it comes to its dominance by a few well-known training programs. [emphasis added]

While I could link to any number of pieces editorializing about the problems in arts funding and education it is the weekend and there's only so much blogging a person wants to do on the weekend, even someone like yours truly.  Social conservatives are more apt to tout technical schools and the like but in both directions the question may ultimately get dodged, if you jump through the hoops and the market hasn't promised you anything then if you fail that's on you.  If you succeed?  You may just become evidence that the system works, whatever the system is supposed to be by those partisans who want to say their system works.

These days it's hard to shake the impression that everybody left, right and center is cherry-picking so much that the whole pie seems rotten and there's little point in debating the merits of how to divide up the pie.