Fool's gold?

A couple of days ago, the Times Higher Ed published a piece about journals ranking their reviewers (in terms of who gets their reviews in on time). This led to a discussion on twitter about how fair it is to judge and rank people who are doing a voluntary task.
The timeliness of reviewers is an important element in publishing, but it is understandably hard for many scholars to prioritize putting time into reviewing if they are facing other time-sensitive demands from their employer (like, you know, TEACHING).

One thread of the discussion moved to “why shouldn’t reviewers be paid?” I had an exchange with another twitter user on this which highlighted the divide between STEM and humanities academics. He was thinking of the publishing “behemoths” like Elsevier and SAGE, who probably could afford to throw some money at reviewers. I was thinking of the small scholarly associations (who publish many humanities journals), and who definitely could not.

This got me thinking again about scholarly publishing and who pays for it. There have been some insightful discussions over at Scholarly Kitchen on Open Access in the last year. Some of their analysis has discussed the cost (or savings) to institutions of Gold open access. For some it would be a saving (they would pay less in submission fees for their academics than they currently pay for subscriptions), while at highly productive institutions it might actually cost more.

My feeling has tended to be that Open Access (and the expectation of it) is going to be increasingly a cudgel used against smaller publishers/academic societies, who will cut off their main income stream if they lose subscriptions and memberships. Large publishers may have the financial cushion to subsidize the transition, but smaller players will not. Meanwhile the author-pays element will increasingly freeze out scholars who are marginally employed, or at institutions who don’t have the cash to pay for their submissions. My sense is that this will hit the humanities much harder for two reasons—first, more of our journals are published by academic societies rather than Elsevier et. al.; second, there are more independent scholars trying to publish in the humanities (it’s easier to do some humanities work as an independent scholar than if your research requires access to lab facilities).

The scholarly associations I am thinking of often have a membership/subscription bundle - all members are subscribers and vice versa. While it may be that they can replace the subscription funding with author payments to the journal, they will lose the membership element. Membership confers much more than just access to a journal. These are often societies going back over a century, with a history of advocating for their members. They organise conferences, give scholarships, create disciplinary networks. Without members, who will feel invested in them? And what will scholars lose when we don’t have these groups to speak for us?

Then it fell apart....

It is some years since I worked as a film critic, but I still occasionally see a film that I want to write about. In this case, the long-awaited Jason Bourne.

[mild spoilers]

I yield to no-one in my love of the Bourne series, and I was looking forward to this. There was no clear need for a sequel after all this time, 14 years after the Bourne Identity and 9 after the Bourne Supremacy. But the question of what had become of Bourne was one that could have been worth answering.

Unfortunately, the answer suggested by this film is not a great one. We first see Bourne making a living in prize fights in Eastern Europe, and the camera lingers over the signs of the toll his life has taken. Scars from bullet wounds, the greying hair at the temples, the sinewy muscles.

The story here involves Nicky Parsons (Julia Stiles) hacking in and retrieving information about CIA black operations, and passing this to Bourne. The NCIS-level plotting has the CIA storing such data in a file labelled “Black Ops”. The CIA operative who spots the hack then attempts to track down Parsons, and through her, Bourne.

This young officer (Alicia Vikander), is a computer whiz who was hired straight from university - which from the looks of her was about six weeks ago. Vikander is fine as an actress, but the character is totally implausible. She’s far too young to be not only given control of an operation, but then to have the stones to ask to be made CIA director.

Vikander’s character should have been 10-20 years older, like Pam Landy (Joan Allen in The Bourne Supremacy). Landy was in the position of choosing between supporting her employers and having doubts about the Black Ops enterprise. Landy’s character also demonstrated what many mid-career women experience in male-dominated workforces. She gets gaslighted by her male colleagues, who plan to scapegoat her when a project goes wrong.

That level of nuance and development among the agency staff is absent here. Vikander’s character is apparently only driven by ambition. But having introduced a junior tech genius who could trace a hack, there should have been a character somewhere in the hierarchy between her and the old guy in the corner office. The character of “Generic senior CIA guy who wants Bourne gone" (previously played by Chris Cooper, Brian Cox, David Strathairn) is played by Tommy Lee Jones.

The limited storyline also involves some folderol about Bourne’s father, and Bourne’s motive for volunteering for Treadstone. But we learn nothing really new about Bourne, and this additional backstory is too hackneyed to build on the web of subterfuge created in the earlier films. Although his name is the title, Bourne is more the object than the subject of the action.

Bourne's nemesis, or rather, his opposite number, is a character not even given a name beyond "The Asset”. “The Asset” is played by Vincent Cassel, a lean actor with high cheekbones and a sardonic pout. An award-winning actor in France, he has appeared in action-thrillers making use of his martial arts skills, as well as more serious dramas. His forays into English-language cinema have been few, notably the choreographer in Black Swan and the cat burglar in Ocean's 13. However, he is wasted here. In what passes for character development, we are told the Asset was held hostage in Syria and tortured. But whatever sympathy that was designed to elicit evaporates the second we see him killing civilians left, right, and centre. He is the Big Bad, with no moral gradations.

In the earlier films, Bourne - and the system that created him - existed in a moral grey area. Individuals might be bad or good, but they were all in the same swamp of ambiguity. In The Bourne Identity, another Treadstone assassin (played by Clive Owen), was sent after Bourne. In him we saw that Bourne was not alone, as an agency man trapped in his role. As he died, he said to Bourne: "Look at this. Look what they make you give”. Both men knew that they were victims of the same system. The idea that such men are morally compromised, have doubts, and are psychologically damaged by their work, is the heart of the Bourne character. But here, such nuance has disappeared.

The tone is also changed. There are two big chase sequences: one, on a motorcycle through Athens, is pure Bourne. The other, in Las Vegas, looks like it belongs in Mission Impossible, or another more glitzy action flick. (One driver is even in an armored SWAT vehicle, and the smashing of casinos reminded me of Con-Air, when a plane was landed on the Vegas strip). The earlier Bourne films always avoided glamorous locations. In European cities, the action took place in rail stations or anonymous suburban streets. That gritty realism was part of what gave the films their edge. Glitzy action and a flimsy plot: Jason Bourne deserved better than this.

In my enjoyment of the series, I have wondered about what the future would really hold for Bourne. Since he wants to stay off the grid, he can’t seek legitimate employment or start a business as a security contractor. How, as our net of surveillance grows tighter, does someone really escape? (Especially when you’ve been on an Interpol watchlist for the past decade).

In this film, people talk about Bourne “coming in”, but never say why. To work for them? Face trial? This is unclear. But his return to the nest would at least have been a reason for a final film after such a hiatus. As he gets too old to be scratching at the margins of the cash economy, why not finally decide that the only way out was back to the start? Surely, while beating up Serbian toughs on the Greek border for a handful of sweaty banknotes, the option must have occurred to him. Plot-wise, a story in which his hand were forced—such as finding himself in a foreign jail, or stumbling onto a genuine threat to the USA—could have presented the situation for him to make the choice he’s been avoiding for years.

A film could end with him settling into an office at Langley, with JASON BOURNE (or even DAVID WEBB) on the door. That would have been a better outcome for this troubled but patriotic (and pragmatic) character. Extreme ways indeed.

How brave of you!

Recently an academic told me he had "pledged" to not participate in any all-male conference panels.

I wasn't sure how to respond: was I supposed to be grateful, as a female academic, that he was making this gesture? It seemed both condescending and smug. But I couldn't clearly articulate quite why it pissed me off until I read this piece in Jezebel:

Damn, You're Not Reading Any Books by White Men This Year? That's So Freakin Brave and Cool

If you want a change to be normalised, just do it. Don't ask for applause.

Here, there, and everywhere

I was recently fortunate enough to interview members of ICE's human rights group, who work tracking down war criminals in the United States. I wrote about it for TIME.

http://time.com/3746936/ice-historians/

I've also written for the Chronicle, on teaching

http://chronicle.com/article/Digital-Natives-Like-a-Good/150301/

and university finances

https://chronicle.com/article/Where-the-Money-Is-Not-At/190061/

Are the only real "Expendables" women?

The Expendables 3 is coming out. I will confess I did see the previous two, but I doubt that's necessary to follow the plot.

I notice there is a woman in this one, getting main billing on the poster with all the men. Will this film pass the Bechdel test? I somehow doubt it, I suspect it's going to be the Smurfette syndrome.

The female star is Ronda Rousey. She is a mixed martial artist, which means she should be convincing in the fight scenes. (I've no idea her acting skills but it's not like anyone else in this line-up is Laurence Olivier). Also, at 27, she's young enough to be the daughter of most of the men in the film.

But where are the VETERAN female action stars? Where are Linda Hamilton, Sigourney Weaver, or Michelle Yeo? Or Milla Jovovich, Michelle Rodriguez, Kate Beckinsale, Lucy Liu?

I mean, come on, Kelsey Grammer? A guy whose most action-y role so far was playing a psychiatrist who probably sprained his ankle hurrying to the box office for opera tickets?
I want to see Geena Davis (who has spoken out so much against the underrepresentation of women in film) as a bounty hunter.

Face to face with the man who sold the world

The last three weeks have been a hectic whirl, of four countries and a nightmare of moving that I won't even begin to express. 

In London I went to see David Bowie Is.... , a mindblowing exhibition. It's rare for any museum to put on a show about such a talented, fashion-shaping individual, but the way they laced different phases of his life (personal notes, clothing - lots of clothing - and drawings) was so involving, and the soundrack on the audioguide was brilliant. I have never seen anything like it—and I doubt I will again.

Meantime, my own scribblings have appeared elsewhere: 

In Slate, I reviewed a book on the History of Neon.  

And for The Australian, a book on the history of whale hunting.  

And now through the miracle of technology, I can track the ship carrying my possessions from Munich to the Pacific.  

Need a big loan from the girl zone?

Salon just published a piece saying that the gay rights movement has an advantage over the abortion rights movement, in that celebrities are willing to come out as gay, while few come out as having had an abortion.

This distinction can be parsed further though, in ways they do not: gay is an identity, having an abortion is an event - in a life full of other events.  

Nonetheless, I said recently to my students that people will more readily admit to a drug addiction to an abortion. Most (especially the male students) seemed surprised by this. But it's obviously true: there is a tv show called Celebrity Rehab, not Celebrity Abortion Clinic.  

I realise this is because abortion strikes at the twin axes on which women are most harshly judged. One: their sexual behaviour (if you need an abortion, you must be a slut having sex out of wedlock), and Two: their maternal skills (getting rid of a pregnancy is about as unmaternal as one can get). The women who tend to speak publicly about abortions are the repentant, putting it in the "terrible mistakes I made when I was young" category, not the "damn straight I did it and I'd do it again" school.  

It's true that if women more willingly admitted to having ended pregnancies, this would have the power to overturn the debate on the issue: as politicians would realise how many women they know have made use of the right to choose.  

All I need is a rhythm divine

In 1941, Desi Arnaz was an up-and-coming performer, with some Broadway and Hollywood credits, looking to become a headliner. He had married Lucille Ball the previous year, after they met on the set of Too Many Girls.

But when he premiered his song "Babalu" at Newburgh’s Ritz Theater, few in the audience would have known the significance of the date: December 17.

Babalú, first a Yoruba deity, crossed the Atlantic with his adherents when they were transported as slaves. As Santeria emerged in Cuba as a syncretism of African spirit worship and Catholic rituals, Babalú became one of the most popular of the Cuban Lucumí Santería orishas. He has been linked with Saint Lazarus, and now shares that Saint’s feast day of December 17.

The name Babalú-Ayé translates as “Father, lord of the Earth”. He is believed to be associated with epidemics—curing them, as well as bringing them down upon communities when displeased. In Cuba, thousands of devotees gather each December at the Church and Leprosorium of Saint Lazarus in El Rincón.

The Spanish version of the song, written by Margarita Lecuona and performed by Arnaz, describes invoking the god with offerings of tobacco and alcohol, and lighting seventeen candles in the shape of a cross. He asks Babalú for protection.

Ta empezando lo velorio
Que le hacemo a Babalu
Dame diez y siete velas
Pa ponerle en cruz.
Dame un cabo de tabaco mayenye
Y un jarrito de aguardiente,
Dame un poco de dinero mayenye
Pa' que me de la suerte.

However, the English version published that year (lyrics by S. K. Russell) doesn't translate the story, but change it completely. The lyrics set the piece back in Africa, making the (implicitly white) singer an observer, not a participant in the worship of Babalú. It also refers to Babalú as a "Voodoo" god of love.

Jungle drums were madly beating
In the glare of eerie lights:
While the natives kept repeating
Ancient jungle rites.
All at once the dusky warriors began to
Raise their arms to skies above
A a native stepped forward to chant to
his Voodoo Godess of love.

Of course in December 1941, just days after the attack on Pearl Harbor, audiences were also looking for a distraction. Arnaz’s music, and his long conga solo, would have been a welcome source of entertainment. A few years later “Babalú” became the signature song of his character, Ricky Ricardo, on I Love Lucy. That he chose to perform it first on December 17 shows that perhaps he was hoping for some protection, and luck, too.

I am the words, you are the tune

I like Regina Spektor's music. But when I heard "On The Radio" the other day—being sung along to by a couple of girls—I was struck by the line "everyone must breathe, until their dying breath". Well, yes, I suppose they must. But where does this pseudoprofundity come from? (or rather who decided this was an ok line to go ahead and record? Cole Porter wouldn't have gone near it, Jimmy Webb might have but regretted it the next morning). But there seems to be a lot more of this kind of nonsense in music these days.

The popular singer-songwriter is a modern creature, dating only from the availability of sheet music and recorded music. Earlier singer-songwriters were troubadours, who adjusted their ballads to suit the audience, or related recent events—and what they performed generally wasn't written down.

Folk songs, which everyone knew, weren't written down either, and were probably written collaboratively over generations by Uncle Tom Cobley and all.

There were also people who wrote down songs that aspired to universal themes, about life, death, and God. Those songs are called hymns. "Time, like an ever-rolling stream, bears all its sons away".... more affecting than something about breathing until you stop breathing.

Once we got into the twentieth century though, people wrote (and recorded) songs typically on the themes of:

1. I'm in love!
2. The person I love doesn't love me, and it's a bummer.
3. My whole life's a bummer. Dead dog, no job.
4. Everyone's life's a bummer. Try to look on the bright side. (a popular theme in the Great Depression).
5. Commentary on blue-collar life and/or prison.
6. The apparent suicide of Billie Joe McAllister.

But songs that include the singer's (banal) pronouncements on the meaning of life started increasing since the 1960s (with the arrival of a new theme: 7. Stick it to the Man), and seem to have really proliferated today. It's one thing to get musings on life from people who've really clocked up some city miles, like Tammy Wynette, or Johnny Cash, or Neil Young, or Edith Piaf. But learning that "If the light is off, then it isn't on" (Hilary Duff), brings to mind the all-time-stupid of Des'ree's preference for toast over ghosts.

A song relating personal feelings, performed in the first person, can be one of the most moving art forms we have. "Love Me Tender" and "Cry Me a River" are aimed at specific, second-person recipients. But any listeners can imagine themselves either as the singer or the recipient. These songs paradoxically achieve universality by being personal. Many of us have acquired our emotional vocabulary through such songs.

However, when songwriters strive for universal observations on the human condition, they wind up sounding like fortune cookies. We don't experience life in general, but in specifics.

If you think I'm being harsh towards Spektor (and before anyone tells me that English isn't her first language, I'm pretty sure that "dying breath" stuff doesn't sound any better in Russian), she is clearly innovative musically, and this kind of lousy lyric lets her down.

That same song has the far more evocative passage:

And then you take that love you made
And stick it into some
Someone else's heart
pumping someone else's blood

This reminds me of another form of songwriting, the factory-like production of pop and dance songs by Ester Dean to be performed by the likes of Rihanna. The astonishingly talented Dean's feelings, her words, are put into someone else's heart, filling someone else's album. In fact, she is able to step into another performer's persona, in creating a song to suit their image.

Of course, it remains to be seen which of today's singles become part of tomorrow's songbook. As Van Halen said, "Only time will tell if we stand the test of time."

Going in for the kill

Today I visited the Museum of Crime in Vienna, a wonderfully old-school museum. No interactivity here, or apologies to the squeamish. Tracing the history of criminal policing in Vienna since the early modern period, there are displays on numerous celebrated crimes for each period.

These are illustrated by (for the earlier periods), woodcuts, then newspaper illustrations (themselves entertaining, showing people being variously shot, stabbed, or thrown out of carriages in the best traditions of the sensational press).

But they also have various artifacts from some of the crimes. These include a mummified head, the skull of a multiple murderer which was said to demonstrate "abnormalities", plus fragments of clothing, murder weapons, and a guillotine. There are death masks, life masks, and wax models made of victims' chests showing stab or bullet wounds. There are even the skulls of two small children who were murdered by their father.

They have a photocopied leaflet giving some information (to call it a guide would be generous) - but the displays themselves are abundantly textual, albeit all auf Deutsch. Their English leaflet is itself an adventure in linguistic crime, which includes the sentence "Lively bloodcurdling ballads were distributed in ancient Vienna until the end of public executions in 1868, bringing a farewell to the idyll of the Biedermeier period". I've always found public hangings idyllic, haven't you?

What the museum does remind us are the graphic ways crime was reported in the past. Today, even as we hunger for more crime-focused stories (look at CSI, and murder mystery novels, many today with levels of gruesome detail that would have given Agatha Christie the vapours), the reporting of actual crime is ever more sanitised. At least in Western countries, we don't normally see photos of corpses or crime scenes in the paper.

One of the museum's displays relates to a crime in 1685, when the dismembered body of a woman was found (piece by piece, over successive days), and it was put back together and put on public display in the hopes that someone would identify the woman. Nobody did, however the tactic of displaying the bodies of unknown victims seems to have continued (we see both photos and newspaper illustrations of looky-loos lining up to gawp at the corpse).

The idea of putting a corpse on public display to aid identification seems unpalatable now, Even the Doe Network, dedicated to helping identify unknown (or "John Doe") victims, has identikit sketches of the deceased; not photos.

The museum also shows the evolution of police uniforms, and its displays cover celebrated crimes up til the 1980s. Towards the end, there is more about the development of forensics which is also interesting. The pictures here show a criminal's skull and a display relating to a couterfeiting case.

If you're in Vienna, and willing to look at this kind of thing, worth a visit. If you're of a gentle disposition, skip it and go to the art museum.

Transient
Transient

Older property, seeking loving owner, for LTR

On Thursday I had the opportunity to walk through a house that is up for auction, near where I am living in Newburgh. I had been curious about this unusual property from the outside, and wondered about what the interior must be like. The answer is: in severe need of repair, but still retaining some of its best features, including the original wood panelling.

It has become city property due to non-payment of taxes, and they are auctioning it. This 1870 house was once a beautiful example of the city's heyday. Originally symmetrical, an early owner modified it to add the turret on one side. The rooms in the turret actually feature curved windows, with curved glass.

So the fate of this house will be decided by the purchaser. For someone with a passion for re-habbing (and deep pockets), this could be the one. From the back, there are views of the Hudson - this house has the potential to be a stunning home.

Whoever they are, they will join a growing community in Newburgh restoring these elegant properties. The city is keen to encourage these projects, and tax breaks and special loans are available for historic restoration.

The house is at 288 Grand St, and there will be further open house opportunities on October 5th and 12th.

Transient
Transient
Transient
Transient
Transient

Bells will ring

Last week I was back in Cambridge to attend a friend's wedding. As I watched this marriage in a college chapel, I was staying in a guest room that had that Cambridge smell. The chapel was beautiful, and kneeling on that hard pillow with my back straining reminded me of so many evensongs.

I felt oddly nostalgic but reminded too of the reasons living in England again would be difficult. Cambridge remains as it ever was, a contradictory place. A short stroll will show you, by turns, elegant and trashy, beautiful and stark, crowded and still.

Seeing friends who have stayed on in academe—and those who have not—raised the usual contemplations of my career and what graduate school really does to people. But one of the things it has done for a number of my friends, including the one just married, is bring them together with their life partner.

The intellectual atmosphere and forced proximity of graduate school is the ideal venue for academic over-achievers to pair off. This is such a recent phenomenon (in terms of when many top universities got around to admitting women), it remains to be seen what effect this will have on academe long-term. Perhaps the notion of the social spaces within academia will change. We have already moved on from the—once common—acknowledgment of gratitude in book or thesis to the "wife who typed my manuscript".

Strange days indeed, to still be at the very tail end, generationally, of the bachelor dons who once filled Oxbridge colleges.

How's my restoration?

In Newburgh, where I am now working for the Newburgh Historical Society, there are many people fixing up old houses.

Just across the street from the Crawford House (the society's HQ), the owner put up this sign, asking for input on paint choice.

Which paint color do you prefer? (I chose maroon).

It shows the kind of community spirit often demonstrated in places like Newburgh by those involved in restoration. It invites neighbours to feel invested in what's going on in their environment. Those improving a house are not just fixing up their own home, but adding something to the area.

Transient
Transient

Bravery, and historical pursuits

Recently on twitter, Maureen Ogle suggested that we historians have "ceded the field" of writing history for a mainstream audience. Journalists, novelists, and others, have filled the gap. I don't entirely disagree, but I'm not sure I'd characterise it as "ceding". Many historians just don't have the access to the popular media and trade publishers that established journalists do. Vida's study of the under-representation of women in many literary venues—and the editorial responses to it—show that editors might not be consciously trying to keep women out, but they tend to stick with the (male) writers they already know. The same situation is probably true for academics trying to break into the mainstream market. If Harpers wants to run a historically themed piece, they're likely to give that assignment to a writer they already work with, not start looking for an academic. Indeed, an academic is probably the last person they'd ask. Far from serving as a qualification to get one's foot in the door, I've found that having a PhD in the subject area makes magazine editors very wary. One admitted as much to me, saying academics tend to be bad writers. I do want to engage a popular audience, I'm trying very hard to do so. So it's not a case of my ceding anything, but not having the platform. But are we, the experts, the best at communicating our knowledge of the past? Sometimes yes, sometimes no. William Cronon (President of the American Historical Association),  rekindled the debate on whether academic writing is too dull to appeal  to a wide audience, which prompted a range of replies, including that not everyone in academe wants to appeal to popular readers. I tweeted recently about trying to peel the sticky resin of academese from my writing. Writing a PhD and various other academic works has made my writing worse than it was before. Mark Twain may have said that "Education is the path from cocky ignorance to miserable certainty", but a PhD program is the path to miserable uncertainty. We use the passive voice, we equivocate, we acknowledge multiple interpretations of the the events of the past. Partly this is to pre-emptively fend off critiques from fellow academics, who will nail us for not addressing various sub-issues and tangential debates. We lack confidence. There's an acquired style in academe, and I acquired it. This confidence is partly why journalists and other non-academics can produce more readable, arresting, historical texts. Dan Snow (who has not passed through the confidence-eradication process of graduate school) has a twitter account, "Dan's History Fact", in which he posts various nuggets of historical information, frequently incorrect. He's been called out on this many times, but doesn't seem to care. I mention this because any academic historian would have curled up dead from embarrassment at having posted so many things as historic "facts" that were urban legends or just plain wrong. But why should Snow care? He still has a large number of followers. I'm struggling right now to regain some confidence and authority in my  writing. I received comments on a recent piece which could be summed up as "be less dull". I have to remember how to write as myself, not as the platonic academic ideal.



Recently on twitter, Maureen Ogle suggested that we historians have "ceded the field" of writing history for a mainstream audience. Journalists, novelists, and others, have filled the gap. I don't entirely disagree, but I'm not sure I'd characterise it as "ceding". Many historians just don't have the access to the popular media and trade publishers that established journalists do.

Vida's study of the under-representation of women in many literary venues—and the editorial responses to it—show that editors might not be consciously trying to keep women out, but they tend to stick with the (male) writers they already know. The same situation is probably true for academics trying to break into the mainstream market. If Harpers wants to run a historically themed piece, they're likely to give that assignment to a writer they already work with, not start looking for an academic. Indeed, an academic is probably the last person they'd ask. Far from serving as a qualification to get one's foot in the door, I've found that having a PhD in the subject area makes magazine editors very wary. One admitted as much to me, saying academics tend to be bad writers. I do want to engage a popular audience, I'm trying very hard to do so. So it's not a case of my ceding anything, but not having the platform.

But are we, the experts, the best at communicating our knowledge of the past? Sometimes yes, sometimes no. William Cronon (President of the American Historical Association),  rekindled the debate on whether academic writing is too dull to appeal  to a wide audience, which prompted a range of replies, including that not everyone in academe wants to appeal to popular readers.

I tweeted recently about trying to peel the sticky resin of academese from my writing. Writing a PhD and various other academic works has made my writing worse than it was before. Mark Twain may have said that "Education is the path from cocky ignorance to miserable certainty", but a PhD program is the path to miserable uncertainty. We use the passive voice, we equivocate, we acknowledge multiple interpretations of the the events of the past. Partly this is to pre-emptively fend off critiques from fellow academics, who will nail us for not addressing various sub-issues and tangential debates. We lack confidence. There's an acquired style in academe, and I acquired it.

This confidence is partly why journalists and other non-academics can produce more readable, arresting, historical texts. Dan Snow (who has not passed through the confidence-eradication process of graduate school) has a twitter account, "Dan's History Fact", in which he posts various nuggets of historical information, frequently incorrect. He's been called out on this many times, but doesn't seem to care. I mention this because any academic historian would have curled up dead from embarrassment at having posted so many things as historic "facts" that were urban legends or just plain wrong. But why should Snow care? He still has a large number of followers.

I'm struggling right now to regain some confidence and authority in my  writing. I received comments on a recent piece which could be summed up as "be less dull". I have to remember how to write as myself, not as the platonic academic ideal.

I like the way sparkling earrings lay...

I was reading a book for my work on missing persons, and I came upon an interesting comment by a New York detective of the 1930s: that when he found the body of a young woman with pierced ears,  he could assume she was foreign-born or the daughter of immigrants. He also mentions elsewhere—in relation to older cases—that ear piercing was something that had been more popular in the nineteenth century.*
Plenty of movies and photographs (and vintage stores, and grandmother’s jewellery boxes) show us that clip-on earrings were very popular from the 1930s to the 1960s, when pierced ears became standard once again. But why did the custom drop off? Was it precisely the association of pierced ears with immigrants: that the arrival of large numbers of people from southern Europe, who tended to pierce the ears of their infant daughters, made the practice seem declassé to the WASP middle classes? This is just my stab-in-the-dark guess; I’d be interested to know if any readers have more information. (It seems to have dropped from popularity far too early for blood-borne diseases to have been a concern).
We know that in the classical world, Greek sailors wore a gold earring that they could use to pay the boatman across the river Styx, and the Song of Solomon mentions earrings. There is plenty of evidence of some women in the early modern period in Europe having their ears pierced (some earrings still exist, and portraits show at least elite women had them). But like so many small details of women’s lives, particularly those relating to beauty customs, we have sketchy evidence even of recent generations.

*John Ayers, Missing Men, New York, 1932