Who’s missing in modern academia: solitary geniuses or something much more significant?

Cross-posted from The H Word blog, where this post first appeared on 10 December 2013.

1974 portrait of Isaac Newton as solitary genius.

When Peter Higgs, of Higgs boson fame, was quoted in the Guardian on Friday as saying “Today I wouldn’t get an academic job” because he would not “be regarded as productive enough”, it prompted much nodding and retweeting from academics.

Coming as it did on the tail of British academics’ rush to complete submissions to the REF (Research Excellence Framework), in a term that has seen two strikes over fair pay in Higher and Further Education and at a time when there are reports of long working hours and other pressure on academics affecting wellbeing , it is hardly surprising that there was sympathy toward Higgs’s negative judgement of today’s focus on “productivity” and publication.

When Higgs was quoted as saying “It’s difficult to imagine how I would ever have enough peace and quiet in the present sort of climate to do what I did in 1964”, many academics undoubtedly heaved a sigh and got back to the marking, teaching preparation, grant application, or whatever other non-research-related activity they were currently engaged in.

It seems, though, that Higgs’s comments struck a wider chord, perhaps because of the extent to which they conform to the stereotype of the solitary scientific genius. His “peace and quiet” of 1964 (aged 35) brings to mind Newton’s escape to his Lincolnshire family home in 1666 (aged 24), and it is contrasted in the article with “expectations on academics to collaborate and keep churning out papers”. This is the kind of thing we want to hear our science Nobel winners saying.

Teaching, which takes up a huge proportion of most academics’ time, is not mentioned in this piece. I have no idea what kind of a teacher Higgs was, but Isaac “lecture to the walls” Newton clearly would have been a flop on Rate my Professor and a liability for a university anxious about its position in the National Student Survey. He would probably have been just as problematic for REF. Although he was to go on to have a staggering impact (or Impact), Newton was famously, for much of his life, reluctant to publish.

In many ways Newton and his mythology became a model for how we think of genius, particularly in the physical sciences. Stories of his youthful moment of inspiration, his forgetfulness, his oddness, his solitariness and his immersion in his work abound. Yet he was also someone who learned not just from books but also from his Cambridge tutors and colleagues and wide correspondence, who made his approaches to the Royal Society with scientific papers and the gift of his reflecting telescope, and who went on to become an MP and to lead the Royal Mint and Royal Society.

Science is profoundly collaborative, relying on communication to peers and students, and collaboration with colleagues and a whole range of other “stakeholders”. It goes without saying that there have, always, been many people doing scientific work who not only put up with but also thrived on all those other activities. Science would not have developed without them.

While there are some, perhaps-justified, fears about modern academia effectively losing the insights of the next Newton, it’s worth recalling the circumstances in which many of the well-known figures in the history of science conducted their work. While they may not have been writing grant reports of marking exams, they were likely seeking patronage, carrying on journalistic careers, undertaking the duties of a doctor or a vicar, teaching, family business or otherwise making a – usually non-scientific – living.

Those who really were excluded were not solitary geniuses who could not find sufficient time for thinking, but those who were, as a result of class, geography, race or gender, never likely to have the opportunity to begin an education, let alone contribute to the established scientific societies and journals. And this affected the science that was done: ample research shows how the norms, assumptions and interests of elites have shaped supposedly value-free science.

Science and academia today remain embarrassingly homogeneous. However, the fear is not so much that we might be failing to find or support working class, black or female geniuses, but that we are more broadly missing out on other perspectives and experiences that would help frame different questions and solutions. It is for this – as well as the good health and useful productivity of academics – that we need to fight not just for better investment in Higher Education, supporting excellent outreach and teaching as well as research, but for a fairer society.

Advertisements

Women in science: a difficult history

Cross-posted from The H Word blog, where this post first appeared on 15 October 2013.

Caricature of women attending a 19th-century meeting of the British Association for the Advancement of Science
Caricature of women attending a meeting of the British Association for the Advancement of Science.

 

Today, as you’ll probably see from many tweets and blog posts, is Ada Lovelace Day. As the Finding Ada website explains, this

aims to raise the profile of women in science, technology, engineering and maths by encouraging people around the world to talk about the women whose work they admire.

Many of the talks and posts that mark the day will be about mentors and leaders in science today. Many will highlight the sometimes overlooked work of women in the history of STEM.

As I discussed on this blog last year, I find myself somewhat conflicted about Ada Lovelace Day and similar projects that focus on highlighting women in the history of science. On the plus side, I am wholeheartedly supportive of the attempt to encourage young women to think about scientific careers and to appreciate the work of women in the past, when opportunities were even more circumscribed. I am also glad to see stories from the history of science getting wider attention.

However, I am also wary. In celebratory mode, there is a tendency to overplay the work that the women highlighted actually did. There is no better example of this than Lovelace herself, who is wrongly credited with writing the first computer program. Likewise, just as with the heroic “great man” mode of history, focusing on individuals can hide the extent to which science is always a collaborative enterprise. Finally, although some women are rescued from the background shadows, other individuals and groups, equally deserving of attention, remain ignored.

As a historian, I am always likely to be suspicious of the use of history to serve particular purposes, whether that is to get more women into ormore funding for British science. Laudable though those aims might be, there is a risk that the historical evidence will be selected or distorted to suit the current purpose.

Certainly there were (are) remarkable women in all spheres of life, but the more important story is the one that explores why there were so few and how and why their talents – and those of whom we’ll never hear – were wasted. While it’s good to encourage girls and women to have the confidence to succeed in science or elsewhere, we also need them to look at the societies that have made, and still make, this a difficult task.

It always strikes me that should women of the past read some of what is written about them today, they would be hugely surprised and perhaps even offended. Before the 20th century, and often after, women who did scientific work tended to present themselves as a support to science or men rather than as pioneers. Although this is a reflection of the patriarchal society in which they lived, and they may sometimes have said things they did not privately believe in order to appear acceptable, it was their chosen self-presentation.

Recently, for example, a post was published that claimed that William Whewell had coined the word “scientist” to describe Mary Somerville. The response suggested that this is something that people really wanted to believe. However, while it is true that the first published appearance of the word was in a review of Somerville’s book On the Connexion of the Physical Sciences (1834), neither Whewell nor Somerville would have dreamed of its being applied to her. Women, Somerville suggested, did not have original ideas, but the female mind might, as Whewell wrote, provide a “peculiar illumination” in explaining the ideas of others.

Somerville undertook aspects of science that were “women’s work”: writing, translation, popularisation. She also frequently highlighted her role as a wife and mother. Others, who approved of and supported her, did likewise. The American astronomer Maria Mitchell met Somerville and wrote:

I could not but admire Mrs Somerville as a woman. The ascent of the steep and rugged path of science had not unfitted her for the drawing-room circle; the hours of devotion to close study have not been incompatible with the duties of wife and mother; the mind that has turned to rigid demonstration has not thereby lost its faith in those truths which figures will not prove.

Somerville and others underlined the distinction between men and women’s minds and appropriate spheres of activity because of the society in which they lived. Somerville was a supporter of women’s education and it was important to show that learning mathematics and the sciences would not turn young women into unattractive, barren spinsters. Those campaigning for women’s suffrage had a similar choice: emphasise your acceptable femininity or reinforce the stereotypes of Punch caricatures.

We like to think that we have moved on. In many ways we have: women in the UK can vote, be educated and enter careers and remain in them even after getting married or having children. Yet they are still radically underrepresented in the most highly paid and esteemed positions, and overrepresented at the other end of the scale. We are, as Alice Bell explained yesterday, still in a society that asks for female intelligence to be mitigated with the use of lipstick and a focus on domestic details.

By all means celebrate individuals, but understand them as they would have understood themselves. Make sure to think of the society in which they operated and look hard at what has and what has not changed.

Messing with time

Cross-posted from The H Word blog, first posted on 31 March 2013, the first day of British Summer Time.

Analogue clock

It’s hardly surprising that I’ve become very aware of time and how we measure it since beginning work at the Royal Observatory Greenwich. What has really struck me is how much we, on the one hand, are resentful at any suggestion that we alter the way we measure time and, on the other, have done just that many times over the course of history.

Although it’s useful publicity for the Observatory, I never cease to be amazed that the change of the clocks, which happens twice a year ever year like … err… clockwork, is always a news story. Suggestions that we might do away with British Summer Time, or shift to what is euphemistically called Double Summer Time (aka Central European Time) are huge debating points. The fact that we do, or the fact that we might cease to, add leap seconds are stories that both receive many column inches.

Although the introduction of BST was a very 1900s notion of making better – that is, more healthful and productive – use of our long summer days*, history suggests that it takes something like a world war to implement such seemingly radical changes. There is often a sense that diverting from whatever it is we are used to is, somehow, unnatural. It certainly seems unnatural when we are ripped untimely from our beds but, even before the arrival of this annual ritual, there was nothing natural about the way we measured time.

Take GMT, which traditionalists are keen to defend by rejecting summer time and by adding leap seconds to ensure that we do not drift too far from having the midday sun above the meridian at Greenwich at noon. In the scheme of things, it is almost as much of a novelty as BST. It only became official standard time in Britain in 1880, even if it had been used in specific contexts, like railway timetables or in navigation, some time before. Its use as a standard to which international time zones are aligned was a matter of slow adoption from the late 19th century onward.

The implementation of a national standard time, which might be some 20 minutes different from your local time, should you live in the west of the country, was seen by some as an unnecessary novelty. There were those who made a stand, and even today the chimes at Christ Church in Oxford still stick to local time. As can be imagined, the idea being floated at the end of the 19th century that there might be a universally-adopted International Time was much mocked, resented or worried about.

But the national or local times that seemed more meaningful to people are, of course, also artificial products. They are expressed as mean time, which is an averaging out of Sun’s the apparent motion that was adopted in deference to the introduction of a new technology – the clock. Clock time is produced by an artificial system that imperfectly mimics “real” time, which is a product of the Earth’s daily rotation and annual orbit.

The difference between mean time and “real” or apparent time is expressed by what’s called the Equation of Time. This was first calculated in the second half of the 17th century, when the introduction of pendulum clocks made the business of artificial timekeeping sufficiently precise. The fact is, the Sun is rarely at its highest point over the Greenwich meridian at the moment our watches show noon.

Things get even more complex when we start looking into how calendars have been calculated and established. These are fascinating histories that are intertwined with everything from the story of astronomy to how we live and order our lives. Our time has been messed with for centuries and it is not, forgive the pun, a clock that can be turned back.

* Personal gripe: many people seem to think that putting the clocks back in Winter is “daylight saving” and is designed, somehow, to give us more daylight. Guys: “daylight saving” is used in Summer and nothing, but nothing, will create more daylight in Winter.

School history: what worked for me

Much has already been said about the proposed new history curriculum. This piece by David Cannadine in the TLS is a good place to start, as is the Historical Association’s forum on the topic and, of course, Richard Evans in the FT. There is not much point in my adding to all this, but I did want to share something that looking at this contents-page of a curriculum made me recall.

As many of the critics of the proposed curriculum have pointed out, it all begins promisingly enough: it should allow children to “understand historical concepts such as continuity and change, cause and consequence, similarity, difference and significance”, and “how evidence is used rigorously to make historical claims, and discern how and why contrasting arguments and interpretations of the past have been constructed”. But what follows seems specifically-designed to undermine such aims, with a chronological list of names, abstract ideas and events that kids from as young as six are supposed to get through in just an hour a week.

There is much concern that this dry list, with often often age-inappropriate topics, will be a complete turn-off and that numbers taking history at GCSE will plummet. It will now be much harder for primary teachers to make history come alive by finding their local history, talking to people who remember past events, taking advantage of local museums, or discussing topics that fit the age of the children being taught – evacuation, for example, might be a powerful topic to discuss with children young enough not to be able to imagine leaving their parents. Instead, seven-year-olds will be discussing “concepts such as civilisation, monarchy, parliament, democracy, and war and peace”.

Thinking about what I don’t like about this curriculum got me thinking about my own experience of history at school. It didn’t make much of an impression on me at primary school: I got most of my history at home and on family trips to museums, monuments and galleries. It is this kind of thing, which many children will not have had, that Gove claims his education reforms make up for but, of course, it is never presented as a chronological ‘island story’. It involved going to places, asking questions, haphazard connections.

The little history I remember from primary school seemed equally haphazard, but that is no bad thing. You gain a sense of historical perspective not by slogging, over years, through a long chronology, but by thinking one day about Roman gladiators and then thinking about how different the world was when Henry VIII was on the throne, or when your house was built, or when aeroplanes were invented. The things that stick most in my mind had a connection to where I lived: the history of the city, the use of the buildings surrounding me. On one class visit to Edinburgh Castle, we dressed up as the French prisoners kept there during the Napoleonic Wars. We offered, as they had, our craftwork for sale and, of all things, sang the Marseillaise as we walked up to the gates.

At secondary school I started to really enjoy history, sometimes because of excellent teaching, sometimes in spite of it. Things were worst when we had to slog through a topic that covered a long period of history and when we were mostly obviously cramming in facts, people, dates and themes to prepare for exams. Things were best when we had discrete topics that we could cover in sufficient detail to get a feel for the period, the people involved and different perspectives.

Another post today from a fellow history of science curator – Charlie Connelly of the Science Museum – suggested one approach to counter that of Gove’s curriculum. This was to tell good stories, something with which many a watcher of TV documentaries and reader of popular histories would agree, not to mention many public historians and those who come to history outside the usual school and academic route. Charlie explains that it was good stories, even if ‘bad’ histories, like Sobel’s Longitude that got her to change her mind on history, having given up at 14 “finding it an unbelievably dry and tedious subject”.

I am a bit more doubtful, as readers of this blog will know, about using misleading stories as a hook. I also don’t recall “stories” being something that got me interested in history at school (although I do remember frequently getting a book called “100 Great Lives” out of the library over and again, undoubtedly a text Gove would have approved of). In fact, the thing that really excited me about history was that the more we know, the more we see that stories can be questioned. That was a powerful feeling.

Three lessons in my first year of secondary school stick out. For each of them the teacher prepared packs of images and texts and allowed us to go through them and draw our own conclusions, before class discussions and his conclusions. The first lesson was based on a fictional crime. We were given a range of evidence from the scene and about suspects and had to see if we could solve the crime. The next lesson did something similar, but with real images, newspaper reports, letters and statements, about the assassination of JFK. The final one looked at the assassination at Sarajevo.

For the real cases it became abundantly clear that the evidence we had was contradictory, that it could be very different in form and that different narratives could be created. This was exciting. We weren’t being taught facts, we were investigating, doing our own thinking and drawing our own conclusions or creating our own narratives based on the evidence we had. Aside from the fact that the evidence was given to us on coloured paper rather than our finding it ourselves in the archive, this really is a little bit like what real historians do.

It is also a little bit like what we all do, when we see the news, read newspapers, talk to friends and family and try to understand the world around us. These are the kinds of skills and the type of knowledge (not knowledge of facts but the knowledge that you are equipped to question or investigate) that school history can give to citizens and future voters. It goes without saying that they are also useful for many kinds of work. In what possible way can Gove’s curriculum compete with that?

Goldacre, Gove and evidence

Something about Ben Goldacre’s recent report on Building Evidence into Education has left me feeling concerned. This feeling is not just caused by the “I love teachers” and “I wrote this lovely paper about lovely teachers” tweets, or the sense of medical research approaches and the “gold standard” RCT [edit: corrected from RTC, when it is, of course, Randomised Controlled Trial.]  coming to the rescue of the floundering field of education research, or the suggestion that there is no quantitative research expertise within existing academic education departments. It wasn’t even the fact that his report fails to mention things like Ofsted, teaching unions, teacher training, curriculums or do more than nod to the huge changes to the profession that would be required.

The thing that really worried me, I suppose, was that this report was commissioned by and launched while sharing a stage with Michael Gove. This is the same Gove who has consistently ignored evidence and expert advice in education policy on a whole host of issues, including, of course, the writing of a ludicrous history curriculum. As David Cannadine, who led a research project on the history of history teaching, wrote in the TLS a few days ago, Gove has in this case taken a course that is “the exact opposite of what has been repeatedly recommended by people who know what they are talking about”.

Is it simply that Gove has a greater respect for the evidence revealed by RCTs than he has for that produced through historical and sociological research? I am not wholly convinced. Does it seem likely that this government will start supporting state schools sufficiently to allow them time and resources to engage in research trials, not to mention the kind of pay scales that might be commensurate with a profession that undertakes such work and is listened to? Again, not convinced.

The report was launched at the Bethnal Green Academy, and it is interesting to put this report alongside Gove’s commitment to academies and free schools. It is surely schools of this sort, along with private and independent, rather than state primaries and secondaries – tied to government-defined curricula and competing  on the most unlevel of playing fields – that would be in the best position to attract and support the kind of teachers who could undertake research. This would only exacerbate the divergence in provision, with teachers in different kinds of schools working in increasingly different circumstances. In addition, it seems likely that evidence more likely be produced from particular kinds of schools, lacking any real randomness in the trials.

If the model is medicine, these schools are, like general practice surgeries, businesses. They too will be testing products and techniques offered by private companies, which, like pharmaceuticals, are in the business of finding ever-more conditions that require intervention. Meanwhile, Goldacre’s urging of teachers to take the initiative and ape they betters is as get-on-your-bike, pull-your-socks-up and self-help as any Thatcherite would like.

I’ll admit that this is not a topic I know enough about, and the foregoing may be complete twaddle. If so, I apologise, but would be interested to hear other views. I know that the schools I went to (long, long ago) and the school my son goes to today would not be capable of adding research – quantitative or qualitative – to what they already, just about, manage to deliver. I also know that much of education and education research is already a commercial business. Goldacre’s report won’t shift policy, but it is sobering to think about where it might sit in Gove’s world view.

————

Postscript [18/3/13 20:58]: Ben Goldacre has today published a piece in the Guardian on his report, which, despite a tweet that stated “People keep asking what I think about [Gove] and evidence. Here:”, did absolutely nothing to suggest he has thought deeply about the politics, about Gove or, indeed, about the school-based, education-focused context of his topic.

Since publishing this post, I’ve had some interesting discussions on twitter and been pointed to some useful posts and articles. Here is some further reading:

Behavioural Insights Team, published in collaboration with Ben Goldacre and David Torgerson, Test, Learn, Adapt: Developing Public Policy with Randomised Controlled Trials – a policy paper from June 2012 that is fairly similar to the education piece, arguing that RCTs should be used widely in public policy, debunking some ‘myths’ about them then explaining how they should work in an ideal world.

Tom Bennett, ‘Thinking in the right direction; just don’t put all your faith in RCTs. Ben Goldacre’s vision for evidence based learning‘ – a post from an education blog that highlights the problems and impossibilities of the Goldacre vision. His description of Bethnal Green Academy – “Nice looking school; it’s got BSF written all over every plane and pane. The livery outside the school shouted every second sentence of the latest Oftsed report.” – was one of the things that got me thinking about the positioning of Goldacre’s report for Gove, although he also felt that Gove’s very presence was a positive sign for research and evidence in education.

Paul Cotterill, ‘The dangerous Dr Goldacre: Cohranian hero or just peddling garbage‘ – a post that points out the politics always and inevitably (and rightly) inherent in education and social policy, and that “a worthy aspiration can all too easily be hijacked by … governments (not just the current one) very keen on the idea of using research evidence to impose their own views of what success in education looks like, and less, but much less keen on the development of the kind of ambitious, democratically oriented research governance infrastructure that he advocates”. It also quotes from the following:

Will Davies, ‘The problem of evidence centres‘ – a post looking at the recently-announced evidence centres and the “somewhat naively optimistic view” that RCTs are a universal solution. But what are, and how do we define, ‘good’ and ‘bad’ outcomes in the many facets of educational experience?

Warren Pearce and Sujatha Raman, ‘Evidence: a means of making expertise public? The RCT movement in public policy‘ – outline of paper forthcoming at the  International Interpretive Policy Analysis Conference that will be asking “How can RCTs deal with the problem of expertise if experts are still needed to produce, interpret and apply evidence in particular circumstances? What sorts of institutions of epistemic governance might berequired to ‘open up’ RCTs vis-à-vis other forms of evidence for policymaking? To what extent have classical concerns about the ethical challenges posed by RCTs in social context been addressed in therecent literature?”

Dave O’Brien, ‘Drowning the deadweight in the rhetoric of economism: what sports policy, free swimming, and EMA tell us about public services after the crash‘ (Jan 2012 £) – an article that, exploring the Coalition government’s plans and decision-making processes in public service provision, notes “a specific rhetorical use of economic evidence to present controversial decisions as technical exercises”.

Skeptics and scepticism

Cross-posted from The H Word.

I was somewhat disconcerted to see something completely erroneous appear in Guardian Science’s own Notes & Theories blog. It was this:

A word about the distinction between sceptics and skeptics. A generic “sceptic” questions accepted beliefs. In this way, we have “man didn’t go to the moon” sceptics. (Some people won’t believe anything.) Skeptics are different: they espouse the evidence-based approach – and find the world wanting in many respects.

Yikes! As an early commenter rightly pointed out, the sceptic/skeptic spellings are simply UK and US variants, although later commenters denied this and continued to perpetuate the error. Somehow the British spelling now denotes “bad” scepticism (i.e. questioning scientific consensus on topics as varied as vaccination, lunar landings and climate change) and the US spelling is identified only with “the evidence-based approach” to … something-or-other.

It is true that the capital “S” Skeptic movement uses the US spelling even in the UK, but that is an extremely circumscribed use of the word. It is one that is not widely known or understood outside particular communities. Before about 2010, when I started blogging and using Twitter, it’s something I had never come across (and I say that as someone who has an interest in science, is an atheist and attempts to make decisions rationally and based on evidence).

To compound matters, this was written by Deborah Hyde, editor of The Skeptic magazine. To not understand the meaning and history of the title of your own publication is a worry.

Scepticism, or skepticism, is neither denialism nor a movement. Based on the Greek skeptomai, which means to think or consider, it usually means doubt or incredulity about particular ideas, or a wider view about the impossibility of having certain knowledge. This uncertainty is a philosophical position, and philosophical scepticism includes attempts to deal with it, through systematic doubt and testing of ideas.

So, let’s be clear. In the US you can be a climate skeptic. In the UK you might consider yourself a Skeptic and approach knowledge in a sceptical way. It also appears that it is possible to be a Skeptic and yet not be a sceptic. Hyde’s parenthetical “Some people won’t believe anything” dismissal of “bad” sceptics suggests very little understanding of what scepticism really means.

This goes to the heart of much recent criticism of Skeptics, often coming from within the movement itself. The charge is that many self-identified Skeptics are not properly sceptical (or skeptical) of the positions that they or their leading figures take up. Rather, a tribalism or group-mentality develops in which – unthinkingly – certain positions are condemned or approved.

It would be wrong to tar every self-identified Skeptic with the same brush. However, all too often what comes over to those on the outside is a rather narrow and repetitive focus on particular topics and, more importantly, a condescending, over-confident tone in engaging with those who disagree or who have given such things little thought.

These things matter if Skeptics are really interested in changing or opening minds rather than getting together and having a good laugh about whacky beliefs. Hyde’s article suggests it is the former that now takes precedence:

Many skeptics retain a hobbyist’s level of delight in debunking psychic powers or ghost stories, and that’s where the movement started. But the subject matter has become more serious and political. In the last decade, the most formidable opponents of alternative medicine have not been government regulators, but skeptics.

She adds vaccination, the teaching of evolution in schools, gay rights and abortion rights. Her claim is that Skeptics, or nerds (or geeks) are “the people with the best intellectual tools to rebut the traditional postulates”. I would query that, if her “nerdocracy” means the self-selected (and not necessarily experienced or qualified) group that might identify with the term. As it stands, they also may not be the best (and should certainly not the only) group to attempt to communicate the issues to the broader public.

I’ll end simply with a reminder that the etymology of scepticism implies enquiry and reflection, not dismissiveness.

Getting women into history and science is great, but where are the men?

Cross-posted from The H Word.

Now that Ada Lovelace day – which seeks to highlight inspirational stories about women in science – is safely over, I can mention something about it that concerns me. It is a broader concern than this one day, and is basically this: where were the men?

As I explained in an earlier post, I am all for seeking out women’s stories in history, because it gives us a different and equally important perspective on how science happens. I also showed, in the context of Wikipedia editing, that it is hugely important for women to be making and sharing knowledge. However, I am concerned by the fact that the two things seem so often to have to go together. Women should be writing about women.

In fact, men and women should be writing about men and women (and boys and girls, animals and instruments, seas and skies, or whatever). I think it would be hard to disagree with this point, and yet so often we find ourselves positioned against this.

I very happily went along last Tuesday to host an Ada Lovelace special for PubSci, to hear several women speaking about the women in history that helped provide inspiration in their scientific careers. But I would have loved it if there had been one or two men speaking about women too. (Though many thanks to the men who organised and attended the event!)

In my academic career, writing about gender is something that has both had an appeal and something I have purposely avoided. After having written my undergraduate dissertation on women in 17th-century drama, I was keen not to find myself pigeonholed as a woman who only wrote about women and found myself for some time focusing fairly squarely on the men of science who formed the core of my postgraduate research.

Fairly or unfairly, my sense was that this reinforcement of the view that women’s history is by and for women can serve to trivialise both the makers and subjects of that research. It becomes a niche, when it should be an integral part.

The same danger can arise when all the work and publicity about bringing women into research/academia/politics/business is delivered by women. If we agree that it is important that female voices should be much better represented in a host of important and influential places, then that is a concern for everyone.

That it should be a shared concern is the reason why I do not agree with those (men) who complain about women-only shortlists, training sessions aimed at women, or the need to create “binders of women” that can be waved under the nose of politicians suffering female-selective blindness. But these things should be endorsed, created and delivered by men as well as women, until such future time that these particular distinctions cease to hold such significance in the workplace.