Farewell Greenwich Mean Time (see you in October)

Cross-posted from The H Word blog, where this was first published on 30 March 2014.

The 24-hour Shepherd Gate Clock outside the Royal Observatory, Greenwich, displaying Greenwich Mean Time to the public.

The 24-hour Shepherd Gate Clock outside the Royal Observatory, Greenwich, displaying Greenwich Mean Time to the public.


It has become something of a tradition on this blog to mark the biannual change of the clocks and, although I no longer work at the Royal Observatory Greenwich, it’s a habit that sticks. This time, as we say farewell to it until the autumn, it seems a good opportunity to reminisce about Greenwich Mean Time.*

Why Greenwich time? And what’s mean about it?

Mean time is clock time. It is a regularised, idealised version of solar time that is tracked not by the apparent motions of the Sun, observed by shadows on sundials, but by a mechanical device that splits the solar day into equal parts. Mean time ticks away at the same pace no matter the season. The difference between the two is described by the equation of time.

Establishing the relationship between mean solar time and apparent solar time only really became possible, or useful, with the arrival of the pendulum clock in the 1650s. This made the mechanical clock, for the first time, a scientific instrument. Christiaan Huygens, who developed the first prototype pendulum clock in 1656, was able to produce reasonably accurate tables of the equation of time in 1665.

However, it fell to John Flamsteed to publish tables in 1672-3 that tackled the problem in what became the standard way. He provided the formula by which apparent solar time could be converted into Mean Time.

Just a couple of years later, Flamsteed was appointed the first Astronomer Royal and moved into the newly built observatory in Greenwich. There, he and his patrons had installed state-of-the-art pendulum clocks by the best clockmaker available, Thomas Tompion. With observations of the Sun and the help of his tables, Flamsteed set these clocks to the local time: Greenwich Mean Time.

Greenwich time became important because there were people measuring it and because other people made use of astronomical observations based on it. Flamsteed’s catalogue of stars, which was to become a standard reference work for the following decades, listed their positions based on Greenwich time.

It was one of Flamsteed’s successors, Nevil Maskelyne, Astronomer Royal from 1765 to 1811, who did most to ensure that GMT mattered to more than just astronomers. Under his initiative, observations made at Greenwich were processed into tables that could be used by navigators and cartographers to establish positions at sea or on land. This was the Nautical Almanac, first published for the year 1767.

Surveyors of the Royal Navy and the Ordnance Survey relied on data that was based on observations made at Greenwich, meaning that their charts and maps used Greenwich as a reference point. More precisely, this was the meridian (north-south line) on which the chief telescope at Greenwich was mounted. The Greenwich meridian thus became a prime meridian for British mapping, and east-west position was measured from there. To establish longitudes it was necessary to know the difference between local time and GMT. This could be worked out with astronomical observations and the tables of the Nautical Almanac and, increasingly, with chronometers set to GMT.

The move of GMT from the specialist worlds of astronomy, navigation and surveying into civilian life was down to the increasing role of technologies and cultures that demanded standardization. The arrival of railways made timetabling a necessity. Telegraph systems made it both desirable and possible to know what time it was elsewhere. Factory work made production and payment dependent on timekeeping.

GMT became “Railway Time” in the 1840s, and Britain’s legal standard time in 1880. Despite what you’ll often read, it did not become an international standard in 1884. In that year an international conferencedid recommend the adoption of the Greenwich meridian as the world’s reference point for time and longitude, but it was just a recommendation.

What actually happened as a result of the International Meridian Conference, and what did not, is a story for another post. See you back here on 26 October.

* Our standard time is now in fact Coordinated Universal Time (UTC), derived from International Atomic Time but as close as darn it to GMT.

Cosmos and Giordano Bruno: the problem with scientific heroes

Cross-posted from The H Word blog.

 

Statue of Giordano Bruno, erected at Campo de' Fiori in Rome, 1889.

Statue of Giordano Bruno, erected at Campo de’ Fiori in Rome, 1889.


Although it’s not as big news in the UK as it has been in the US, readers of the Guardian science pages may have noticed that Carl Sagan’s classic series Cosmos: A Personal Voyage is being remade by Fox and presented by Neil deGrasse Tyson as Cosmos: A Spacetime Odyssey.

Broadcast in the US last Sunday, I saw a lot of love being expressed on my Twitter timeline. However, it has also prompted some interesting comments from historians of science. We in the UK can see it for ourselves this Sunday (if we have access to the right channels), but here are some articles and posts that give food for thought.

In The Atlantic, Audra Wolfe looked at the Cold War context in which the original Cosmos succeeded, or could, at least, be credited by many with having kicked off a decade-long “popular science boom”. What the Cosmos effect actually was does not seem to have been measured but, even if real, Wolfe points out that times have changed. She argues that Cosmos Can’t Save Public Support for Science today, particularly if it is “weigh[ed] down with Cold War-era fantasies that confuse the public understanding of science with its appreciation.”

Other historians have been prompted to comment on Cosmos because, as in the original, history of science is part of the package. Much has been said about the importance the remake, as a high-profile broadcast that can reflect the extent to which science has moved on since 1980. History of science has also moved on: is this reflected in the new series?

The answer, it seems, is yes (a bit) and (mostly) no. In the first episode, a rather hefty portion of airtime (11 out of 43 minutes) is devoted to an animation on the life of Giordano Bruno. Burnt at the stake by the Roman Inquisition in 1600, he was there to play the role of scientific hero and martyr. It is an ill-fitting part for this idiosyncratic Dominican monk.

Laudably avoiding any temptation to snark, Meg Rosenburg took the sudden interest in this reasonably obscure figure as an opportunity to help those who might Want to Know More About Giordano Bruno. While Bruno’s cosmological poetry and mystical thought included heliocentrism, he was not, of course, a scientist, nor was he sentenced to death for “scientific” ideas or anything like “the nice-mannered, doe-eyed dissenter” that appears on the screen.

In fact, Bruno is so obviously a problematic choice as a scientific martyr that several non-historians have also picked up on the issue. Corey S. Powell in Discover Magazine suggested that Cosmos picked the wrong hero, and that another – even more obscure but significantly more astronomical – early Copernican, Thomas Digges, might have been a better bet. Hank Campbell at The Federalist picked the Bruno problem as the most significant of Five Things that Cosmos Gets Wrong.

Becky Ferreira at Motherboard carefully explained What Cosmos Gets Wrong About Giordano Bruno, the Heretic Scientist, although, as she notes, it was not all bad as the account “did a pretty good job of covering its butt by shoehorning in some of Bruno’s contradictions, like the fact that he was a crappy scientist (and many historians argue he shouldn’t be considered one at all).”

Yet, nevertheless, the overriding message appears to have been about heroic passion for truth against dogma and science versus religion. And, despite the nod the nuance, this is a case of turning history into parable.

This is problematic for many reasons, one of which is that it doesn’t exactly sit well with claims to champion evidence-based knowledge. Another is that hiding parts of Bruno’s story that undermine the image of the scientific martyr plays into the hands of those who are only too pleased to highlight what might appear to be anti-religious propaganda coming from the scientific and media establishment (thanks to Rosenburg for tweeting that link).

Historical figures who lived in a very different world, very differently understood, cannot be turned into heroes who perfectly represent our values and concerns without doing serious damage to the evidence. It reminds me of one of the 19th-century men of science-cum-historians I researched, who learned this lesson the hard way.

In 1831 David Brewster published a short biography of Isaac Newton, portraying him as a hero that represented everything the author wanted to say about the moral status of science and its practitioners, and how they should be supported in late Georgian Britain. A couple of decades later he produced a much expanded biography, this time based in part on the unpublished archive. Lo and behold: Newton was a nasty piece of work, he was unorthodox in his Christian belief and he was a dedicated alchemist.

Poor Brewster! Although, as a reviewer said, he attempted to “do his best” by his hero, he was sufficiently dedicated to the evidence to “admit” the faults in public. It undermined his overriding narrative and seems to have caused him real personal anguish. Let this be a cautionary tale against those who invest too much in their heroes – and a call for some evidence-based history to help us better understand what science has been, is now and could be in the future.

 

Information, images and imagination: Beautiful Science exhibition explores data visualisation

Opening tomorrow [NB this post is cross-posted from The H Word, where it was first published on 20 February - the exhibition remains open until 26 May], the British Library’s Beautiful Science exhibition raises fascinating questions about the power of visualisations and how we might tell their history

 

Diagram of the Causes of Mortality in the Army in the East, Florence Nightingale. London, 1858.

Florence Nightingale’s “rose diagram”, showing the Causes of Mortality in the Army in the East, 1858. Photograph: /British Library


Beautiful Science: Picturing Data, Inspiring Insight, which opens at the British Library tomorrow, is a small but thought-provoking display that looks at how scientific data has and can be visualised. Prompted by today’s interest in big data and infographics, it merges modern digital displays with historic texts and images.

Perpetual Ocean

Perpetual Ocean: Goddard Space Flight Center Scientific Visualization Studio, 2011. Photograph: NASA/Goddard Space Flight Center Scientific Visualization Studio, 2011.


According to the exhibition’s curator, Johanna Kieniewicz, it is the British Library’s “first science exhibition”, which seems extraordinary, given the extent to which its collections can reflect the display’s theme.

However, science has often featured in the Library’s larger, more overtly historical exhibitions. Henry VIII: Man and Monarch included a section that – slightly handwavingly – indicated new approaches to knowledge and development of key areas like navigation. The current Georgians Revealed exhibition likewise has a section that notes, particularly, the new technologies, innovation and entrepreneurial spirit of the 18th century.

The Out of this World exhibition, while focusing on science fictions, nevertheless revealed a great deal about the excitement, expectations, humour and fears surrounding science and technology in a broad range of periods. Points of View dwelt on the scientific and technical, as well as artistic, development of photography. Magnificent Maps was a perfect demonstration of how travel and precision techniques create new knowledge in ways that suit different audiences

The key display artefacts in Beautiful Science are, like these other exhibitions, historical texts, charts, maps and illustrations, so we might wonder what is new. The answer, it seems, is that it is the first display to have been led by the Library’s Science and Digital teams, rather than that it displays science per se. What difference might this make?

The display items are well-chosen, and include some key examples of innovation in data collection and presentation. However, the science- rather than history-led interpretation of the 17th- to 19th-century texts is clear in the fact that their selection reflects trends and concerns of the present, rather than a concern to reveal those of the past. There is, likewise, an emphasis on progress toward ever better and more accurate approaches to data visualisation (although in a post at PLOS Blogs, Kieniewicz suggests that designers have recently stolen a march over scientists in the display of data).

The three themes of the display are Weather and Climate, Public Health and (rather less obviously) Tree of Life. The first includes Halley’s world map of trade winds, a persuasive form that masked his lack of data, and two 18th-century log books from ships of the of the East India Company. The latter are there less because they have much to say about visualising data in the past (although the recorded observations did feed into charts by Halley and others, and one of the log books includes a charming sketch of a sea bird) and more because of current climate science projects that are attempting to make current use of old data.

Such data, however, also has much to say about how 18th-century mariners saw their world and Empire, what they understood about weather or climate and what was their understanding of important things to record, which maps only poorly onto our own.

“Public Health” naturally includes John Snow’s famous map of cholera cases in London’s east end and Florence Nightingale’s “visually gripping” rose diagrams representing the effects of her sanitary reforms on mortality during the Crimean War (top). Here the power of visual data is made clear, being, above all, an extremely effective tool of persuasion for public opinion and government action.

The “Tree of Life Section” is an excuse to bring out some lovely early modern illustrations and, while it seems a bit too simplistic to connect these theological and metaphysical meditations directly to modern taxonomies and diagrams “based on scientific data and information”, we are prompted to reflect on how older views have left their mark. If a branching tree of evolutionary theory recalls a Great Chain of Being, then it is far too teleological (that is, progressive and purposeful) to represent natural selection.

The Pedigree of Man. Ernst Haeckel, The evolution of man. London, 1879.

Ernst Haeckel, The Pedigree of Man, London, 1879. Photograph: /British Library


In comparing earlier and modern taxonomies, it is interesting too to speculate on changing criteria. All systems of categorisation are to some degree unnatural, despite claims to be representing nature. Today, in a way that would have bemused earlier taxonomists, genomic data trumps visual description. Has the (family) tree analogy made this inevitable?

The British Library is the perfect institution for discussions between science, arts and the humanities to take place. While defined as a “science exhibition”, visitors to the display and participants in the accompanying events programme should be encouraged to see the aethestic and the historical in it too – just as the science of the Tudor or Georgian eras should be recognised as part of their history.

Beautiful Science runs from 20 February to 26 May 2014. See the British Library’s website for the full list of list of related events.

 

Captain Cook and Australia Day: invasion, exploitation and science

Cross-posted from The H Word blog, where this was first posted on 27 January 2014.

Captain Cook’s contested reputation casts him as imperialist villain or man of science. Whatever we think of him, the two roles are not mutually exclusive

Statue of Captain Cook at Greenwich
Statue of Captain James Cook outside the National Maritime Museum. Photograph: David Iliffe/Wikimedia Commons

Yesterday [26 January] was Australia Day and, thanks in part to social media, it seems to have been more overtly contested than ever before. As a much-shared piece on this website stated, for many Australia Day is a time for mourning, not celebration. Marking the anniversary of the arrival of the 11 British ships known as the First Fleet in 1788, its choice as a national holiday has long been contested. In my Twitter feed, #invasionday was more prevalent than the trending Happy Australia Day.

As a historian of science working on the history of 18th-century navigation, I’ve noticed how often Captain Cook appears as the symbol of the British invasion. Yesterday, for example, Australian comedian Aamer Rahman joked on Twitter that he had a Cook-shaped piñata to celebrate the holiday (that wept white tears when hit) and, earlier in the week, Cook’s family cottage was graffitied with slogans, including “26th Jan Australia’s shame”.

This is odd, in some ways, as Cook died nearly a decade before the Fleet sailed. He did not invade or settle, nor, even, was his ship the first European contact with Australia. However, the fact that his cottage was vandalised in Melbourne, having been moved from Yorkshire in 1934, perhaps tells us almost everything we need to know about how Cook’s reputation has been welded to his brief visit to Australia and has been both near-deified and villainised ever since.

In illustration of the complexity of Cook’s legacy, ex-pat New ZealanderVicky Teinaki alerted me to a film on display at the Captain Cook Birthplace Museum in Yorkshire. The museum’s website describes it as “recording the reaction of contemporary communities to Cook’s legacy” and these, Vicky said, could be generalised into three groups: “acknowledging he was a great & brave explorer, anger at the white man diseases he brought, or ‘better English than French’”.

The reaction from this side of the world depends, I think, on whether Cook is viewed as the military man – a blue-coated, gun-toting officer of the Royal Navy – or the explorer and man of science. He was, of course, both, for the categories are not mutually exclusive. On the Endeavourvoyage he was paid by and carried out the instructions of the Navy andthe Royal Society of London. He was both a vessel commander and one of two astronomers charged with carrying out a range of observations, including the 1769 transit of Venus and longitude determinations.

Those who cast Cook as a man of science note not only his ability in astronomical observation and mathematical calculation, but also his careful observation of the new lands, flora, fauna and peoples he encountered. Regarding Cook’s journal descriptions of the latter, theNational Library of Australia is careful to note that “Lord Morton, President of the Royal Society, had advised Cook by letter to treat with respect the Indigenous people he encountered and to communicate peacefully with them.”

Yet it is obvious that all the science undertaken on his and similar voyages was part and parcel of the process of exploration and colonisation. The transit of Venus observations were bound up with attempts to improve navigation and cartography, which, along with botany, geography and ethnography, provided information about how best to exploit new territories.

Cook is, perhaps, less directly worthy of vilification than those who developed policies for colonisation and who governed societies that forgot the caution and respect that Morton had urged. Equally, however, he is among those to whom we might attach collective guilt for their role in making empire and exploitation possible.

If Cook is guilty in this way, were not also many of those who stayed at home? Morton and the Royal Society, who linked their enterprise firmly to Britain’s imperial interests? John Harrison and the Commissioners of Longitude, who looked for ways to make long-distance sea voyages and the data they brought home more reliable?

This train of thought led me to recall an interview I recently heard on Radio 4, with a scientist brought in to discuss the Moon’s potentially exploitable natural resources. How might we manage the claims of different nations (limited, in theory, by international agreement) and private companies (currently unlimited in law) to these minerals? Might this lead to conflict, injustice and over-exploitation?

The planetary scientist pushed the questions away. We do not yet know if anything useful is there, he said, and no one yet has the resources to make lunar mining profitable. His aim is simply to find out what is there, not to worry about the consequences. Given what history tells us, it might seem better to resist looking. At the very least, it seemed shockingly blasé to say that any future conflicts, rivalries and ruination would have nothing at all to do with the curiosity-driven likes of him.

Cook could not foresee the results of his actions. Understanding of the transmission of disease or the consequences of introducing alien species was limited; rivalry for worldwide empire was, as yet, in its infancy, and belief in the virtue of spreading European knowledge and values was firm. Cook is blamed because of hindsight, a little of which should always prompt a greater sense of responsibility today.

Historical images of women using scientific instruments

Cross-posted from The H Word.

Biologist Beatrice Mintz (b. 1921) with microscope
Biologist Beatrice Mintz (b. 1921), with microscope. Photograph: Smithsonian Institution/flickr

 

There is something about one of my Pinterest boards that seems to have caught the imagination. It is, as the platform allows, simply a way of collecting and displaying images that I have culled from elsewhere across the internet, hitting a particular theme. This one is called Women using scientific instruments.

At present, it is only 42 images, from the 14th century to the 1970s, the majority coming from the mid 19th to the mid 20th century. It has largely been created by chance and targeted Googling and offers no narratives and little interpretation. Yet it seems to have provided something that at least some people were looking for.

I started it some time back. Having written a blogpost including an image of putti using scientific instruments, I got into conversation with Danny Birchill on Twitter and mentioned that this had once been a fairly common trope and that, pre-19th century, images of people actually using scientific instruments were relatively rare. Danny was prompted to make his own Pinterest board, Putti of Science (there are many other examples).

This was my introduction to Pinterest, and I set about creating some history of science-themed boards myself. I hadn’t really promoted them but, after happening to mention it on Twitter, Alice Bell tweeted:

Historian of science, @beckyfh has a ‘women using scientific instruments’ board on Pinterest. And it’s a delight.http://www.pinterest.com/beckyfh1/women-using-scientific-instruments/

And it took off from there, with lots of re-tweets and follows on the board. As well as Alice’s “it’s a delight”, comments included “This is so great I may not sleep tonight”, “This gives me goosebumps” and “1st time I understand Pinterest”. I am not sure I have ever put together anything that seems to have had such an overwhelmingly positive response.

It is particularly interesting for me to have been part of this, given that I have sometimes found problems with the way that women in the history of science have been celebrated. Historical facts are rather too often ignored in favour of good stories and the creation of scientific heroes. Yet, the response to this set of images helps remind me how much women in science and science communication need to see themselves reflected in history.

It is also, as someone pointed out on Twitter with a link to this hilarious gallery of stock photography of women, a perfect response to the way women are so often depicted in the media. On my board I have eschewed the modern, posed images of “female scientist” and “woman with test tube”, and instead have largely gathered images of women who actually made use of the instruments they are shown with.

For me, it’s also important than only a few of these women are well known. This is not about creating heroines of science, or making any sort of claim beyond the fact that these particular women were there. It remains obvious that there are far more historical images of men with scientific instruments, and the images also show that women’s experience of science was often mediated by men. But this, and the fact that for much of history they were more likely to be part of the audience or figuring as a muse, should be recalled rather than swept under the carpet.

I, however, will remember the response to this simple collection. It was an excellent reminder that the past does not just belong to historians.

• Rebekah Higgitt will be speaking as part of a panel on Doing Women’s History in a Digital Age at the Women in Science Research Network conference this May. Comment here or tweet her @beckyfh with suggestions for the Pinterest board.

Who’s missing in modern academia: solitary geniuses or something much more significant?

Cross-posted from The H Word blog, where this post first appeared on 10 December 2013.

1974 portrait of Isaac Newton as solitary genius.

When Peter Higgs, of Higgs boson fame, was quoted in the Guardian on Friday as saying “Today I wouldn’t get an academic job” because he would not “be regarded as productive enough”, it prompted much nodding and retweeting from academics.

Coming as it did on the tail of British academics’ rush to complete submissions to the REF (Research Excellence Framework), in a term that has seen two strikes over fair pay in Higher and Further Education and at a time when there are reports of long working hours and other pressure on academics affecting wellbeing , it is hardly surprising that there was sympathy toward Higgs’s negative judgement of today’s focus on “productivity” and publication.

When Higgs was quoted as saying “It’s difficult to imagine how I would ever have enough peace and quiet in the present sort of climate to do what I did in 1964”, many academics undoubtedly heaved a sigh and got back to the marking, teaching preparation, grant application, or whatever other non-research-related activity they were currently engaged in.

It seems, though, that Higgs’s comments struck a wider chord, perhaps because of the extent to which they conform to the stereotype of the solitary scientific genius. His “peace and quiet” of 1964 (aged 35) brings to mind Newton’s escape to his Lincolnshire family home in 1666 (aged 24), and it is contrasted in the article with “expectations on academics to collaborate and keep churning out papers”. This is the kind of thing we want to hear our science Nobel winners saying.

Teaching, which takes up a huge proportion of most academics’ time, is not mentioned in this piece. I have no idea what kind of a teacher Higgs was, but Isaac “lecture to the walls” Newton clearly would have been a flop on Rate my Professor and a liability for a university anxious about its position in the National Student Survey. He would probably have been just as problematic for REF. Although he was to go on to have a staggering impact (or Impact), Newton was famously, for much of his life, reluctant to publish.

In many ways Newton and his mythology became a model for how we think of genius, particularly in the physical sciences. Stories of his youthful moment of inspiration, his forgetfulness, his oddness, his solitariness and his immersion in his work abound. Yet he was also someone who learned not just from books but also from his Cambridge tutors and colleagues and wide correspondence, who made his approaches to the Royal Society with scientific papers and the gift of his reflecting telescope, and who went on to become an MP and to lead the Royal Mint and Royal Society.

Science is profoundly collaborative, relying on communication to peers and students, and collaboration with colleagues and a whole range of other “stakeholders”. It goes without saying that there have, always, been many people doing scientific work who not only put up with but also thrived on all those other activities. Science would not have developed without them.

While there are some, perhaps-justified, fears about modern academia effectively losing the insights of the next Newton, it’s worth recalling the circumstances in which many of the well-known figures in the history of science conducted their work. While they may not have been writing grant reports of marking exams, they were likely seeking patronage, carrying on journalistic careers, undertaking the duties of a doctor or a vicar, teaching, family business or otherwise making a – usually non-scientific – living.

Those who really were excluded were not solitary geniuses who could not find sufficient time for thinking, but those who were, as a result of class, geography, race or gender, never likely to have the opportunity to begin an education, let alone contribute to the established scientific societies and journals. And this affected the science that was done: ample research shows how the norms, assumptions and interests of elites have shaped supposedly value-free science.

Science and academia today remain embarrassingly homogeneous. However, the fear is not so much that we might be failing to find or support working class, black or female geniuses, but that we are more broadly missing out on other perspectives and experiences that would help frame different questions and solutions. It is for this – as well as the good health and useful productivity of academics – that we need to fight not just for better investment in Higher Education, supporting excellent outreach and teaching as well as research, but for a fairer society.

Clock change challenge

Cross-posted from The H Word, where this post first appeared on 27 October 2013.

Alarm clock

Spring forward; fall back. Or was that spring back and fall forward (equally possible)? And will we ever have a mnemonic that works for those of us who usually talk about autumn rather than fall? At least it’s always an hour and only twice a year, right?

It’s hard to imagine, but the person who lobbied most vigorously for the introduction of British Summer Time at the beginning of the last century actually suggested that this confusion be extended over the course of four weeks. At each end of the summer the clocks would be shifted 80 minutes, rather than an hour, in 20-minute chunks. This is what William Willett suggested in 1907 in a pamphlet that worried about The Waste of Daylight.

Imagine having to remember if we were in week 0 or 1, 4 or 5. Weeks worth of excuses for being late for that meeting or missing the train! Today, of course, things are markedly easier, since most of us have devices that update the time they display automatically, but keeping on top of that in the early 20th century would surely have been a significant challenge.

Willett’s enthusiasm for fiddling with the clocks apparently derived from a revelation that hit him when out riding early one morning in Petts Wood. It was a beautiful summer’s day but all around him were drawn curtains and closed blinds. His fellow men and women of were missing out on this joyful and healthful experience. Willett was sufficiently sure of the benefits of “early to bed and early to rise” to be convinced that it might make the whole nation “healthy, wealthy and wise”.

Thus, although people always moan about clocks changing (back to GMT) in autumn, with our evenings and afternoons plunged into increasing darkness, the idea is and was all about making the best of the available daylight in summer rather than worrying about farmers in Scotland or kids going to school on dark mornings in winter. Daylight in winter is scarce at high latitudes, and no amount of clock-fiddling will change that.

It was the additional light in summer that had the potential to improve the “health and strength of body and mind”, Willett thought. He therefore proposed:

that at 2 a.m. on each of four Sunday mornings in April, standard time shall advance 20 minutes; and on each of four Sundays in September, shall recede 20 minutes, or in other words that for eight Sundays of 24 hours each, we shall substitute four, each 20 minutes less than 24 hours, and four each 20 minutes more than 24 hours.

Easy peasy! Willetts pointed out that this adjusting of clock hands – which was, perhaps, a more familiar task in days when timepieces still had to be wound regularly – would provide an “extra” 9 hours and 20 minutes for beneficial activity each summer week. The 20-minutes-a-week-for-4-weeks idea was intended to make the switch less sudden and, therefore, less objectionable. It was, he suggested, akin to the change in time experienced, without ill effect, by those travelling east or west by ship.

He had a point: there was no “ship lag” akin to the jet lag that aeroplanes introduced. There has been research that suggests that the loss of an hour in spring can be enough to cause fatigue and increase road accidents. Perhaps gradual change would be better, so long as half our minds aren’t busy wondering which 20-minute increment we’re currently on.

There are, of course, alternative ideas. There’s so-called Single/Double Summer Time, which would put us in synch with Central European Time, and please a number of campaigners and interest groups. However, when we last tried sticking with BST in winter, in 1968-71, the even longer and deeper morning darkness proved to be a deal breaker.

We could, of course, do away with daylight savings altogether. Perhaps we could extend working hours in summer and reduce them in winter? Or simply shift business hours or starting times rather than clock time. This is what Benjamin Franklin suggested, as he joked that shutters could be taxed and candles rationed to “encourage” his fellow man to make better use of light. Church bells and cannon sounding as the Sun rose might reinforce the point:

All the difficulty will be in the first two or three days; after which the reformation will be as natural and easy as the present irregularity.

Rebekah Higgitt scheduled this post last night in the fond hope that she might get her additional hour in bed this morning. If you check her@beckyfh, however, you’ll more than likely find her moaning about the fact that young children don’t take the blindest bit of notice of the clock.