Captain Cook and Australia Day: invasion, exploitation and science

Cross-posted from The H Word blog, where this was first posted on 27 January 2014.

Captain Cook’s contested reputation casts him as imperialist villain or man of science. Whatever we think of him, the two roles are not mutually exclusive

Statue of Captain Cook at Greenwich
Statue of Captain James Cook outside the National Maritime Museum. Photograph: David Iliffe/Wikimedia Commons

Yesterday [26 January] was Australia Day and, thanks in part to social media, it seems to have been more overtly contested than ever before. As a much-shared piece on this website stated, for many Australia Day is a time for mourning, not celebration. Marking the anniversary of the arrival of the 11 British ships known as the First Fleet in 1788, its choice as a national holiday has long been contested. In my Twitter feed, #invasionday was more prevalent than the trending Happy Australia Day.

As a historian of science working on the history of 18th-century navigation, I’ve noticed how often Captain Cook appears as the symbol of the British invasion. Yesterday, for example, Australian comedian Aamer Rahman joked on Twitter that he had a Cook-shaped piñata to celebrate the holiday (that wept white tears when hit) and, earlier in the week, Cook’s family cottage was graffitied with slogans, including “26th Jan Australia’s shame”.

This is odd, in some ways, as Cook died nearly a decade before the Fleet sailed. He did not invade or settle, nor, even, was his ship the first European contact with Australia. However, the fact that his cottage was vandalised in Melbourne, having been moved from Yorkshire in 1934, perhaps tells us almost everything we need to know about how Cook’s reputation has been welded to his brief visit to Australia and has been both near-deified and villainised ever since.

In illustration of the complexity of Cook’s legacy, ex-pat New ZealanderVicky Teinaki alerted me to a film on display at the Captain Cook Birthplace Museum in Yorkshire. The museum’s website describes it as “recording the reaction of contemporary communities to Cook’s legacy” and these, Vicky said, could be generalised into three groups: “acknowledging he was a great & brave explorer, anger at the white man diseases he brought, or ‘better English than French’”.

The reaction from this side of the world depends, I think, on whether Cook is viewed as the military man – a blue-coated, gun-toting officer of the Royal Navy – or the explorer and man of science. He was, of course, both, for the categories are not mutually exclusive. On the Endeavourvoyage he was paid by and carried out the instructions of the Navy andthe Royal Society of London. He was both a vessel commander and one of two astronomers charged with carrying out a range of observations, including the 1769 transit of Venus and longitude determinations.

Those who cast Cook as a man of science note not only his ability in astronomical observation and mathematical calculation, but also his careful observation of the new lands, flora, fauna and peoples he encountered. Regarding Cook’s journal descriptions of the latter, theNational Library of Australia is careful to note that “Lord Morton, President of the Royal Society, had advised Cook by letter to treat with respect the Indigenous people he encountered and to communicate peacefully with them.”

Yet it is obvious that all the science undertaken on his and similar voyages was part and parcel of the process of exploration and colonisation. The transit of Venus observations were bound up with attempts to improve navigation and cartography, which, along with botany, geography and ethnography, provided information about how best to exploit new territories.

Cook is, perhaps, less directly worthy of vilification than those who developed policies for colonisation and who governed societies that forgot the caution and respect that Morton had urged. Equally, however, he is among those to whom we might attach collective guilt for their role in making empire and exploitation possible.

If Cook is guilty in this way, were not also many of those who stayed at home? Morton and the Royal Society, who linked their enterprise firmly to Britain’s imperial interests? John Harrison and the Commissioners of Longitude, who looked for ways to make long-distance sea voyages and the data they brought home more reliable?

This train of thought led me to recall an interview I recently heard on Radio 4, with a scientist brought in to discuss the Moon’s potentially exploitable natural resources. How might we manage the claims of different nations (limited, in theory, by international agreement) and private companies (currently unlimited in law) to these minerals? Might this lead to conflict, injustice and over-exploitation?

The planetary scientist pushed the questions away. We do not yet know if anything useful is there, he said, and no one yet has the resources to make lunar mining profitable. His aim is simply to find out what is there, not to worry about the consequences. Given what history tells us, it might seem better to resist looking. At the very least, it seemed shockingly blasé to say that any future conflicts, rivalries and ruination would have nothing at all to do with the curiosity-driven likes of him.

Cook could not foresee the results of his actions. Understanding of the transmission of disease or the consequences of introducing alien species was limited; rivalry for worldwide empire was, as yet, in its infancy, and belief in the virtue of spreading European knowledge and values was firm. Cook is blamed because of hindsight, a little of which should always prompt a greater sense of responsibility today.

Historical images of women using scientific instruments

Cross-posted from The H Word.

Biologist Beatrice Mintz (b. 1921) with microscope
Biologist Beatrice Mintz (b. 1921), with microscope. Photograph: Smithsonian Institution/flickr

 

There is something about one of my Pinterest boards that seems to have caught the imagination. It is, as the platform allows, simply a way of collecting and displaying images that I have culled from elsewhere across the internet, hitting a particular theme. This one is called Women using scientific instruments.

At present, it is only 42 images, from the 14th century to the 1970s, the majority coming from the mid 19th to the mid 20th century. It has largely been created by chance and targeted Googling and offers no narratives and little interpretation. Yet it seems to have provided something that at least some people were looking for.

I started it some time back. Having written a blogpost including an image of putti using scientific instruments, I got into conversation with Danny Birchill on Twitter and mentioned that this had once been a fairly common trope and that, pre-19th century, images of people actually using scientific instruments were relatively rare. Danny was prompted to make his own Pinterest board, Putti of Science (there are many other examples).

This was my introduction to Pinterest, and I set about creating some history of science-themed boards myself. I hadn’t really promoted them but, after happening to mention it on Twitter, Alice Bell tweeted:

Historian of science, @beckyfh has a ‘women using scientific instruments’ board on Pinterest. And it’s a delight.http://www.pinterest.com/beckyfh1/women-using-scientific-instruments/

And it took off from there, with lots of re-tweets and follows on the board. As well as Alice’s “it’s a delight”, comments included “This is so great I may not sleep tonight”, “This gives me goosebumps” and “1st time I understand Pinterest”. I am not sure I have ever put together anything that seems to have had such an overwhelmingly positive response.

It is particularly interesting for me to have been part of this, given that I have sometimes found problems with the way that women in the history of science have been celebrated. Historical facts are rather too often ignored in favour of good stories and the creation of scientific heroes. Yet, the response to this set of images helps remind me how much women in science and science communication need to see themselves reflected in history.

It is also, as someone pointed out on Twitter with a link to this hilarious gallery of stock photography of women, a perfect response to the way women are so often depicted in the media. On my board I have eschewed the modern, posed images of “female scientist” and “woman with test tube”, and instead have largely gathered images of women who actually made use of the instruments they are shown with.

For me, it’s also important than only a few of these women are well known. This is not about creating heroines of science, or making any sort of claim beyond the fact that these particular women were there. It remains obvious that there are far more historical images of men with scientific instruments, and the images also show that women’s experience of science was often mediated by men. But this, and the fact that for much of history they were more likely to be part of the audience or figuring as a muse, should be recalled rather than swept under the carpet.

I, however, will remember the response to this simple collection. It was an excellent reminder that the past does not just belong to historians.

• Rebekah Higgitt will be speaking as part of a panel on Doing Women’s History in a Digital Age at the Women in Science Research Network conference this May. Comment here or tweet her @beckyfh with suggestions for the Pinterest board.

Who’s missing in modern academia: solitary geniuses or something much more significant?

Cross-posted from The H Word blog, where this post first appeared on 10 December 2013.

1974 portrait of Isaac Newton as solitary genius.

When Peter Higgs, of Higgs boson fame, was quoted in the Guardian on Friday as saying “Today I wouldn’t get an academic job” because he would not “be regarded as productive enough”, it prompted much nodding and retweeting from academics.

Coming as it did on the tail of British academics’ rush to complete submissions to the REF (Research Excellence Framework), in a term that has seen two strikes over fair pay in Higher and Further Education and at a time when there are reports of long working hours and other pressure on academics affecting wellbeing , it is hardly surprising that there was sympathy toward Higgs’s negative judgement of today’s focus on “productivity” and publication.

When Higgs was quoted as saying “It’s difficult to imagine how I would ever have enough peace and quiet in the present sort of climate to do what I did in 1964”, many academics undoubtedly heaved a sigh and got back to the marking, teaching preparation, grant application, or whatever other non-research-related activity they were currently engaged in.

It seems, though, that Higgs’s comments struck a wider chord, perhaps because of the extent to which they conform to the stereotype of the solitary scientific genius. His “peace and quiet” of 1964 (aged 35) brings to mind Newton’s escape to his Lincolnshire family home in 1666 (aged 24), and it is contrasted in the article with “expectations on academics to collaborate and keep churning out papers”. This is the kind of thing we want to hear our science Nobel winners saying.

Teaching, which takes up a huge proportion of most academics’ time, is not mentioned in this piece. I have no idea what kind of a teacher Higgs was, but Isaac “lecture to the walls” Newton clearly would have been a flop on Rate my Professor and a liability for a university anxious about its position in the National Student Survey. He would probably have been just as problematic for REF. Although he was to go on to have a staggering impact (or Impact), Newton was famously, for much of his life, reluctant to publish.

In many ways Newton and his mythology became a model for how we think of genius, particularly in the physical sciences. Stories of his youthful moment of inspiration, his forgetfulness, his oddness, his solitariness and his immersion in his work abound. Yet he was also someone who learned not just from books but also from his Cambridge tutors and colleagues and wide correspondence, who made his approaches to the Royal Society with scientific papers and the gift of his reflecting telescope, and who went on to become an MP and to lead the Royal Mint and Royal Society.

Science is profoundly collaborative, relying on communication to peers and students, and collaboration with colleagues and a whole range of other “stakeholders”. It goes without saying that there have, always, been many people doing scientific work who not only put up with but also thrived on all those other activities. Science would not have developed without them.

While there are some, perhaps-justified, fears about modern academia effectively losing the insights of the next Newton, it’s worth recalling the circumstances in which many of the well-known figures in the history of science conducted their work. While they may not have been writing grant reports of marking exams, they were likely seeking patronage, carrying on journalistic careers, undertaking the duties of a doctor or a vicar, teaching, family business or otherwise making a – usually non-scientific – living.

Those who really were excluded were not solitary geniuses who could not find sufficient time for thinking, but those who were, as a result of class, geography, race or gender, never likely to have the opportunity to begin an education, let alone contribute to the established scientific societies and journals. And this affected the science that was done: ample research shows how the norms, assumptions and interests of elites have shaped supposedly value-free science.

Science and academia today remain embarrassingly homogeneous. However, the fear is not so much that we might be failing to find or support working class, black or female geniuses, but that we are more broadly missing out on other perspectives and experiences that would help frame different questions and solutions. It is for this – as well as the good health and useful productivity of academics – that we need to fight not just for better investment in Higher Education, supporting excellent outreach and teaching as well as research, but for a fairer society.

Clock change challenge

Cross-posted from The H Word, where this post first appeared on 27 October 2013.

Alarm clock

Spring forward; fall back. Or was that spring back and fall forward (equally possible)? And will we ever have a mnemonic that works for those of us who usually talk about autumn rather than fall? At least it’s always an hour and only twice a year, right?

It’s hard to imagine, but the person who lobbied most vigorously for the introduction of British Summer Time at the beginning of the last century actually suggested that this confusion be extended over the course of four weeks. At each end of the summer the clocks would be shifted 80 minutes, rather than an hour, in 20-minute chunks. This is what William Willett suggested in 1907 in a pamphlet that worried about The Waste of Daylight.

Imagine having to remember if we were in week 0 or 1, 4 or 5. Weeks worth of excuses for being late for that meeting or missing the train! Today, of course, things are markedly easier, since most of us have devices that update the time they display automatically, but keeping on top of that in the early 20th century would surely have been a significant challenge.

Willett’s enthusiasm for fiddling with the clocks apparently derived from a revelation that hit him when out riding early one morning in Petts Wood. It was a beautiful summer’s day but all around him were drawn curtains and closed blinds. His fellow men and women of were missing out on this joyful and healthful experience. Willett was sufficiently sure of the benefits of “early to bed and early to rise” to be convinced that it might make the whole nation “healthy, wealthy and wise”.

Thus, although people always moan about clocks changing (back to GMT) in autumn, with our evenings and afternoons plunged into increasing darkness, the idea is and was all about making the best of the available daylight in summer rather than worrying about farmers in Scotland or kids going to school on dark mornings in winter. Daylight in winter is scarce at high latitudes, and no amount of clock-fiddling will change that.

It was the additional light in summer that had the potential to improve the “health and strength of body and mind”, Willett thought. He therefore proposed:

that at 2 a.m. on each of four Sunday mornings in April, standard time shall advance 20 minutes; and on each of four Sundays in September, shall recede 20 minutes, or in other words that for eight Sundays of 24 hours each, we shall substitute four, each 20 minutes less than 24 hours, and four each 20 minutes more than 24 hours.

Easy peasy! Willetts pointed out that this adjusting of clock hands – which was, perhaps, a more familiar task in days when timepieces still had to be wound regularly – would provide an “extra” 9 hours and 20 minutes for beneficial activity each summer week. The 20-minutes-a-week-for-4-weeks idea was intended to make the switch less sudden and, therefore, less objectionable. It was, he suggested, akin to the change in time experienced, without ill effect, by those travelling east or west by ship.

He had a point: there was no “ship lag” akin to the jet lag that aeroplanes introduced. There has been research that suggests that the loss of an hour in spring can be enough to cause fatigue and increase road accidents. Perhaps gradual change would be better, so long as half our minds aren’t busy wondering which 20-minute increment we’re currently on.

There are, of course, alternative ideas. There’s so-called Single/Double Summer Time, which would put us in synch with Central European Time, and please a number of campaigners and interest groups. However, when we last tried sticking with BST in winter, in 1968-71, the even longer and deeper morning darkness proved to be a deal breaker.

We could, of course, do away with daylight savings altogether. Perhaps we could extend working hours in summer and reduce them in winter? Or simply shift business hours or starting times rather than clock time. This is what Benjamin Franklin suggested, as he joked that shutters could be taxed and candles rationed to “encourage” his fellow man to make better use of light. Church bells and cannon sounding as the Sun rose might reinforce the point:

All the difficulty will be in the first two or three days; after which the reformation will be as natural and easy as the present irregularity.

Rebekah Higgitt scheduled this post last night in the fond hope that she might get her additional hour in bed this morning. If you check her@beckyfh, however, you’ll more than likely find her moaning about the fact that young children don’t take the blindest bit of notice of the clock.

Women in science: a difficult history

Cross-posted from The H Word blog, where this post first appeared on 15 October 2013.

Caricature of women attending a 19th-century meeting of the British Association for the Advancement of Science
Caricature of women attending a meeting of the British Association for the Advancement of Science.

 

Today, as you’ll probably see from many tweets and blog posts, is Ada Lovelace Day. As the Finding Ada website explains, this

aims to raise the profile of women in science, technology, engineering and maths by encouraging people around the world to talk about the women whose work they admire.

Many of the talks and posts that mark the day will be about mentors and leaders in science today. Many will highlight the sometimes overlooked work of women in the history of STEM.

As I discussed on this blog last year, I find myself somewhat conflicted about Ada Lovelace Day and similar projects that focus on highlighting women in the history of science. On the plus side, I am wholeheartedly supportive of the attempt to encourage young women to think about scientific careers and to appreciate the work of women in the past, when opportunities were even more circumscribed. I am also glad to see stories from the history of science getting wider attention.

However, I am also wary. In celebratory mode, there is a tendency to overplay the work that the women highlighted actually did. There is no better example of this than Lovelace herself, who is wrongly credited with writing the first computer program. Likewise, just as with the heroic “great man” mode of history, focusing on individuals can hide the extent to which science is always a collaborative enterprise. Finally, although some women are rescued from the background shadows, other individuals and groups, equally deserving of attention, remain ignored.

As a historian, I am always likely to be suspicious of the use of history to serve particular purposes, whether that is to get more women into ormore funding for British science. Laudable though those aims might be, there is a risk that the historical evidence will be selected or distorted to suit the current purpose.

Certainly there were (are) remarkable women in all spheres of life, but the more important story is the one that explores why there were so few and how and why their talents – and those of whom we’ll never hear – were wasted. While it’s good to encourage girls and women to have the confidence to succeed in science or elsewhere, we also need them to look at the societies that have made, and still make, this a difficult task.

It always strikes me that should women of the past read some of what is written about them today, they would be hugely surprised and perhaps even offended. Before the 20th century, and often after, women who did scientific work tended to present themselves as a support to science or men rather than as pioneers. Although this is a reflection of the patriarchal society in which they lived, and they may sometimes have said things they did not privately believe in order to appear acceptable, it was their chosen self-presentation.

Recently, for example, a post was published that claimed that William Whewell had coined the word “scientist” to describe Mary Somerville. The response suggested that this is something that people really wanted to believe. However, while it is true that the first published appearance of the word was in a review of Somerville’s book On the Connexion of the Physical Sciences (1834), neither Whewell nor Somerville would have dreamed of its being applied to her. Women, Somerville suggested, did not have original ideas, but the female mind might, as Whewell wrote, provide a “peculiar illumination” in explaining the ideas of others.

Somerville undertook aspects of science that were “women’s work”: writing, translation, popularisation. She also frequently highlighted her role as a wife and mother. Others, who approved of and supported her, did likewise. The American astronomer Maria Mitchell met Somerville and wrote:

I could not but admire Mrs Somerville as a woman. The ascent of the steep and rugged path of science had not unfitted her for the drawing-room circle; the hours of devotion to close study have not been incompatible with the duties of wife and mother; the mind that has turned to rigid demonstration has not thereby lost its faith in those truths which figures will not prove.

Somerville and others underlined the distinction between men and women’s minds and appropriate spheres of activity because of the society in which they lived. Somerville was a supporter of women’s education and it was important to show that learning mathematics and the sciences would not turn young women into unattractive, barren spinsters. Those campaigning for women’s suffrage had a similar choice: emphasise your acceptable femininity or reinforce the stereotypes of Punch caricatures.

We like to think that we have moved on. In many ways we have: women in the UK can vote, be educated and enter careers and remain in them even after getting married or having children. Yet they are still radically underrepresented in the most highly paid and esteemed positions, and overrepresented at the other end of the scale. We are, as Alice Bell explained yesterday, still in a society that asks for female intelligence to be mitigated with the use of lipstick and a focus on domestic details.

By all means celebrate individuals, but understand them as they would have understood themselves. Make sure to think of the society in which they operated and look hard at what has and what has not changed.

Astronomers Royal, scientific advice and engineering

Cross-posted from The H Word blog, where this post first appeared on 12 September 2013.

The collapsed Tay Bridge

This evening, the Astronomer Royal, Lord Rees, will weigh into the debate about climate change and geoengineering in an address at the British Science Festival.

Finding such fixes, as well as more efficient forms of alternative energy, may well be problems focused on by the new challenge prize that Rees has helped set up. That he, as Astronomer Royal, will be judging what has been called a new ‘Longitude Prize’, seems appropriate, but the innovations under consideration may be a long way from his own field of astronomy and cosmology.

Today the post of Astronomer Royal is honorary. It means simply, as Alok Jha’s article on Rees’s speech suggests, that he is “one of Britain’s most senior scientists”. Like a Chief Scientific Advisor, or the head of a scientific society, the Astronomer Royal can be expected to give all sorts of opinions about science and science policy, straying at least occasionally, if they wish, well beyond their area of research.

Was it always like this? Yes and no. Until the 1970s the post of Astronomer Royal was synonymous with director of the Greenwich Observatory (at GreenwichHerstmonceux and then Cambridge). Before the 19th century, the AR was also an active observer, in fact only one of two observers in the institution.

Nevertheless, Astronomers Royal were often called upon to make judgements and offer advice in areas that did not relate to making observations or managing an observatory. Because the Royal Observatory was funded by government, being under the administration of first the Board of Ordnance and then the Admiralty, there was potential for them to be asked to consider a whole range of technical and scientific issues.

For much of the AR’s history, the most obvious place in which this happened was the Board of Longitude. While many of the ideas under consideration were astronomical (involving knowledge of astronomical theory, mathematics, optics and instrumentation), others were based on geomagnetism or, of course, horology. Understanding clocks and timekeeping was essential to astronomy, but the specifics of horological theory and manufacture would have been beyond the AR’s experience.

ARs also advised on areas like cartography, instrument design and weights and measures, that involved techniques closely allied to astronomy. But they were also asked to consider a wide range of fields of interest to the Admiralty and other branches of government, simply because they ended up being their available scientific expert.

One of the ARs who most obviously became the government’s go-to scientific and technical guy was George Airy, who was in position from 1835 to 1881. Airy covered a great deal of ground, intellectually and practically. Unlike all his predecessors he was not much involved with daily observations and he had a significantly larger workforce at the Observatory, onto which observation, calculation and even management could be delegated.

Airy, for example, did a considerable amount of work on the effect of iron ships’ hulls on compass use and design. He also advised, like many other ARs, on education and he was involved in the organisation of the Great Exhibition. He was, perhaps most intriguingly, called in to advise the Great Western Railway on track gauges and the engineer Thomas Bouch about the pressures that might be exerted by wind on the planned rail bridge crossing the Forth.

That latter advice got him into trouble. It was first applied by Bouch to the Tay Bridge and, when that collapsed in 1879 [see image above], Airy was called in by the enquiry. He claimed that his advice had been specific to the circumstances of the Forth and the design for that bridge (which was now speedily discarded). The enquiry agreed, suggesting that Bouch had “must have misunderstood the nature of [Airy's] report”.

Airy did know quite a lot about engineering. He was, apart from anything else, closely involved with the design of large instruments and their mounts at Greenwich. Times and the nature and range of expertise have changed considerably since the 19th century, however. Lord Rees is not an Astronomer Royal who can offer specific or technical engineering expertise, rather he is calling for research and funding. Whether or not you agree with his statements is a different matter.

Barbados or bust: longitude on trial

Cross-posted from The H Word blog, where this post first appeared on 9 September 2013.

barbados

Barbados beach scene (perhaps not quite what Nevil Maskelyne experienced in 1763)

On 9 September 1763 a young curate and Fellow of Trinity College, Cambridge, set off for Portsmouth. He was to travel to Barbados on a voyage that would test the accuracy and practicality of three different methods of finding longitude at sea. At stake were potential rewards from the Board of Longitude.

The curate, Nevil Maskelyne, was also an astronomer and mathematician who became Astronomer Royal in 1765. I am currently editing a book of essays centred around Maskelyne, which, like the book I am co-authoring on the history of longitude, is due out next year for the tercentenary of the first Longitude Act. Working toward that anniversary, I spotted this one.

Back in 1763, Maskelyne was instructed to do two things. Firstly, he was to make longitude-determining astronomical observations during the voyage and, secondly, to make observations on land when the ship arrived in order to determine the island’s position, a prerequisite for an effective trial.

The three “methods” under trial in 1763 would be deemed successful if they succeeded in predicting Barbados’s longitude to within a degree or half a degree. They were:

1) A marine chair made by Christopher Irwin that was intended to steady an observer to allow him to measure the positions of Jupiter’s satellites at sea. (Eclipses of Jupiter’s moons were already used as a celestial timekeeper* to determine longitude on land: these were the observations Maskelyne made at Barbados.)

2) The latest version of the lunar tables of Tobias Mayer, which helped predict the position of the Moon and allow it to be used as a timekeeper using the lunar-distance method.

3) The latest mechanical marine timekeeper, and first sea watch, made by John Harrison.

Maskelyne and his assistant, Charles Green, were to make the ship-board observations and calculations necessary for the use of the first two methods. Harrison’s watch, now known as H4, would travel out separately with Harrison’s son William.

All of the methods worked in theory; the sea trial was to establish whether they worked reliably in practice. Only Irwin’s chair was a failure. Remarkably, two plausible methods of finding longitude had, finally, come to fruition at almost exactly the same time:

1757: Mayer sent his theory of the Moon’s motion to the Board of Longitude. It proved capable of making pretty good predictions – an object that had defeated Isaac Newton’s best efforts. Harrison, who had received rewards amounting to £2750 during 1737-1757, abandoned the development of his large marine clocks (H1H2H3) and thew his efforts into his watch.

1761: The potential of Mayer’s tables and the lunar-distance method was demonstrated by Maskelyne and his assistant, Robert Waddington, during a voyage to St Helena, where they had been sent by the Royal Society to observe the transit of Venus. Harrison sent his watch on trial to Jamaica and claimed an excellent result. Unfortunately, the trial was declared void because of uncertainties about the longitude of Jamaica and the watch’s rate, Harrison had to make do with another £1500.

1763: The Barbados trial was the really significant one – Mayer’s tables (via the lunar-distance method) and Harrison’s watch were both officially found to have met the necessary criteria. The Board of Longitude had two methods on their hands… potentially.

The lunar-distance method was complex and time-consuming and could only become useful if enough navigators were trained to undertake the required observations and calculations. Ideally, part of the work needed to be done for them, via the publication of regularly updated predictive and pre-calculated tables.

Harrison’s watch had worked well, but the question was whether another such machine could ever be made. Could one be made by another workshop? Could a marine timekeeper be made that was less costly than the exquisite H4?

In 1765, an Act was passed that divvied up the spoils and aimed to help make these potential methods “practicable and useful”. Harrison would receive £10,000 only if he revealed his method (i.e. the mechanism and the methods and materials involved in the construction of his watch) to other artisans. A further £10,000 would be paid out if more timekeepers could be made and successfully tried.

Tobias Mayer had died in 1762, but £3000 was paid to his widow in return for his papers. £300 went to the mathematician Leonhard Euler as a reward for his equations, which had greatly enhanced the accuracy of Mayer’s tables. A further £5000 was held out as a reward for the future improvement of the tables and, perhaps most significantly, the Board committed to the regular publishing of a Nautical Almanac, to be overseen by the brand new Astronomer Royal, Nevil Maskelyne.

The Barbados trial was not a competition or a race for a prize, although Christopher Irwin certainly found his marine chair out of the running. Rather, it confirmed two promising methods that required further investment. The Board of Longitude committed to this, seeing that they were not mutually exclusive. The lunar-distance method could be made available more quickly and was the only means of checking the performance of a ship-board timekeeper.

While Harrison’s paranoid belief that Maskelyne was prejudiced against him and his watch has become the dominant version of this story, it is not backed by the evidence. As Astronomer Royal and Commissioner of Longitude from 1765-1811, Maskelyne was to aid the development of both of the methods that his 1763 voyage had helped to prove.

* The difference in longitude of two places is equal to the difference in their local times.