Sign up for Our Newsletter

Two Pence

Two Pence features opinion pieces on issues that affect Britain, British history, and Britain’s interactions with the wider world.

The Cambridges in Australia: Balmorality 2.0?

Helene von BismarckDr. Helene von Bismarck, Associate Editor

Students of the British Empire and students of the British monarchy are faced with a similar dilemma: the lack of a clear set of rules that makes either system intelligible. The Empire, with its wide variety of complicated constitutional arrangements, its formal, semi-formal and informal parts, may be largely gone today, but its successor organization, the Commonwealth of Nations, can only be understood if it is put into the context of the gradualism and pragmatism that characterized British imperialism for centuries and that left its legacy for substantial parts of the globe. A historic approach is also essential for those seeking to grasp the political and cultural relevance of the British monarchy, an institution whose role has continuously evolved and changed over time in a country with no written constitution. The recently concluded visit to New Zealand and Australia by the Duke and Duchess of Cambridge and their little son, Prince George, has been a good opportunity to look back on the historic relationship of monarchy and Empire-Commonwealth, and to discuss the consequences of the past for the present. In many ways, this royal tour can be interpreted as an attempt to breathe new life into two interrelated concepts that have defined the role of the British monarchy and its connection with the Empire, and later the Commonwealth, for at least a century: family and visibility.

When Victorian writer Walter Bagehot famously remarked in his 1867 study of The English Constitution that ‘a family on the throne is an interesting idea’, he probably was not thinking of a baby prince on a play date in New Zealand.[1] However, little Prince George’s recent encounter in Wellington with a group of fellow toddlers selected carefully – and with due political correctness – from among his future subjects, and the Duke and Duchess of Cambridge’s informal meeting with their parents, could have easily been used by Bagehot as an example to illustrate his argument that the advantage of a family monarchy laid in turning the otherwise abstract concept of sovereignty into something ordinary people could understand and relate to.[2] The pictures of the happy young family on their trip Down Under fit neatly into a tradition which began with Victoria and Albert and their numerous children, was continued and reinforced by George V. and Queen Mary, George VI. and his wife, Queen Elisabeth, and, of course, Queen Elisabeth II., but was drastically interrupted during the 1990s, when one widely publicized family drama after the other struck the House of Windsor. The whole idea of family monarchy has been based on the condition that the royal family would live and behave in a certain manner, conveying a set of values, ‘Balmorality’, as Sir David Cannadine calls it, that do not include scandal, adultery or divorce.[3] At least until now, the Cambridges have met that set standard, and in an age when any photograph of them travels the globe in an instant, everyone can see it.

The fact that the Duke and Duchess have chosen this trip to Australia and New Zealand, rather than public engagements in Britain, to finally give the world more than a glimpse of Prince George, and to introduce themselves as a small family jointly working for the monarchy, is also meaningful, and not only because of the continuity demonstrated by a visit of two future kings to their subjects on the other end of the globe. Since the later years of Queen Victoria’s reign coincided with the high age of imperialism, the British monarch has been portrayed and staged as the head of two different families, one real, one metaphorical: the royal family, and the ‘great imperial family’.[4] This was part of a process in the course of which the Empire gave the British monarchy a new role after it had been deprived of most of its political power at home: to act as a symbol of the connection between Britain and its colonial dependencies. Splendid ceremonial events, such as the Dehli Durbars of 1877, 1903 and 1911, when Victoria, Edward VII. and George V., were proclaimed Empress and Emperors of India, were used to reinforce that image.[5] The strategy of being seen and, following the invention of the wireless, being heard, both in Britain and across the Empire, was taken to a new level by King George V. and his wife, Queen Mary, after the First World War. Shaken by the series of revolutions that had swept away the monarchy in so many countries on the European continent and cost the King’s cousin, Tsar Nicholas II., his life, George V. and his consort added new elements to their roles by making themselves visible to the people, putting a greater emphasis on royal charity, cooperating with the press, broadcasting royal speeches and dispatching their grown-up children to tour the Empire. Their legacy for the British monarchy could be felt throughout the twentieth century and continues to do so.[6]

The Empire no longer exists, but the metaphor of family ties that have bound its former components together has survived and continues to resonate to the present day.[7] Interestingly, the use of this rhetoric is not one-sidedly British or restricted to the royal family. In his speech on ANZAC day 2014, delivered during a war memorial service attended by the Duke and Duchess of Cambridge, Australia’s Prime Minister Tony Abbott underlined the ‘family’ ties that in his opinion continue to bind his country to Great Britain, even if the latter could no longer be called Australia’s ‘mother country’.[8] New opinion polls show that support for the monarchy in Australia is much stronger now than it was during the 1990s, and especially pronounced among the young generation. This contradicts earlier expectations that the role of the monarchy in the Commonwealth realms would automatically phase out in the post-imperial age.[9] The question remains, of course, whether the revived popularity of the monarchy in Australia can be regarded a sign of a continued link with Britain, or whether it results from the fact that, in the age of digital media, the Duke and Duchess, and now also their little son, have turned into global icons that are likely to attract great interest anywhere, not just the Commonwealth. In any case, the masterfully staged royal tour to Australia and New Zealand has shown that the next generation of the British royal family is ready to embrace the strategy of public Balmorality. The most important prerequisite for its use is in place: people are watching.

 


[1] Walter Bagehot, The English Constitution (London 1963), p. 117.

[2] On Bagehot’s line of argument see Philip Murphy’s enlightening new book, Monarchy and the End of Empire. The House of Windsor, the British Government, and the Postwar Commonwealth (Oxford 2013), pp. 1-2.

[3]David Cannadine, History in our Time (London 1997), pp. 3-4.

[4] Cannadine, Ornamentalism (London 2001), p. 119; History in Our Time, p. 4.

[5] Cannadine, Ornamentalism, p. 101-107.

[6] Miranda Carter, George, Nicholas and Wilhelm. Three Royal Counsins and the Road to World War I (New York 2009), p. 422.

[7] Murphy, Monarchy and the End of Empire, p. 3.

[9] On the prominence of this perception during the age of decolonization see Murphy, Monarchy and the End of Empire, p. 8. 

A Manifesto on Academic Integrity

Bryan S. GlassGeneral Editor

When we think of academic integrity the first issue that comes to mind is cheating.  But does cheating single-handedly define academic integrity?  What other types of behavior might fall under its auspices?  The time is right to define academic integrity.

Cheating is very straightforward.  From the time we enter primary school we’re taught that cheating is unacceptable.  Cheating entails, among other things, looking over the shoulder of a fellow classmate during a test, committing plagiarism, or paying someone else to write a paper or sit an exam for you.  For all of us, cheating deserves the harshest penalties.  Full stop.

Cheating, however, is just the most obvious, and most easily punishable, stain on academic integrity.  Another issue of academic integrity, and the focus of this manifesto, revolves around the academic publishing industry.  It is a time-honoured practice that when you submit a book proposal it must not be under consideration by any other publisher.  You submit to one publishing house at a time.  In order to prevent this from occurring, our Book Proposal Form, which all potential authors must fill out, states the following:  ‘if … the proposal is under consideration by another publisher … please do let us know.’  Upon learning of a dual submission we will refuse to review the proposal.  The reasons are straightforward:  loss of time and money.  Palgrave pays reviewers either in money or in books to review proposals once the editors have read through and approved the offered monograph as a fit.  The loss of time and money caused by the duplicity of authors who are trying to see who offers them better contract terms should not drive academic publishing.  After all, an author is always free to turn down a contract offer from a publisher and then submit elsewhere, but book submissions should never occur concurrently.  Authors should always submit to their first-choice publisher and wait to see what happens.  To submit concurrently angers the publisher and places the author in an ignominious position.  It is safe to say that the author will never be able to get published by the spurned publisher in the future.  The academic book publishing industry has a long memory.

But what should we think of the rule, imposed by journals, that you may only submit your article to one venue at a time?  This is a bit trickier than with book proposals.  If you need to get an article published within six months to save your career (especially in Britain) in the publish or perish atmosphere of academia and it takes nearly that long to get a response from your targeted journal, you could be in jeopardy of losing your job.  What if the journal wants you to revise and resubmit?  What happens if they simply turn you down?  Game over.  With our journal, Britain and the World, I am the first to admit that we haven’t always completed the peer-review process rapidly.  In fact, given our own track record, I do not believe that submitting an article to numerous journals simultaneously is an integrity problem.  It is a measure of desperation in the interests of self-preservation.  However, it is neither healthy nor productive for journal editors to spend a great deal of time determining whether an article should be sent for peer-review and then having it reviewed only to be told by the author that he/she has secured publication with another journal.  This makes editors angry and causes ill-will towards the author.  So I propose an alternative for journal submission acumen. First, journal articles must be reviewed within two months of receipt.  Under these strict time constraints, authors must adhere to the time-tested rule of submitting to one journal at a time.  Once the two months has passed without an answer, the author should feel free to send the article to another journal for publication consideration.  Under this strict two-month policy, the author will be able to submit the article to six different journals during the course of a year.  This model will provide the author with the peace of mind that an answer on their research is coming by a certain date and it should end the problem of concurrent submissions.  Concurrent submissions make authors look bad as editors question their academic integrity.  And rightfully so.  But journal editors must understand that part of the problem lies with them and only they can correct the turnaround times on submissions.  As a model, Britain and the World is implementing this policy from January 2012.  All articles submitted to us will be peer-reviewed and returned to the author within 60 days.  This system eliminates the perceived need to concurrently submit and upholds the academic integrity of pressured authors.  It is the right solution for our time.

So what, you may be asking, is the difference between book publishing and journal publishing?  Why are concurrent submissions of book proposals always condemned as immoral?  Books take a much longer time to publish than journal articles.  You don’t wake up one morning and say I need to get my book from proposal to published in six months.  Many academic monographs can take between five and ten years to complete.  There is a great deal of time built into the tenure and promotion (or self-preservation) system for the publication of a book if you keep to schedule.  This is not always the case with journal articles.  While dual submissions are never acceptable, it is more understandable when they happen with journal articles as opposed to monographs.

Academic integrity plays a major role in the lives of all academics.  No one wants to be tarred with the duplicity brush.  But it is necessary to establish ground rules to prevent academic dishonesty from taking hold in these trying times.

Legacy of Lucian Freud

Rebecka Black
University of Houston

According to the Huffington Post on 14 October 2011, a small portrait of a young boy painted by the late Lucian Freud, who passed away in July 2011, has sold for £3.2 million to an anonymous bidder. Boy’s Head, painted in 1952, depicts Freud’s neighbor Charlie Lumley who became a recognizable subject for British figurative painter Lucian Freud. Lumley, who is now 79, still recalls his visits to sit for Freud.

Lucian Freud is most remembered for his portraiture works and close friendship with fellow 20th century British painter, Francis Bacon. In 2008, Freud’s painting of an overweight nude woman set a record for highest selling price for a living artist’s work. Although the portrait of Charley Lumley is small, its selling price is representative of the stature of Freud himself within art history.

Freud, who was also the grandson of Dr. Sigmund Freud, is considered somewhat of an anomaly in 20th century painting. His painting style remained figurative – depicting recognizable forms – throughout the century despite the demands of Modernism for complete abstraction of forms, a la Jackson Pollock drip paintings, or the color field explorations of a Mark Rothko piece with its large blocks of color absent of any distinguishable forms or figures. Also separating Freud from the avant-garde of mid-century British art, specifically, was his figurative style’s distinction from Pop-Art, which is accepted as beginning in England with artists such as David Hockney, but made famous by American artists such as Andy Warhol and Roy Lichtenstein, among others. The slick mechanical style of Pop-Art which often included collage work provided a cultural critique of post-war consumerism and was in direct opposition to the painterly style of portraiture which emphasizes evidence of the artist’s hand through visible brushstrokes. Where Pop-Art attempted to remove the humanity, figurative works and portraits embraced the human form.

It was this figurative style of artists such as Lucian Freud which maintained a connection to the celebrated early 20th century portraiture traditions in England. Freud’s style of portraiture acknowledges this heritage by showing the subject, not as a sitter, but as an individual; Charley Lumley is not presented in ideal terms, instead he is presented as he is, a young boy perhaps bored, perhaps daydreaming, but still a figure worthy of study. It is this portrayal of value given to the figure which allowed Freud’s figurative style to weather the onslaught of mid-century abstraction and Pop-Art, which aimed to de-humanize art and emphasize formal aesthetic qualities or critique consumerism. For this reason, Freud’s works remain invaluable within British history and art history.

The Contemporary History of House of Lords Reform

Martin Farr
Newcastle University

At the centre of the conundrum that is the British constitution is the House of Lords, reform of which is once again being discussed by Parliament, though conspicuously not by the public. The enduring mystification inherent in the subject is clear from first principles: the upper chamber is in practice the lower chamber. Since the 1911 Parliament Act the House of Commons has been superior to the House of Lords. 1911 was however also the last major reform; that it was also the first major reform is not a coincidence. The 1911 Act mattered greatly, as hitherto an unelected chamber could thwart the will of that which had been elected. With that clear democratic outrage removed, there was no consensus over subsequent reform: though the Act conveyed the intention “to substitute for the House of Lords as it at present exists a Second Chamber constituted on a popular instead of a hereditary basis”, with some perspicacity it went on that “such substitution cannot be immediately brought into operation”. Thus, one hundred years on, with no public interest, therefore no partisan political benefits, and so in turn no momentum, the situation remains.

There have been many attempts at reform. Of those that succeeded, the most important were the 1958 Life Peerages Act, which provided for the creation of life peers (and so for the admittance of women for the first time), and the 1999 House of Lords Act, which reduced the number of hereditary peers from 747 to 92. The 1999 Act was typical: the product of compromise in the absence of consensus, avowedly only part of a process, but still greatly controversial to those actually exercised over the issue. It began a process which continued to a Royal Commission and the Wakeham Report of 2000, which led to public consultation that produced no consensus, and hundreds of different opinions, before a Joint Committee of both houses offered parliamentarians seven options, ranging from a wholly-appointed, to a wholly-elected house. None of the options gained a majority. A Department for Constitutional Affairs was established in 2003, demonstrating that whatever else it was, the Blair governments were certainly the most radical in relation to constitutional change since those of Asquith before the First World War. The new Department managed reform of the judiciary, but Lords reform again foundered on the absence of any agreement, or, indeed, any will.

The latest initiative comes amidst an unprecedented period of constitutional upheaval, as Nick Clegg, Deputy Prime Minister, one hundred years after Asquith – one of his predecessors as Liberal leader – sought to use the unexpected and probably fleeting opportunity of a presence in government permanently to modernise the constitution. The first reform, and the most pluralistic, has already been lost, in a referendum in May 2011: electoral reform. The embarrassment of that defeat at least gave face-saving momentum for Clegg to pursue Lords reform; to demonstrate the progressive constitutional vitality of at least the Liberal Democrat part of the Coalition. Another of Clegg’s predecessors, David Steel (a life peer since 1997) has suggested a bill to break the deadlock, which would replace 1958’s unaccountable system of patronage with a Statutory Appointments Commission, create a system of retirement (peers are peers unto death), the ability to remove peers guilty of serious transgressions, and, by ending the system of by-elections when hereditary peers die, and converting the existing hereditaries into Life Peers, finally remove heredity from Britain’s legislature (though not of course from the constitution).

Even if Steel’s suggested bill were eventually passed, the essential issues of the second chamber would remain unresolved: whether it should be elected or appointed, or a mixture of the two, and if so on what proportions. Some maintain that no legislator should be anything but elected; others hold that an elected Lords might effectively reverse or at least rebalance the 1911 settlement and claim democratic legitimacy over the Commons, or at the very least confuse matters. Then there is the nature of the members of the Lords themselves: if elected, the problems of party politics may be reproduced in a chamber currently characterised by its relative distinctiveness from (increasingly unpopular) machine politicians, whilst also in all likelihood doing away with the accumulated years of experience in all areas of national life that the present 789 peers can call on when scrutinising legislation, peers who would be unlikely to want to stand the rigours of campaigning if they were required to be elected. If members were appointed, who should appoint? There appears at least to be a consensus that a hybrid system – of election and appointment – should be introduced, but no agreement whatever as to the proportions has been, or is likely ever to be, reached. More fundamentally one could indeed ask whether there needs to be a second chamber at all. Such debates have taken place, and continue, in the face of widespread public indifference.

Public indifference, however, is not a reason to do nothing about an issue; it is however the most specious of the claims by those resisting any change, just as it was to those opposing electoral reform. Nor should one refrain from stating that elites are not inherently undesirable if those elites are open, any more than that voting does not necessarily equal democracy: a democratic second chamber therefore need not be elected. Those who argue that every legislator has to have been elected or else is it not democratic may be called democratic dogmatists, implying as they must do that the Commons is a model of the form, and overlooking that the Lords have consistently displayed independence of mind based on wider life experience, and a consequent freedom and independence of mind. Lords reform is not any more a matter of left and right, as the unholy alliance of Michael Foot and Enoch Powell demonstrated when it derailed another doomed effort, in 1968. Indeed, the most recent impetus for ‘democratising’ the Lords has come from the right of the Conservative Party, after the Lords defeated two ‘democratising’ bills of the present government for elected police chiefs and for a ‘referendum lock’ before ratifying future European treaties, measures for which the term “democratic dogmatism” could have been invented.

A minority, easily derided, support a wholly-appointed chamber, to preserve the best of the old, and with the creation of a transparent, inclusive, and rigorous, appointments commission, and a correlational retirement process, introduce the best of the new. The Steel ‘bill’ is as close to such a reform as has been mooted, though it fulfils the wishes neither of Asquith nor Clegg. Unlike the monarch, another component of the conundrum, which has in common with the Lords the probability that it would not exist in any constitution that had actually been created rather than had merely evolved, the Lords holds no public attention one way or the other. Unlike the winning campaign in the referendum on electoral reform, however, it will not be subject to ‘democracy’ in its purest form: a direct poll of that minority of the electorate that could be bothered to vote after a campaign disfigured by far from impartial and heavily funded publicity and press coverage. So, in the absence of ‘democracy’, and even the failure of the Steel bill, the upper chamber seems likely, in its anachronistic way, to continue to exercise scrutiny and restraint on the lower chamber, reform of which is long overdue.

The “Essential Relationship”?

Bryan S. Glass
University of Texas at Austin

At first glance, Barack Obama and David Cameron appear to be strange bed-fellows.  Given the short and rather distant relationship between the Obama Administration and Downing Street, many people were caught off-guard when the two leaders of the most powerful Anglophone countries in the world co-wrote an op-ed piece for The Times.  Perhaps even more unusual was the topic “Not just special, but an essential relationship.” Yet given the economic, cultural, linguistic, philosophical, political, religious, and historical ties between the two countries, this op-ed should not have surprised historians of Britain, America, and the world.

We should always expect Britain and the United States to be close.  Of course, there have been disagreements in the past between the two countries, some being much more severe than others, as in the case of the War of 1812. Overall, though, these disagreements have been overshadowed by a sense of cooperation and fraternity that is underscored by our common heritage.  The British Scholar Society is an organization dedicated to understanding Britain’s interactions with the entire world from the seventeenth century to the present, and few relationships are easier to understand than that between Britain and the United States.  Perhaps most importantly for the interested layman, we speak the same language, which drives the high level of understanding between our two countries.  This linguistic bond, it is true, should never be underestimated.  But history also gives us insight as to why the two countries, barring something catastrophic occurring, will always remain close.  The common values shared by Britain and the United States include the rule of law, representative government, dedication to human rights, and a free-market economy.  The commonalities are pointed out by President Obama and Prime Minister Cameron when they state that “we look at the world in a similar way, share the same concerns and see the same strategic possibilities.”  This similar worldview has only been possible by the test of time and its concomitant trials and errors.  You need not look far to find disagreements between the two countries.  Most recently, there have been differences as to the most effective way to handle the economic crisis.  And no analysis of major disagreements would be complete without discussing the twentieth-century Anglo-American conflicts over the Suez Crisis, Palestine, civil rights, and imperialism.  The truth, however, is that all of the disagreements between the two countries over the past 235 years cannot mask the innate connections that make the relationship unbreakable.

History bears testament to the bonds of kith, kin, and ideology that have always made the relationship between the US and Britain more than just special.  Language emboldens the connection.  The joint article by President Obama and Prime Minister Cameron should not be viewed as something groundbreaking.  The relationship between Britain and the United States has always been essential.  This is just the first time that it has been articulated as such.

At The British Scholar Society we applaud the two leaders for recognizing the importance of committing “to strong collaboration between our universities and research facilities.”  As we are dedicated to increasing understanding of the interactions of these two countries, we believe that such collaboration will only bring benefits.  With this conviction in mind, we anticipate a stimulating and successful annual conference at the University of Edinburgh in 2012.  Let us hope that from now on the relationship will always be viewed as essential.  History already attests to this reality.

  • Sign up for Our Newsletter