Employability may be an ugly word, but it is increasingly an important part of teaching and learning at my higher education institution. In a world of tuition fees, student satisfaction scores and information gathering about leavers’ destinations, I imagine that its importance will also continue to grow. Whilst I am not a fan of any of the aforementioned trends, employability is something that I have been thinking about. If students are spending their time and resources on degree study because they think it will make them more employable, then they should be reflecting on what precisely it is they have learned that makes them distinct from people who didn’t attend university or who took a different course. They need self-awareness about their own development and the ability to articulate this to a potential employer in a meaningful way.
If you want to know more about employability then the HEA has a framework for ‘embedding’ it in your institution, but if you haven’t got time to wade your way through this, I can tell you that the sorts of provision that universities offer include: help with CVs; mock interviews; confidence building activities; work experience/placements; shadowing; help researching the job market etc.
Which leads me to wonder…what is the role of the academic tutor here? Is it our responsibility to talk about skills and ‘employability’ in our history seminars and lectures? Or is that something that is better left to the professionals in Careers Services? If tutors do have a role, what is it? Continue reading
Update: links to other posts in the #histsources series:
Diaries: sources that gently bruise the consciousness
Doing history by numbers: bibliometrics and counting things
Court depositions: the stories we tell about ourselves
Churchwarden accounts: teeming with all sorts of life
Primary sources are where histories come from. The stuff left over from the past that by accident or design has survived down to this day is the lifeblood of historical study. Sources are our direct (if not always reliable) witnesses to the events, people and processes of moments now long gone. The creative and self-aware use of the complexities of evidence often produces the best histories as historians read against the grain, contextualise, and dissect the stuff of the past to extract new meanings from it.
In 2000, Ludmilla Jordanova wrote that ‘there has been a decline in (primary) source-based undergraduate teaching’, but in 2016 it certainly feels like the opposite is true. Partly thanks to the internet (although printed transcriptions remain a vital resource), primary sources have never been more available or accessible for university lecturers and their students. Given that history is to some extent defined by its methodology, it doesn’t make sense not to use primary materials with undergraduates – how else to teach the dynamic relationship between the sources, the historian and their history? How else to understand the vantage points that we can and can’t find on what happened in the past? Continue reading
The many-headed monster’s mini-series ‘On Periodisation’ really struck a chord with our readers, prompting an outpouring of comments both below the line and on twitter. I have captured many of these in this Storify – thanks so much to everyone who took the time to offer their thoughts, and my apologies to anyone whose comments I missed, but it was hard to keep up!
The digested version is that comments tended to fall into three categories: those who were prompted to reflect on periodisation in relation to their own research; those who offered a transnational perspective; and those who added an interdisciplinary slant to the discussion. Whilst debates on this topic are a constant of historical research, social media has the benefit of creating a more diverse conversation which encourages broader perspectives and raises new complications. If the debate continues I intend to add to the story in due course, so please do join the conversation.
My original intention was to try to summarise these contributions in another post, but when it came to it I struggled because the responses were both (a) too various, and (b) too contingent. Thus this post instead focuses on the shared responses to periodisation, in the form of a series of questions people ask about it. Continue reading
This is the first post in our new Monster Mini-Series on periodisation. Click here for the Series introduction.
There are many different ways to divide the past up into analytical chunks, but some ways are more popular than others. In this post I offer a brief overview of some of the most common periodisations. It is of course a broad brush summary with a tendency to generalisation. Please do flesh it out with your own comments and refinements below the line – we have had a great response to the series on twitter and I will be collating many of these contributions for a later blog.
Binaries: modern / pre-modern
Starting with the simplest division: if you are short of time, the strongest chronological distinction often appears to be between the modern age, and all the stuff that happened before it. You know, when everyone was blindly superstitious, 99% percent of the population spent their lives covered in sheep poo, there was no electricity, no penicillin, no roads and subsequently a bunch of kings ordered everyone about whilst riding over-mighty dragons. Or something. Sometimes undergraduate ‘survey’ modules are organised along these chronological lines: at Exeter the ‘pre-modern’ module covers c. 500-1750; the ‘modern’ module covers 1750-present.
Pre-modernity: knights, witches, muddy peasants and that sort of thing.
Of course this rather oversimplifies things. Europe in 700 didn’t look or feel anything like Europe in 1700. The divide also massively prioritises the last 200 or so years of history and diminishes the previous 1,300. It is a useful shorthand for showing potential undergraduates the breadth of your teaching programme or identifying yourself at a multi-disciplinary event, but not much more.
The holy trinity: medieval / early modern / modern
In Europe and North America, history is often chopped into three: the medieval (c. 500-1500), the early modern (er… let’s say c. 1500-1800) and the modern (c. 1800 – present). Continue reading
This is the introductory post to our new occasional Monster Mini-Series on periodisation in history. In a series of related blogs we will be exploring historical chronologies, examining the ways in which we chop up the past into more digestible chunks. We are interested both in how we do this, but also why, and what the consequences are for both how we conceive of the past, and how this in turns effects the organisation and practice of the discipline.
The University of Exeter’s community day: What does early modern mean, and who is this strange horned and hoofed man?
I quite often find myself needing to explain what period of history I work on, and it isn’t always straightforward. For example, my students are often not familiar with the term ‘early modern’ and remain convinced that anything that isn’t modern is ‘medieval’. At the University of Exeter’s recent Community Day, one of the most popular questions from visitors to the ‘Centre for Early Modern Studies’ stand was: what dates / centuries / years does that refer to? (In case you are interested, the other top question was: ‘what’s the devil’ from children colouring in woodcuts of said infernal being).
This is an issue that has a bearing on the many-headed monster too. We self-identify as an early modern blog, and the Monster’s tagline is that we offer “‘the history of ‘the unruly sort of clowns’ and other early modern peculiarities”. Whilst fellow academics might have a grasp of what that means, the blog aspires to reach an audience beyond professional historians, to whom the term may appear rather opaque.
The answer to what ‘early modern’ means is, of course, 1480-1700. Or perhaps 1500-1750. Or maybe 1450-1800. Actually it really depends. Oh, and if you are outside of Europe and North America you might not recognise the term at all, being equipped with a completely different way to think about your national past. Continue reading
Every year, universities across Britain and beyond place hundreds of advertisements on jobs.ac.uk seeking to appoint historians to academic posts. Although accessing this data is not easy and analysing is far from straightforward, recent listings do provide some potentially useful information about the state of the academic job market for historians.
In my previous posts on jobs, I have focused primarily on the ‘supply side’ of the equation – how many people are completing PhDs each year and how that is changing. It is much more difficult to get hard data on the ‘demand side’ – i.e. jobs available – though I’ve tried various proxies such as the numbers of historians with university posts or the number of incoming history undergraduates. The cohort studies of Warwick and Cambridge PhDs allowed me to link doctorates to academic posts more directly, but that was only a small and probably unrepresentative sample.
New data and methodology
The advertisements on jobs.ac.uk allow us to get a better sense of how many jobs are actually listed in a given year. Continue reading
My analysis of official public data suggests that the number of PhDs in history in the UK is growing significantly almost every year, whereas the number of undergraduates and university-based historians is expanding only very slowly. However, these figures are only really useful for providing a sense of changes in the relative balance between undergrads, teachers and PhDs over time. They do not provide any concrete sense of the likely destinations of successful doctoral candidates.
In September, Rachel Stone, a medievalist at KCL, put together a very useful ‘cohort study’ of the 66 people who had completed PhDs in history at Cambridge in 2005. Her headline result was that 36 of them (55%) seemed to have academic posts a decade later. She also found that modernists and men were more likely to have academic jobs than medievalists and women, though she noted that it is difficult to know which factor is the most influential as women were more likely to be medievalists. Katrina Gulliver did a similar analysis of those granted Cambridge history PhDs in 2007-8 and found that ‘fewer than 50% have a permanent academic job’ after seven years.
I did my PhD at Warwick (completed 2009, awarded 2010), so I thought it might be useful to look at some cohorts there as a comparison. However, as the Warwick History Department is much smaller than Cambridge’s History Faculty, I decided to look at a longer period, namely 2001 to 2013. Continue reading