Sunday, July 13, 2014

Ritter Sport ratings 2014

Some more Ritter Sport. The first two I picked up and reviewed in Marburg at the DGfS conference but must not have got round to posting. The next two, Eiscafé and Chocolate Mousse, I picked up myself while passing through Munich on the way to and from Budapest for a conference. The last two were gifts from Tine Breban, who thoughtfully raided a petrol station to fuel my addiction.

Baiser Nuss: 8/10
These very fine, dry, crunchy nuts give one a taste experience akin to walking over gravel, with a warm autumnal aftertaste. Highly recommended.

À la Crema Catalana: 6/10
Not much of a likeness to the eponymous dessert (though there are hints). Again, rather yoghurty, with a strong and cloying aftertaste.

Eiscafé: 8/10
This experience is eerily reminiscent of the Continental caffeinated beverage with ice cream. That combination of warm and cold, in a chocolate bar? No way? Yes way, apparently.

Chocolate Mousse: 7/10
Ritter cleverly divided their square into 3x3 rather than 4x4 "für mehr Mousse-Genuss". That gives a nice squidgy bite to this one and to the below. Ultimately a bit bland, though.

Vanilla Mousse: 7/10
As above, really. Very creamy, lots of vanilla. Straightforward.

Caramel Orange: 9.5/10
Oh. Oh, my. I've carped on in the past about Ritter combining citrussy sharpness with much mellower tones, and this one really delivers the goods. The caramel is dreamy enough to float away on, while the orange delivers a sharp kick up the backside. My life is substantially better after having eaten this one. A rare Winter-Kreation, picked up out of season by Tine, for which I am eternally grateful.

Wednesday, April 02, 2014

Open Access Linguistics: You're Doing It Wrong

If you're a linguist - any kind of linguist - then you, like me, will probably have received an email from the Open Journal of Modern Linguistics, inviting you to submit your work.

I'm extremely committed to open access in linguistics, and in academia more broadly; here's why. But OJML is doing it wrong, and the rest of this post aims to explain why. The tl;dr list version of this post is as follows:
  • Don't ever submit your work to OJML.
  • Tell your friends never to submit to OJML.
  • If you know someone who's on the editorial board, gently ask them not to be.
So, what's so very wrong with OJML? The short answer is that it is run by the wrong people and threatens to bring the entire, very promising, open access movement into disrepute by charging stupidly high APCs and skimping on quality both in terms of typesetting and intellectually.

The "costs" of progress: predatory publishers

Let's take a look at OJML's guidelines on Article Processing Charges (APCs). It's $600 per article, but only if that article is within ten printed pages: in linguistics, that's barely out of squib status. For each additional page above ten, an extra $50 is whacked on.

This may not seem like much, given that Elsevier charge up to $5000. But for a 20-page article, which is still short by linguistics standards, we're talking $1100. Moreover, this kind of incremental model penalizes thorough argumentation and, in particular, proper referencing. It might even not be so bad if what you paid for was worth it - but I'll argue below that it isn't even close.

The open access community has a name for this kind of publishing practice: "predatory". Jeffrey Beall maintains a list of predatory publishers on his website, along with criteria for inclusion. Surprise, surprise: "Scientific Research Publishing" (SCIRP), the publishers of OJML, are on the list at number 206.

What's in it for them? Large amounts of money, made from academics' naivety. Last year, journalist John Bohannon conducted a "sting" operation by submitting a series of 304 deliberately deeply flawed manuscripts by fictional authors to gold open access journals, many of them ostensibly peer-reviewed. More than half of them accepted the papers, including many that apparently sent the paper out for review, and 16 journals accepted the papers despite the reviewers spotting their damning flaws.

The journal Science, who hosted Bohannon's piece, were keen to trumpet the failure of open access (unsurprisingly, as they represent the status quo that open access threatens). However, there are a lot of problems with Bohannon's approach, which have been ably summarized elsewhere. In particular, since Bohannon didn't include a "control group" of traditional subscription journals, there's no evidence that open access peer review practices are any worse than those. And even if they were, the existence of exploitative behaviour within open access of course doesn't entail that open access itself is a bad thing. But it's clear from Bohannon's experiences and those of others that, where there are new ways of making shady money, there will be crooks who leap to seize them, and that gold open access (and OJML) simply illustrates one instance of this general principle.

Bad production standards

One of the areas where any publisher can claim to add value is in ensuring the formal quality of their published submissions: typesetting, copy-editing, proofreading, redrawing complex diagrams or illustrations, etc. If a publisher does this well, they may merit at least some of the fees that they typically charge for open access. However, OJML's performance in this area shows that they hardly even look at the papers they publish. Here are some examples from Muriungi, Mutegi & Karuri's 2014 paper on the syntax of wh-questions in Gichuka (which, at 23 pages, must have cost them a pretty penny):
  • Glosses are not aligned (e.g. in (6) on p2).
  • The header refers to the authors, ridiculously, as "M. K. Peter et al".
  • There are clauses which contain clear typographical errors, e.g. "the particle ni which in Bantu, which is referred to as the focus marker", on p3.
  • In (17), the proper name "jakob" is not capitalized.
  • There are spelling errors: "Intermadiate", in table 1, p8.
  • The tree on p14 has been brutally mangled.
  • Some of the references are incomprehensible garbage: "Norberto (2004). Wh-Movement. http.www.quiben.org/wp.content/uploads"
A quick glance through any OJML paper will reveal that these aren't isolated occurrences, and little of this is likely to be the fault of the authors: at least, any linguistically-informed copy-editor or proofreader should have picked up on all of these points instantly, and any proofreader at all should have picked up on most of them.

Low quality papers

What about the academic quality of the papers accepted? I don't want to pick on any particular paper: in fact, I'm sure that there are nuggets of gold in there (the Muriungi et al. paper mentioned above, for instance, is a valuable syntactic description of an aspect of an understudied language). But I invite you to skim some of the papers and draw your own conclusions.

In particular, the dates of acceptance and revision of the papers aren't exactly indicative of a thorough review process. For instance, the paper by Muriungi et al. was "Received 7 June 2013; revised 9 July 2013; accepted 18 July 2013". Again, this isn't unusual for the papers in this journal. It's certainly not impossible for quality peer review to take place at this speed - and it's certainly desirable to move away from the unacceptable slowness of some of the big-name journals - but it is at least doubtful. And one thing that is extremely eerie is how many of the articles are dated as having been revised exactly one month after receipt, suggesting that the process may have been even shorter and that SCIRP is trying to cover itself, by means of outright lies, against exactly the kind of allegation I'm making.

The fields of linguistics given under their Aims & Scope don't inspire confidence, either, with "Cosmic Linguistics" and "Paralinguistics" among them.

Why is this important?

OJML is symptomatic of exactly the wrong approach to open access. Open access, to me, is about disintermediation, about putting power back into the hands of academics. There are several good open access operators out there: Language Science Press is a prime example in the domain of books, the e-journal Semantics and Pragmatics has been performing a valuable no-fees open access service for years, and the Linguistic Society of America recently took a step in the right direction by making papers in its flagship journal Language openly accessible after a one-year embargo period. These initiatives are all run by researchers, for researchers.

In contrast, OJML is about opportunistic money-making. Here's a quote from SCIRP's About page, in relation to why their base of operations is in China while they're registered as a corporation in Delaware: "What SCIRP does is to seize the current global trade possibilities to ensure its legitimate freedom with regard to where to do what." If this sort of creepy graspingness doesn't put you off submitting to OJML, and the problems outlined in the previous sections don't either, then I don't know what will.

Unless we nip this problem in the bud, then it threatens to damage the reputation of the Open Access movement more generally. Time to boycott OJML, and to spread the word.

Sunday, March 16, 2014

On dieting

There will be aspects of grumpy rant to this post, but in order to contextualize it I'll need to do a little autobiographical sketch first. Please excuse both the self-indulgence and the rant.

I've never been skinny, and I have reason to suspect I never will be (genetics, and also past experience; see below). When I was younger I was always one of the fat kids, though never one of the really fat kids, and because of that I was the butt of jokes. When I came back from my Year Abroad in Germany in 2007 I was like that: a bit flabby, but nothing too noticeable. Over the course of my fourth year I gradually put on a fair bit of weight, causing someone who hadn't seen me for a year in the summer of 2008 to make an oblique reference to "too much good living". There were probably a number of reasons for this: I had a flatmate who was an absolutely wonderful cook, but who indulged me in a lot of carbs, and besides there was the stress of the final year in Cambridge (at one point I wrote 16 essays over an 8-week period, I believe).

During my MPhil year (2008-9), nothing much changed. I was getting on well, but was eating the carb-rich diet I was used to, and sometimes snacking grotesquely. Besides that I was drinking a lot of beer. Towards the end of that year I felt like a change was in order, and I took up a friend's offer to introduce me to the local gym. The guy who did the induction seemed like a nice bloke, and offered some trial personal training sessions afterwards, which (after checking my bank balance) I accepted. At that stage when I stepped on the scales I was 107.6 kilos – well into the 'beached whale' section of the BMI chart.

Things changed. I stuck with my personal trainer, kept a food diary, and completely turned my diet around as well as exercising for an hour three times a week. Between the summer of 2009 and the autumn of 2010 I lost about 30 kilos of weight, which I'm told is pretty good going. Not sure exactly what my lowest weight was, but for a while I was consistently under 80 kilos. I didn't feel skinny, and that's because I wasn't: I still had a belly that jutted out, and some handles that shouldn't have been there. Though I felt pretty good about myself, I didn't have girls queuing up to check me out, and my BMI was barely into the "normal" range (for my height, 5'11", normal is below about 81kg, and obese is anywhere above about 97). Okay, these are both stupid metrics: girls aren't that interested in BMI, which in any case is a terrible measure of healthy body composition. I should certainly have cut myself some more slack: at this stage I was doing 10km runs fairly regularly, and did elicit one or two positive comments about my change of shape from people who'd known me for longer.

I managed to keep this up for... not sure how long. A year? By mid-2012 I was up to ninety-something kilos, anyway. Then I moved to Manchester, stopped worrying about my diet, stopped exercising, and just kind of hoped that living a normal life would cause me to stay at a healthy weight. Unfortunately I hoped wrong. By the end of 2012 I weighed in at over 100kg, so I signed up to the local gym. Time constraints and general apathy meant that I didn't go more than about 10-12 times over the course of the year, though. My weight has stayed pretty much constant since then, at about 111-113kg. I cancelled the gym membership at the end of 2013 - figured it was a waste of money - and instead bought an exercise mat and bench and some dumbbells. My reasoning was this: a) these things will last me for much longer than the year of gym membership, b) I can exercise in the comfort of my own home rather than surrounded by meatheads and scarily-fit old people, and c) strength training was always the part of my gym workout that I actually enjoyed. I would go running too, but the only local options seem to involve canal towpaths strewn with broken glass.

The weight gain is unsurprising: I was stressed with a new job and new place, and started eating pretty badly as well as stopping exercising. I'd like to get fit again, since I feel that my posture is being damaged by my oversized belly and I'm often robbed of breath by things that wouldn't have bothered me three years ago. Plus, who wants to be at increased risk of diabetes and heart disease?

The dieting is going to be crucial here, though, and I need to explain how that works for me. My basic meal structure hasn't changed since the time in Cambridge when I lost all those kilos. It goes like this, when I'm at home or on a normal working day:
  • Breakfast: poached egg; porridge (made with jumbo oats and water); pint of water
  • Lunch: salad (any combination of lettuce, peppers, tomato, cucumber, coleslaw, olives) with chicken, tuna or some other meat; banana; pint of water
  • Dinner: roast chicken (sometimes something else like breaded fish); two green veg (usually broccoli and green beans); small yoghurt or two; pint of water
I know this is a good template. I know that because it helped me lose 30kg of body fat. The problem is what else I do. When I'm at work I frequently buy a chocolate bar, a latte and a muffin as an afternoon snack. And on the way back from work or other events I will buy a bag of Sainsbury's double chocolate chip cookies and scoff the lot, and this is by no means a rare occurrence. I also snack a lot on cheese and oatcakes.

So that's me, and that's where I am today. I'm not my own best friend, sure, but judging by the above you might not expect me to be as fat as I am. (I also walk to work for half an hour every day and back, for instance.) That's bad genes for you, and I've learned to accept that.

This incredibly long confessional was meant to be a prelude to a grumble about dieting. It goes like this: pretty much every piece of dieting advice I've ever seen or heard is bad. The worst are the ones that pretend that you can keep eating the things you love. "It's not one of those faddy diets that require you to give up X and Y! You can keep eating wholesome and nutritious meals that are exactly the things you would eat anyway!" Really? WTF? If I wanted to do that, I wouldn't go on a diet in the first place. My sympathies are actually with the faddy diets, since at least they're not pretending to achieve the impossible. And the science behind the diets seems to be all over the place. The Hairy Dieters state that "we focus on the energy equation: your calories in via food and drink versus your calories burnt through exercise". There must be something to it, because it's a bestselling product, right? But I was told by my trainer in no uncertain terms that calorie counting was a radically misconceived approach to fat loss, since not all calories are equal – and that seems to be the standard line in Atkins-style approaches to dieting.

Something else I have heard on occasion is "dieting is bad". Really? Well, I guess it depends what you mean by dieting. All I mean is a change in diet, and there does seem to be evidence (to put it mildly!) that doing that is useful for weight loss. Certainly in my case it worked (for a while). So dieting can't be all bad.

But dieting is hard. Do you a) give yourself absolute prohibitions against certain foods, or b) acknowledge that certain foods aren't great and therefore resolve to limit your intake of them? The rigid a) approach has been problematic for me, since it leads to cravings of exactly those foods. The looser b) approach has in my experience tended to lead to "food creep" where the consumption of those foods has become more and more common. It seems to be a lose-lose situation.

The one thing that I'd hold up as fact throughout all the bullshit is this: dietary change requires willpower. There's just no way around that. I successfully dieted for long enough that I don't think it can be called a faddy phase that I eventually reacted against. If I had the willpower (and maybe I do?), I could do so again. But dieting is hard, and what I hate about most of the dietary advertising out there is that it pretends that it's easy.

Thursday, January 23, 2014

Wine per head of student population

Following the publication of this, giving the amounts spent on wine by the different Cambridge colleges, I thought someone ought to compare it to this, the number of students in each college, and do the maths. Here are the results:


College Number of students Total spent on wine Per head
King's 677 £338,559 £500.09
St John's 912 £260,064 £285.16
Jesus 814 £212,256 £260.76
Trinity 1044 £223,291.98 £213.88
Pembroke 668 £141,692 £212.11
Peterhouse 412 £82,133 £199.35
Trinity Hall 641 £127,186 £198.42
Emmanuel 709 £131,127 £184.95
Sidney Sussex 581 £97,507 £167.83
Corpus Christi 490 £79,254 £161.74
Magdalene 542 £68,192 £125.82
Gonville and Caius 829 £96,994 £117.00
Christ's 614 £71,055 £115.72
Downing 675 £77,798 £115.26
Queens' 987 £111,112.64 £112.58
Churchill 801 £87,685 £109.47
Clare 768 £79,989 £104.15
St Catharine's 695 £62,432 £89.83
Selwyn 583 £49,498 £84.90
Robinson 556 £44,722.39 £80.44
Clare Hall 236 £17,400 £73.73
Murray Edwards 518 £32,917 £63.55
Girton 699 £30,051 £42.99
Wolfson 927 £39,647.10 £42.77
St Edmund's 459 £19,304 £42.06
Newnham 656 £27,263 £41.56
Fitzwilliam 767 £23,028 £30.02
Darwin 674 £17,510 £25.98
Hughes Hall 594 £14,033.58 £23.63
Homerton 1342 £27,974.55 £20.85
Lucy Cavendish 341 na na

Both sets of figures are based on the 2012-13 academic year. Figures per head are rounded to the nearest 1p.

There's no particular reason to suspect that the amount spent on wine would correlate particularly well with the number of students. Other factors are likely to be much more important: perhaps size of endowment, age of establishment, etc. And we shouldn't pretend that the students and staff actually get to drink all this wine for free. Many colleges actually sell their wine to the students for formal dinners. I suspect also that these figures include wine bought to be served at conferences hosted at the colleges, in which case the colleges will likely be making a hefty profit.

Still, when the data are presented like this, some small colleges (Peterhouse, Trinity Hall) come off looking spendthrift, and big colleges like Queens' don't look quite as bad. It's interesting also to note that colleges for a) graduates/mature students and b) women are clustered at the bottom of the table; too bad we don't have data for Lucy Cavendish, which is both.

Sunday, June 09, 2013

Ritter Sport again

In my brief jaunt to Bamberg & Würzburg last month, I only managed to pick up one new Ritter Sport to road-test. And it was a pretty good one:

Erdbeer Vanille-Waffel: 7.5/10
Well played, Ritter, well played. If this had been just a strawberry waffle combo, it would in all likelihood have been overly sweet. As it is, the vanilla adds a touch of creaminess to the proceedings, which in turn is then put in relief by the crunchy waffle. (The waffle makes this variety particularly satisfying to open using the Knick-Pack technique.) For some reason, I can't bring myself to rate this one among my all-time favourites, but it was nevertheless a highly pleasurable experience.

And here's an updated list of my all-time preferences:

Cherry & Mini Smarties: 10/10
Rhubarb, strawberry and yoghurt: 9/10
Milk Chocolate: 9/10
Alpine Milk Chocolate: 9/10
Knusperkeks: 9/10
Caramel & Nut: 9/10
Kakaosplitter: 9/10
Mixed Fine Nuts: 8.5/10
Corn Flakes & White Chocolate: 8.5/10
Cookies & Cream: 8.5/10
Nougat: 8/10
Cappuccino: 8/10
Hazelnut (milk chocolate): 8/10
Hazelnut (dark chocolate): 8/10
Edel-Bitter: 8/10
Rum, Raisin & Nut: 8/10
Orange & Marzipan: 8/10
Amarena Kirsch: 8/10
Fruits of the Forest & Yoghurt: 7.5/10
Peach & Passionfruit: 7.5/10
Bourbon & Vanilla: 7.5/10
Himbeer-Cranberry Joghurt: 7.5/10
Marzipan: 7/10
Blood Orange: 7/10
Raisin & Nut: 7/10
Coconut Batida Liqueur Truffle: 7/10
Vanilla Liqueur Truffle: 7/10
Knusperflakes: 7/10
Stracciatella: 7/10
Vanilla Cookie: 7/10
Waldbeer Joghurt: 7/10
Crema Catalana: 7/10
Milk & White Chocolate: 6.5/10
Alpine Cream & Praline: 6.5/10
Hazelnut & Almond Crumble: 6.5/10
Sunny Crisp (sunflower seeds): 6/10
Espresso Crunch: 6/10
Half Dark Chocolate: 6/10
Marc de Champagne Truffle: 6/10
Amaretto Truffle: 6/10
Whole Peanut: 6/10
Hazelnut (white chocolate): 6/10
Raisin & Cashew: 6/10
Nut in Nougat Cream: 5.5/10
Dark chocolate with Creme a la chocolate mousse: 5/10
Jamaica Rum: 5/10
Kakaocreme: 5/10
Peppermint: 5/10
Bourbon Vanille: 5/10
Whole Almond: 4/10
Golden Peanut: 4/10
Yoghurt: 4/10
Napolitaner Waffel: 4/10
Lemon: 3/10
Egg Liqueur Truffle: 3/10
Coconut: 2/10
Diet Half Dark Chocolate: 1/10

That means, by my count, that I've sampled 57 varieties. Heinz would be proud.

Wednesday, April 17, 2013

More chocolate!

This blog has got awfully serious, hasn't it? Time to talk about chocolate again. Today's instalment of the Ritter Sport ratings comes to you courtesy of my chocoholic colleague (chococolleague?) Tine Breban, who raided the stores of the Ritter Sport Welt in Berlin specially. Good batch, too!

Cookies & Cream: 8.5/10
This is an excellent translation of the concept of Ben & Jerry's Cookie Dough ice cream into chocolate-bar form. It has the right notes of salty crunchiness combined with sweetness, and feels delightfully indulgent.

Crema Catalana: 7/10
An inspired idea, rendering crème brûlée in Ritter Sport form. And it does translate well, with the creamy layer very reminiscent of its target. Rather lacking in execution, though; what I love most about crème brûlée is the slightly burnt, dark and crispy layer on the top, and this was missing from this interpretation, which consequently receives a somewhat lower rating than it may deserve on its own merit.

Himbeer-Cranberry Joghurt: 7.5/10
This one has crunchy, but is overwhelmingly sweet. The cranberry, by contrast, is easy to miss entirely in the powerful assault of raspberry. I have a sweet tooth, but this one was really too much for me. A little less brutal and it could have been a firm favourite.

Thanks also to chococolleague Laurel MacKenzie (of TV fame) and chocoholic collaborator (chocollaborator?) Anne Breitbarth, who also kept me supplied with delicious chocolate over this beautiful Spring period. But because it wasn't Ritter Sport, you don't get to hear about it.

Tuesday, April 09, 2013

Journal of Historical Syntax: interim report

My little Journal of Historical Syntax has been in existence for a year and a half now. The Executive Committee of the LSA has requested some facts and figures on the eLanguage journals, and I thought that readers might be interested to see these numbers as well. Enjoy!

Since its inception in summer 2011, the Journal of Historical Syntax has received 13 submissions: 1 in 2011, 9 in 2012, and 3 so far in 2013.

Of those 13 submissions:

3 were rejected.
4 were advised to revise and resubmit (of which 1 was subsequently accepted).
4 were accepted with changes (plus the 1 mentioned above).
2 are currently under review.

36 individuals have been involved in reviewing. The average time between receipt of the manuscript and date of the decision (not counting papers that were not sent out for review) is 97 days. 2 peer-reviewed papers have so far been published (1 in 2012, 1 in 2013). For these two, the times between receipt of the manuscript and publication were 275 and 187 days respectively. The articles have received 158 and 138 views respectively, and their abstracts have received 454 and 257 views respectively.

2 book reviews have also been published (1 in 2012, 1 in 2013), and a third is in the works. The two reviews have received 200 and 106 views respectively, and their abstracts have received 420 and 184 views respectively.

Many thanks to all our reviewers, authors and readers!

Tuesday, April 02, 2013

The case for Open Access

The Pirate Party UK asked me to write a piece for them on Open Access. You can find it here:

http://www.pirateparty.org.uk/blog/2013/mar/30/the-case-for-open-access/

I'm glad I've finally got my views on this down in writing somewhere! This sort of material was originally intended for another "What's wrong with academia?" post, which I probably now won't write. It glosses over some important issues, such as the whole furore (pointless, in my view) around CC-BY licenses, and whether to opt for green or gold Open Access. But as a quick introduction to an increasingly complex debate, I'm quite happy with it.

Tuesday, March 26, 2013

The Manchester beneath our feet

In a previous post I've blogged about my inexplicable affection for railways, especially abandoned ones. It may then come as no surprise that I feel the same way about hidden networks of tunnels. I think part of it is a boyish excitement at seeing discrete mathematics realized concretely - the same reason the Beck map of the London Underground is so iconic. The angular, artificial dreamworld constructions of maps and networks forcing their way into a reality which seems determined to be fuzzy in so many other respects.

When I was a child, my Dad used to take me to Manchester's Museum of Science and Industry. There were all sorts of awesome things on display there: a little train which would tootle a few hundred yards before coming back (and which now occasionally wakes me up at weekends with its tooting!), an "experimental" area with all sorts of cute demonstrations of the power of physics, and an early computer which allowed you to take a virtual tour of the solar system while playing Holst's Planets. (This last is sadly now gone.) But possibly my favourite part was the Underground Manchester gallery. Here it was explained how the sewer systems of the city had developed over the two thousand years of Manchester's settlement, from the Romans to the present day. In the middle was a section of Victorian sewer you could walk through, built with genuine sewer bricks and featuring an inconspicuous model rat halfway down. My excitement at this was not diminished when I revisited it last year.

Then, in my quest to discover the optimal walking route from Castlefield to the University, what did I discover?

A black door. Surrounded by an (ineffectual) metal fence - but also the kind of British box hedge that screams "Nothing interesting going on here". Set into a concrete stairwell, and leading to... well, who knows? Underground, is all that matters.

So this got me interested in figuring out what was really there beneath our feet. Not in a conspiracy-theoretic way; that's not what floats my boat. But a city as big as Manchester must have some skeletons in the closet, right? Or at least a secret closet in which it could in principle store skeletons. Preferably an underground closet.

The internet is one's friend in these matters, and I soon found out about various things. The creepily-named "Arndale void" - apparently built as the first stage of a proposed tunnel leading from Piccadilly to Victoria station. The abandoned Manchester and Salford Junction Canal, with its tunnel from the Bridgewater Hall to the River Irwell near the Granada Studios, used as a public air raid shelter during the Second World War. And, most scarily of all, the Guardian Underground Telephone Exchange.

Now, if you'd told me that there exists a secret network of tunnels stretching from Ardwick to Salford, built to withstand a nuclear blast, I probably wouldn't have believed you. That sort of underground insanity challenges even my overactive subterranean imagination. But it does seem to exist: a recent exhibition, infra_MANC, co-run by fellow Manchester lecturer Martin Dodge and based on local government documentation, presented some of the results. The catalogue is still obtainable via the Modernist Society, and so I ordered a copy. It's well worth a look - it contains maps, photos, and all kinds of discussion. But if tantalizing speculation is more your thing, take a look at the pages here and here. A recent fire in the tunnel reportedly caused 130,000 homes to be without phone connections.

I'm going to order this book, and also go on this tour with some friends. Hopefully I'll be blogging more about this stuff in the near future!

Saturday, December 22, 2012

Prezi for teaching linguistics?

This semester I've been using Prezi as a visual aid in my lectures. I decided it would be a useful exercise to summarize my thoughts on it for the benefit of others, since there's no need to reinvent the wheel by discovering all of Prezi's features and flaws.

The conclusion, for those of you with short attention spans, is that I probably won't be using it again, at least not for regular lecturing. But let's start at the beginning.

Scenario

I used Prezi for eleven one-hour lectures, which constitute the lecture part of the first-year course Introducing English Grammar at the University of Manchester. It's a basic course, designed to get everyone up to speed on a basic framework for understanding English grammar, from those who have no prior knowledge to those who might have substantial experience with grammatical terminology in a different framework. The course is highly terminology-laden: lots of names for things, and lots of tests one can apply in order to identify those things. There were about 220 students on the course this year.

Prezi, for those of you who haven't come across it, is a piece of presentation software marketed as an alternative to Powerpoint (and to Keynote and Beamer, which are basically the same thing: slides), even as a "Powerpoint killer" in some markets. I won't give a full introduction; take a look for yourself at some of the sample presentations on their website if you're interested. The key thing is that instead of just moving from slide to slide you zoom in and out and move around on one giant canvas.

On to the advantages and disadvantages.

Advantages

  • Wow factor. This is not to be underestimated. Students are a jaded bunch, and it's difficult to impress them with technology; but by and large have responded well to the general look and feel of it. In my mid-term survey, which had 58 respondents, 74% found the presentations to be attractive, and 0% thought they were ugly. (Though this must in part be due to my general awesomeness as a designer, and not entirely to the software...) Obviously this wow factor will diminish the more people use Prezi for teaching.
  • Clearer conveyal of complex arguments. I genuinely believe Prezi is better for this than its slide-based competitors. Let's say you're presenting a list of things, for instance constituency tests. Rather than having a sequence of slides and presenting them one by one, you can have a kind of spider diagram and zoom in to each test in turn. And having some screens embedded in small form within other screens is great for capturing part-whole relations, and relative importance. In short, here I think Prezi lives up to its claims. In the mid-term survey, 41% stated that they found Prezis easier to follow than Powerpoint presentations, and 19% said the opposite.
Some of the positive comments I received in the survey:
  • "I definitely much prefer the prezi presentations compared to powerpoint presentations, and the Grammar lecture is one of my favourites because of this."
  • "The lecture slides are very attractive and engaging which helps make the concepts more memorable."
  • "The way the information is portrayed is much more interesting than a powerpoint presentation, and the examples are always useful to refer back to." 

Disadvantages

  • Poor facilities for typography. Prezi has no bold, underline, italic, superscript, subscript, anything like that. This means that using it for linguistics can be very frustrating. (Sure, you can create a "subscript" by creating a new text box and making it smaller, then manually positioning it in the right place. But that's not a sustainable solution.) Writing out formulas, for instance, or labelled bracketings, is virtually impossible. Prezi also gives you an extremely limited colour palette.
  • No tables. You have to draw all the lines by hand, and space it out by hand; it's not impossible, but very tedious.
  • Incredibly time-consuming. You probably figured this one out from the above two, but it's the main barrier to using this software in a sustainable way. Even a very basic lecture, with nothing fancy added, takes hours to create; much longer than it would in Powerpoint, at any rate.
  • Extremely buggy. The browser-based editor, which I used to create all my Prezis, is bugged beyond belief. Sometimes you'll copy-and-paste something, only to have it appear in a random position somewhere else in the presentation. Sometimes, after moving something, it will suddenly decide to spring back to where it came from. Sometimes line breaks will arbitrarily delete themselves. A colleague who used the desktop version of the editor tells me it's no better. This is unbelievably frustrating even for experienced users (I'd now count myself as one). There is a facility to import Powerpoint slides, but it's just as buggy as you'd expect.
  • Massive files, no handouts. You can download a "portable Prezi" to present offline, but the file size is usually between 40 and 50 megabytes. Not useful. Furthermore, it's difficult to create handouts. You can create PDFs of the screens (which are themselves large-ish files; mine were all between 4 and 13 megabytes), but there's no easy way to do, say, 6 per page. (I found a workaround for this, but like many workarounds it's time-consuming; just what you don't need with Prezi.)
  • Not supported on all computers. Though the portable Prezis are supposed to be standalone files, they need a certain version of Flash to work, and apparently need some sort of internet connection (or sometimes do). Anyway, some Manchester computers, such as the ones in the library, simply won't play them.
  • Motion sickness. Though only 2% of students (one) found the lecture presentations to be nausea-inducing, apparently this can be a general problem with Prezi if you do too much panning.
Some of the negative comments I received in the survey:
  • "The prezi presentations are very effective in lectures but when reviewing I haven't found a way of quickly getting to the slide I want except through flicking through all of them. Similarly, they can't be printed off except one to a page, so for both these reasons I prefer powerpoint for slides that I am going to refer to later."
  • "lectures slides are too long it would be very helpful if  you use powerpoint instead"

Evaluation

Overall, then, I'm sad to say that I think the negatives outweigh the positives. The "wow factor", as I noted, is only relevant to the extent that Prezi is a minority technology: if everyone uses it, it will become much less impressive. That leaves its only lasting advantage as the clarity of representation of complex chains of thought. While it's nice to have, it doesn't justify the time and effort spent on creating them, or the additional problems created for students who want handouts.

I therefore can't recommend Prezi for regular lecturing, and won't be using it in that function myself in future. Still, it was a fun experiment, and I've certainly learned from it - and I hope you find this useful too!

Sunday, August 26, 2012

More chocolate

Greetings (and ratings) from Stuttgart, where everything is closed on Sundays. (Ah, Catholic Europe, how I love thee.)

Waldbeer Joghurt: 7/10

A tasty treat, though very sweet and rather gooey. This variety certainly succeeds in bringing out the distinctive sharp taste of the berries - it's a bit like a Fruits-of-the-Forest cheesecake encased in milk chocolate. Only a slight yoghurty clogginess detracts from it.

Napolitaner Waffel: 4/10

Though wafers and Ritter Sport milk chocolate are good on their own, I don't think the combination quite succeeds. Wafers are at their best when dry and crispy, which the very creamy chocolate coating prevents. The result is a bit soggy and nondescript, though by no means unpleasant.

Friday, July 20, 2012

What's wrong with academia? Part 1A

I wasn't planning to write anything more on the issue of job security, but I've been really pleasantly surprised by the number of people who've taken the time to engage seriously with my previous post, both in blog and Facebook comments and in private responses. Thanks for your thoughts - I really appreciate it. And I hope that the debate has helped a few people to clarify their own position on this issue, whatever that might be. It's certainly had that effect on me.

I should start by saying that I am extremely unlikely to be in a position where I can implement any of the sweeping changes I proposed. That's for the best, for a number of reasons. For one thing, like Neil (Facebook comment), I'm actually more conflicted than the previous post made out; in that post I was trying to take a line of argumentation to its (approximate) logical extreme, and though it's an extreme that I am sympathetic to, I'm not too fond of extremes in general. For another thing, I'm not sure I'd have the balls to make big changes like this.

I think two major issues have been raised with regard to the alternative system I sketched (as well as a host of more minor ones, such as the increased danger of funding cuts under such a system, as Christine pointed out in a blog comment, and the difficulty of keeping long-term projects afloat, as Katie pointed out in a Facebook comment). These are: "juking the stats", and the issue of job security as an incentive per se (the "family argument"). I'll address these in turn.

Juking the stats
"Impact is up 42%, and the Mayor's gonna love our project on the Big Society."
I think this issue was stated most clearly by Tim (Facebook), Lameen (blog) and Unknown (blog), though in different ways. It's closely related to the "flavour of the month" approach to research funding mentioned by Orestis (blog). Essentially the key problem as I understand it is this: the intention of abolishing permanent positions is to force academics to continue to come up with innovative new work. But one alternative for academics is to become cynical, and to try to game the system by either a) producing a load of hackwork (or at best work that's a "safe bet") and passing it off as research activity, or b) deliberately focusing your research priorities on what others think is awesome (grant-awarding bodies, employers, research assessment bodies, the media) and generating hype and hot air rather than ideas. (On reflection, I guess that a and b are variants of one another.)

This is a genuine concern, and a clear potential practical problem for any approach like the one I sketched. It's worth mentioning that it's a problem right now as well. For instance, in Lisbon recently I was discussing with colleagues a project that had been awarded vast amounts of money by a major grant-awarding body but that seemed to us to be mostly spin. Similarly, as I mentioned in my previous post, research assessment as carried out at present is not enormously difficult to juke, at least insofar as the intent of research assessment is to assess research quality and the metrics used by for instance the REF in arts and humanities are a fairly poor reflection of that. (Publication counts, essentially: you have to submit four; monographs count for two [why two? why not four, or ten, or zero?].) Other metrics used as a proxy for research assessment at present are also not great: citation counts, for instance. It's not as if you cite something solely because you believe it's wonderful research.

Given that the problem exists now, it would only be quantitatively greater under the approach I sketched, not qualitatively different. This leads me to suspect that the issue is an independent one: can a robust metric for research quality or for innovation be devised? I've seen no demonstrative argument to the effect that this is impossible either in principle or in practice (though I'm damned if I can think of anything that would work). More generally, though, when it's put this way it's pretty clear that the increased influence of juking the stats under the approach I outlined is not an argument against the approach. Consider an analogy from the school system. In order to assess pupils' achievements (as well as teaching efficacy etc.), exams are needed. This much is uncontroversial, though the exact extent of examination at primary and secondary level gives rise to heated debates. Now consider a system in which pupils only take one examination - in order to assess their suitability to enter the school in the first place (sorta like the old 11+ in the UK) - and then are left to their own devices, without any assessment. They might advance from year 7 to year 8, say, but this (as, ultimately, in the school system) would be based solely on age. This seems to me to be fully analogous to the current system of permanent academic positions. (In particular, though it's not unheard of for pupils to repeat a year, being demoted to the year below on account of poor performance is not something that often happens, to my knowledge.)

The point is that one has to doubt any argument that goes as follows: "Assessment (of pupils, academics, the Baltimore police force, etc.) is really difficult, and all metrics so far devised are imperfect reflections of what we're actually trying to measure. Therefore, let's not do any assessment at all past a certain point." At best it's a slippery slope argument, and we all know that slippery slope arguments lead to much, much worse things. ;)

The family argument
"Won't somebody please think of the children?"
This is the argument most clearly and repeatedly made against my position, e.g. by Chris, Liv, Katie and Neil (Facebook) and Darkness & Light (blog) and by more than one person in private responses as well.

There are many strands to this argument, but before I mention them I should perhaps explain why in my first post it seemed like I was dismissing the family argument so cavalierly. Underlying that post was the desire to optimize the individual academic's research output. I was tacitly assuming that this is the only goal of academia - which of course it isn't. There are many other sides to academia: teaching, admin (yay!), training others to become good researchers, etc. While the approach I sketched might be good for the research output of individuals, it doesn't look as promising for any of these other sides.

One strand of the family argument is simply a human argument: it's not as good for us as people if we don't have permanent jobs. We can't plan in advance to nearly as great an extent, and of course it's much harder to do things like buying a house and raising a family. Well, this is all obviously true, though of course it will bother some people more than others. I personally don't particularly want to raise a family; I have no particular ties; I am young and mobile. (To those of you in different situations, this particular bias must have seemed painfully obvious from my post.) To the extent that optimizing individual research output is the goal, however, it's irrelevant.

However, note the word "individual" with which I've carefully been hedging. As Chris pointed out in his Facebook comment and subsequent clarification, if we consider the research community as a whole, that could suffer. People who do want to raise a family might decide that academia is not for them, and we might have a mass exodus on our hands. This reduces the "genepool", and is hence bad.

There are a couple of ways of responding to this criticism, though both are super tendentious. First of all, maybe I think that actually the absence of permanent positions should be something that's not restricted to academia but is more prevalent at large. (As, in fact, it already is among people of my generation. One good friend has had several jobs now, in the real world, and found career advancement to be nearly impossible - putting this down to the fact that "old people can't be fired".) If the whole world works in the way that I've been suggesting, then academia would just be one field among many.

Secondly - and I should emphasize that I don't believe this, though the argument could in principle be made - do we really need all those people who would leave the field? Academia is already massively oversubscribed to the extent of the job market being a joke, at least in the arts and humanities. But the smaller genepool must be a bad thing in itself - unless it could be argued that the people who desire permanence, who want to raise families etc. are inherently less good at research than flexible, asocial freaks like me. But I really don't want to go down that road; I'll just note that it's an open question, which could presumably be investigated empirically. (Actually the argument could be put the other way round, as one private response to my post did. If academia is robbed of all the people who are embedded in stable social contexts such as families, it becomes distanced from the social "mainstream", which encourages precisely the kind of philistinism I was scared of in my previous post.)

The final key strand of the family argument is not about families: it's about the other roles of academics. Certainly for teaching purposes, constant change is bad. Departmental leadership and continuity of that kind will also suffer. Perhaps most importantly, as again emphasized in a private response, the role of senior academics in mentoring more junior academics would be compromised. Again, on a narrow reading of optimization of the individual research output, none of this is a problem. But again, if we consider the output of the research community, it's bad.

In this section I haven't been concerned with defending my original argument, at least not beyond pointing out the tacit (and, ultimately, flawed) assumption that underlay it. There's more to academia than the individual's research, that much is clear.

Well, I think I'll stop here. Other interesting points were raised; in particular, my impression is that a lot of the sort of changes I'm suggesting are already in place in the sciences (and that people heartily dislike them). But I don't have the background or knowledge necessary to consider that further, and I wouldn't want to generalize beyond the arts and humanities (which is itself a stretch from linguistics). So, yeah.

Saturday, July 14, 2012

What's wrong with academia? Part 1: Job security

Update, 17th March 2021: I wrote this post nearly a decade ago, and have since become convinced that it's the single worst thing I've ever written. This is especially true given that, at the time, I'd recently taken up a permanent position myself, so it's sick-makingly tone-deaf. Unsupported assertions about 'human nature', unironically appealing to 'meritocracy'... honestly, it'd be better for my reputation if I just deleted it, or retconned it à la Dom Cummings. I'm leaving it here only for the sake of intellectual honesty and accountability. Perhaps unsurprisingly given the fierce reactions this post engendered (see the comments), I never ended up writing parts 2 and 3.

What follows is a collection of musings on various topics that have come to bother me during my first six months in a lectureship. In the interests of structure, I'll focus on three main areas: job security, the relationship between teaching and research, and publishing.

If you're familiar with my general left-wing leanings, you might think you can already anticipate the bones of contention that form the skeleton of this blog post. With regard to job security, for instance, one might expect me to bewail the decreasing availability of permanent positions; and one might expect me to extol the virtues of the oft-unnoticed synergies between teaching and research. In both these cases I will do neither of these things; if anything, the complete opposite viewpoint will emerge. (With regard to publishing, given my own editorial activities, the thread of argument will be a bit more predictable.)

Whether any of this is consistent with the aforementioned left-wing leanings or with my life philosophy in general, or whether I should instead be counted among the Hippocrates, is an interesting question. I'm convinced that my stance is consistent, but that's a discussion for another time; in any case, I do welcome thoughts on this or any other part of the post.

1. Job security

As I've mentioned, it's fashionable and commonplace to find the decreased availability of permanent academic positions deeply worrying - so much so that it's entered into mainstream media discourse. Now this seems to go hand in hand (at the moment, at least) with a general decline in the availability of academic jobs tout court. I'd be the first to say that the latter is an extremely worrying trend, especially when coupled with the general philistinism as regards academia in the UK. Consider the following comment, a response to a Guardian article about the AHRC supposedly being told to study the Big Society:
The country spends £100m on 'arts and humanities research'???

Please cut it all and let's see if we miss it....
Worryingly, this comment is 'recommended' by 62 people... and this is the Guardian we're talking about, not the Daily Fail. And in the meantime, we pay £2 billion a year for a collection of Cold War relics to gather dust, and some people defend this with their lives. Ho hum.

So I'm against a reduction in jobs across academia as a whole. However, this issue is logically separate from the question of whether those jobs should be permanent or temporary/fixed-term. What's more, I've never heard a good argument for permanent academic positions.

Permanent positions make a necessity out of virtue. They are disproportionate post hoc rewards for research achievements, and give no incentive to advance the state of knowledge (which I take to be the primary function of academia as a whole). Let's say you write a decent PhD thesis and a few publications, meet some nice people at conferences, get lucky, and then end up with a job for life. Why is this considered to be a good thing? From that point onwards, it's human nature to kick back and do nothing. From my observations of other supposedly research-active staff (admittedly a small and varied group), if this happens, the worst that the university can do to you is shout at you a little bit. But because you're contractually protected, you can more or less continue to do nothing with impunity.

But let's say that's not the case. Let's say that instead you sit down and churn out the four publications needed to become REFable every few years - or even more. Where is the incentive to innovate, to produce research that will change the state of ideas?

Worse is that academic advancement (at least in the fields with which I'm familiar in the arts and humanities) is still so closely tied to age. 'Being on the ladder', many reflexively call it, and with good reason. Once you're in at the ground floor, every decade or so, a promotion comes along and you go upstairs. You never go downstairs again. Who ever heard of a reader being demoted to lecturer? Or a professor to reader? Why not? Furthermore, ask yourself how many professors you've met who are under the age of 40. Then think about who's doing the top quality research in your field right now - the work you're really excited about, the work that is changing the way people think. How old are they? What is their job title? Whatever the outcome, chances are this group of researchers won't be anything like coextensive with the 50-something professors who have climbed highest on the ladder. This fact seems to be so obvious that I'm amazed at the level of acceptance that exists for it. At best one can conclude that pay in academia isn't in any way performance-related.

My solution? Well, it's not a novel one. One's position at a given time should be related to two things: a) the quality of the work one is doing at that time (in practice, since this is difficult to assess, a fixed time span immediately preceding can serve as a proxy) and b) the quality of one's research proposal. There was a massive outcry a while back when King's College London threatened to make everyone reapply for their own jobs. In principle, as long as the total number of jobs and amount of funding stays proportionate, I think this is an excellent idea. It forces researchers to think about exactly what they're doing and why - and to up their game in order to stay in it. I can see no harm in stipulating that academic positions last for a maximum fixed term of five years. In fact, a lot of good would surely come out of it.

Now one could object that the proposal I'm making here is precisely what grant funding is supposed to achieve in the UK. My response is twofold. Firstly, grant funding (again, at least in the arts and humanities) constitutes only a small amount of the money academics receive: I don't have numbers, but I'd wager that far more is paid on an annual basis to salaried, tenured professors. Therefore, the grant funding solution doesn't go nearly far enough. Secondly, the grant application system is so massively broken in the UK as to be almost completely worthless from the point of view of advancing the state of knowledge. The reason is a classic Catch-22. Grant applications to bodies like the AHRC are like double-blind peer review - except that, crucially, the reviewers know exactly who you are. They need to know this (so I'm told) because they need to assess your suitability for leading a project team, and for managing grant money. How is this assessed? Well, of course in terms of your experience of leading a project team, and of managing grant money. If speculative business financing in general worked on this basis... well, it wouldn't. Work, that is. No interesting project would ever get off the ground. The emphasis on grant-handling experience is particularly bemusing in light of the fact that actually AHRC-funded projects often have no obvious output or endpoint at all. (I use the term 'output' non-traditionally here, to refer to 'any resource that advances the state of knowledge' rather than the more typical 'publications'.) It seems that the AHRC and bodies like it have little concept of what it means for a project to be successful; which makes it all the more odd that they set such high stock in the ability of the project leader to achieve success. (Once again, let me emphasize that publications in and of themselves are NOT 'success'. This will become a lot clearer in part 3.)

The preceding two paragraphs are perhaps a bit deliberately polemical, but you should at least be disabused of the notion that funding bodies are the great levellers. Even if funding bodies played a significant enough role in actual funding to be the deal-breaker, they couldn't vouchsafe the advancement of knowledge because their priorities are wrong and their funding criteria flawed.

The moral of all of this? Academics make such a big deal out of meritocracy in principle that it's hard to see how things could have gone so drastically wrong. Throughout your school, undergraduate and graduate career you're fighting to jump through the next hoop, to advance yourself, to educate yourself. Then when you enter the job market the logic is reversed: you find a hole to crawl into, where you'll be paid a reasonable sum of money. And if you churn out enough publications, take care not to ruffle any feathers in teaching or administration, and maybe get a grant or two, you'll probably get promoted every ten years or so. Whatever happened to onward and upward?

Tuesday, June 19, 2012

Chocoholics anonymous

Went to Norway this month. Went on the train via Germany, just so I could get some Ritter Sport on the way. What?

Kakaosplitter: 9/10

This one tastes like crushed-up cocoa beans in chocolate cream encased in chocolate, and indeed it's hard to imagine how that combination could fail. This is an energy-granting variety, which wasn't particularly advantageous for me given that all I had to do that day was sit on trains for 9 hours - but in addition it really lifted my spirits. A lovely balance of smooth and slightly crunchy, one of this year's spring varieties.

Amarena Kirsch: 8/10

Not bad at all. Very fruity, but not overpowering, either in terms of the fruit or the modest liqueur content. Perhaps a little overly sweet, and - as with many varieties - this might have been overcome with the use of dark chocolate. But all in all this was a very enjoyable eat. A summer variety.

Bourbon Vanille: 5/10

Here you could barely taste anything except the vanilla-flavoured yoghurty goo. Not offensively bad, just boring and ill-judged. A spring variety that won't compensate for the April showers.

Saturday, May 19, 2012

Back to Babel?

I've just finished reading David Bellos's recent book on translation, 'Is That a Fish in your Ear?' (Penguin, 2011). It verged between an entertaining read and a frustrating one. Perhaps unsurprisingly, I found the book to be at its least entertaining and convincing when it touched on subjects in my own area of expertise, linguistics. On the other hand, aspects of the book – such as the story of the dragomans of the Ottoman Empire (chapter 11), and the paradoxical language policies of the European Union (chapter 21) – were fascinating, and well narrated.

This isn't meant to be a full review, but since I feel that ITaFiyE misinterprets linguistics and/or sells it short at more than a few points throughout the book, I decided to set the record straight on the open web, especially with regard to three particular chapters.

Chapter 6: Native Command: Is Your Language Really Yours?

This chapter, on what it means to be a L1 or L2 speaker of a language, starts off promisingly enough. Bellos correctly observes that the traditional term 'mother tongue' is misleading, since we learn our L1 just as much from our peers as from our parents. He then goes on to claim that 'communicative competence' is acquired 'between the ages of one and three' – but that the language learned during this period is not always the one that adult speakers feel most comfortable using. The example he cites for this is Latin 700–1700, which uncontroversially had no native speakers during this period, but which was used as a vehicular language for various purposes. Then comes an astonishing leap (p59):
But if a clear distinction can be made between the language learned from your mother and the language in which you operate most effectively for high-born males in Western Europe between 700 and 1700 CE, the very concepts of 'mother tongue' and 'native speaker' need to be looked at again.
Um, really? The distinction seems pretty clear to me. The muddying of the waters in Bellos's book starts with use of the nebulous term 'communicative competence', which does not enter into mainstream definitions of native speaker status, for which grammatical competence is far more crucial. This is a minor quibble. But the claim that high-born males operated most effectively in Latin for a millennium is an incredible one. It may well be the case that for 'formal speech and writing', as well as for 'diplomacy, philosophy, mathematics, science and religion', Latin was the language of choice. However, these high-faluting academic purposes constitute a tiny minority of our total language use. Is the claim really that these people spoke (as adults) to their parents and peers in Latin in everyday situations? The suggestion that they 'thought' in Latin is even more absurd. There's a long literature on the 'language of thought' and how closely it approximates the languages we hear spoken, but the idea that a 15th-century Dutch nobleman, say, would wake up and think Sum esurientem ('I'm hungry') does not enter into it. The evidence from vernacular written traditions in Western Europe also speaks against this assertion. From the very beginning of the period 700–1700, writing – even for academic and ecclesiastical purposes – began to be carried out not in Latin but in the local languages of the area. Alfred's great program of translation into West Saxon English (not mentioned in this book), or the monastery translations of Boethius and other such texts in the Old High German-speaking area, are prime examples. These do not indicate that Latin was a language of thought, or even an effective operating language. Instead they indicate that the Latin of the period was a language on life-support, for which cribs had to be devised so that keen young men could get their heads round it. The worst part of this little paragraph is that even if it were true that a distinction could be drawn between 'the language learned from your mother' (read: first language) and 'the language in which you operate most effectively' in this instance, it wouldn't follow that this somehow invalidates the concept of a 'native speaker'.

This distinction continues to be made throughout the chapter, with the implication that languages learned during the early years of life are of little importance. There is, no doubt, a difference between 'first learned language', in Bellos's terms, and 'operative language'. Bellos then adduces two examples – his father, whose mother spoke to him in Yiddish but who learned English at school age, and his wife, who initially acquired Hungarian but who began to learn French at the age of five. The aim seems to be to deny the significance of the 'first learned language' or 'mother tongue'; and in these terms, it's a reasonable aim. But it misses the point that linguists and specialists in acquisition are trying to make when they talk about something called the 'critical period' or 'critical threshold', a term dating back to Lenneberg (1967). Very simply, in Trudgill's (2011) terms:
Lenneberg’s term refers to the well-known fact, obvious to anyone who has been alive and present in normal human societies for a couple of decades or so, that while small children learn languages perfectly, the vast majority of adults do not, especially in untutored situations.
To be sure, there is disagreement about what the relevant age is, or whether the term 'critical threshold' is really appropriate as opposed to a gradual tailing-off of language learning abilities. Meisel (2011) provides a recent summary. But what is uncontroversial is that adults do not learn languages as well as children. If, indeed, it is possible to isolate a specific age at which language learning ceases to be a cake-walk, that age is more like 7 (Meisel 2011: 134) rather than 3 as proposed by Bellos. Both of the examples that Bellos gives, then, may be evidence that 'first learned language' or 'mother tongue' is not what is important. But neither is problematic for the idea that languages learned during the critical period are learned better than languages learned after.

This misrepresentation colours the rest of the chapter, including Bellos's conclusion (pp65–66), to the effect that it is not important for translations to be into the translator's L1:
[I]t would be futile to insist that the iron rule of L1 translation be imposed on all intercultural relations in the world without also insisting on its inescapable corollary: that every educational system in the world's eighty vehicular languages devote significant resources to producing seventy-nine groups of competent L1 translators in each cohort of graduating students. The only alternative to that still utopian solution would be for speakers of the target languages to become more tolerant and more welcoming of the variants introduced into English, French, German and so forth by L2 translators working very hard indeed to make themselves understood.
There are a few things wrong with this. First, there's a straw man hiding amidst the prose. If we don't accept L1 translators, do we really have to devote huge amounts of money to training enormous numbers of language professionals? (Not that that would be a bad thing, in my opinion.) Not at all, because of something that Bellos himself mentions later in the book: translations can perfectly well be carried out via other languages. To find a translator from Welsh into Cantonese may be tricky, but when the translation passes via English it's a piece of cake. Of course there's a loss of fidelity involved in this two-step process – but one of the most convincing parts of Bellos's overall thesis (see chapter 10) is that the very notion of 'fidelity' is suspect when applied to translation anyhow.

More problematically, it seems no less 'utopian' to believe that L2 translators 'trying very hard' is the right solution. I'm not a translator, but I've worked in the industry, and my father's been in it for thirty years, so I feel I know enough to comment. 'Trying very hard' may be good enough when it comes to translation on a hypothetical academic-philosophical level, or literary translation (addressed in chapter 27 of Bellos's book but assumed tacitly throughout in the examples used, but in the real world of translation mistakes can be deadly. When I was working in Aachen, translating patient information leaflets from German into English, I was acutely aware that if I made a mistake people could be killed. With that in mind, it seems daft to insist that well-meaning L2 translators are as good as the real deal.

(There's also at least one factual error in this chapter. It is stated that 'all babies are languageless at the start of life'. That's not quite true: in fact, the process of language acquisition begins well before birth, as shown in experimental work by Kisilevsky et al. 2003 among others.)

Chapter 14: How Many Words Do We Have For Coffee?

Unlike chapter 6, this chapter sets out to argue for something reasonable. Its aim is to assess the evidence for linguistic relativity – the idea that language shapes thought – and its conclusion (in stark contrast to the gushing quasi-religious masturbatory rhetoric we so often see in the popular press surrounding the issue, for instance Boroditsky 2010) is sensible (p170).
If you go into a Starbuck's and ask for 'coffee' the barista most likely will give you a blank stare. To him the word means absolutely nothing. There are at least thirty-seven words for coffee in my local dialect of Coffeeshop Talk ... You should point this out next time anyone tells you that Eskimo has a hundred words for snow.
This is in general a strong and interesting chapter, even though the more recent work of 'neo-Whorfians' like Boroditsky and Levinson in the last decade is rather unaccountably left out of consideration.

My problem with it is only in how it begins: Bellos trots out a well-worn passage from Sir William Jones's 1786 Discourse, commenting (p161) that this 'is generally reckoned to be the starter's gun' in the development of comparative linguistics. The idea that Jones had this pivotal role is part of the origin mythology of historical linguistics, to be sure – but his significance has almost certainly been massively overestimated, as shown in detail by Campbell & Poser (2008: chapter 3). You can read their chapter if you want the real story, but the gist of it is this:

Firstly, Jones was not particularly original in his contributions. Commonalities between Indo-European languages had frequently been observed before his time, and even the relationship of Sanskrit to these languages was not a new idea.

Secondly, Jones made a lot of mistakes. He considered Peruvian, Chinese and Japanese languages to be part of the same family as the more familiar Indo-European languages, for instance, while leaving out others he should have included, such as Pahlavi, which he classed as Semitic (Campbell & Poser 2008: 37–38).

Thirdly, Jones was working within a biblical framework and viewed his own work as having 'confirmed the Mosaic accounts of the primitive world'; specifically, all the languages of the world could be traced back, according to him, to one of Noah's three sons Ham, Shem and Japhet (Campbell & Poser 2008: 40) – in stark contrast to the backbone of comparative linguistics of the day.

There's more to say, but the point should be clear enough. The idea that Jones was the founder of comparative linguistics is just as much of a myth as the idea that Eskimo has one hundred words for snow. The repetition of the myth is frustrating within the narrow confines of linguistics, and the situation can only get worse if books like this one, intended for a popular audience, perpetuate it further.

Afterbabble: In Lieu of an Epilogue

Epilogues are typically unambitious: summaries of the content and main argument of the book, perhaps, or suggestions for future research. Not so for the ITaFiyE epilogue, which tries – in 34 short pages – to solve the problem of language, the universe, and everything.

Well, perhaps that's overstating the case. But it does attempt to address the problem of the evolution of language, which is almost as thorny an issue. As modern theorists are fond of observing, in 1866 the Linguistic Society of Paris banned debates on the subject. Those same modern theorists often then argue that we have come far enough, nowadays, to lift the ban and talk about the origins of language with impunity. I disagree – though even among linguists I feel like I'm still in the minority here. We're barely any closer to understanding the genetic basis of our language capacity than we were a century ago, and there is still substantial debate as to what language even IS. The concrete proposals made by Noam Chomsky to that effect, as for example in Chomsky (1986), are very often rejected on the basis that they don't tally with our hazy pretheoretical intuitions about language – such as the idea that it is a social phenomenon, whatever that means; see e.g. Enfield (2010). We don't know nearly enough about human prehistory to say when language emerged, and that situation is unlikely to change. Most painfully, very often theorists still fail to distinguish between 'glossogeny', i.e. change in 'languages' (as we pretheoretically know them), and 'phylogeny', the emergence of the human biological capacity for language (whatever form that takes). (On this distinction, see Hurford 1990.)

In historical linguistics, meanwhile, it is quite normal to suggest that standard comparative methods typically can't take us more than 8–12,000 years into the past (Campbell & Poser 2008 have a discussion of this); this is not due to any flaw in the methods themselves, but rather to the build-up of confounding factors and the paucity of relevant data the further back one goes. Any claim about what was going on 40,000 years ago or more is likely to meet with extreme scepticism from any sensible historical linguist. Nevertheless, this is what specialists in the evolution of language get up to constantly. Perhaps not surprising that I can't shake the feeling that the whole field is a waste of time, then. Until someone has something more evidence-based to say, I'm inclined to take the simple route proposed by Berwick & Chomsky (2011): a tiny mutation emerged at some point, in one fell swoop, giving us the ability to put words together like we know we can; that mutation was (unsurprisingly) selectionally advantageous in the long run; and that's all there is to be said. (Curiously, critics of this 'saltationist' viewpoint are often the same people who rake Chomskyan linguistic theory over the coals for its apparent baroque complexity...)

But back to Bellos. He attacks the assumption that 'all languages are, at bottom, the same kind of thing, because, at the start, they were the same thing' (p341). Whether or not we believe the 'because' clause (and there's certainly no linguistic evidence that would lead us to; see again Campbell & Poser 2008), Bellos gives us no reason to doubt the underlying sameness of languages. The fact of linguistic diversity has very little bearing on this; the very fact that we have a concept of language at all, on the other hand, even a pretheoretical one, is evidence for sameness. At some level, we can judge whether something is linguistic or non-linguistic. That alone suggests unity in diversity.

Beyond that, though, there are linguistic arguments for sameness, many of which have been controversial over the years. Bellos groups these together as the argument that languages have 'a grammar', and disposes of it quickly and unconvincingly: rather than being an empirical matter, 'the "grammaticality hypothesis" is an axiom, a circular foundation stone' (p342). Why? Because...
Since traffic lights and the barking of dogs seem to have no discernible rules of combination or no ability to create new combinations, they have no grammar, and because all languages have a grammar in order to count as languages, dog barking and traffic lights are not languages. QED.
One might legitimately argue that this is hardly circular reasoning. Rather, we're attempting to understand the defining characteristics of language in terms of concrete properties, in order to establish what exactly it is that we're doing when we describe something as linguistic or non-linguistic. If these properties turn out to be a poor mapping to what our intuitive conception of language is, or not sharp enough to distinguish language from other things, we can reject them, refine them, retain them on the understanding that no better model has yet been proposed, OR decide that our concrete properties in fact tell us that our intuitive conception is wrong or not clear enough. This is what has happened over the last fifty years with Hockett's (1960) 'design features' of language (which seems to be what Bellos is bashing here, though he doesn't cite it), and with purported universals both in the Chomskyan tradition and the Greenbergian (e.g. Greenberg 1963). It seems to me to be normal scientific practice. For example, we now know that spiders are not insects, despite all our intuitions telling us otherwise, and this is a natural consequence of adopting a particular model of taxonomic classification that is superior to our vague intuitions. We now know that whiteness isn't necessarily a property of swans. Likewise, by the botanical definition of tomatoes that we adopt, they are classed as fruit rather than vegetables. This isn't just axiomatics: we're learning something that we didn't already know. This is scientific progress.

But Bellos, like many authors in and around linguistics, refuses to give up on the intuitive definition. He proceeds as follows (p342):
In a similarly circular way, the axiom of grammaticality pushes to the edge of language study all those uses of human vocal noises – ums, hums, screams, giggles ... and so forth – that don't decompose neatly into nouns, verbs and full stops.
Quite apart from the disingenuousness of this comment (no one ever seriously proposed that full stops were a property of human language, axiomatically or otherwise, as Bellos must well know), I fail to see the problem. Describing these types of noise as 'non-linguistic' seems to me to be entirely fair and reasonable. (Note that this is very different from saying that they don't constitute a worthy object of study in their own right, a claim that no linguist I know would want to be associated with.) What we discover by doing this kind of scientific work is that our intuitive conception of language is so fuzzy and all-encompassing as to be effectively unusable, a point that Chomsky has been making for years (see again Chomsky 1986). If, like Bellos, you find definitions in the Chomskyan mould unpalatable, then the onus is on you to come up with a better operational definition if you want to be thought of as doing serious work on language.

After listing various ways in which languages can be odd (evidentials always seem to come up whenever anyone puts together a list like this!), Bellos somewhat uncharitably (but perhaps not unfairly) states that the attempt to discern what all grammars share 'has got about as far as the search for the Holy Grail'. He then builds another straw man which he proceeds to rip apart: 'all grammars regulate the ways in which free items may be combined to make an acceptable sentence' (p344). The obvious problem is the word 'sentence' here – what does it mean? Nothing (either inside or outside a theory of grammar, as far as I'm aware), and of course we don't speak in sentences, as Bellos points out.

Nor is it really a problem that 'no living language has yet been given a grammar that accounts for absolutely all of the expressions' (p344). Even if this goal were a reasonable one for linguistic theory (Chomsky 1986 argues that it isn't), and even if a living language like 'English' were a coherent object of study (see virtually any work by Chomsky for a brief but irrefutable demonstration that it isn't), does this stymie any attempt? In physics, our best theories of reality can't account for phenomena like dark matter; all this shows is that science (any science) is a work in progress. So it's an absolute nonsense to claim, as Bellos does (p344), that:
Flaws of this magnitude in aerodynamics or the theory of probability would not have allowed the Wright Brothers to get off the ground or the National Lottery to finance the arts.
First off, there's no reason that our scientific theory has to be practically applicable in order to be worth something (look at string theory, for instance). But, in any case, Bellos should look at state-of-the-art work in computational linguistics, where parsers based on handwritten grammars in combination with a simple statistical learning algorithm can robustly parse up to 92.4% of an average corpus of English (see Fossum & Knight 2009). That doesn't seem like crash-and-burn to me.

The afterbabble goes on to compare dialectal variation to primate grooming, and to propose this as a potential evolutionary origin for language, following work by Robin Dunbar. I won't discuss this in any detail, but suffice it to say that the conclusion – that 'The most likely original use of human speech was to be different, not the same' (p351) – presupposes precisely what has been argued so vigorously against earlier in the same chapter, namely a definable object that is 'human speech' (which, since animals fairly intuitively don't have it, must have evolved somehow).

In short, this chapter (and the book as a whole) overreaches itself. Though issues of translation are inevitably bound up with deep questions about the nature of language, ITaFiyE would have been a better book if it had chosen to stick closely to the former and leave the latter to specialists.

References

Berwick, Robert C., & Noam Chomsky. 2011. The biolinguistic program: the current state of its evolution and development. In Ana Maria di Sciullo & Cedric Boeckx (eds.), The biolinguistic enterprise: new perspectives on the evolution and nature of the human language faculty, 19–41. Oxford: Oxford University Press.
Boroditsky, Lera. 2010. Lost in translation. Wall Street Journal, 23 July.
Campbell, Lyle, & William J. Poser. 2008. Language classification: history and method. Cambridge: Cambridge University Press. 
Chomsky, Noam. 1986. Knowledge of language. New York: Praeger.
Enfield, Nicholas J. 2010. Without social context? Science 329, 1600–1601.
Fossum, Victoria, & Kevin Knight. 2009. Combining constituent parsers. Proceedings of NAACL HLT 2009: Short Papers, 253–256.
Greenberg, Joseph H. 1963. Some universals of grammar with particular reference to the order of meaningful elements. In Joseph H. Greenberg (ed.), Universals of grammar, 73–113. Cambridge, MA: MIT Press.
Hockett, Charles F. 1960. The origin of speech. Scientific American 203, 89–97.
Hurford, James R. 1990. Nativist and functional explanations in language acquisition. In I. M. Roca (ed.), Logical issues in language acquisition, 85–136.
Kisilevsky, Barbara, Sylvia Hains, Kang Lee, Xing Xie, Hefeng Huang, Hai Hui Ye, Ke Zhang, & Zengping Wang. 2003. Effects of experience on fetal voice recognition. Psychological Science 14, 220–224. Lenneberg, Eric. 1967. Biological foundations of language. New York: John Wiley & Sons.
Meisel, Jürgen M. 2011. Bilingual acquisition and theories of diachronic change: bilingualism as cause and effect of grammatical change. Bilingualism: Language and Cognition 14, 121–145. Trudgill, Peter. 2011. Sociolinguistic typology: social determinants of linguistic complexity. Oxford: Oxford University Press.

Friday, December 02, 2011

Schokoladenfreude

So, I arrived in Berlin yesterday and made my way immediately to a rather special place: the Ritter Sport Bunte Schokowelt ('colourful choco-world'). What an amazing place. It's not very big, but there's a lovely exhibition where you can learn all about the history of Ritter Sport and how they are made. In the process, I finally found out what the 'Sport' is all about. Apparently, in 1932 chocolate bars were too long to fit into the recalcitrantly quadratic jacket pockets of football fans, so Clara Ritter suggested to her husband Alfred that they do something about this, and the Ritter Sport was born. (And in case you're inclined to dismiss this as just the kind of nonsense that I routinely make up, here's a link to prove it).

The best thing about the Schokowelt, however, is that you can CREATE YOUR OWN RITTER SPORT. And so I did!

Cherry & Mini Smarties: 10/10
Let's be honest: I would have given this one full marks even if it had tasted terrible. But even so I felt that this combination was inspired. The sour-sweet cherry pieces, a little chewy, contrast perfectly with the zippy crunch of the mini Smarties, all, of course, surrounded by creamy milk chocolate. Mmm. (There's also the option of white chocolate or half-dark chocolate.) Fortunately I had the prescience to order two of these, and will take the other one back to the UK with me - definitely something to look forward to. Ritter, you are a wonderful, wonderful company for giving me this opportunity. I won't forget this!

There was other awesome stuff, too. A polo-shirt, for instance, which I almost bought. Also more varieties than I knew existed: Olympia, chocolate mousse, all sorts of Bio varieties, and more. It was a toss-up between buying all of them and thinking of my health; in the end, I opted for only one, in addition to the variety I created myself.

Mixed Fine Nuts: 8.5/10
It's kind of difficult to distinguish between all the different types of nut (macadamia, cashew, almond) when they're encased in chocolate, but I've found that they get stuck in your teeth during the process, and then you can taste them individually. That warrants an extra half point over the excellent hazelnut varieties. That and it's the special limited-edition jubilee variety celebrating 100 years of Ritter (though not quite 70 years of Ritter Sport), so I was favourably inclined towards it.

As you can imagine, I'm a pretty happy bunny right now!

Friday, July 22, 2011

Li-fi (linguistic science fiction)? Embassytown, by China Miéville

A while back (actually a couple of decades), Geoffrey Pullum published a piece called 'Some lists of things about books'. You can find it in his essay collection The Great Eskimo Vocabulary Hoax..., if you're interested. One of the lists includes Heffers, in Cambridge, as one of 'five bookstores where you can find a really serious stock of linguistics books. How things change. Nowadays, Heffers has one measly case of linguistics books, and it's hidden away quite cunningly, not to mention being populated mostly by Crystal, Pinker and Bryson. But I digress.

Another of the lists presented six science fiction books for linguists. Now it's never been clear to me that linguistics and fiction mix particularly well. Science fiction in particular has a bad rep with regard to linguistic accuracy. In Stargate SG1, for example, it's revealed that the reason the Ancient Egyptians spoke the language that they did was that they were visited by aliens who also spoke that language. Fine, so far, perhaps... but we're then led to assume that the Norse got their language from the grey-skinned Asgard aliens, and that the Romans learned Latin from the mysterious 'Ancients'. All these alien races are completely unrelated, of course. Enough to make a comparative linguist's brain overheat (though fortunately Asgard technology is capable of curing that). Worse still, almost all the myriad human societies they encounter on other worlds speak... English. I love SG1 dearly, but still.

Then Doctor Who and Star Trek cheat by employing a universal translator. (Oh hai computational linguists, could you build me one of those?) And even when alien races are given a foreign language to speak, assuming it isn't a code, or, worse, a cypher like Gnommish in the Artemis Fowl books (why would fairies speak an enciphered version of English?), it's usually constructed broadly as a human language, like the Na'vi language created for Avatar. (OMG ejectives!) In some ways, this is worse; it shows something of a lack of imagination, in any case. There are any number of ways that the different conceptual-intentional and sensorimotor systems of different alien life forms could be hooked up to one another. Why would they all behave like human languages?

In fairness, the works I'm criticizing really aren't science fiction in the purest sense of the term. They all fit much better into a category of 'space fantasy' or 'space adventure' that's far removed from the works of Wells, Clarke, Dick, Capek, Lem and the real pioneers of concept science fiction. Fun though the former may be, I'm always on the lookout for a work of the latter kind that takes language seriously. China Miéville's Embassytown may be such a book.

The Ariekei, or 'Hosts', have two mouths each, and can vocalize only using both simultaneously. More interestingly, their 'language' (known as 'Language') has no deixis, and they are unable to lie (and, by extension, to use metaphor; simile is a grey area). They are also incapable of comprehending Language unless it is spoken by an entity that they recognize as having a single mind, so two ordinary people can't fake it, nor can speech synthesis.

I'm not entirely convinced that all of Miéville's setup actually makes sense, especially the dénouement. But it's really refreshing to read someone who's actually attempted to play with the boundaries of what language is all about. More to the point, the book is a very enjoyable read. And it even has a linguist as a prominent character:
"I got almost all of it," Scile told me afterwards. He was very excited. "They shift tenses," he said. "When they mentioned the negotiations they - the Ariekei, I mean - were in present discontinuous, but then they shifted into the elided past-present. That's for, uh..." I knew what it was for, I assured him. He'd told me already. How could you not smile at him? I'd listened to him with affection, if not always with interest, over hundreds of hours. "Does it ever occur to you that this language is impossible, Avice?" he said. "Im, poss, ih, bul. They don't have polysemy. Words don't signify: they are their referents. How can they be sentient and not have symbolic language? ...
So yeah, go away and read this book. It may not be as page-turningly thrilling as Miéville's other work The City & The City, but it makes up for that with soaring imagination. And to my non-linguist friends: if I ever start going on like Scile, slap me upside the head and remind me of this.