There ain’t no such thing as a free journal (or lunch)?

The History of Economics Review is the journal of the History of Economic Thought Society of Australia (HETSA). The journal started in 1981, first as a news bulletin of the society, and within half a decade also publishing original research. As of this Thursday the journal is published by Taylor Francis. The move brings a new editorial team and a desire to make the journal an obligatory read for the global history of economics committee, as the editors pledge in their brief opening statement. (My  humble suggestion to them is to encourage a review of the gender balance of their editorial board.)

To commemorate the occasion, the first T&F issue is open access, and can be accessed here, for ever and ever…free, gratis, no money … for real!

You will find original papers on A.W.H. Phillips by Selwyn Cornish and Alex Millmow, T. R. Malthus by John Pullen, and John Rae by David Reiss. You can read a polemic between James Forder and Thomas E. Hall and William R. Hart, and on the book review section you will find a critique of, Playground writer, Floris Heukelom’s book Behavioral Economics: A History, which nearly deserves its own polemic.

(For back issues contact HETSA, with a membership to the Society you get unlimited access to the trove, which is interesting for original research but also as documents to the history of our community.)

Advertisements

Pop Archives

I was just amused with two projects by Shaun Usher: to “gather and sort fascinating letters, postcards, telegrams, faxes, and memos” in his blog Letters of Note, and to present interesting letterheads in his Letterheady blog.

In the former one can see images and the transcript of a scathing letter from John Lennon to Paul and Linda McCartney in the early 1970s, and letters from other pop figures as Mark Twain, Yoko Ono, Thom Yorke (Radiohead), Edgar Allan Poe, CalTech’s chemist Eric Carreira to his post-doc student, among many others. In the latter blog, one finds letterheads of people/companies like Paul Simon, Elizabeth Taylor, Ozzy Osbourne, Marvel, Capitol Records, and many other beautiful ones.

It is interesting that Usher, despite of having “a seemingly endless supply of correspondence to plough through,” invites cyberfellows to contribute with their own images. But he warns them: “If you already know it’s fake, don’t send it.”

Just fun!

Halls of fame

At about the same time economists were publishing their AER’s hall of fame, the team that brought the Google n-gram viewer published their own version of a hit-parade: an all-time, all-discipline scientists’ ranking. Success is counted in milli-Darwins (mD), and measures the frequency of citation in a corpus representing 4% of all the books ever printed:

http://www.sciencemag.org/site/feature/misc/webfeat/gonzoscientist/episode14/index.xhtml

In the first 200 names, where are the economists?

John von Neumann (137mD), and Harold Hotelling (27mD), if you want. But that’s cheating. As far as I could see, the only economist by training cited in the first 200 is Herman Daly (48mD). Certainly a surprise to me (not a bad one), and a motive for thought about the cultural imprint of economics!

(btw: is it a coincidence that Hotelling, like Herman Daly, is also a contributor to the study of the finiteness of natural resources?)

Tweeting and digital humanities

I had a twitter account for long. After faint hearted attempts at tweeting (“Going for lunch”, “Really appreciated my week-end in Paris”), I just gave up. What is this service for?

Developing an interest in digital humanities changed my opinion. It is not quite a field yet: I am not aware of established journals devoted to digital humanities, or of international societies with annual meetings. But it is certainly a community of interest. The trouble for this community is that they come from widely different backgrounds: history, demography, philology, but also machine learning and software developing, to mention just a few. How do they get to know and learn from each other?

Twitter happens to be a very convenient space for this purpose. It is commonly used by computing scientists, who tweet furiously about their ideas, results, and the events they organize. And some social scientists started participating in the discussion. They are very few for now, but the principle of twitter is that each “tweet” can quote links and keywords which can then be followed (and re-tweeted, etc.) So that their voices are amplified, and at the end one gets quite a broad view of social sciences in the digital humanities.

Tweeting

For instance, what triggered the writing of this post was reading a fascinating blog post by a Princeton scholar on the design of databases for historians, which I discovered by following a link on twitter (http://sappingattention.blogspot.com/2011/03/what-historians-dont-know-about.html). The point I want to make is that, instead of following the work of this Princeton guy in particular (even if in this case that might be a good idea!), it might be actually a better idea to use twitter and take advantage of its “echo chamber” effect, which will bring you a view on his work when he gets referred in links, and a much vaster view of the digital humanities in general by simply tracking a few keywords and individuals.

If you are tempted, here are a few of my favorites to follow on twitter:

#digitalhumanities
#nltk (for textual analysis)
#sna (for social network analysis)
@jonathantray (a professional journalist and a computer scientist, works now for AP, the news agency)
@wmijnhardt (an exec at my univ, tweets a lot about science management)

He does not tweet, but gets cited a lot in the chatter: Elijah Meeks from Stanford – another fine scholar in the digital humanities.

Happy tweets!

@seinecle

 

[EDIT: again following a link on twitter, I found this contribution by Anthony Grafton, worrying about the conspicuous absence of historians in one large Harvard history project in digital humanities (“culturomics“): http://www.historians.org/Perspectives/issues/2011/1103/1103pre1.cfm]

Time for historians to make a move!]

N-graming

Mid December Google gave the nerds of the World an early Xmas gift. It was N-gram viewer, a visualization tool to plot word frequency (and word strings “n-grams” up to 5 words) in its Google Books corpus. There is a Science article to go with it dawning a new field of “culturomics” (apparently a Harvard University object). Looking beyond the Steven Pinker enabled hype, better methods to probe corpora for meaningful subtexts and cultural themes exist, N-gram viewer is just fun.

There are plenty of clever queries out there, but I liked most the ones I found in Datavisualization. Closer to our interests are the queries of Economic History Blog. My contribution is a bit poor in imagination. But here goes.

Occasionally, the label of our community becomes a subject of debate. What best represents us: history of economics (blue) or history of economic thought (green)? From n-gram viewer the latter gets the most uses. At least until 2000 there is not much movement between one and the other, their frequencies move in parallel. I am sorry to report that our subject peaked in the mid-1990s (in books at least). [Update, 8th Feb. 2011: I included two graphs with caps and without, thanks Andrej!]

The triad of Masters programs in economics are: Micro/Macro/Econometrics, but how to these fare in mentions? I was surprised that econometrics was ahead for most of the period, and macro goes over the top only in the 1980s.

How about subjects in the work of economists (and everyone else)? Growth is the word that explodes into consciousness particularly post 1945, as Wealth slowly declines.

The classic: supply or demand? Demand of course!

Finally, a query that is not much history of economics but is important. How have the lay been referred to: as citizens and taxpayers (political), or as investors, consumers and producers (economic).

Zombies

The economic crisis has a new trope. Zombies. An Australian economist, professor, blogger has published a book titled Zombie Economics: How Dead Ideas Still Walk Among Us. And Paul Krugman wrote a recent op-ed “When Zombies Win.” (it is not the first time he played with the term.) The message that Quiggin and Krugman express is that some ideas rise from oblivion, and just won’t go away, won’t die, i.e. zombies.

I believe I know a thing or two about zombies. I watched, at a very impressionable age, The Night of the Living Dead, credited to have originated the concept (zombies are the most modern of monsters), although I always preferred the zombie comedy not being a horror buff: Army of Darkness, Zombieland and that classic Shaun of the Dead. I am now adept of zombie videogames playing often with my main bro Left 4 Dead 2, and looking forward to play in Call of Duty: Black Ops where I will choose between JF Kennedy, Nixon, McNamara or Castro and fight for survival against zombies in an underground complex (really! no kidding!).

These are my extensive zombie credentials, and with those I feel confident to say a thing or two about the semiotics of zombiehood.

Survival. The first and last element of all zombie tales is survival. Financial crisis is dire but it seems hardly the matter for life and death struggle, chainsaw in hand. In this the analogy presses urgency, but not action. Survival in a zombie world is to escape, keeping out of sight, lay low, and wait for someone with big guns to come clean the place. This is not Krugman’s approach who wants us to go out there and fix the economy…

Sadism. In most of its comedic and particularly in its videogame versions, the real pleasure of zombiedom is sadism, and indulgence in its exploration. Zombies look like people but their status as infected or cursed allows you to dispense of them with extreme prejudice. The human body is dehumanized, and somehow it’s ok. Here is what worries me most about the zombie analogy and the crisis, that it invites some level of dehumanization of those that are your opponents, these zombies ideas are also zombie people, and it is ok to terminate them with righteous violence. I don’t predict physical extermination, but a unhesitating deletion of the other from public discourse is not implausible.

Closure. Along with sadism, the underlying theme of a zombie story is that of a loved one (or a peer) that cannot rest, cannot go in peace, and that is incensed by a hunger for your flesh. Tough call. Zombie narratives are about letting go, about forgetting. How do you kill a zombie? You destroy his brain, his memory, his mind, his idea. This is a hot subject in science studies right now: the construction of ignorance, of forgetting. Take the work of Naomi Oreskes on climate change and her Merchants of Doubt: like the tobacco companies many decades ago trying to deny the links between smoking and cancer, the climate change deniers are today attempting to turn back the clock on knowledge. Krugman and Quiggin might be claiming that pseudo-science is holding us back from knowledge, but as an historian I worry about their appeal for forgetting and closure. As an historian I sympathize with the zombies. Sure if they bite you you will get very ill, but i plan to keep my distance.

(this post is all the more appropriate since AMC is doing a marathon of the first season of its Walking Dead)

The End of Relativism?

The beginning of a new year is always the occasion to reflect on the recent past, as the posts of my fellows Benjamin, Clément and Béatrice [to whom the opposite Calvin & Hobbes comic strip is dedicated] have shown. Though their interrogations mainly concern the purposes and practices of historians, I would like to add another one, which may be a bit more ‘philosophical’ – pardon the grand word! What has struck me during the year is the slow decline of what some thinkers call relativism.

Relativism (not to be confused with moral relativism), as I have argued here and there, is not the idea that everything is equal or that there is nothing demarcating the good and the bad, the true and the false. Instead, it is the observation that what we call truth or scientific facts or fair decisions is affected by the context in which we are located and that they can be appraised differently in different communities or cultures. It is not surprising that relativism – a term sometimes used pejoratively by its detractors – has been associated with literary theorists such as Stanley Fish, because rhetoric is where it is used more conspicuously. My literary style will greatly change depending on the people I am addressing to and, as a result, the meaning of what I am saying too. For instance, while writing a scientific paper, I can call some previous contribution ‘misleading’ or ‘unfortunate’ while in front of friends researchers, I will call it a ‘piece of crap’, and back at home, in a sign of deep fatigue and irritation,  I will paraphrase Lennon and call it ‘the shittiest pile of shit ever’. Talking about Samuelson in a private correspondence, Stigler wrote Friedman: “It may merely be prejudice, but I’m inclined to write him off as an economist” [in Hammond, Making Chicago Price Theory: Friedman-Stigler Correspondence, 1945-1957, p. 97]. This is certainly not something he would have used – in spite of his renowned acerbic wit – in publication, and though Samuelson may have been conscious of such animosity he certainly did not take it into account when he called Friedman “an able scholar” and “an old friend” [Samuelson, Economics From the Heart, p. xi). There is nothing abnormal in this. Whatever our opinions are, we have different ways of communicating them to our interlocutors – from our closest friends to the scientific community and the public at large.

This, however, has seriously been threatened in 2010 and I will only mention two events that struck me in this respect: the first one is the fact that a few people have been legally fired from their jobs after talking badly about their supervisors on Facebook, the other one is the whole Wikileaks affair. In the former, it is quite striking that people who have written on their wall a few negative words about their work environment – like calling their boss an idiot, or their job crappy – have been recognized as guilty of serious professional mistakes while we know that everyday people spend most of their time at the workplace, near the coffee machine for instance, unfearfully disparaging other colleagues and immediate superiors. Why is something that is considered normal in the workplace is suddenly demonized when it is done outside of it? The wikileaks affair is quite similar, as it simply shows that when diplomats talk between then, they do not adopt the same discourse that they will use publicly. Is there anything shocking about that? I don’t believe so. You may have to deal in a friendly manner with that head of state you believe is an arrogant and disagreeable human being, especially if world peace is threatened. Similarly, you can perfectly envision with some allied country the use of the military force  toward a country you are simultaneously conducting amiable negotiations with – just in case this does not work, as Clausewitz believed . The fact that these seemingly inconsistent behaviors are suddenly judged negatively by law courts and the public opinion at large will make people adopt the same discourse whoever they talk to. Whether we are blogging, writing academic papers or chatting on our Facebook walls, should we adopt the same writing style? Some people obviously believe we should and the huge informational database that is constituted on the internet seems to put some pressure upon us to do so as well.

How much our practices as historians [of economics] are to be affected by that? I believe History as we construct it is built upon the idea that things – ideas, objects, etc. – evolve and differ in different periods of time and among different communities. If they do not, there is simply no story to be told. The denial of relativism is then the denial of historicity. Happy new year!