Hisreco 2017: no decreasing returns, yet.

20170420_180853The 11th History of Recent Economics conference (HISRECO) took place at the University of Lucerne on April 21-22, 2017. As a co-organizer of this conference, with my dear friends Pedro Duarte and Verena Halsmayer, I am not well placed to express an opinion on it. Let’s just say that we haven’t entered the period of decreasing returns yet. We had a very nice roster that included historians of economics, historians and STS scholars, and that my impression last year that the distance between those communities was decreasing has not been proven wrong. This is not to say that all the papers that were presented were perfect: they need not be, anyway. But the free-form discussions we had were as enthralling as ever. A quick summary follows.

Harro Maas (University of Lausanne) wrote on forecasting in the Netherlands, from the early postwar years of the Centraal Planbureau (CPB) to the aftermath of the 2008 crisis. He put in contrast the practices of scientific modeling and the idiosyncratic practices of the quacks. The latter were rehabilitated as the only ones who managed to predict the Great Recession.

Marion Ronca (University of Zurich) talked about Eugen Böhler, the Swiss economist, and his influence on the economic policies of his country. Converted to Keynesianism after fighting against it, Böhler was nonetheless an intellectual who did not fit into the mainstream economics of his time. At the end of his year, he used Carl Jung’s concept of mythos to criticize the discipline.

Laetitia Lenel (Humboldt University of Berlin) studied the first years of the NBER, showing that not only the methodologies used there differed from those adopted at the Cowles Commission but the views of the role of policy as well. While Koopmans and his allies’ endeavors aimed at advising the government, Mitchell and Burns were interested more in collecting facts and educating the public at large.

Sarvnaz Lotfi (Virgina Tech) provided an account of Research and Development (R&D) in the postwar period. Her project is to contrast the views of R&D as the main explanation of macroeconomic growth (following Solow’s residual) with its practical value as shown in accounting, management and law. Ultimately, there is more disparity than consensus in the way scholars and policymakers envisage the value of R&D to a nation.

Roger Backhouse (University of Birmingham) attempted to assess MIT economist Paul Samuelson’s role in influencing the economic policies of John Kennedy. Samuelson did not participate directly in policy advising, choosing instead to reflect on policy through his textbook and interventions in the press. This illustrates his cautious, even ambiguous, stance towards politics.

Cleo Chassonnery-Zaigouche (University of Lausanne) provided an alternative account of the role of economists in the courtroom,  focusing more specifically on James Gwartney’s expertise in racial and gender discrimination on the labor market. The way through which truth is assessed in the court is different from the way it is done in an academic setting, affecting the view of economics as a science in the process.

Francesco Sergi (University of Paris-Sorbonne) studied the standard, internalist, history of recent macroeconomics, that is contained in the manuals used in central banks. He argues that these narratives, which are aimed at standardizing practices, also tend to “decontest the contestation” existing in the field. In his view, new neoclassical macroeconomics – needs to be disaggregated and it is the duty of historians to bring more dissent to the discipline.

Steve Medema (University of Colorado at Denver), finally, wrote on the place of non-welfarism in the debates over the Coase theorem. While economists typically tried to exclude non-welfarist – i.e. social justice related – arguments in the postwar period, those were ubiquitous in the pieces that expressed criticism toward Coases’s idea of a market-based solution to environmental issues. Medema argues that non-welfarist arguments can be considered as proxies to ideology.

*

*                      *

123309e_goodwin_craufurd_hires
Craufurd D. Goodwin (1935-2017)

On a sadder note, we have learnt during the first day of the conference that the great historian of economics and longtime History of Political Economy Editor Craufurd Goodwin had passed away. Goodwin’s vigorous efforts to promote the history of economics did not consist in faint discourses about the vitality of the field but, rather, in his constant allegiance to the highest possible academic standards. The mere possibility of a conference like Hisreco is a testament to the excellent scholarship his endeavor helped to encourage. He was one of the true giants of our discipline and will be greatly missed. Our condolences go to his wife, Nancy, and his friends and colleagues at the Center for the History of Political Economy at Duke University.

 

Advertisements

2016 as a ten-letter word

proctor
Robert N. Proctor (Photo: Linda A. Cicero)

In the end of November, as it is the case every year since 2004, Oxford Dictionaries revealed their choice for the word of the year. For 2016, they settled on “post-truth”. This adjective, defined as “relating to or denoting circumstances in which objective facts are less influential in shaping public opinion than appeals to emotion and personal belief”, echoed a number of events of the past few months, including UK’s vote in favor of Brexit and, most infamously, the election of Donald J. Trump. Following the announcement, a few commenters were quick to observe that “post-truth” could be considered as an emanation of postmodernism, the brain-child of post-1968 French philosophy and critical theory. Whether you buy this or not – I don’t -, there’s no denying that “post-truth” has been everywhere in the press and on social networks. Yet, as a historian of science with little – if any – interest in questions of “truthfulness” and “falsity”, I would like to suggest another ten-letter word for describing more accurately what has been going on over the past few months – and, admittedly, over the past few decades as well, 2016 representing in my opinion some kind of turning point in its development. This word is: “agnotology”.

For those of you who are not familiar with the concept elaborated by Robert N. Proctor in books such as The Cancer Wars: How Politics Shapes What We Know and Don’t Know About Cancer (1995) and Golden Holocaust – Origins of the Cigarette Catastrophe and the Case for Abolition (2012), suffice to say that “agnotology” is the production and dissemination of ignorance – as well as the study of this phenomenon. Proctor’s argument in a nutshell is that knowledge is not created out of a vacuum which we would call “ignorance” but, instead, that both knowledge and ignorance are social constructs, therefore contingent to many social, political and individual factors. The production and dissemination of ignorance, therefore, can be studied using the tools that are traditionally attached to the history of science, making the distinction between science and non-science not so significant in the process. When we look at the history of how the cigarette industry intentionally spread doubts about evidences that linked smoke ingestion to cancer, there were a few scientists to back this claim. At first, I was skeptical about “agnotology” because I had read Naomi Oreskes and Erik Conway’s Merchants of Doubt, which does not use the term “agnotology” but tells a relatively similar story of ignorance dissemination, and was unsatisfied with the way they tried to demarcate between the good, disinterested scientists fighting for truth and those who were paid by big corporations to spread false information – I had expressed my dissatisfaction on the INET version of this blog. However, I do not find the same problem with Proctor’s historical narrative which is not so much interested in questions of demarcation but rather in the cultural and political context in which ignorance is produced and disseminated. Accordingly, agnotology has been used in the history of economics by Phil Mirowski and Edward Nik-Khah in a way that may seem controversial at first  but which I found, after some resistance, increasingly convincing. After all, agnotology deals with producing and distributing something, so economics should not be too far away when we think about it. Two French economists have recently tried to use Proctor’s work, without using the term ‘agnotology’, in order to build a case against the critics of mainstream economics but in my opinion – which I have expressed in a forthcoming review of their book for a French STS journal -, they fell short of understanding the complexity of the concept and, quite ironically, ended up generating a lot of agnotology over the current status of their discipline.

trump
Donald Trump: the rise of agnotologic governmentality?

But so much for these issues of scientific demarcation – or lack therof. “Agnotology” is an enlightening word to describe 2016 because it is effectively applicable to the political issues of the day. Ignorance production and dissemination is not something which is just relevant to scientific issues. It is actually, a total social fact in the Maussian sense of the term, one that ties together cultural, psychological and political elements. With the election of Donald Trump, I even wonder if it is not possible to assert that we are entering an age of agnotologic governmentality, a way of governing that uses ignorance as a political device. In using the term governmentality, I explicitly refer to Foucault’s Birth of Biopolitics. Governmentality, in Foucault’s conception, should not be confused with “government”. A regime of governmentality is enforced, not just by the State, but at many different levels where knowledge and power are connected. Suffice to replace knowledge by ignorance and then you have some idea of what a Foucaldian version of agnotology could look like. Since his election in November, Donald Trump has been a master in disseminating so much information, both in the press and on social networks, that it is almost impossible to assess what his legislature will yield. But of course, this had not begun with his election: this is the way he had behaved since the very beginning of his campaign at the Republican primaries. At the time, it had been summed up in one fantastically short tweet.

Now that “the comments section” has become the new President of the United States, we can argue that the kind of ignorance that such comments section typify will noy only lead  the most powerful nation, but as a result will preside over the way of the world at large. But ignorance is not a “top-down” phenomenon and that’s what makes it so stealth, yet powerful. Ignorance is cultivated at every level of the society and now, through more or less trustable internet news coverage, it is disseminated at a higher speed. Even academics and self-proclaimed “intelligent” people such as you and I can be subjected to it. If you have been a regular user of social networks in the course of the past few months, I defy you to tell me that you have never fallen into a clic-bait, believing for at least a few minutes a piece of information that has been revealed to be either false or (mis)guided by a non-objective source. The bombing of Aleppo, for instance, has been the subject of so much news coverage that it is impossible be sure that everything we were told was true. While there is no doubt that, on one hand, some information has been manipulated by pro-Syrian and Russian medias, we are not so naive as to believe that there is no propaganda on the other side, too. Increasingly complex conflicts and social issues such as this one are not easy to grasp and we can all be deceived. Effective propaganda knows how to exploit the capacity we all have to doubt. It is no surprise that agnotology is often related to neoliberalism. It is not so much, I think, that there is a mechanical relation between the two but, instead, that both are so squeezed in the recesses of our our everyday life that they are difficult to espace, unless we turn off our computers and start leading a more recluse life – which may not be a bad idea after all.

Anyway, I am afraid I have conveyed that 2016 has been a very bad year and this is similar to a lot of rants you have already read elsewhere. I should apologize for my lack of originality. But there is also a more positive message: as historians of science, we may  be able to apply our critical toolbox to the understanding of how we got there and, hopefully, how we will be able to get away with it.

On Entering the Canon: Who Wins the Nobel Memorial Prize?

Meanwhile, my department – along with some others, I suspect, – created a form for predicting the future Nobel Memorial Prize winner(s) in economics. In fact, it might be a good exercise for a sociologist of the economics profession (although a bit too narrow one).
     Who will get the Nobel? Many think of Olivier Blanchard (as Web of Science does) – and indeed, Blanchard might be easily confused with those who already got the Nobel long ago, leaving us puzzled that he is still not among them. It will be an interesting twist if Paul Romer gets the prize (recall his recent devastating critique of macroeconomics, as well as previous ones). From the older generation people like Martin Feldstein seem to be overlooked. All these scholars are known for their penchant for ‘macroeconomic’ issues  – whatever this means today (and the last ‘macroeconomic’ Prize winner was Thomas Sargent in 2011).
     Anyway, the intrigue is still present, since the result can often be unexpected (as it was the case, say, with Elinor Ostrom or could have been, for some, with Maurice Allais or James Buchanan). Current priorities, despite the fact that one gives the Prize for the work done long ago, still might play a role – thus juxtaposing the past economic science and the ever growing complexity of the present one. Multiple winners are also possible and ubiquitous, and here we might get most striking combinations interweaving times, schools and sub-disciplines. One could, for example, pull together Elhanan Helpman, Avinash Dixit and Marc Melitz as theorists of international trade, Philippe Aghion, Peter Howitt and Robert Barro as economic growth giants (oops, no Romer and too much Harvard here), or William Easterly and Robert Townsend as classics in ‘economic development’. A more  interdisciplinary – but less probable – perspective would involve Harold Demsetz and Richard Posner as ‘institutionalists’ and, of course, people like H. Peyton Young and W. Brian Arthur or Ernst Fehr (to my mind, Fehr should at some point get the Prize by all conceivable standards) or even Samuel Bowles/Herbert Gintis (with Robert Axelrod as another disciplinary counterpart to Ostrom), many of them could be also paired with the  previously nominated William Baumol or Israel Kirzner. I know much less about econometricians, perhaps those who know more could give their suggestions.
    But this is still a game. What would interest me a lot (and what, as far as I know, is only partly touched upon in the new book by Avner Offer and Gabriel Söderberg – which nonetheless should be fascinating) is the internal mechanics and tensions of the selection process, the struggles behind the nominations and their evaluation, and the techniques of compiling Nobel press releases and «scientific backgrounds». We will never learn the ’true’ story, but we have to come closer to it, because this kind of historical/sociological inquiry might be at least as instructive for understanding the dynamics of economic knowledge as the exercise in prediction.

The Tale of Three Universalisms, or How Mainstream Economics Meets Analytical Philosophy, They Both Roll Up Sleeves and Get to Work

Some time ago, in a conversation, a colleague of mine referred to John McCumber’s book «Time in the Ditch» and I saw the reference to the same book in Roy Weintraub’s recent text on McCarthyism and the mathematization of economics. This coincidence – and the fact that I had known McCumber before as an important Hegel scholar – made me look more closely at the topic I’ve been brooding about since quite a while. Yes, we know, after the work of Phil Mirowski and Sonja Amadae, that Cold war mattered for American economics in pushing it closer to the ideal of a ‘hard science.’ Well, somehow, almost simultaneously, in the beginning of the 2000s, McCumber told us that American philosophy had also been made more scientific in response to Cold war challenges. My claim here does not bear directly on Weintraub’s argument (that the influence of McCarthyism on the research practices of economists has been exaggerated and is not really supported by the evidence), but rather invites to reflect on the more general affinities between mainstream economics and analytical philosophy prompted by this historical research.

The analogies (noted once to me by Eric Schliesser and, as far as I know, never really thematized – but I would be most grateful for any references) do matter for me both historically (as parallels in what was happening in – largely Anglo-American – philosophy and economics in the last century) and systematically (in teaching us what kind of knowledge mainstream economics and analytical philosophy were and, to some extent, continue to be).

Meanwhile, along with and after McCumber’s book, a number of important studies emerged – such as Steve Fuller’s portrayal of Thomas Kuhn as a Cold warrior; George Reisch’s analysis of the formalization happening in the postwar American philosophy of science; or Joel Isaac’s detailed and fascinating story of Donald Davidson’s entanglement with Patrick Suppes and other economists and decision theorists. Philosophy, along with behavioral sciences, economics, and operations research, has thus taken its place in the thick historical narratives documenting the shift to a more applied (but not pragmatist!) and, at the same time, more formalized, rule-based, algorithmically oriented kind of knowledge. This shift involved, among many other things, the move away from humanities (broadly conceived) in search for transparent and universally comparable knowledge regime following a ‘tool shock’ (Isaac, again).

Now, is there any sense in juxtaposing analytical philosophy and mainstream economics? Apart from some obvious thematic overlaps – such as, in the case of Davidson and Suppes, value and action theory – there are general aspects worth thinking about. I would tentatively call them three universalisms and I’d abstain, on purpose, from any strict separation between internalist and externalist perspectives. Of course one can find a lot, a lot of counterexamples, but what I sketch here are just general tendencies, to be beaten only by equally general and more plausible ones. It’s not a comprehensive history, it’s a perspective that might help illuminate the history and sociology of the economics and philosophy professions.

The first kind of universalism is fairly obvious: both disciplines value universal knowledge, they clearly prefer generalizations over historical situatedness, abstraction over the entanglement into cultural contexts, and formalized reproducible truths supported by the hard data over the relativisms of interpretation. One could elaborate, but I’d just leave it here.

The second universalism is mostly rhetorical, and perhaps could be found in other disciplines, too. It consists in colonizing the words and continuously reproducing the pars pro toto trope, with certain type of economics suddenly becoming the whole of economics, with certain type of philosophy suddenly representing the whole of philosophy (even Isaac could not avoid this), and with an extremely tough and protective boundary work (see, e.g., Tiago’s important paper on that).

The third universalism consists in a democratic and inclusive nature of both communities. We know it can be a spurious effect, we know that status and prestige play a role everywhere in the academia, and still, in contrast with heterodoxy and continental philosophy very much centered around ‘big’ figures, dead or alive, we cannot ignore the salience of the collective and collaborative nature of the profession on the other side (just think of the increasing population of Daron Acemoglu’s co-authors). In mainstream economics and analytical philosophy, the thinkers and poetic geniuses make way ‘to humble, competent people, on a level with dentists,’ most problems are technical, and the solutions are near at hand. But this makes any exception, any unusual constellation, any identity shift even more interesting, both sociologically and in terms of intellectual history.

 

DARPA, the NSF and the social benefits of economics: a comment on Cowen and Tabarrok

Tyler Cowen and Alex Tabarrok have a new piece in which they ironically note that economists are surprisingly shy when it comes to applying their tools to evaluate the efficiency of the NSF economics grant program. Their target is a companion JEP article in which Robert Moffitt defends the policy-relevance of economics against the last round of attacks on NSF’s social science budget, not least Senator Coburn’s 2011 list of wasteful federal spending. Cowen and Tabarrok criticize Moffitt’s use of the econ relevance poster child, Paul Milgrom’s research, which, legend says, has brought 60 billions $ to the US government through rounds of FCC spectrum auctions. This is a typical case of crowding-out effect, they argue, since private firms like Comcast, who also saved $1,2 billion in the process, had huge incentives in funding that research program anyway. Likewise, they note, the quest to raise revenue in sponsored search auctions led Google engineers to rediscover the Vickrey-Clarke-Groves mechanism. NSF funding should thus be shifted to high social benefits programs, for instance setting up a replication journal, supporting experimental projects with high fixed costs or –here they agree with Moffitt – deploying large publicly available datasets such as the Panel Study of Income Dynamics, which was specifically targeted by Coburn.

NSF grants are also biased toward research programs with high probability of success, already well established and published in top journals, they add. Such “normal science” is hardly groundbreaking. NSF should rather emulate DARPA and fund “high risk, high gain, and far out basic research” (which could include heterodox economics). They also suggest shifting from funding grants ex ante to giving prizes ex post (DARPA’s practice), because this creates competition between ideas. If an heterodox model provides better predictions than mainstream ones, then a NSF prize would signal its superiority.

The paper is challenging and, as always with the authors, unrivalled in its clever applications of economic tools. My problem is with:

1.their romanticized picture of DARPA

2. their lack of discussion of how the public “benefits” of economic research should be defined

 1. Should the NSF emulate DARPA?

department_mad_scientists_paperback-224x300 Cowen and Tabarrok’s suggestions are predicated on the impressive record of the Defense Advanced Research Projects Agency in promoting groundbreaking research, from hypersonic planes, driverless cars and the GPS to ARPANET and onion routing. This success is usually attributed to the philosophy and structure of the secretive defense institution founded in 1958 to “prevent technological surprise like the launch of Sputnik,” and later to “create technological surprise for US enemies.” Michael Belfiore, for instance, has described a “department of mad scientists” who profess to support “high risk, high gain, and far out basic research.” This is achieved through a flexible structure in which bureaucracy and red tape are avoided and the practical objectives of each project are carefully monitored. I have only skimmed Anne Jacobsen’s recent “uncensored” history of DARPA, but so far it does not seem to differ much from Belfiore’s idyllic picture. Yet digging into the history of specific projects yields a different picture. In his account of DARPA’s failed Strategic Computing program, Alex Roland explained that while the high-risk, high-gain tradition served as the default management scheme, the machine intelligence project was supervised by no less than 8 different managers with diverging agenda. Robert Kahn, the project’s godfather, believed that research was contingent and unpredictable and wanted to navigate by technology push, whereas his colleague Robert Cooper insisted on demand pull and imagined an AI development plan oriented toward innovations he could sell. Some viewed expert systems as crucial and other dismissed it, which changed what applications could be drawn from the program.

Roland’s research exemplified the difficulty of DARPA’s officials in agreeing over a representation of the scientific, technological and innovative process that would yield maximum benefits. And benefits were to be evaluated in terms of defense strategy, which the history of Cold War science has shows was far easier than to evaluate the benefits of social programs. From cost-benefit analysis to GDP, hedonic prices, contingent valuation, VSL or public economics, the expertise economists have developed is precisely about defining, quantifying and evaluating “benefits.” But the historical record also shows each of these quantifications have been fraught with controversy, and that when it comes to defining the social benefits of their science as a whole, economists are not even struggling with quantification yet. For the last 70 years, they have been stuck with negotiating a definition of “social,” “public” or “policy” benefits consistent with the specific kind of knowledge they produce with their patrons.

2. Fighting over “policy benefits”

Moffitt’s article is only the last instantiation of a series of attempts to reconcile economists’ peculiar culture of mathematical modeling with external pressures to produce useful research, their quest for independence and their need for relevance. This required a redefinition of the terms “pure,” “applied,” “theoretical,” or “basic,” and Moffit’s prose perfectly illustrates the difficulty and ambiguity of the endeavor:

The NSF Economics program provides support to basic research, although that term differs from its common usage in economics. Economists distinguish between “pure theory” and “applied theory,” between “pure econometrics” and “applied econometrics,” and between “microeconomic theory” and “applied microeconomics,” for example. But all these fields are basic in the sense used in government research funding, for even applied research in economics often does not concern specific programs (think of the vast literature on estimating the rate of return to education, for example, or the estimation of wage elasticities of labor supply). Nevertheless, much of the “basic” research funded by NSF has indeed concerned policy issues, which is not surprising since so much of the research in the discipline in general is policy-oriented and has become more so over time. Although most of that research has been empirical, there have been significant theoretical developments in policy areas like optimal taxation, market structure and antitrust, and school choice designs, to name only three.

 For Moffitt, in other words, the nub of the funding struggle is that both theoretical and applied economics are considered “basic” by funding agencies because they are only indirectly relevant to specific policy programs. Trying to convince patrons to fund “basic” or “theoretical” research was an issue even before the NSF opened its social science division in 1960. At that time, economics’ major patron was the Ford Foundation, whose representatives insisted on funding policy-relevant research. Mathematically-oriented economists like Jacob Marschak, Tjalling Koopmans, or Herbert Simon had a hard time convincing Thomas Carroll, head of the behavioral science division, that their mathematical models were relevant.

NSF’s economic funding remained minimal throughout the 1960s, and it climbed substantialy only after the establishment of the Research Applied to National Needs (RANN) office in the early 1970s. Tiago Mata and Tom Scheiding explain that RANN funded research on social indicators, data and evaluation methods for welfare programs. It however closed in 1977 after Herbert Simon issued a report emphasizing that the applied research funded was “highly variable in quality and, on the average, not impressive.” The NSF continued to fund research in econometric forecasting, game theory, experimentation and development of longitudinal data sets, but in 1981, Reagan made plan to slash the Social Sciences NSF budget by 75%, forcing economists to spell out the social benefits of their work more clearly. Lobbying was intense and difficult. Kyu Sang Lee relates how the market organization working group, led by Stanley Reiter, singled out a recent experiment involving the Walker mechanism for allocation a public good as the most promising example of policy-relevant economic research. Lawrence Klein, Kenneth Arrow and Zvi Griliches were asked to testify before the House of Representatives. The first highlighted the benefits of his macroeconometric models for the information industry, the second explained that economic tools were badly needed at a time when rising inflation and decreasing productivity needed remedy, and the third explained that

…the motivation for such selective cuts [could only be due to] vindictiveness, ignorance and arrogance: Vindictiveness, because many of the more extreme new economic proposals have found little support among established scientists. Because they have not flocked to support them, they are perceived as being captives of liberal left-wing ideologues; Ignorance, because this is just not so. It is ironic and sad that whoever came up with these cuts does not even recognize that most of the recent ‘‘conservative’’ ideas in economics – the importance of ‘‘rational expectations’’ and the impotency of conventional macro-economic policy, the disincentive effects of various income-support programs, the magnitude of the regulatory burden, and the arguments for deregulation – all originated in, or were provided with quantitative backing by NSF supported studies. And arrogance, in the sense that those suggesting these cuts do not seem to want to know what good economic policy can or should be. They do not need more research, they know the answers.

 31r2hG8wtzL._SY344_BO1,204,203,200_Sounds familiar? The ubiquity of Al Roth’s research on kidney matching in economists’ media statements, the proliferation of books such as Better Living Through Economics, Angus Deaton’s 2011 Letter from America, the 53 proposals by economists to rethink NSF future funding, and Moffitt’s article can all be interpreted as attempts to redefine the relationship between basic, applied and policy-relevant research and to provide a framework to assess the public benefits of economic research. They all exhibit tensions between publicizing benefits, and maintaining objectivity, independence and prestige. Reconciling these contradictory goals underwrite centuries of terminological chicanery. In 1883, physicist Henry Rowland delivered a “Peal for Pure Science” in an attempt to divorce his research from the corrupting influence of money and materialism on “applied” physics. In an effort to promote both scientists’ autonomy and their ability to foster profitable industries and strategic military applications, Vannevar Bush introduced the term “basic science” into his 1945 Endless Frontier report. And this is how Moffitt ended up straddling pure, applied, basic, practical, theoretical and empirical science. Economists nevertheless might be able to cut through these debates over the “policy benefits” of their science by turning it into a battle of indicators, as they successfully did with the concepts of growth and inequality.

Bonus question that no paper on NSF econ funding addresses. How has the NBER succeeded in monitoring 15% of NSF econ grants, and what are the consequences on the shape of econ research?

CFP – HISRECO 2017 in Lucerne

city-scapeHistory of Recent Economics Conference University of Lucerne – April, 21-22 2017

The eleventh History of Recent Economics Conference (HISRECO) will be held at the University of Lucerne on April 21-22, 2017. Since 2007 HISRECO has brought together researchers from various backgrounds to study the history of economics in the postwar period. It is the organizers’ belief that this period, during which economics became one of the dominant discourses in contemporary society, is worth studying for its own sake. The increasing availability of archival materials, along with the development of new perspectives inherited from the larger history and sociology of knowledge, has helped to provide insightful histories of the development of recent economic practices, ideas, and techniques. In particular, this area of research offers good opportunities to young scholars who are interested in interdisciplinary approaches to the history of economics.

We invite researchers in all related fields to submit a paper proposal of no more than 500 words. Even though the organizers are open to a wide range of approaches to the history of economics, paper proposals that address the interface between this field and the history and sociology of science, or cultural and science studies will be particularly appreciated. Proposals should be sent electronically (as a pdf file) to Verena Halsmayer (verena [DOT] halsmayer [AT] unilu [DOT] ch) by October 14 2016. Successful applicants will be informed by November 15 2016.

Thanks to financial support from the University of Lucerne, FIPE (The Institute of Economic Research Foundation, Brazil), the European Scientific Coordination Network (GDRI, CNRS) and the KWI (Kulturwissenschaftliches Institut) Luzern, HISRECO has limited funds to partially cover travel and accommodation for up to four young scholars (PhD students or researchers who have obtained their PhD over the past two years, from July 2014 to October 2016). Young scholars should include in their proposal their current affiliation and the university and year of their PhD, if this is the case. Those needing more information about funding are welcome to approach the organizers.

For those who want to know more about HISRECO, a list of past conferences and contributors can be found at http://www.hisreco.org.

The organizers, Verena Halsmayer (University of Lucerne), Pedro Duarte (University of São Paulo), Yann Giraud (University of Cergy-Pontoise), and Joel Isaac (University of Cambridge).

How to soundtrack your perfect failed grant application?

Well, call me. I am a specialist.

There are many ways to do it. Generally your first reaction is a bit extreme, and no one does extreme like Converge.

Then, self-pity comes next. I would advise you to season it with a bit of British wit.

(Of course, if you know some pal who succeeded, you can still hum this, too!)

Then, you must find who’s responsible for your failure and that’s when you try to guess who your referees are.

But the real answer is much too obvious.

Of course, you were not alone on this grant application, so you should let your partners know about the outcome before they ask you.

So there are two possibilities, either you decide to pull yourself together, because, well, you’re a scientist, right?

Or you just let it go and move forward.