“Romantic Planet: Science and Literature within the Anthropocene”

My Literature Compass article went live last week and I’ve been waiting to post about it; the idea for the piece was originally that it would try and locate my book (The Age of Analogy) within the subfield of literary criticism called “Romantic science and literature.” Along the way, though, I became more interested in what the field might tell us about the Anthropocene and the new “planetary” solidarity that, some argue, is required for a collective action that can address global warming.

I’ve been waiting to post the article while we worked on the html formatting of the Keats lines at the beginning. They’re the lines used as the epigraph for Rachel Carson’s classic Silent Spring, from “La Belle Dame Sans Merci,” and they go:

The sedge is wither'd from the lake
        And no birds sing.

Looks simple, right? Turns out, though, that formatting poetry is hard in html, in part because it makes it difficult to tinker with line spacing, indentation and other formatting issues. The css that Literature Compass uses does have a block poetry class, but (ironically) it makes a hash out of the lines (when it was first published, the first line was broken at “the” and the second at “no,” producing “The sedge is wither’d / from the lake And / no birds sing”).

The folks at LC and the editor, Jonathan Sachs, have been working on it (and I feel kind of bad asking for the special attention), but the lines just look crazy when they’re not formatted right. Right now I’m seeing if they can try using the <pre> tag (as I have here) to try and preserve formatting. I suggested the following inside the block quote:

<i class=”icon icon__block icon__poem”><pre style=”font-family:Arial;”><p>The sedge is wither’d from the lake,</p><p>         And no birds sing.</p></pre></i>

In the meantime, you can check the piece out in html or download the (properly formatted) pdf — it’s outside the paywall for another couple of weeks! Here’s the abstract:

This article surveys recent scholarship in Romantic science and literature, exploring what such studies may offer the recent “planetary turn” in ecocriticism and postcolonial research on the Anthropocene. Situating these studies in a longer critical history, it explores their implications for how we engage modern climate science. The “Romantic century” 1750–1850 marks both the dawn of the Anthropocene and a formative stage in its sciences and technologies, from the industrial revolution to modern theories of climate change and ecology. Because ecocritical writing, including green Romanticism and ecofeminism, as well as research into colonialism, empire, and global capitalism were traditionally skeptical of Western science, their recent theorizations of “planetarity” do not adequately confront a new investment in the empirical claims of global warming. Positing that the traffic between science and social forms is asymmetric – two-way but uneven – the author argues that Romantic science and literature both furnishes a sophisticated historical epistemology for planetary studies and, in its concern for the technologies, genres, and social forms that produced the Anthropocene, an “epistemology of the climate” that may help us dig our way out.

Sampling the Novel (Dickens and Network “Connexion”)

A few weeks ago statistician Andrew Gelman posted an article that used Dickens’s social novels as an example of the perils of sampling networks (h/t to Jonathan Stray and Andrew Piper for tweeting about this). Whereas, in statistical methodologies, you can “sample” a larger diffuse or “atomisticcollection and get an accurate picture of what the larger group looks like, when sampling a few points in a large network, those samples give a very poor picture of the larger network structure. It’s a bit like the difference between picking a handful of M&M’s out of a bag and making an inference about the total color distribution (reasonably accurate), and sampling a handful of molecules within an M&M and making assumptions about what the larger shape, taste, paint pattern, etc. look like. The former doesn’t have much structure, but the latter does — and that structure matters.

Here’s how Gelman applies this to Dickens:

In traditional survey research we have been spoiled. If you work with atomistic data structures, a small sample looks like a little bit of the population. But a small sample of a network doesn’t look like the whole. For example, if you take a network and randomly sample some nodes, and then look at the network of all the edges connecting these nodes, you’ll get something much more sparse than the original. For example, suppose Alice knows Bob who knows Cassie who knows Damien, but Alice does not happen to know Damien directly. If only Alice and Damien are selected, they will appear to be disconnected because the missing links are not in the sample.

This brings us to a paradox of literature. Charles Dickens, like Tom Wolfe more recently, was celebrated for his novels that reconstructed an entire society, from high to low, in miniature. But Dickens is also notorious for his coincidences: his characters all seem very real but they’re always running into each other on the street (as illustrated in the map above, which comes from David Perdue) or interacting with each other in strange ways, or it turns out that somebody is somebody else’s uncle. How could this be, that Dickens’s world was so lifelike in some ways but filled with these unnatural coincidences?

My contention is that Dickens was coming up with his best solution to an unsolvable problems, which is to reproduce a network given a small sample. What is a representative sample of a network? If London has a million people and I take a sample of 100, what will their network look like? It will look diffuse and atomized because of all those missing connections. The network of this sample of 100 doesn’t look anything like the larger network of Londoners, any more than a disconnected set of human cells would look like a little person.

So to construct something with realistic network properties, Dickens had to artificially fill in the network, to create the structure that would represent the interactions in society. You can’t make a flat map of the world that captures the shape of a globe; any projection makes compromises. Similarly you can’t take a sample of people and capture all its network properties, even in expectation: if we want the network density to be correct, we need to add in links, “coincidences” as it were. The problem is, we’re not used to thinking this way because with atomized analysis, we really can create samples that are basically representative of the population. With networks you can’t.

Gelman goes on to argue that all of the supposed “coincidences” of a Dickensian novel are an attempt to simulate network structure or “links” where the number of sampled nodes are too small to fill out a real map of the network’s structure. So coincidences simulate what would be major linkages in the actual network of London ca. 1850.

It’s a cool idea — and it gets right to the heart of the famous question posed by the narrator of Dickens’s Bleak House:

What connexion can there be between the place in Lincolnshire, the house in town, the Mercury in powder, and the whereabout of Jo the outlaw with the broom, who had that distant ray of light upon him when he swept the churchyard-step? What connexion can there have been between many people in the innumerable histories of this world who from opposite sides of great gulfs have, nevertheless, been very curiously brought together!

But for Dickens, “connexion” obviously means more than association between characters. It has moral, filial, and in Bleak House, even epidemiological dimensions. One of the questions that launched my book, The Age of Analogy, was to ask what connected the various discursive registers that operate in Bleak House — what connects the legal system of the Court of Chancery to the salvage economy of Krook’s Court; what links the virtue of Esther Summerson’s narrative position to the small pox that sickens her? (Ultimately, I came to believe that one thing that links them as a new way of thinking about analogy — between characters, social formations, and discursive vocabularies — as a way to get at the sedimentary nature of history and social formations. I say “believe” because, along the way, Bleak House & Dickens fell out of the project.)

But whether this is true, Dickens’s characters do not operate “atomistically” or even as atoms linked by coincidence. What they do and how they interact displays a great deal of structure that is not pure invention. One way to get at this is that the network model has a dispersed physical and temporal dynamic that doesn’t lend itself to thinking about narrative.  Narratives are not links, though narratives may feature interactions between characters (moments that would count as either “links” or “coincidences” in Gelman’s account). But they also convey important information about the transformation of individual characters, and their transit with respect to other conditions beyond the social: geographical and economic movement, maturation from youth to age, etc. And narratives, through their invocation of generic history, constantly invoke links to modes of thought and histories of representation that, in some sense, exceed the network of the novel and even the network of London at any given time.

Gelman himself brings up another kind of sampling in his paper that I think provides a better way of thinking about how Dickens attempts to get at larger social structures, something he terms “fractal sampling“:

When you do a survey, you want to learn at all levels. For example, if you’re studying politics, you’ll want to know what’s happening nationally, you’ll want a nationally representative sample. But you’ll also want to know what’s happening at the state level, the city level, and the neighborhood level. You can’t expect to get good estimates for all the neighborhoods in the country or all the cities or even all the states, but you’ll want some information at all these levels. That’s what fractal sampling is all about.

Basically, the point is that you can change the sampling methodology in order to capture specific kinds of group & scalar structure. I think this is a better description of what Dickens’s novels do. For a given social question (configured through a specific subset (or sub network?) within the larger world), each novel seems to seek out representative constellations of character that capture the key groups that operate within that network. So, to return to Bleak House, the key problem seems to have to do with poverty and responsibility, as configured by different social & class postions within the city, and as they interplay with legal, administrative, religious, medical, and domestic networks. And if we go back to Bleak House’s famous question, it basically samples along those lines: a country house, a townhome, a servant (the “Mercury in powder”), a street sweeping urchin (“Joe”), a metaphysical visitation (the “distant ray of light”), the dirty churchyard step. I used to read this as an open question that assembled a more or less random collection to pose in extremis the problem of connection that underwrites all of what Henry James would later term “loose, baggy monsters.” Now I think there’s a fair case to be made that the question embeds a set of structural relations that underwrite the fractal sampling of a wider network of encounters: country and city, estate and town, servant (and master), poor and rich, church (and the secular government that will user Jo from that stoop), and worldly infrastructure (the stone of the church) in its tenuous possible connection to divine revelation (the light from above).

Of course, as Jonathan Grossman has taught us, there are lots of different kinds of networks in Dickens’s novels. But it’s interesting to think about how single sentences, “What connexion can there be…,” can be important nodes in bringing them together and suggesting their analogies.

kompromat (or: how I helped lose the election)

Today, as I’ve been relaunching my blog and migrating it from an Amazon EC2 cloud instance to a GoDaddy hosted wordpress account, I’ve been listening to the president elect hyperventilate over recent reports that Russia has compromising information on his business interests and peccadilloes, was at some point prepared to blackmail him, and had regular covert contact with his campaign. Crazy. Even crazier than the widespread reports that Russians used an army of hackers and trolls (especially Edward Snowden and Wikileaks) to help spread disinformation about the election and sabotage Hillary Clinton’s campaign. It’s like we’re living in a mashup of Bridge of Spies and Spies like Us.

But for me, the craziest thing of all is that I seem to have played a (tiny) role.

Let me share a bit more about why I’m moving the site. Recently my domain has been down, and I’d been struggling to figure out why, since the server seemed to be up and running. Worse, I couldn’t access WordPress or even ssh into the site, which meant I couldn’t check and see where the traffic was coming from and I couldn’t export my old posts for relaunch. Not being much of a tech wizard, I set the problem aside sometime over the summer.

Well, I finally gave up, and I’m now reconstructing the old posts by combing through the WP database backups that I was emailing to myself on a weekly basis (this is a PITA, by the way, and means I’m losing all images and documents hosted on the old site; but there’s a great tip on how to pull posts from a WordPress DB here).

Now that the site is up and running again, I thought I’d check in to Google Analytics. I hadn’t thought to look before because, since the site was down, I figured there wouldn’t be anything to track. This is what I found:


Check the nationality.


And check out the “language” used by my top visitors.


So the vast majority of visitors to the site were from Russia (and Kyrgyzstan). And their preferred language was either Russian or something called “Secret.google.com … Vote for Trump!” And the traffic spiked through election day and then collapsed in December.

Finally, if you look at the pages they were visiting, you see several pages that I never placed on the site:



Now I’m not sure what this all adds up to. They certainly couldn’t have secure shelled into the server itself (I’d done a lot to harden that). My guess is that they found some other way to exploit WordPress and take over the server, including creating content. But I am shocked. I’d be curious how much this tracks what other WordPress hosts saw over the same period. Certainly, it was the last thing I expected.

A few years ago I was complaining that maintaining your own server meant having to fend off increasingly severe and sophisticated attacks from hackers/bots located outside the US (something I’m not really equipped to do). Now it seems clear that this was more than just a hassle — it’s actually dangerous. If you don’t know what you’re doing, you’re basically opening up a channel for others to use against the world.

Sorry, democracy. ¯\_(ツ)_/¯

moving day

When I first started this site I thought it would be fun to figure out how to run it on a free Amazon EC2 instance using WordPress. Years later and after countless crashes and security lapses, I’ve given up. Over the next few weeks I’ll be migrating all of my old posts to this site (hosted by GoDaddy). In the meantime, here’s something apropos from the Yeah Yeah Yeahs:


Talking TED (“Understanding Analogy: Theory and Method”)

A few months ago the Information Sciences Institute here at USC invited me to talk at one of their weekly Natural Language Seminars. They knew I’d been working on theorizing and analyzing analogies digitally, and wanted to hear more.

It was an exciting but daunting opportunity. How would I speak to an audience that thought about language and procedures for studying it in a radically different way? Several years ago I gave a talk like this at a conference for the Association of Computational Algebra. It didn’t go over well.

This time, I decided to experiment with a TED-style talk. There’s been a lot of criticism of the TED format. Most of it centers on whether the talks are accurate and informational or simply entertainment. Some do seem to be the intellectual equivalent of cotton candy — tasty but evanescent. But they also, I think, are a model for how to talk to a wider audience and enlist interest across cultural, institutional, and disciplinary boundaries.

So I studied up. There’s Nancy Duarte’s TED talk on TED talks, and Chris Anderson, a TED coach, has also shared his recipe. I think it boils down to three things. First, use biography (yours or another’s) to tell a coherent story that centers on the problem you work on. Second, have a clear transition from the problem to your answer. And finally, emphasize why that answer is powerful — what it changes about how we see the problem, and what it might mean for others. To put it differently, they rely on an analogy drawn between a personal narrative and a larger problem.

Put this way, it’s a recipe that applies to most of the good talks that I’ve seen, except TED talks are more personal and less complex. You have to put yourself forward and abandon qualifications, hedges, and the basic acknowledgement that others have been working on similar problems, often more successfully.

Despite discomfort with the TED format, I’ve been trying to figure out how I can get my scholarship out to a wider audience, especially communities beyond academia. This seemed like a great opportunity to experiment.

So I sat down and hammered it out. Meg was out of town, which meant that most of the writing happened with my daughter in my lap, and we practiced with her in the baby bjorn (she’s my biggest fan).

The final title: “Understanding Analogy: Theory and Method.” The folks at ISI posted it here. It doesn’t quite live up to the billing, but it worked. My auditors generally agreed that analogies are an important feature of new ideas and that I’d found a new way of looking at them. And since that talk we’ve been talking about collaborating on a machine learning tool that finds analogies. I’m recruiting undergrads for some initial work this summer. It will be exciting to see where this leads.

V21 @ INCS2015: The Chicago School of Victorian Studies

Chicago ca 1838 (Francis Castelnau, credit Wikimedia)

Sometimes it’s great to be last to a party. I just heard about the V21 Collective at INCS 2015 (my thanks and congrats to Narin Hassan + organizers for a huge success in Atlanta). I’ve had my head down trying to meet a big deadline and I’ve been almost completely absent from Facebook and Twitter. I’m less a Luddite than someone trying to actively manage a long addiction to technology. So I was surprised to learn about a debate, organized around a central manifesto, that placed some of my favorite scholars and good friends on both sides of a line in the sand.

The virtue and vice of manifestos is that they draw these lines. They’re inhospitable to the qualifications, allowances, readings, and citations that would fill out a more careful conversation and blunt the cutting edge. This allows them to be both gleeful and urgent, playful and purposeful. It’s fun to write that a spectre is haunting Victorian Studies. And we do feel haunted even if we have trouble seeing its shape. This is a great chance to engage that problem.

Chicago continues to hum along as an intellectual engine in part because it has been so savvy at leveraging the particular strengths of its faculties and its institutional resources, especially the U Chicago Press. And this is a smart opportunity to drum up some interest and help draw Victorian Studies into a warmer conversation that can excite energy, interest, and (most important to those of us pre-tenure) publications. Ben Morgan continues to help us think around the corner in advocating for venues that draw younger Victorian scholars into conversation at formative stages in their work. And while I haven’t met Anna Kornbluh yet I’m excited to talk about her book when we do. I met most of the folks now in V21 for the first time when we joined for the informal workshop that Ben organized at NAVSA here in Los Angeles. It was thrilling, electric. And the upcoming V21 symposium in Chicago is an auspicious way to collect some of this energy and give new charge to the discipline we work in. One can almost feel through the rails the vibrations of a coming special issue of Critical Inquiry. Maybe I fantasize.

I have to say, I struggle to draw lines myself, so it’s been hugely interesting to read the manifesto and some of the circulating criticism and responses. My immediate and lasting reaction is excitement. Finally (I still feel) there’s an active debate that has people widely engaged and talking but doesn’t feature the “crisis in the humanities” or the terrors of the anthropocene (I’m not a hater, btw; I talk about both issues regularly).

But in talking about the debate with folks at INCS I gathered that few of us are really sure where this new line is, even if we share a general sense that we know what kind of scholarship is being targeted. One way to put it, in accord with Kathy Psomiades, is that we generally agree on the difference between a good and bad conference paper, but it’s easier to diagnose these in terms of the characteristic failures of a specific talk than the characteristics of bad papers generally. To paraphrase Anna Karenina, I think good papers share a basic felicity, but bad papers are bad in their own special ways.

So in the spirit of collective inquiry, I’d like to pitch in and help identify what makes good work compelling. Manifestos, above all, are a call to roll up our shirtsleeves. I’m sure we can all think about theses we’d like to see.* 10 is an arbitrary but shapely number. But I think, in the spirit of the manifesto’s format, it’s more appropriate to engage theses individually rather than in the negative or en masse. So I want to take up the first thesis, because I think it demonstrates both the strength and basic challenge of the boundary that is drawn:

1. Victorian Studies has fallen prey to positivist historicism: a mode of inquiry that aims to do little more than exhaustively describe, preserve, and display the past. Among its symptoms are a fetishization of the archival; an aspiration to definitively map the DNA of the period; an attempt to reconstruct the past wie es eigentlich gewesen; an endless accumulation of mere information. At its worst, positivist historicism devolves into show-and-tell epistemologies and bland antiquarianism. Its primary affective mode is the amused chuckle. Its primary institutional mode is the instrumentalist evisceration of humanistic ways of knowing.

In sharpening the distinction between good & bad Victorian Studies, the V21 authors have settled on the entrenched opposition between a sophisticated engagement with contemporary critical theory and what I think is more clearly a soft new historicism. This has really gotten under people’s skin. I think what we need instead is a discrimination between good and bad historicisms, and I want to explain why. For one, I’m guessing that “new historicism” doesn’t appear in the manifesto (instead of “positivist historicism”) because, even if most of the “bland antiquarianism” they identify operates under that paradigm’s blunter edge, the authors recognize that new historicism, at its cutting edge, was driven by deep and substantive theoretical reflection. It’s been almost twelve years (!) since Andrew Miller observed that “Victorian literature seems, on the evidence of [that] year’s publications, to remain confidently immured within an orthodox, loosely new-historical set of historiographical assumptions, devoted to understanding and judging individual texts by appeal to historical contexts sometimes richly-but often poorly-conceived” (SEL Autumn, 2003). Clearly the V21 authors don’t think that much has changed. I generally agree, but I think Miller gives a more precise definition of the problem. In making that point he also had the benefit of a review format, with the institutional capital and professional status to show us precise examples of what he meant. So perhaps we want a “newer historicism”?

Moreover, I think “positivist historicism” is an unfortunate substitute for new historicism, because in the service of a strong generalization, it reinscribes an even looser set of assumptions about what terms like “positivist” and “historicism” mean, when we Victorianists, of all literary scholars, need more precision. We can’t rely on Walter Benjamin’s Theses on History when characterizing two of the dominant historical paradigms of the nineteenth-century. First, “historicism.” If Benjamin carefully studied Leopold von Ranke’s Geschichte der romanischen und germanischen Völker, I’ll eat my hat.** Ranke’s point in describing a history “wie es eigentlich gewesen” was that writing history was both a science and an art; historians should bring imaginative intuition in contact with strict critical attention, and set aside facile moralization or “lessons” (see Frederick Beiser’s discussion in The German Historicist Tradition (2011)). Who doesn’t agree with that? And if I do have to eat my hat, I’m still not sure why we should turn to Ranke or Benjamin (versus, say, Walter Scott) to characterize historicism for Victorian scholars or the Victorian period.

Similarly, Comtean positivism, for all of its late zaniness, was also a profound, and profoundly influential attempt to think beyond history as a “mere accumulation of facts”; to elucidate the general patterns of history and think about their theoretical as well as historical implications for present society.*** As Auguste Comte noted in Martineau’s translation of his Positive Philosophy (1853) positivism itself was just one of many systems that addressed “the necessity of observing facts in order to form a theory, and having a theory in order to observe facts.” Comte’s writings were among many adaptations of Claude Henri de Saint-Simon’s own scientific historicism, which supercharged a whole range of thinking about the texture of historical difference, formal transformation, and the political implications of intellectual labor for the present. You don’t get to Karl Marx without Saint-Simon. And isn’t there a connection between Comte’s critique of the “theological stage” and Marx’s “spectre”?

In an even longer view, that strain of positivism was important to Émile Durkheim and the Annales school of French historicism, with the strong focus on time series data and the cross-comparison of periods, and this in turn helped shape Michel Foucault’s thinking (as Thomas Flynn argued in his Sartre, Foucault, and Historical Reason, 2005). Foucault, a positive historicist? Sure, among other things.

In a long enough view, of course, everything is connected to everything. For any set of ideas there are a host of mitochondrial Eves; that’s the poverty of intellectual history. But these exercises are intriguing because they help us to think about how we might return to movements like positivism or the nineteenth-century historicist turn in search of fresh ways to think about our present concerns. This is what is fun about manifestos; they give us something concrete to think with, push against, push off of.

So instead I imagine a Thesis 1 that calls for dynamic historicisms, instead of a static historicism that explains text in terms of context; reflexive historicisms, rather than a reflective historicism that doesn’t see how the objects of study change their times; and most of all, historicisms that recognized the heterogeneity of historical understanding within the nineteenth century. Bah, an outmoded call for “reflexivity”? Why not? Reading Theses 1 this way, imaging my thinking as part of the qualifications that a manifesto does not permit, I understand V21 as a call to interrogate the historical procedures of the nineteenth century, in their connection with what we do today. To ask, in effect, who is the Victorian Foucault?

For what it’s worth, I think we do need, as Elaine Freedgood insists, to keep reading the basic theoretical texts that provide renewing precision and opportunity in thinking about our objects of study. That collection of touchstones continues to expand, particularly when we think critically and inventively about works in the nineteenth century. Surely, for instance, Darwin’s On the Origin of Species belongs in those ranks. Yet it’s my intuition (and here I agree with Ryan Fong) that renewed engagement happens in close contact with our teaching and as part of our effort to communicate our research and our ways of thinking to a larger audience. When I can catch someone’s interest over a dinner table, or I see a flash of recognition in the classroom, I know my formulation of a difficult perspective works.

At one of the INCS plenary sessions, on, appropriately, “Victorian Futures,” Dino Francis Felluga and Jay Clayton made the basic if sometimes uncomfortable point that our scholarship operates in a larger community of interests and yet (at least from my perspective) there are few institutional incentives to get our work out there, intervening in larger conversations. Under the conference theme of “mobility” Felluga and Clayton urged us to mobilize our work, to get it and its hefty insights out to more audiences. Some of the many approaches they analyzed look more promising than others. As a student of the digital humanities, I think (with Lauren Goodlad) it will play an important part. Maybe the V21 conversation as an opportunity to get us moving and get our work into the larger world beyond the paywall.

The more general point is that, in place of/alongside the renewed the “Presentism” proposed in V21’s Thesis 8 we (by which I mean we humanists generally) need a renewed evangelism. For some reason I feel like this is my most controversial suggestion. The fight for critical heft and innovation, in my view, is part of a larger fight for interest from graduating grade school seniors, the fight for relevance in the minds of parents and the larger community that encourage our students to choose specific paths, the fight for enrolment numbers. Students = jobs = better placement. Sure, we help our students see the light, but first we need them in our classrooms. Classes are our opportunity and we’re losing them. This isn’t particular to Victorian studies, but perhaps we can work on putting Victorian literature at the leading edge. This is a good start.

Finally, I hope this discussion proves a series of constructive lines in shifting sands; that we can work to reframe and develop the future of Victorian Studies without dividing it into camps. Aren’t we all historicists (of one stripe or another)? Who doesn’t know the year of Victoria’s accession? The first reform bill passed? That The Prelude & In Memoriam appeared? If we understand V21 as calling for renewed historicisms, as well as renewed formalisms, renewed digital humanities, renewed materialisms (the list goes on), it’s a really, really big tent. What can I say; I’m a social person. I’ll be excited to see everyone this summer, share a beer and plot our possible futures.




* As a side note, I can also imagine a thesis that addresses the problem of the “Victorian” as a period. I almost never self-identify this way unless I’m in some specific professional environment. Outside of those contexts, saying I’m a “Victorianist” provides a kind of characterization and a set of questions I’m not generally ready to tackle. Can we agree that among the major periodizations in English literature it’s the most challenging to justify? The period is explicitly defined in relation to a monarch rather than a set of ideas, historical movement, or dates. Sure, centennials and historical movements are arbitrary. As Martin Hewitt notes in his “10 Alternative Theses”, “all periods are contingent,” and I think this is their strength, if they are grasped and interrogated as such. But can you imagine someone in early modern studies saying, I’m a Jacobean? Victorian is on weakest footing as a periodizing term; as Priya Joshi has observed with respect to 20th-century India, it may be more intriguing to think about the work of the term and its literature outside of the normative bounds of period or geography (Yearbook of English Studies 2011). Irene Tucker, for instance, has given a persuasive account of how the period helps to designate a real transformation in historical sensibility, so that “by the time of [her] death, Victoria’s place has come to seem only the smallest part of what – and where – Victorianism is” (Victorian Studies Summer, 2003).


** I’m not ready/willing to dive into the Benjamin bibliography.


*** Of course, I realize that in using the term “positivist,” the authors mean rather what John Guillory has termed the “spontaneous philosophy of the critics” after Althusser (Critical Inquiry Winter 2002). But, as Guillory notes, positivism in this sense is more accurately the spontaneous philosophy of scientists; as an ethic of the transcendent value of the nude fact, it’s less meaningful (I hope) for Victorian Studies.

Surfing the Permanent Revolution: Digital Humanism at NAVSA 2013

This week I’m back from NAVSA. Well — not really back; it was just up the road in Pasadena. But I expect to spend some time nursing this (intellectual) hangover and thinking of the talks that I saw and the questions that were raised there.

Most immediately, it’s clear that digital work has hit the pavement in 19th century studies. Natalie Houston gave a fantastic talk about her “Visual Page” project, which uses Google’s tesseract OCR reader to analyze formal elements in a print corpus of Victorian poetry. It was stunning how much a computer can learn about a poetry collection just from the blank spaces on the page. Maeve Adams gave an intriguing paper that read across key terms in Victorian periodicals as “epistemic communities” and used this to ground a far-reaching argument about formalism in the 19th-century. And Rachel Buurma expanded on her work on Charles Reade and his archives — an eccentric even among archive rats. As she put it, his wildly profuse collections of documents, indexes, and indexes on indexes, add up to archives “on the way to becoming novels.” I’m almost convinced to read more Reade. It doesn’t sound like he would have appreciated YAHOO (I read the marginalia as: “In other words know the contents before you know anything about this”):

On Saturday I participated in a digital roundtable that Anne Helmreich of the Ghetty Foundation organized to field questions about research and pedagogy from conference attendees. The Prezi from my own talk, about some of the tools I’m using in class, (using Facebook as a social CMS and Google Drive for workshops) is posted here. My main point was that English seminars have always been “flipped”: focused on in-class workshopping and intellectual tinkering. Which makes it easy to fold in digital tools. (I take my inspiration here from Jentery Sayers and his Maker Lab.) But I was more interested in hearing what the other panelists and the attendees had to stay about the state of the digital union with C19 studies.

Most questions raised by the participants were about the ins and outs of digital scholarship: how to recruit technical collaborators (find research questions they’re interested in); how to find time and money for the work (no good answer there); how to use statistics (to be avoided while best standards are worked out); how to use undergraduate research more effectively (give them work that is tied to your own research + break projects into discrete chunks). This last point was made by Dermot Ryan, current Undergraduate Research Director at Loyola Marymount. I suspect the dismal statistics for undergraduate research conducted in the humanities at LMU would be matched at USC. It’s a thorny problem. I’ve been thinking about ways to pull undergrads in to my next digital research project. But as I focus on finishing my analogue book, there’s not much I can think of sending undergraduates out for, besides checking references. Clearly this is a problem with hermetic patterns of research. In order to frame more collaborative projects we have to hash research questions into practices that depend less on our own idiosyncratic habits of mind and the idiolects of convenience. We (or at least, I) need to be better at looping others in.

It was also a huge pleasure to meet Petra Dierkes-Thrun and learn more about the “Wilde Decadents” class she’s running at Stanford and its blog. The class generated tremendous interest; the work the students produced was read by visitors from across the globe. I’m frankly envious. She was particularly savvy promoting the course and its Twitter account through academic networks and listserves like the Victoria List.

But perhaps the most intriguing contribution to the roundtable, to my mind, was Andrew Stauffer’s diagnosis of the NINES project. NINES is currently working to redefine itself to better serve the current wave of digital scholarship. As Andrew described it, NINES was originally envisioned as a coordinator and peer-review network for online collections produced by academics — sites like the Rossetti Archive, the Woman Writer’s Project, and Darwin Online. They envisioned an academic internet populated by public research archives. Instead the major commercial publishers and Google have digitized masses of texts and placed them behind paywalls. Gale’s NCCO database is a case in point. A corollary challenge is that NINES’ COLLEX originally provided a solution to the basic problem of finding a CMS to furnish different kinds of academic content. But the widespread adoption of other open source CMSs like OMEKA diminishes the case for further investment in COLLEX. The folks at NINES are now trying to figure out how else they might support digital research — for instance, producing new tools for digital analysis along the lines of DocuScope. I’m looking forward to their public launch of Juxta, which produces a visual codex for textual variants. There’s an undergraduate who’s been asking for a good tool to start DH work with and this looks friendly enough to be promising. Andrew also suggested NINES might start convening seminars which bring humanists and engineers together to test new research avenues. It would be exciting to have an interdisciplinary research seminar that was formatively tied to a technical team rather than an academic department — tied to makers as well as thinkers.

At its heart, NINES is a classic disruption story. It announced a new chapter in 19th-c scholarship when it was launched in 2003 — the same year as NAVSA’s first conference. Both organizations are now at a crossroads (Dino Felluga handed his role as NAVSA’s head and to Marlene Tromp on Saturday). Given the rapid change of our technical tools, no organization or project that locates itself in the digital sphere will we able to avoid a regular reinvention. I spent a considerable amount of time getting the Monk project software up and running for an early experiment with my Darwin analysis. I invested even more time figuring out the Meandre suite, including a trip up to the UVic digital workshop, with hours both in person and via email, drawing on the expertise of Loretta Auvil and Boris Capitanu. That culminated in a single talk at the Seattle MLA on the global network imagined by Oliphant’s novels. The return on investment for this work has been relatively small. And now both Meandre and Monk have exhausted their funding and have begun to recede into history. I’ve just now noticed that “monk-project” is embedded in the permalink for this post — legacy of an early vision for this site.

Like any story, it has been a combination of design and contingency. I’ve been focused on cementing a traditional research profile, using the digital work to keep my hand in, waiting to mount my extended DH project when the book’s off. Each effort has given impulse to that trajectory. It’s still exciting to imagine the tools and methodologies that the next two and ten years will bring. And yet, as I listened to conference attendees ask what it would take to get trained in digital work, how to figure out the appropriate criteria for significance, how to adapt to new technologies — essentially, how to surf a continual revolution — it hit me what DH work signs you up for. A lifetime of fresh tarball installations, cribbed command prompts, endless help pages for new object libraries and bewildering new GUIs. As the tools change we reboot and relearn. We need to be honest about this. Off the top of my head, my current experiments with Python follow upon, in reverse order, exploring Ruby, Java, JavaScript, JQuery and MySQL, XSLT, CSS, VisualBasic, HTML, and TCL (!). This sets aside humdrum life as a sysadmin for OSX, Linux, AmazonAMI, WinXP, MsDos and Unix machines — not to mention WordPress itself. The most rigorous Ph. D. programs require two to three languages, not four or five.

If it sounds like I’m grousing, maybe I am. We need to emphasize the long dead ends as well as the triumphs of DH scholarship when we talk to curious peers. But the big “but” is that, as academics in the humanities, we’re tinkerers by trade — whether on our computers, in the classroom, or at the archive. For my part, I’d be exploring some version of these technologies in any likely case. It’s just so much time wasted stringing zeroes and ones unless I invest this labor in my research. Besides, I want to show my daughter what hacking looks like.

Mapping the World of Oliphant’s Novels

About a year ago, at the previous MLA, I gave a talk on a panel that detailed literary reactions to the Scottish Rising of 1745. I’d thought I’d written about it, but in the process of getting this server back up and running, I found this old draft post. As part of that panel, I gave a talk on Victorian reactions to the ’45, focusing on the novels of Margaret Oliphant and Robert Louis Stevenson. Part of the question I wanted to raise was whether the rising is typically understood at a site of political and historical closure that cements the constitution of “Britain” as a cultural entity. One way to get at this, I thought, was to see whether literature written about the rising emphasized Britain over Scotland and England.

But it was also a good opportunity to experiment with using network analysis and mapping to explore the geographic imagination of the nineteenth-century novel. One way to raise the question of political formation i to look at the locations that are explicitly cited in each novel, and to map out how they are connected. To do this I extracted location entities from several of Stevenson’s novels using the Meandre framework (apparently now defunct), as well as 65 of Oliphant’s, derived from the Internet Archive, and I did a series of network analysis graphs using Gephi to look at which locations are cited most frequently, and which other locations they tend to be cited with. An example of a plot this produced is below, and shows locations referenced in Oliphant’s novels, sized by reference frequency and connected by proximity of references.

Vector graph of locations in Oliphant’s Novels, Sized by Degree

I found it hard to figure out how to visualize the networks effectively in a talk. This was less of a problem for the Gephi visualizations, which are static, though images with a large number of nodes presented a challenge. One strategy I experimented with was to do a screen capture movie and then edit the movie so that I could produce a video that zoomed in as I spoke. In retrospect, it would have been more effective and flexible to use Prezi.

One question that working on the talk raised was how to evaluate the utility of these visualizations in the context of a talk. In the case of Oliphant, the justification accrues in the difficulty of assessing the range and depth of her fiction. An essayist and author of more than ninety novels, works which were often simultaneously serialized across several publications, it is almost impossible to wrap your head around her production. On the other hand, it’s a gives you the chance to make some nice visualizations. Here’s two animations I made using GEPHI and Google Maps. The first is a network map of locations in her novels with node size and proximity scaled to total number of links, the second is a world map with the locations geocoded.



Hacking: WYSIWYG

Two weeks ago I noted that someone had recently tried to get into my WordPress server. My firewall traced the query back to an IP in China, though I don’t have the ability to figure out where it originated from initially. I linked it to news of escalating activity from abroad; it seems that attempts to get into academic networks are sharply on the rise.

Then a week ago my server collapsed under what seemed to be a DDOS attack. I tried to restart it several times, but everytime I got the server back up it was swamped with traffic. I’ve spent a good eight hours now launching a new server and migrating over content from a backup. Most of my posts are back, but I lost the last year’s worth of images. I’ve only been able to recreate or restore about half.

It’s all kind of creepy. And it may be beyond my capacity to try and stay on top of escalating security problems on a private blog. Apparently there’s a botnet that’s been hacking WordPress servers generally for the last several months. I like having my own site; I like the ability to post whatever content I want and try out different kinds of server technologies; my Omeka-based class last year depended on this capacity. But the bar is getting higher.

Machine Grading

A friend of mine drew my attention to the NYTimes’ recent article on advanced in essay-grading software. It’s technology that will raise hackles at campuses around the country. The claim is that such programs are becoming sophisticated enough to grade college-level writing. Of course, their effectiveness is widely debated. The article helpfully includes a link to a study by Les Perelman which critiques the data being used to support such claims (he argues that sample size problems, confusion between distinct kinds of essays and grading systems, and loose assertions undermine the argument). The software is getting better, but it still doesn’t look like it can quite replicate the scores produced by human graders.

But such criticism is an argument at the margins. There is now clearly room for debate on both sides. Machines are comparable on standardized tests. The long-term trajectory is evident: if machines are roughly as effective as a force of part-time human graders, standardized tests will end up using the software to save money. They’ll keep some humans in the loop cross checking and validating, but the key incentives all point in the direction of greater automization. The reductive structures and simplistic arguments which we train students to replicate for these tests has laid the groundwork. We’ve already whittled essay writing into an algorithm.
Continue reading