I’m having trouble sleeping, I’m thinking of what you said

(At the risk of dragging Nany Kayo over to the blog (which I profoundly do not want), I wanted to mentioned a current (though fading slightly) Twitter tempest. It doesn't start here, but Ms. Reese (@debreese on Twitter) is a powerfully articulate writer, whose blog focus is exactly what it says it is--to notate the appearances of First Nation peoples in childrens' literature, and especially to praise the good portrayals, the people who got it right.

Neil Gaiman's Graveyard Book isn't exactly standard childrens' fare, though it does feature a child protagonist--but she's not so much talking about the book, itself. She's talking about an interview he had, two years ago, wherein he said the controversial and now nigh-immortal (for the next two months or so) words:
The great thing about having an English cemetery is I could go back a very, very, very long way. And in America, you go back 250 years (in a cemetery), and then suddenly you’ve got a few dead Indians, and then you don’t have anybody at all, unless you decide to set it up in Maine or somewhere and sneak in some Vikings.
Now, I can state objectively, he is both right and wrong, here. Not so much about the "few dead Indians" line--I'm not touching that, and besides, Ms. Reese and kynn (who's also @kynn on Twitter) are doing an admirable job poking holes in what Gaiman said.

What I want to talk about is accountability, and disposable history.

First, the history bit, because that one's far easier. There aren't a lot of graveyards in America that go back two hundred and fifty years. There aren't a lot of communities in America that go back two hundred and fifty years. Let's trim off the last ten and just take it from 2000--that would be 1850, right?

If you grew up on the east coast, and you grew up in any state housing the original thirteen colonies--then you have cemetaries that potentially go back far longer. The Jamestown colony in Virginia is the earliest with a potential "bury patch", and was founded in 1607, gaining royal support in 1624. The Plymouth colony in Massachusetts came next, founded in the winter of 1620 at the site of an abandoned Native village, which was "abandoned" due to all its inhabitants having perished when an epidemic of something dire swept the area.

(Now, it should be noted, though it matters little in terms of burials and burial locations, that the founders of the Plymouth colony were called Pilgrims--all the Church of England dissenters were called Pilgrims, due to their wanderings--but they were not Puritans. These terms have become interchangeable in American history, and it's just not the case. The goal of the Puritans was simple: to purify the Church of England's excesses by working within the Church on issues of reform. For the Separatists, they wanted nothing to do with the Church of England, feeling--and probably not in error--that it retained too many practices from Roman Catholicism, including kneeling, religious leaders who wore distinctive clothing, the use and upkeep of altars, and many other provisions. Interestingly--I'm thinking of the American LDS movement here--the Separatists called themselves the Saints, because they held themselves to strict higher standards, in accordance to what they believed were God's wishes. They called anyone who was not a Saint in good standing a Stranger, establishing resolute guidelines for Us versus Them thinking later on.)

However, as settlements and land seizures drift westward, the dates go higher, and the preservation of history often took a back seat to simple survival. Also, stone-cutting is only profitable for memorial structures if you a) have someone trained to quarry stone, and b) have a quarry nearby. Many of the earliest cemetaries in America (not all, mind you, but MANY) were marked with wooden memorials, and these weathered as wood does through the years. Thus, by the time we hit the west coast, a cemetary with dates going back to 1900 is considered venerable and aged, because most of our cemetaries don't go back more than seventy-five years, if that.

Most people living near major cities, in fact, aren't even used to free-standing headstones; all we're seeing are the in-ground stone tablets that are easier to mow over. And while headstones are still popular fixtures in October for decoration, most (again, most, not all) don't see them on a daily basis at this point. And I guarantee you, if they do--and if they bother to stop their cars, and go look at the headstones they see--they will rarely find any date older than 1910. This is on average, mind you, but unless we're talking really old burial grounds, that's about the standard.

(To touch briefly on the comments of Ms. Reese before we leave: no, I am not ignoring the histories--and the preserved burial places--of non-Europeans, here. There are First Nation preserved burial sites, there are some Japanese and Chinese graveyards that go back some time, there were the segregated burial grounds in the South, some of which are very ornate--not to mention until Katrina hit in '05, some very elaborate above-ground crypts and structures in New Orleans--and there were travelers from Denmark, China, Japan, France and Ireland among others who came, usually to the east coast, and ended up marrying into the population they found there, bringing the burial customs of their peoples along. I am not saying they didn't exist. I'm talking public perception here, the same public perception that says all Pilgrims were Puritans, and Pocahantas, a poor "Indian maid", married John Smith as an adult--hells, that one's three wrong things in one!--or that 1600s America was just like 1800s America, just with less cars. Public perception is frequently wrong.)

But now we have accountability, and that--to me, at least--is the really interesting thing.

First off, Twitter accountability is a strange thing. For one, anyone can claim to be anyone else; the only way we know for sure is if there's a Twitter "Verified account" checkmark on their profiles. And those aren't easy to get; Twitter seems to hand them out randomly (though they do verify; I don't mean they don't check out who people are behind the screen, but that there's no set process to gain that Verified Account mark). Add to that that the tools for Twitter are only semi-easy to pick up and run with, and some tools--and conventions of the medium--only become clear over time.

(An example: at this point, most of us simply look below the tweet in question and click Retweet beside Reply for anything we want our group of followers to read. But some people still use the old method of copying the original tweet, then putting "RT"--"retweet"--in front of the person's Twitter name. Or the vast and diverse world of hashtags. You can hashtag anything by putting '#' in front of a word or phrase, which Twitter software picks up in searches later--things like #volcanogod or #SL2--but say "hashtag" to anyone who's not on Twitter, and they look at you funny.)

There's also the steadily changing impression that Twitter is mostly just social communication. And, for the most part, that's still true, though marketers of all stripes and, I swear sometimes, every single "life manager" on the damn planet have discovered it long since.

Big businesses use it now. Amazon has a channel directly for people who are Associates, and they're likely not the only one who has changed the 'little social platform' into an employee communications downline. And news services are being rocked with the virtues and the flaws of this 'little social platform'--witness the Green Party in Iran, unknown to most of the world, who pleaded with Twitter users to show their support by turning their avatars--the little icons seen beside each tweet--green. Suddenly, literally within hours, over half of Twitter was green. It was a powerful statement, and without warning, an entire nation realized public opinion was rising and falling by the actions of one commentator with a laptop. Pretty huge stuph.

But, with all the sweeping changes and the rise of Twitter as A Power, the Twitter tools have remained fundamentally unchanged. And one of those rules is the concept of deleting Tweets. Because sometimes people say bad things in social settings. They want the take-back. They want to apologize and move on.

There are two problems with this.

1. As of this writing, outgoing Tweets--either some, or all of them--are being archived by the Library of Congress. Why? I don't know. Is this a rumor? Yes, but one that's been confirmed by the Library of Congress. (On their blog, fittingly enough.)

2. What happens when an average person on Twitter decides to employ that "delete" feature--but happens to have hundreds of thousands of followers? You get what happened to Neil Gaiman, who is now being accused of suppression of his tweets in order to look better, or seem more noble, or...whatever it is.

Listen. I mean this. I am not talking about what he said, I've dealt a little bit with the popular perception going on for many people, and I'm not getting involved in the battle other than that. What I am interested in is--how famous does a person have to be before they have to live their entire lives in public? Day by day, it's hard enough engaging in a dialogue with fans, some of whom are always going to be rabid, because that's just the nature for some fans. (I have been to enough conventions, trust me, I can speak from experience.) Personally, I think the whole controversy is being handled fairly well by the direct participants involved, and really, really, amazingly badly by some of Gaiman's fans, who've gone to both blogs in question and gotten overly snippy.

"Enhancing expressive features such as eye movement could eventually make avatar-mediated communication feel more trustworthy than online video, because only relevant visual cues need to be displayed, said Steptoe."

I'm not sure what unnerves me more, the incipient lack of all human socialization, or the "feeling" more trustworthy than talking to someone on webcam. But both are on the unnerving side.

And if you reload an inkjet cartridge with human cells, can you then print human skin? The answer apparently is yes. If this really works, it could be the end of tissue rejection, for at least skin problems.

And--for those diehard WoW fans who are also SL neo-Victorians, fulfill both desires in one place! Apparently the Spice Bread recipe on that page is adapted from a Victorian milk bread recipe, and is very heavy on the savory additions. Must try that out.

And that's it for this entry. Also, YAY! I'm not going to say the over-the-top dramatics are over, but yay for one entry (almost) without a mention. I'm so happy.

Ish.

Comments

Icterus Dagger said…
"There aren't a lot of communities in America that go back two hundred and fifty years. Let's trim off the last ten and just take it from 2000--that would be 1850, right?"

No, that would be 150 years. Did you mean 1750?

-iD
Edward Pearse said…
Seconded on the subtraction thing :-)

Most of the pre-seventeenth century didn't survive into the eighteenth but in a post dealing with public perception it's interesting you refer to Jamestown and Plymouth but make no mention of St. Augustine, which beat them by around 50 years. Public perception indeed.

Still 250 years is about right as a median point. The East coast starts in the 1500s but the West doesn't start till much later. Seattle didn't exist before 1851.

As for the Indian burials, my *personal* perception was that many tribes did not actually bury their dead. At least not inhumed in the way Europeans think of burial. Scaffold burials, cremation and sending the body downstream in canoes may allow for "burial grounds" in a spiritual sense but not a graveyard in an actual sense.

So yes, Gaiman could possibly have phrased it better, but Debbie Reese seems to be one of those people who make a career out of being offended.
Rhianon Jameson said…
You raise a good point about having debates on Twitter. It's a terrible medium for that...I've occasionally made an offhand comment, usually a joke, that will get someone upset, and then it's off to the races because there's no way of holding a reasonable conversation in 140 characters. (A friend of mine refuses to try Twitter because, as he put it, "At 140 characters, I'm still getting warmed up for the sentence ahead.")

Though it's certainly not original to me, I like the metaphor of Twitter as the Virtual Water Cooler. With a reasonable number of followers/following, it works pretty well. One person, not to name names, may point out a blog update; another may link to an interesting video or newspaper article; yet another may make an observation about the latest LL insanity. Because everyone self-selects who he or she follows, it's a reasonable assumption that most followers care at least to some degree about the water cooler remarks.

But this makes Twitter a terrible medium for fan interaction. With, say, a @neilhimself, with a million or so followers, it's no longer a virtual water cooler; it's the head of your company using the PA system to tell you his random thoughts. I'm not saying it's not useful, because fans might want to know the stuff in the tweets. But when the fan tries to interact? Even if only 1% decide to reply to any given tweet, that's 10,000 replies to Neil Gaiman. Twenty tweets a day? He's looking at 200,000 daily pieces of short fan mail. If you want him writing any more fiction, you'd better hope he ignores almost all replies.
Emilly Orr said…
Mr. Dagger,

My math is fail. But that further narrows things, because while there are buildings in some cities (surprisingly, Spokane, WA is one of them) that have buildings dating back to the 1850s, I can't think of a single community I've lived in that has a graveyard with dates going back even to 1800!

(It must be noted, how'ver, the farthest east I've been is Colorado.)
Emilly Orr said…
Edward,

That's exactly what I'm talking about. Now that you reference it, I remember reading about it, but most public education mentions the Big Thirteen and moves up from there.

(Well, that, and I tend to edit out Florida anyway. Strange things come from there.)
Emilly Orr said…
Miss Jameson,

Thankfully, 'Twitterstorms' seem to be relatively brief when they start, too--I've had fights in ISC chat last longer.

But you're right--mostly I RT rather than reply to people who have more public-oriented lives (Amanda Palmer being the exception, but even then, I rarely reply back to her), but I'm an atypical fan. Most people have no issue with responding directly, sending fan letters later, personal emails...and even if 90% of that is supportive, the inflow must be dunning, as you mentioned.

Popular Posts