Archive for the ‘critique’ Category
Alexander McQueen: Savage Beauty coming to the UK? – Telegraph
Alexander McQueen: Savage Beauty coming to the UK? – Telegraph.
Savage Beauty is a remarkable exhibit, brilliantly curated, of one of the great artists of the fin-de-siècle. Surely England’s (nae, Scotland’s) greatest of the last 50, 100 years, in terms of power, vision, execution.
And I never even saw the exhibit, just bought the catalogue/book.
Open Cures: A Protocol Outline for Mitochondrial Protofection
Open Cures: A Protocol Outline for Mitochondrial Protofection.
The model advocated for open (source) development is predicated for the authors and biologists on the software industry. Good! But at the same time, as a historian, I cannot help but recall prior instances where peer production and collaboration have occurred. But the difference is this: never before has so much of our culture (“our”) been “owned” or faceted by the claims of property. It’s as if the Lockean promise of America (land of no property, just land) has evaporated, leaving only the claims to what was and is and will be there. (Thus goeth the future, too: it’s been commodified into the present banality of expectation.)
What is left, of course–for there is no and never has been a land without property (remember not the Alamo but the Indians? They were here, in America, even before the Europeans named them so wrongly and then killed them disease, slavery, war.)–what is left is the need to consider property as the domain of collaboration, not a patent for exclusion, with the thing available only as a bracketed commodity on the market, to those who can afford it.
Maria Popova: In a new world of informational abundance, content curation is a new kind of authorship » Nieman Journalism Lab » Pushing to the Future of Journalism
Does anyone recall Gissing’s New Grubb Street? The book ends with the successful writer embarking upon the new commuter magazine, Chit-Chat, and abandoning the long slough work of novel writing and real thought. Chit chat opposes thought because it surfs (surfaces?) the top of the head and ignores the plumb of heart. Goes without saying that Gissing found this offensive, and equally, that he found it inevitable, and equally that he probably would have wanted it to be he winning, for none of the authors depicted in this risibly bleak Naturalist account is particularly rewarding or otherwise worth copying, and none is even as sexy or powerful as London’s Martin Eden, which takes a different–American?–Naturalist take.
Do I see Twitter as chit chat? It *is* the vehicle of fast talk and it can encourage the semblance of conversation, and conversation–fast–is fun. But tweeting is not real dialogue, just as we can always discern when someone talks to his phone vs his friend face to face. It’s speech snippified. Okay, no problem there. But it is also not particularly revolutionary, just as SMS wasn’t–and has effectively, already, disappeared.
People will always find fast if not better (wrong evaluative) ways of communicating or seeming to communicate. “Satisfactory” comes to mind rather than “better.”
What do I like doing? I like being at parties or seminars or classes or meetings and holding multiple conversations simultaneously, especially if they are wildly disparate topics; I like being more than one person at any time, I suppose, where “person” is the spun, woven thread of a single conversation. I like losing myself in talk.
I think a lot of people are like this. Twitter seeks to provide the means by which one can effectuate that. But it doesn’t really. I can see something like that one day doing it perfectly, but in the meantime, one has to wonder: what’s wrong with just having a party, symposium, class, seminar, meeting, conference–with seeing people. And talking.
Data Privacy, Put to the Test in a Supreme Court Case – NYTimes.com
Data Privacy, Put to the Test in a Supreme Court Case – NYTimes.com.
This case is enormously important. One way of looking at it is not from the perspective of “control,” as in, “I want to control my public persona,” which puts a lot of weight on “control,” but on the similar dynamic of what it means to be(come) a commodity. It’s not quite the same as chattel slavery; hardly. The object in question–the consumer who leaves a trail of identity constituting the soul which holds him to account–is not being asked to do things. But the loss of privacy here, which is tantamount to the commercialisation of public identity by others in despite of the will of the person (and we can further include the marketing of genetic code, blood, and other body tissues: what’s the difference?), is effectively a modern form of slavery by other means.
The point I’m trying to make is oddly not the libertarian one that we absolutely own our own selves, but rather the more complex one that we never do. And for that reason, we need to establish defensible boundaries of identity, much as in the same way we have set perimeters defining the limits of our home, our family, even “our” nation. All these are conventions, though “family” is also probably a thing that comes before and after convention. But its extent, and the degree to which we can act on and with it, is a thing of convention, a thing, that is, of social definition and negotiation.
I.B.M.’s Watson – Computers Close In on the ‘Paris Hilton’ Problem – NYTimes.com
I.B.M.’s Watson – Computers Close In on the ‘Paris Hilton’ Problem – NYTimes.com.
It’s an interesting point, and IBM has earned the feather in its cap. But it’s really silly to think that the contest means anything beyond the immediacy of the sophistication of the hardware demonstrated.
And if there is anything to take from this it’s probably simply that our present education system is lousy, or that factoids do not make the human, or that the biggest dirty secret about what makes humans human is that they are cheap. Historically, humans have been by and large cheaper than beasts of burden (when those were available) and beasts of war. Humans can mostly follow instructions better than, say, a cat, but probably not as well as a smart border collie (and they are all smart). Ask a shepherd which he’d rather have: a stupid human or smart dog.
What has made humans distinct is the effort put into that question by religious organisations that have sought to distinguish humans from everything else; and not all religions, of course, have done this. Indeed, numerically speaking, most have not. So we can look instead to the very simplest fact of human distinction, and it’s simply the conscious boasting of it (“I am, I think, I am, I think!”), coupled with humans’ amazing ability to hang around, decade after decade, while even the smartest dog will never see its third decade.
I.B.M.’s Watson – Computers Close In on the ‘Paris Hilton’ Problem – NYTimes.com
I.B.M.’s Watson – Computers Close In on the ‘Paris Hilton’ Problem – NYTimes.com.
It’s an interesting point, and IBM has earned the feather in its cap. But it’s really silly to think that the contest means anything beyond the immediacy of the sophistication of the hardware demonstrated.
And if there is anything to take from this it’s probably simply that our present education system is lousy, or that factoids do not make the human, or that the biggest dirty secret about what makes humans human is that they are cheap. Historically, humans have been by and large cheaper than beasts of burden (when those were available) and beasts of war. Humans can mostly follow instructions better than, say, a cat, but probably not as well as a smart border collie (and they are all smart). Ask a shepherd which he’d rather have: a stupid human or smart dog.
What has made humans distinct is the effort put into that question by religious organisations that have sought to distinguish humans from everything else; and not all religions, of course, have done this. Indeed, numerically speaking, most have not. So we can look instead to the very simplest fact of human distinction, and it’s simply the conscious boasting of it (“I am, I think, I am, I think!”), coupled with humans’ amazing ability to hang around, decade after decade, while even the smartest dog will never see its third decade.