I'm having an agreeable yuppie fantasy today.
That probably sounds contradictory. What it involves is a little window in the corner of my second monitor that shows me the exchange rate between the UK pound and the NZ dollar, and -- and this is the nifty bit -- updates itself once per minute. This window is provided courtesy xe.com, a lovely little site that epitomises the service industry ideal of doing one thing and doing it well. And, most importantly, for free.
This rate is of interest to me because I still have a noticeable amount of money in the UK, and I've been meaning to bring it over, for house-buying purposes, when the rate looks good. As I type this, 1GBP would buy me NZ$2.68159, up from $2.67496 when I first checked this morning.
Over the day, the pound has been trending slightly upwards. But every time I watch it actually make its update, the pound weakens (which is bad). It's almost touched 2.686, before dropping back again when I got too excited and began watching too closely.
This is, of course, entirely in keeping with the known laws of economics, and will come as no surprise to anyone.
When the rate hits 2.71 -- which, for the record, I don't think is going to happen this week -- I'll call my trader and finally switch over my millions from sterling. That's the agreeable fantasy part, anyway. In the meantime, however, it really is pretty exciting to watch it out of the corner of my eye.
(D'oh! Just lost another $200.)
So here's this information that, when I was entering the job market, would have been available only to braying young wine-bar-haunting gits with bicycle clips on their sleeves, whose employers paid more for the service than for the people to watch it... and today, it's freely available to anyone with a broadband connection. Today, a mere twenty years too late to make my fortune, I get to play at being a yuppie for free, without even having to take time out from my real job.
That's so cool.
Addendum (added Thursday lunchtime): So much for my powers of prediction. Today the pound is over $2.72. I'm not pushing my luck any further.
Wednesday, January 28, 2009
Tuesday, January 27, 2009
House hunting, year 2
It's curious, how different people's perceptions can be.
Looking out of the upstairs window from 4/33 Rukutai Street, Susan -- already unsympathetic -- muttered to me: "There's a stagnant stream outside."
There was indeed one of those water features that estate agents persist in calling "a stream", but look to me more like a poorly-finished ditch. I pointed it out to the agent.
"What's the problem? We're well above sea level here," he shrugged.
"Sea level be damned," I retorted. "What about mosquito level?"
"Ah, that's only a problem with stagnant water. That out there, that's running water."
I looked again at the water. A lethargic 'skater twitched its way across the surface.
"That's running, is it?"
Babbling, burbling, gurgling, rushing -- all of these would be entirely false descriptions. Truer descriptions would feature terms such as "ochre", "fecund" and, I felt, quite possibly "leprous". This water wasn't running, it wasn't even sauntering, I seriously doubted if it so much as got out of bed.
Have I mentioned what lying bastards estate agents are?
Looking out of the upstairs window from 4/33 Rukutai Street, Susan -- already unsympathetic -- muttered to me: "There's a stagnant stream outside."
There was indeed one of those water features that estate agents persist in calling "a stream", but look to me more like a poorly-finished ditch. I pointed it out to the agent.
"What's the problem? We're well above sea level here," he shrugged.
"Sea level be damned," I retorted. "What about mosquito level?"
"Ah, that's only a problem with stagnant water. That out there, that's running water."
I looked again at the water. A lethargic 'skater twitched its way across the surface.
"That's running, is it?"
Babbling, burbling, gurgling, rushing -- all of these would be entirely false descriptions. Truer descriptions would feature terms such as "ochre", "fecund" and, I felt, quite possibly "leprous". This water wasn't running, it wasn't even sauntering, I seriously doubted if it so much as got out of bed.
Have I mentioned what lying bastards estate agents are?
Stiffening drinks
"Moderate" drinking may actually help men perform in the sack, according to a study from the Department of Credulous Research at the University of Western Australia. It's a finding that runs so counter to popular belief that it is hard even to consider taking it seriously.
The study, sponsored (I'm guessing) by Fosters, is being variously reported in Australia, the UK and, for some reason, India. It's barely made a ripple in the USA, and New Zealand media has yet to pick it up. I'm wondering if this reflects the varying states of progress in the War on Fun, as staged in each country. I recall that, in Britain, there was a lot of fuss -- many years ago now -- about drinks manufacturers "suggesting", in their marketing, that a drink might be stiff in more ways than one...
The trouble with this is that, as an issue becomes politicised, it becomes impossible to write about it without taking a political stance. And it might ally you with people, such as brewers and advertising companies, that you'd rather not be associated with. So journalists will be put off such stories, even if they are real, for fear that covering them might damage their own carefully cultivated, politically-aware image.
Not that I'm sure this story is real. The story is based, New Scientist tells us, on an "anonymous postal survey" of 1770 Western Australian men. (Other sources report a sample size of 1580. Why the discrepancy?) I'm no epidemiologist, but "postal survey" doesn't sound very authoritative -- like "phone survey", but worse. Then there's the possibility that Western Australia might have cultural, social, economic or environmental factors that make it anomalous...
All of which just goes to show how badly I'm under the thumb of conventional, Shakespearean physick. I'm looking intently for flaws in the study; loth to accept that it might actually be valid. Because that would mean I might have been wrong.
The study, sponsored (I'm guessing) by Fosters, is being variously reported in Australia, the UK and, for some reason, India. It's barely made a ripple in the USA, and New Zealand media has yet to pick it up. I'm wondering if this reflects the varying states of progress in the War on Fun, as staged in each country. I recall that, in Britain, there was a lot of fuss -- many years ago now -- about drinks manufacturers "suggesting", in their marketing, that a drink might be stiff in more ways than one...
The trouble with this is that, as an issue becomes politicised, it becomes impossible to write about it without taking a political stance. And it might ally you with people, such as brewers and advertising companies, that you'd rather not be associated with. So journalists will be put off such stories, even if they are real, for fear that covering them might damage their own carefully cultivated, politically-aware image.
Not that I'm sure this story is real. The story is based, New Scientist tells us, on an "anonymous postal survey" of 1770 Western Australian men. (Other sources report a sample size of 1580. Why the discrepancy?) I'm no epidemiologist, but "postal survey" doesn't sound very authoritative -- like "phone survey", but worse. Then there's the possibility that Western Australia might have cultural, social, economic or environmental factors that make it anomalous...
All of which just goes to show how badly I'm under the thumb of conventional, Shakespearean physick. I'm looking intently for flaws in the study; loth to accept that it might actually be valid. Because that would mean I might have been wrong.
Friday, January 23, 2009
The Internet is still evil
I lost a friend today. Well, that may be over-dramatic. I hope so. I'm quite optimistic that "lost" may be a temporary condition. And as for "friend"...
There are a few words coined to describe that curious, but still genuine, feeling that grows up between people who only know one another on the 'net. "Cyber-friend" makes her sound like a cutesy, friendly, prototype unstoppable futuristic assassin, and that's way too many adjectives for anyone. As for "e-pal", "netpal" -- surely "-pal" is only a valid relationship for under-13s. "Mate", "buddy", "chum"? Don't go there. "Online friend"? Just lacks punch.
Ho hum.
Whatever you call them, I have a few. Some appear in the list of blogs I follow or who follow this one -- and I really do appreciate all of you, folks. Please don't vanish. Others, for various reasons, don't appear in any of those lists, but I'm always aware of their presence, and pathetically grateful when they take time to comment or e-mail.
JasmineArdent was one of my favourite writers from thisisby.us. Frighteningly intelligent, perceptive and confident. And she liked to argue, as much as I do; she wasn't shy to speak her minds on whatever topic someone happened to be ranting about today. When she agreed with me, which was more often than not, I felt validated. When she argued, I felt stimulated to examine my position in depth. There's not many people who can make me feel like that.
And when I saw that her account had vanished...
It was a little like the feeling you get when you're strolling carelessly across a field on a sunny morning, and you look up to see that one of the cattle that had been dispassionately observing you from a distance has put down its head and is now running towards you. Or, miles from home, you reach the place where you know you left your car, but it's not there. (Yes, I've had both of those experiences.) In a moment, in the twinkling of an eye, the whole balance of the world changes.
I quit TIBU the same day, although I left my writing up. And, via e-mail, I talked Jasmine into starting a blog.
She was never too convinced by the idea. Blogging isn't the same as taking part in a hothouse, semi-closed community like TIBU. The daily stimulation, the constant feedback -- just aren't there.
And today I discovered that her blog has gone, too.
I hope she gets in touch. But if she doesn't... Has she been forced to delete it for professional reasons? personal reasons? has she joined a cult, or walked under a bus, or gone to work for the FBI, or in witness protection? -- from my perspective, there's no way of telling. I've lost e-friends before that way ("e-friends"? not as ugly, but kinda -- trivial), and it's always disappointing. But there's nothing I can do about it.
Be well, Jasmine. Write soon.
There are a few words coined to describe that curious, but still genuine, feeling that grows up between people who only know one another on the 'net. "Cyber-friend" makes her sound like a cutesy, friendly, prototype unstoppable futuristic assassin, and that's way too many adjectives for anyone. As for "e-pal", "netpal" -- surely "-pal" is only a valid relationship for under-13s. "Mate", "buddy", "chum"? Don't go there. "Online friend"? Just lacks punch.
Ho hum.
Whatever you call them, I have a few. Some appear in the list of blogs I follow or who follow this one -- and I really do appreciate all of you, folks. Please don't vanish. Others, for various reasons, don't appear in any of those lists, but I'm always aware of their presence, and pathetically grateful when they take time to comment or e-mail.
JasmineArdent was one of my favourite writers from thisisby.us. Frighteningly intelligent, perceptive and confident. And she liked to argue, as much as I do; she wasn't shy to speak her minds on whatever topic someone happened to be ranting about today. When she agreed with me, which was more often than not, I felt validated. When she argued, I felt stimulated to examine my position in depth. There's not many people who can make me feel like that.
And when I saw that her account had vanished...
It was a little like the feeling you get when you're strolling carelessly across a field on a sunny morning, and you look up to see that one of the cattle that had been dispassionately observing you from a distance has put down its head and is now running towards you. Or, miles from home, you reach the place where you know you left your car, but it's not there. (Yes, I've had both of those experiences.) In a moment, in the twinkling of an eye, the whole balance of the world changes.
I quit TIBU the same day, although I left my writing up. And, via e-mail, I talked Jasmine into starting a blog.
She was never too convinced by the idea. Blogging isn't the same as taking part in a hothouse, semi-closed community like TIBU. The daily stimulation, the constant feedback -- just aren't there.
And today I discovered that her blog has gone, too.
I hope she gets in touch. But if she doesn't... Has she been forced to delete it for professional reasons? personal reasons? has she joined a cult, or walked under a bus, or gone to work for the FBI, or in witness protection? -- from my perspective, there's no way of telling. I've lost e-friends before that way ("e-friends"? not as ugly, but kinda -- trivial), and it's always disappointing. But there's nothing I can do about it.
Be well, Jasmine. Write soon.
Thursday, January 22, 2009
So much for history
There's an awful lot of silly[1] talked about blogging. There's people who hate it, on the well-founded but rather illogical basis that bloggers talk way too much trivial claptrap, without pausing to wonder if anyone else might find it worth reading. But really that's just a retread of the same argument we, as a culture, had when writing was first invented; and again with paper, and the printing press, and television, and computers.
[1] Brought to you by the Campaign for Intuitive Nouns
So a lot of bollocks gets written. So what? "Ninety percent of everything is crap" said a wise man, years before the Internet was even conceived.
Then there's the even sillier opposite extreme, which holds that bloggers have made journalists obsolete. Yeah, like the light bulb made the sun obsolete. Of course there are functions sometimes performed by journalists that can, in principle be done better by bloggers. And occasionally, it's true, a blogger does succeed in upstaging Old Media. But in general, bloggers quote a lot more from Old Media than vice versa.
Which kind of brings me to today's whine.
With President St Barack's ascension to the thr... sorry, I mean "accession to the White House" -- his very first act in office, before the cheers from his inaugural speech had died down, was to redesign the White House website.
But more has happened here than a redesign. Hundreds of thousands of pages of history have vanished. And millions of blog pages have been altered, in ways their authors will probably never even notice.
For instance, suppose you referred to President Bush's infamous "Mission Accomplished" speech from May 2003. You can find lots of accounts of that speech, mostly written and edited by unsympathetic commentators, all over the net; but the obvious authoritative source to link to is Bush's own press office.
But if you did that, tough luck. It's not there any more. If you click on the Google link that points to it, you find yourself in Obama's "Briefing Room".
I'm sure the text of this speech, and the other bazillions of press releases put out by the Bush White House, still exists in thousands of places. White House archives, Bush's personal files, more libraries than I can readily imagine. But none of those are as easily and simply accessible as the White House website.
Would it hurt to leave it there? After all, the date is built into the URL -- there's no danger of it getting in the way of Obama's speeches. It might pollute search results -- but it would be a trivial matter to set up the site search engine to prevent that.
What bothers me is that by sweeping Bush's own account under the electronic rug, Obama's people have cleared the way to rewrite the history of the Bush years in their own words.
Of course, this Orwellian process is not new. Bush did the same to all of Clinton's releases, when he first moved in. Good luck finding out, now, what Clinton originally told the nation about (for instance) the bombing of Krajina. But the Internet was younger then; blogging was in its infancy, and the practice of linking to sources was not nearly so established. Bush could, reasonably -- indeed, convincingly -- plead ignorance of what he did.
Obama, on the other hand, has already shown that he understands the importance of history-as-written. He's the first president, I believe, to have published two autobiographies before he even got the top job, thus retroactively turning his entire life into a presidential campaign. Consider this line from his new-look website: "President Obama swiftly responded to Hurricane Katrina."
What the... how the... ?
Okay -- from today's perspective, we are at no risk of understanding that to mean what it appears to say. But in twenty, thirty years' time, when both Katrina and Obama are fading memories, who knows how it will be parsed?
See, I remember another leader who surfed into power on the crest of the wave of hope and the promise of change, who began his term by clearing out the dead wood of the (thoroughly discredited) preceding administration. A leader who made it his first priority to reform the sickly organs of government, to lance the ideologically contaminated pustules of his devastatingly unpopular predecessors -- and above all, to rewrite recent history in his own terms.
And for all the parallels with Kennedy, I can't help but think that the name that flickers in my mind, as most closely resembling Obama in political style, is Tony Blair.
[1] Brought to you by the Campaign for Intuitive Nouns
So a lot of bollocks gets written. So what? "Ninety percent of everything is crap" said a wise man, years before the Internet was even conceived.
Then there's the even sillier opposite extreme, which holds that bloggers have made journalists obsolete. Yeah, like the light bulb made the sun obsolete. Of course there are functions sometimes performed by journalists that can, in principle be done better by bloggers. And occasionally, it's true, a blogger does succeed in upstaging Old Media. But in general, bloggers quote a lot more from Old Media than vice versa.
Which kind of brings me to today's whine.
With President St Barack's ascension to the thr... sorry, I mean "accession to the White House" -- his very first act in office, before the cheers from his inaugural speech had died down, was to redesign the White House website.
But more has happened here than a redesign. Hundreds of thousands of pages of history have vanished. And millions of blog pages have been altered, in ways their authors will probably never even notice.
For instance, suppose you referred to President Bush's infamous "Mission Accomplished" speech from May 2003. You can find lots of accounts of that speech, mostly written and edited by unsympathetic commentators, all over the net; but the obvious authoritative source to link to is Bush's own press office.
But if you did that, tough luck. It's not there any more. If you click on the Google link that points to it, you find yourself in Obama's "Briefing Room".
I'm sure the text of this speech, and the other bazillions of press releases put out by the Bush White House, still exists in thousands of places. White House archives, Bush's personal files, more libraries than I can readily imagine. But none of those are as easily and simply accessible as the White House website.
Would it hurt to leave it there? After all, the date is built into the URL -- there's no danger of it getting in the way of Obama's speeches. It might pollute search results -- but it would be a trivial matter to set up the site search engine to prevent that.
What bothers me is that by sweeping Bush's own account under the electronic rug, Obama's people have cleared the way to rewrite the history of the Bush years in their own words.
Of course, this Orwellian process is not new. Bush did the same to all of Clinton's releases, when he first moved in. Good luck finding out, now, what Clinton originally told the nation about (for instance) the bombing of Krajina. But the Internet was younger then; blogging was in its infancy, and the practice of linking to sources was not nearly so established. Bush could, reasonably -- indeed, convincingly -- plead ignorance of what he did.
Obama, on the other hand, has already shown that he understands the importance of history-as-written. He's the first president, I believe, to have published two autobiographies before he even got the top job, thus retroactively turning his entire life into a presidential campaign. Consider this line from his new-look website: "President Obama swiftly responded to Hurricane Katrina."
What the... how the... ?
Okay -- from today's perspective, we are at no risk of understanding that to mean what it appears to say. But in twenty, thirty years' time, when both Katrina and Obama are fading memories, who knows how it will be parsed?
See, I remember another leader who surfed into power on the crest of the wave of hope and the promise of change, who began his term by clearing out the dead wood of the (thoroughly discredited) preceding administration. A leader who made it his first priority to reform the sickly organs of government, to lance the ideologically contaminated pustules of his devastatingly unpopular predecessors -- and above all, to rewrite recent history in his own terms.
And for all the parallels with Kennedy, I can't help but think that the name that flickers in my mind, as most closely resembling Obama in political style, is Tony Blair.
Monday, January 19, 2009
Oxymoron of the week:
"Indigenous immigrants".
I'm not sure if these people -- that is, the people who wrote and published those words -- simply have no idea of what the words mean, or if they have deeper conceptual problems with the idea that an individual word can have any such thing as "meaning" at all. There's a research paper waiting to be written there.
This idea provided free for anyone who wants to run with it.
I'm not sure if these people -- that is, the people who wrote and published those words -- simply have no idea of what the words mean, or if they have deeper conceptual problems with the idea that an individual word can have any such thing as "meaning" at all. There's a research paper waiting to be written there.
This idea provided free for anyone who wants to run with it.
Presence of mind
I saw this French film once -- I was just channel-hopping late one night, and there it was...
It was a gritty, violent story, featuring a group of urban yoof. What I remember is a scene where four or five of these young thugs are in a public toilet, talking about some extremely illegal deeds -- I don't remember the details, but it was something of the order of multiple murder -- you know, the kind of laws that the police will normally make at least some bona-fide effort to enforce. The kids are excited, they're desperate, they're expecting and ready for any kind of trouble. And in the middle of their conversation, a toilet flushes behind them; the cubicle door opens, and an elderly man -- maybe 70 or so -- shambles out.
As the toughs turn to look at him, he says (a line that was translated in the subtitles as): "Nothing like a really good shit."
It was the air of immense satisfaction with which he delivered this line that probably saved his life.
As he shuffles over to the washbasins next to the lads and washes his hands, he delivers a rambling but eloquent story of someone, perhaps an ancestor of his, who was on a train en route to Siberia when, at one stop, he got off to take a dump, and got so caught up in the excitement that the train rumbled off without him. Leaving him stranded in the freezing, unpeopled wastes of central Russia with his trousers down, and not another train due for weeks. But it was worth it, the old man feels, for the sake of taking the time to clear one's bowels properly.
Throughout this speech, the thugs stare open-mouthed, their feet fixed as if rooted, only their heads turning to follow the old man's movements about the room. It was as if their own lives, their own crimes, their own fear and anger and suspicion, were completely forgotten, for the time it took the old man to wash and dry his hands, leisurely, and then shuffle out of the door.
I wish I knew what the movie was called, because that one scene has stayed with me now for at least 15 years. And the older I get, the more I sympathise with his story. It's my constant hope that, come the day when I find myself in a life-threatening gang-based situation, I acquit myself with as much presence of mind as that old man.
It was a gritty, violent story, featuring a group of urban yoof. What I remember is a scene where four or five of these young thugs are in a public toilet, talking about some extremely illegal deeds -- I don't remember the details, but it was something of the order of multiple murder -- you know, the kind of laws that the police will normally make at least some bona-fide effort to enforce. The kids are excited, they're desperate, they're expecting and ready for any kind of trouble. And in the middle of their conversation, a toilet flushes behind them; the cubicle door opens, and an elderly man -- maybe 70 or so -- shambles out.
As the toughs turn to look at him, he says (a line that was translated in the subtitles as): "Nothing like a really good shit."
It was the air of immense satisfaction with which he delivered this line that probably saved his life.
As he shuffles over to the washbasins next to the lads and washes his hands, he delivers a rambling but eloquent story of someone, perhaps an ancestor of his, who was on a train en route to Siberia when, at one stop, he got off to take a dump, and got so caught up in the excitement that the train rumbled off without him. Leaving him stranded in the freezing, unpeopled wastes of central Russia with his trousers down, and not another train due for weeks. But it was worth it, the old man feels, for the sake of taking the time to clear one's bowels properly.
Throughout this speech, the thugs stare open-mouthed, their feet fixed as if rooted, only their heads turning to follow the old man's movements about the room. It was as if their own lives, their own crimes, their own fear and anger and suspicion, were completely forgotten, for the time it took the old man to wash and dry his hands, leisurely, and then shuffle out of the door.
I wish I knew what the movie was called, because that one scene has stayed with me now for at least 15 years. And the older I get, the more I sympathise with his story. It's my constant hope that, come the day when I find myself in a life-threatening gang-based situation, I acquit myself with as much presence of mind as that old man.
Thursday, January 15, 2009
Intellectual vandalism
As a student, I read some book -- I think Michael Scott Rohan's original Winter of the World trilogy -- in which the hero, a blacksmith, had his legs deliberately broken to keep him from leaving town.
I thought then that this was the essence of barbarism. To deliberately maim someone -- to intentionally make an able body less able -- and to do it, for no better reason than that you didn't want the burden of having to treat them decently. Surely, I thought, this is what the Law is really for: to protect us from such arbitrary and selfish uses of power.
In 1989, the 80486 computer chip came onto the market -- the fourth in the series that began with the 8086, and direct ancestor of the various types of Pentiums that probably power the computer you're reading this on. It was shockingly expensive. But among its advances over its predecessor, the 80386, it had a floating-point co-processor built into the chip itself.
I'd always wanted one of those. It meant you could play games that involved drawing detailed graphics on the screen in real time. Things like flight simulators. I waited, eagerly, for the price to come down to the point where I could afford one.
In 1991, the chip manufacturer, Intel, produced the cheap version: the 80486SX. As everyone knew in those days, the "-SX" suffix meant "cut-down"; in this case, it meant "without the co-processor". To me that seemed the epitome of pointlessness. But by then I was a technical journalist, and it was my business to read lots of reports and writings about developments just like this. And I learned something that shocked me deeply:
The 486SX did have the co-processor built onto the chip -- it was exactly the same chip, built on the same lines in the same factory as the full version -- but the co-processor was artificially disabled. Like Rohan's blacksmith hero, the chip had been deliberately crippled to make it less useful.
In my economics lessons, I'd learned that the purpose of work is to add value to something. Everyone who adds value makes the world a slightly better, or at least richer, place; somebody, somewhere, gains some utility that they would not otherwise have had.
So what should I think of people who work, on purpose, to make a product less useful?
To me that seemed, and still seems, no better than vandalism, or at best theft. I can understand the motivations -- but then I can understand the motives of vandals and thieves, too, and it doesn't mean I accept them as legitimate.
And today the same argument is going on with a much higher profile, although no-one seems to recognise it.
It's called "digital rights management". What it means, in a nutshell, is that publishers add bits of code to their products that prevent them from being used in ways they otherwise could.
Now, the amount of virtual ink that's been spilt in debating the rights and wrongs of DRM in general is, approximately, enough to fill the Atlantic. You don't have to look far to find rants or measured opinions on either side of the subject, including some by yours truly. (I have mostly taken the position that publishers have no moral or legal right to do most of the things they do. The vast majority of their measures are used to enforce "rights" that the law was never meant to grant them in the first place.)
But that's not my real objection. That's just the legalistic formulation of the underlying problem, which is a moral one. It is intrinsically wrong, I believe, to devote time and effort to make your product less useful than it would be if you didn't. That means you're working to make the world a poorer place. And that means you're a vandal, and you belong in jail.
But our economic system is so perverse that it rewards such behaviour. Even worse: our laws are being perverted to not merely protect, but actively support, it. To my mind, that's enough to discredit the entire consumerist economic system; what we are fighting over now is whether the legal and political systems that go with it can be redeemed. They're meant to protect us, the consumers, from this kind of abuse. If they won't do that, what are they good for?
I thought then that this was the essence of barbarism. To deliberately maim someone -- to intentionally make an able body less able -- and to do it, for no better reason than that you didn't want the burden of having to treat them decently. Surely, I thought, this is what the Law is really for: to protect us from such arbitrary and selfish uses of power.
In 1989, the 80486 computer chip came onto the market -- the fourth in the series that began with the 8086, and direct ancestor of the various types of Pentiums that probably power the computer you're reading this on. It was shockingly expensive. But among its advances over its predecessor, the 80386, it had a floating-point co-processor built into the chip itself.
I'd always wanted one of those. It meant you could play games that involved drawing detailed graphics on the screen in real time. Things like flight simulators. I waited, eagerly, for the price to come down to the point where I could afford one.
In 1991, the chip manufacturer, Intel, produced the cheap version: the 80486SX. As everyone knew in those days, the "-SX" suffix meant "cut-down"; in this case, it meant "without the co-processor". To me that seemed the epitome of pointlessness. But by then I was a technical journalist, and it was my business to read lots of reports and writings about developments just like this. And I learned something that shocked me deeply:
The 486SX did have the co-processor built onto the chip -- it was exactly the same chip, built on the same lines in the same factory as the full version -- but the co-processor was artificially disabled. Like Rohan's blacksmith hero, the chip had been deliberately crippled to make it less useful.
In my economics lessons, I'd learned that the purpose of work is to add value to something. Everyone who adds value makes the world a slightly better, or at least richer, place; somebody, somewhere, gains some utility that they would not otherwise have had.
So what should I think of people who work, on purpose, to make a product less useful?
To me that seemed, and still seems, no better than vandalism, or at best theft. I can understand the motivations -- but then I can understand the motives of vandals and thieves, too, and it doesn't mean I accept them as legitimate.
And today the same argument is going on with a much higher profile, although no-one seems to recognise it.
It's called "digital rights management". What it means, in a nutshell, is that publishers add bits of code to their products that prevent them from being used in ways they otherwise could.
Now, the amount of virtual ink that's been spilt in debating the rights and wrongs of DRM in general is, approximately, enough to fill the Atlantic. You don't have to look far to find rants or measured opinions on either side of the subject, including some by yours truly. (I have mostly taken the position that publishers have no moral or legal right to do most of the things they do. The vast majority of their measures are used to enforce "rights" that the law was never meant to grant them in the first place.)
But that's not my real objection. That's just the legalistic formulation of the underlying problem, which is a moral one. It is intrinsically wrong, I believe, to devote time and effort to make your product less useful than it would be if you didn't. That means you're working to make the world a poorer place. And that means you're a vandal, and you belong in jail.
But our economic system is so perverse that it rewards such behaviour. Even worse: our laws are being perverted to not merely protect, but actively support, it. To my mind, that's enough to discredit the entire consumerist economic system; what we are fighting over now is whether the legal and political systems that go with it can be redeemed. They're meant to protect us, the consumers, from this kind of abuse. If they won't do that, what are they good for?
Wednesday, January 14, 2009
"These are not the facts you're looking for"
"Open-plan offices are bad for productivity and employee health", says an Australian study.
Now, I've been a technical journalist. I know how the news reports "studies". And this is one of those (countless) cases where I'd really like to see the parameters, as well as the results, of the original study. Because "open-plan office" can mean pretty much anything.
"The research found that the traditional design was better - small, private closed offices." There's that whore of a word, "traditional", again. What does it mean this time? How many industries have "traditionally" given people private offices?
Not journalism, for sure. Every newsroom in the world, I believe, is and always has been open-plan. It's fitting that the news should be written in a noisy, distracting environment, because that's just how most people will be reading it.
Education. How many schools give each teacher their own private office? Not many, where I come from. Heck, even at university, my tutor had to share an office.
Police officers. Hospital doctors and nurses. Travel agents. Bankers. Air traffic controllers. The 25% or so of the population that works in manufacturing or construction industries. The much higher number that works in retail. What percentage of these spend most of their working hours in their own private offices? And yet the expectation exists -- as if we all, somehow, thought we were destined to be treated like middle-managers in some 60s sitcom. In some sense, it feels almost as if we should expect that. A private office is a sign of status, of privilege, something we're somehow supposed to aspire to.
And that, of course, is how the issue has become politicised, and that's why I'm going to treat any study on the subject with grave suspicion. I now want to know, not only the complete employment history of these researchers, but also who funded them.
"The problem is that employers are always looking for ways to cut costs, and using open-plan designs can save 20% on construction."
And what about rent? Even allowing for the extra meeting rooms and social spaces you need to provide, you can still fit at least twice as many generous-sized desks in a given floor space if you don't have to lay out a whole room around each one.
I've spent a lot of my life working in at least four very different open-plan offices. (Well, six really, but for just four different companies.) In publishing, there is no doubt that open-plan is the only sane layout to have; if anyone in that industry claims that their colleagues are distracting them from their work, they haven't understood what their work is.
In software, the picture is more mixed. There's a persuasive argument for letting team members communicate with minimal barriers. In this view, if you put someone in a position where they have to stand up to communicate with team-mates, you cut down the flow of information by at least two-thirds; if you force them to get up and walk somewhere, it's more than 90%. But (the counter-argument goes), in practice these informal, personal communication channels don't scale well; they're fine when there's just three or four people in the room, but at any size bigger than that, someone will be left out of the loop. You need more formal communication, less reliance on informal.
On the other hand, there's an argument (put forcefully by, among others, Joel Spolsky) that says -- backed up by a whole pile of alleged facts -- that programmers in private offices are "more productive". Microsoft, apparently, is famous for putting all its programmers in private offices. Which just goes to show that productivity is no guarantee of quality.
I've enjoyed working in an open-plan office. It was quiet, because there was rigid self-discipline on all sides about talking; and the lack of privacy didn't bother me, because we all had a degree of trust in our management. You could spend a week doing nothing but surf the web, without feeling the need to conceal it; people would assume either you were working, or you were legitimately trying to work but simply blocked for the moment. Motivation was high, productivity was high.
But I've also seen the opposite. Partitions, stifling that easy, collegial non-verbal communication, while doing nothing to cut sound or provide privacy. Three or four conversations going on at once, sometimes across several intervening desks. Managers who make no pretence that they even want to know what we're doing, as long as it's work. Morale low, productivity suffers.
How many of these factors has the study taken into account? I don't know, and I doubt if I could find out, even if I tried. It's a politicised issue, and it's not hard to frame the parameters of any study to make sure it comes up with the right answer.
Now, I've been a technical journalist. I know how the news reports "studies". And this is one of those (countless) cases where I'd really like to see the parameters, as well as the results, of the original study. Because "open-plan office" can mean pretty much anything.
"The research found that the traditional design was better - small, private closed offices." There's that whore of a word, "traditional", again. What does it mean this time? How many industries have "traditionally" given people private offices?
Not journalism, for sure. Every newsroom in the world, I believe, is and always has been open-plan. It's fitting that the news should be written in a noisy, distracting environment, because that's just how most people will be reading it.
Education. How many schools give each teacher their own private office? Not many, where I come from. Heck, even at university, my tutor had to share an office.
Police officers. Hospital doctors and nurses. Travel agents. Bankers. Air traffic controllers. The 25% or so of the population that works in manufacturing or construction industries. The much higher number that works in retail. What percentage of these spend most of their working hours in their own private offices? And yet the expectation exists -- as if we all, somehow, thought we were destined to be treated like middle-managers in some 60s sitcom. In some sense, it feels almost as if we should expect that. A private office is a sign of status, of privilege, something we're somehow supposed to aspire to.
And that, of course, is how the issue has become politicised, and that's why I'm going to treat any study on the subject with grave suspicion. I now want to know, not only the complete employment history of these researchers, but also who funded them.
"The problem is that employers are always looking for ways to cut costs, and using open-plan designs can save 20% on construction."
And what about rent? Even allowing for the extra meeting rooms and social spaces you need to provide, you can still fit at least twice as many generous-sized desks in a given floor space if you don't have to lay out a whole room around each one.
I've spent a lot of my life working in at least four very different open-plan offices. (Well, six really, but for just four different companies.) In publishing, there is no doubt that open-plan is the only sane layout to have; if anyone in that industry claims that their colleagues are distracting them from their work, they haven't understood what their work is.
In software, the picture is more mixed. There's a persuasive argument for letting team members communicate with minimal barriers. In this view, if you put someone in a position where they have to stand up to communicate with team-mates, you cut down the flow of information by at least two-thirds; if you force them to get up and walk somewhere, it's more than 90%. But (the counter-argument goes), in practice these informal, personal communication channels don't scale well; they're fine when there's just three or four people in the room, but at any size bigger than that, someone will be left out of the loop. You need more formal communication, less reliance on informal.
On the other hand, there's an argument (put forcefully by, among others, Joel Spolsky) that says -- backed up by a whole pile of alleged facts -- that programmers in private offices are "more productive". Microsoft, apparently, is famous for putting all its programmers in private offices. Which just goes to show that productivity is no guarantee of quality.
I've enjoyed working in an open-plan office. It was quiet, because there was rigid self-discipline on all sides about talking; and the lack of privacy didn't bother me, because we all had a degree of trust in our management. You could spend a week doing nothing but surf the web, without feeling the need to conceal it; people would assume either you were working, or you were legitimately trying to work but simply blocked for the moment. Motivation was high, productivity was high.
But I've also seen the opposite. Partitions, stifling that easy, collegial non-verbal communication, while doing nothing to cut sound or provide privacy. Three or four conversations going on at once, sometimes across several intervening desks. Managers who make no pretence that they even want to know what we're doing, as long as it's work. Morale low, productivity suffers.
How many of these factors has the study taken into account? I don't know, and I doubt if I could find out, even if I tried. It's a politicised issue, and it's not hard to frame the parameters of any study to make sure it comes up with the right answer.
Monday, January 12, 2009
Dumbing-down for idiots
So I'm innocently rummaging about YouTube when I encounter, for the first time, the name of Charlotte Iserbyt, author of The Deliberate Dumbing-Down of America.
I've always been kinda wary of people who talk about "dumbing down". To me it's a code-phrase that translates, roughly, as "I'm perfect, therefore any change in the system that produced me must axiomatically be for the worse". But this woman has credentials. She's a "former Senior Policy Advisor in the US Department of Education". She's author of a book billed as "without doubt one of the most important publishing events in the annals of American education in the last hundred years." A "Barnes & Nobel #1 Bestseller", no less. I was curious to see how you earn that kind of publicity.
Looks like the easiest way is to write it yourself.
False modesty is not one of Iserbyt's faults. In the first chapter of her tedious diatribe, she sets the tone by informing us: "Undoubtedly, this chapter may be one of the most important since the philosophies of Jean-Jacques Rousseau, Wilhelm Wundt, and John Dewey et al."
Excuse me?
Actually, on a closer reading, she's not claiming what I first thought she was trying to claim. What she wants to say, what she would say if she were capable of stringing together a coherent thought, is more like: "Although short, this chapter lays the key philosophical foundation for the whole book. It explains how the philosophies of Rousseau, Wundt, and Dewey et al. reflect a total departure from the traditional definition of education."
"Traditional definition of education"? Yes, that's her own phrase. And that -- fully two sentences into the thesis of this 738-page essay -- is where the gangrene sets in. She cites The New Century Dictionary of the English Language (Appleton, Century Crofts: New York, 1927):
Hmmm... 1927. What exactly makes Messrs Appleton, Century Crofts more authentically "traditional" than Rousseau, who predated them by more than a century and a half? If their ideas differ from Rousseau's, doesn't that imply that they are rejecting him, not vice-versa?
This she contrasts with "the new, dehumanizing definition" used by twentieth-century educational psychologists. The remainder of her book is packed with countless interpretations of this "definition". What she seems to object to, in them, is summed up in Prof Benjamin Bloom's dictum: "The purpose of education is to change the thoughts, feelings and actions of students."
Of course, to define education is a challenge that has flummoxed far better minds than Iserbyt's, over the centuries. Charles Dickens, for instance, devoted a lot of ink to ridiculing the educational theories of his day, which favoured the teaching of hard facts (Hard Times) and practicalities (Nicholas Nickelby). The closest he comes to approving of any schooling is the classics-based, "character-forming" liberal education received by David Copperfield.
Waitaminute -- "character-forming"? Wouldn't that be another word for "changing the thoughts, feelings and actions" of the subject? Admittedly, Copperfield's character is formed less by his teachers than by his peers in the school; but still, given the paragon that he grows into, Dickens must feel that his school is doing something right.
Or let's see what Montaigne has to say of education. Turns out, he also favours a, make no bones about it, liberal education with the emphasis on the teaching of sound and virtuous habits of mind, and with the pupil given great freedom in determining and shaping his own learning; he advocates "a tutor, who has rather a well-made than a well-filled head".
Let's go back further: beyond Dickens, beyond Rousseau and Montaigne. Plato believed that the highest purpose of education is "knowledge of the Good": that is to say, the ability to recognise the quality of virtue itself, the essential property that makes something -- anything -- "right" or "wrong". Or, in other words: it is to instil a moral sense. To change not merely the "thoughts, feelings and actions", but the very values and desires of the students.
Iserbyt's "traditional" definition of education is, in fact, anything but: it is a deliberate rejection of the ideas of those, in earlier generations, who have bothered to think about the subject. Like so many nineteenth-century "traditions", it was made new and radical by and for the Victorians; it is a dysfunctional product of a dysfunctional age.
In case you're curious about the other 737 pages of Iserbyt's work: I have, at considerable personal discomfort, skimmed through the meat of it (about 425 of those pages). She seems intent to prove that US education policy in the 20th century has been hijacked by Communists (she points the finger in particular at such anti-American subversives as Roosevelt and Eisenhower, although she also regards Reagan, Bush and Nixon as pansy-waisted sell-outs), and directed to the aim of selling out the US to some kind of ill-defined supra-national government. The cast of the conspiracy, and the precise nature of this "government", shifts from chapter to chapter: but she is in no doubt. Whoever They are, They're all in it together: traitors to her country.
Whatever that is.
In her YouTube video, shot by who-knows-whom, she casts herself as a naif suffering from a devastating failure of self-awareness. She is outraged to learn that to hold a political position, she is expected to know about practical politics. She is aghast to find herself being trained in the rudimentary techniques of persuasion and crowd manipulation -- tactics that will be familiar to anyone who's ever worked for a company with more than 100 employees. She is indignant that teachers are expected to "challenge and change" their student's views.
Among the training she received was how to silence "resisters" (dissidents) by elisting them to serve on committees, flattering them that their opinions are important, while still ignoring them. I think the most telling testimony to her own obliviousness is this: that it still, to this day, doesn't seem to have occurred to her, how she became a "policy adviser"...
I don't know if she's insane, an idiot, or just plain lying. I do know that her YouTube video has been viewed more than 50,000 times, and has a five-star average rating.
I've always been kinda wary of people who talk about "dumbing down". To me it's a code-phrase that translates, roughly, as "I'm perfect, therefore any change in the system that produced me must axiomatically be for the worse". But this woman has credentials. She's a "former Senior Policy Advisor in the US Department of Education". She's author of a book billed as "without doubt one of the most important publishing events in the annals of American education in the last hundred years." A "Barnes & Nobel #1 Bestseller", no less. I was curious to see how you earn that kind of publicity.
Looks like the easiest way is to write it yourself.
False modesty is not one of Iserbyt's faults. In the first chapter of her tedious diatribe, she sets the tone by informing us: "Undoubtedly, this chapter may be one of the most important since the philosophies of Jean-Jacques Rousseau, Wilhelm Wundt, and John Dewey et al."
Excuse me?
Actually, on a closer reading, she's not claiming what I first thought she was trying to claim. What she wants to say, what she would say if she were capable of stringing together a coherent thought, is more like: "Although short, this chapter lays the key philosophical foundation for the whole book. It explains how the philosophies of Rousseau, Wundt, and Dewey et al. reflect a total departure from the traditional definition of education."
"Traditional definition of education"? Yes, that's her own phrase. And that -- fully two sentences into the thesis of this 738-page essay -- is where the gangrene sets in. She cites The New Century Dictionary of the English Language (Appleton, Century Crofts: New York, 1927):
The drawing out of a person's innate talents and abilities by imparting the knowledge of languages, scientific reasoning, history, literature, rhetoric, etc. -- the channels through which those abilities would flourish and serve.
Hmmm... 1927. What exactly makes Messrs Appleton, Century Crofts more authentically "traditional" than Rousseau, who predated them by more than a century and a half? If their ideas differ from Rousseau's, doesn't that imply that they are rejecting him, not vice-versa?
This she contrasts with "the new, dehumanizing definition" used by twentieth-century educational psychologists. The remainder of her book is packed with countless interpretations of this "definition". What she seems to object to, in them, is summed up in Prof Benjamin Bloom's dictum: "The purpose of education is to change the thoughts, feelings and actions of students."
Of course, to define education is a challenge that has flummoxed far better minds than Iserbyt's, over the centuries. Charles Dickens, for instance, devoted a lot of ink to ridiculing the educational theories of his day, which favoured the teaching of hard facts (Hard Times) and practicalities (Nicholas Nickelby). The closest he comes to approving of any schooling is the classics-based, "character-forming" liberal education received by David Copperfield.
Waitaminute -- "character-forming"? Wouldn't that be another word for "changing the thoughts, feelings and actions" of the subject? Admittedly, Copperfield's character is formed less by his teachers than by his peers in the school; but still, given the paragon that he grows into, Dickens must feel that his school is doing something right.
Or let's see what Montaigne has to say of education. Turns out, he also favours a, make no bones about it, liberal education with the emphasis on the teaching of sound and virtuous habits of mind, and with the pupil given great freedom in determining and shaping his own learning; he advocates "a tutor, who has rather a well-made than a well-filled head".
Let's go back further: beyond Dickens, beyond Rousseau and Montaigne. Plato believed that the highest purpose of education is "knowledge of the Good": that is to say, the ability to recognise the quality of virtue itself, the essential property that makes something -- anything -- "right" or "wrong". Or, in other words: it is to instil a moral sense. To change not merely the "thoughts, feelings and actions", but the very values and desires of the students.
Iserbyt's "traditional" definition of education is, in fact, anything but: it is a deliberate rejection of the ideas of those, in earlier generations, who have bothered to think about the subject. Like so many nineteenth-century "traditions", it was made new and radical by and for the Victorians; it is a dysfunctional product of a dysfunctional age.
In case you're curious about the other 737 pages of Iserbyt's work: I have, at considerable personal discomfort, skimmed through the meat of it (about 425 of those pages). She seems intent to prove that US education policy in the 20th century has been hijacked by Communists (she points the finger in particular at such anti-American subversives as Roosevelt and Eisenhower, although she also regards Reagan, Bush and Nixon as pansy-waisted sell-outs), and directed to the aim of selling out the US to some kind of ill-defined supra-national government. The cast of the conspiracy, and the precise nature of this "government", shifts from chapter to chapter: but she is in no doubt. Whoever They are, They're all in it together: traitors to her country.
Whatever that is.
In her YouTube video, shot by who-knows-whom, she casts herself as a naif suffering from a devastating failure of self-awareness. She is outraged to learn that to hold a political position, she is expected to know about practical politics. She is aghast to find herself being trained in the rudimentary techniques of persuasion and crowd manipulation -- tactics that will be familiar to anyone who's ever worked for a company with more than 100 employees. She is indignant that teachers are expected to "challenge and change" their student's views.
Among the training she received was how to silence "resisters" (dissidents) by elisting them to serve on committees, flattering them that their opinions are important, while still ignoring them. I think the most telling testimony to her own obliviousness is this: that it still, to this day, doesn't seem to have occurred to her, how she became a "policy adviser"...
I don't know if she's insane, an idiot, or just plain lying. I do know that her YouTube video has been viewed more than 50,000 times, and has a five-star average rating.
Thursday, January 8, 2009
Summer
It's a perfect day outside.
The sky is blue from horizon to horizon -- not a wisp of cloud, not so much as a contrail in sight. The breeze is just strong enough to freshen the air, and carries the scent of mown grass. I, regretfully, have to work, but many people are still on holiday, so the traffic is light and the streets half-empty.
All of which just leaves me wondering: why aren't more of you here? What are you waiting for?
The sky is blue from horizon to horizon -- not a wisp of cloud, not so much as a contrail in sight. The breeze is just strong enough to freshen the air, and carries the scent of mown grass. I, regretfully, have to work, but many people are still on holiday, so the traffic is light and the streets half-empty.
All of which just leaves me wondering: why aren't more of you here? What are you waiting for?
Wednesday, January 7, 2009
Sanity test
If you've been awake during the past month, you may possibly have heard that the band Coldplay is being sued for infringing the copyright of guitarist Joe Satriani. Seems that Coldplay's hugely popular hit "Viva la Vida" uses the same melody and chord progression as Satriani's (hitherto) little-known "If I Could Fly".
Coldplay denies it. Not strenuously, not hotly, but quietly and firmly and with surprising dignity for a rock band. (Probably time to declare my interest here: I personally like the Coldplay song.)
YouTube is crawling with commentaries, combinations and comparisons of the two songs. To me, the best of these is this offering, which digs up no less than five previous versions of the same melody, the earliest dating right back to the 1960s. It's been said that there are only seven basic notes in music, and moreover there are fairly strong rules about how you can combine them in any given genre. So who, exactly, is going to say whose contribution deserves to be considered most important?
To me, this case is ever so slightly terrifying.
In Coldplay's favour: their song is substantially original, it's been a colossal popular (and critical) hit, they've got plenty of money to fight the case, and there is no evidence that Coldplay had ever heard Satriani's recording. In Satriani's favour: the melodies, rhythms and chord progressions are more than averagely similar. And -- and this is the point that I think has been largely overlooked -- he's American. There are lots of people commenting on those YouTube videos who think that the case is a slam-dunk for Satriani.
If Satriani wins, that will imply a reversal of the burden of proof in copyright infringement cases: the onus will be on songwriters to prove that they haven't "copied" another's work. I would see such an outcome as vindication of my "imaginary frontier" theory: that the American legal establishment is engaged in the biggest land grab in history, claiming absolute sovereignty over the entire realm of "intellectual property".
So what should happen here?
Well, for starters, copyright needs to loosen up. A lot.
Until very recently, if someone copied a basic theme, made their own changes and sold the work as their own, no-one turned a hair. So long as the changes were sufficient to count as an "original" contribution, the attitude was: "good on them, for enriching all our lives". The movie Clueless fails to credit Jane Austen for the original story, but nobody accused it of plagiarism. And that attitude has, so far, failed to lead to the death of artistic creativity. On the contrary, it encourages it: there's nothing more stifling to creativity than constantly policing your own thoughts against the possiblity of "stealing" someone else's idea.
But the bar for plagiarism is slowly creeping downward. And that, it seems to me, really will lead to the end of creativity.
Make the most of art while we have it. If Van Gogh were working in today's legal climate, he'd have been sued by Cezanne for copying his style.
Coldplay denies it. Not strenuously, not hotly, but quietly and firmly and with surprising dignity for a rock band. (Probably time to declare my interest here: I personally like the Coldplay song.)
YouTube is crawling with commentaries, combinations and comparisons of the two songs. To me, the best of these is this offering, which digs up no less than five previous versions of the same melody, the earliest dating right back to the 1960s. It's been said that there are only seven basic notes in music, and moreover there are fairly strong rules about how you can combine them in any given genre. So who, exactly, is going to say whose contribution deserves to be considered most important?
To me, this case is ever so slightly terrifying.
In Coldplay's favour: their song is substantially original, it's been a colossal popular (and critical) hit, they've got plenty of money to fight the case, and there is no evidence that Coldplay had ever heard Satriani's recording. In Satriani's favour: the melodies, rhythms and chord progressions are more than averagely similar. And -- and this is the point that I think has been largely overlooked -- he's American. There are lots of people commenting on those YouTube videos who think that the case is a slam-dunk for Satriani.
If Satriani wins, that will imply a reversal of the burden of proof in copyright infringement cases: the onus will be on songwriters to prove that they haven't "copied" another's work. I would see such an outcome as vindication of my "imaginary frontier" theory: that the American legal establishment is engaged in the biggest land grab in history, claiming absolute sovereignty over the entire realm of "intellectual property".
So what should happen here?
Well, for starters, copyright needs to loosen up. A lot.
Until very recently, if someone copied a basic theme, made their own changes and sold the work as their own, no-one turned a hair. So long as the changes were sufficient to count as an "original" contribution, the attitude was: "good on them, for enriching all our lives". The movie Clueless fails to credit Jane Austen for the original story, but nobody accused it of plagiarism. And that attitude has, so far, failed to lead to the death of artistic creativity. On the contrary, it encourages it: there's nothing more stifling to creativity than constantly policing your own thoughts against the possiblity of "stealing" someone else's idea.
But the bar for plagiarism is slowly creeping downward. And that, it seems to me, really will lead to the end of creativity.
Make the most of art while we have it. If Van Gogh were working in today's legal climate, he'd have been sued by Cezanne for copying his style.
Tuesday, January 6, 2009
Can we dance on the grave yet?
I'm indebted to my friend Wisco, over at Griper Blade, for pointing me in the direction of this little gem (warning: PDF, 15Mb) from the White House website.
It's President Bush's own spin-doctors' picture of his "legacy". Frankly I don't know whether to mock or vomit. In the interests of brevity, I'll confine myself to the one cheap shot.
I can't help but wonder what a police shield is meant to be a symbol of. "Protego et ministro" hardly seems to describe what GWB has done for his country. "Fabricate diem, pvnc" would come closer, in spirit at least.
It's President Bush's own spin-doctors' picture of his "legacy". Frankly I don't know whether to mock or vomit. In the interests of brevity, I'll confine myself to the one cheap shot.
"Since the terrorist attacks of 9/11, the President has carried with him a symbol of sacrifice and courage: a police shield given to him by the mother of an officer who died at the World Trade Center."
I can't help but wonder what a police shield is meant to be a symbol of. "Protego et ministro" hardly seems to describe what GWB has done for his country. "Fabricate diem, pvnc" would come closer, in spirit at least.
Fitness equipment for geeks
Our star Christmas prezzie to ourselves was a Wii Fit.
For those of you who aren't entirely au fait with this latest in couch-potato fitness accessories: a Wii Fit is, basically, a very sensitive electronic scale that connects to your Wii (games console) and encourages you to do all kinds of exciting physical stuff to reduce your weight. You stand on the doodad and perform yoga, muscle, balance or aerobic exercises (per instruction) as the mood takes you; it tots up your time and chimes a happy validating little chord when you've clocked up half an hour.
Yogic deep breathing, for instance, can earn you two minutes. Jogging (running on the spot) can be variable length -- novices like me get a five-minute course, but by June I daresay I'll be up to the half-marathon level. And it has the advantage, over real jogging, that nobody can see you. Step exercises can earn you anything up to ten minutes in a single hit.
Then there's the balance games, which are video games controlled by leaning your body this way, that way and, for good measure, the other way. Skiing, ski-jumping, snowboarding, a penguin on the ice -- these frosty occupations will earn you about two minutes per hit -- for playing games. While you play, the trusty scale tracks your centre of gravity and uses this information to control the game.
Then it judges me -- assigns my "Wii Fit age" -- based on my powers of balance. The ability to stand perfectly still is highly prized in Wiiland. And it's harder than I'd thought.
Now, the first thing the scale told me, when I loaded myself onto it on Christmas Day, was that I was overweight. Bastard, I thought, and set myself a target to shed two kilos within two months, which will put me well within the "ideal" weight range.
Since then, my weight's been up and down like a stock market. But it has shown a slightly downward trend. On Sunday it told me, for the first time since I started, that I was no longer overweight.
I presume that's because I was taking the test before breakfast. (The device says that I probably fluctuate by about 1 kilo over the course of a day, so for consistency it's best to weigh oneself at around the same time daily.) Since then, I've been avoiding it. I figure the next news is probably going to be bad, so why rush it?
Interesting things I've noted:
Susan, however, has been going nuts. She's probably clocked enough hours on the Wii Fit to earn some kind of pilot's license. While on holiday -- from Christmas until this Monday -- she would take the test three, four times a day. And because she was burning so much energy, of course, she's been mainlining candy canes and similar health-giving foods.
So although she's unquestionably toned and beautiful, she's not losing weight. That's not a problem in theory -- she's already comfortably in the "ideal" region -- but the damn' device itself encourages us to set goals for weight change. "Staying much the same" isn't an option, as far as I can see.
On the whole, it's quite fun. And it speaks with a good English accent, which is nice. But I do wish it could be localised to the southern hemisphere. Being told to watch what I eat during these winter months is mildy annoying; come June, I anticipate being told to get out and enjoy the sunshine, which will be irritating beyond measure.
For those of you who aren't entirely au fait with this latest in couch-potato fitness accessories: a Wii Fit is, basically, a very sensitive electronic scale that connects to your Wii (games console) and encourages you to do all kinds of exciting physical stuff to reduce your weight. You stand on the doodad and perform yoga, muscle, balance or aerobic exercises (per instruction) as the mood takes you; it tots up your time and chimes a happy validating little chord when you've clocked up half an hour.
Yogic deep breathing, for instance, can earn you two minutes. Jogging (running on the spot) can be variable length -- novices like me get a five-minute course, but by June I daresay I'll be up to the half-marathon level. And it has the advantage, over real jogging, that nobody can see you. Step exercises can earn you anything up to ten minutes in a single hit.
Then there's the balance games, which are video games controlled by leaning your body this way, that way and, for good measure, the other way. Skiing, ski-jumping, snowboarding, a penguin on the ice -- these frosty occupations will earn you about two minutes per hit -- for playing games. While you play, the trusty scale tracks your centre of gravity and uses this information to control the game.
Then it judges me -- assigns my "Wii Fit age" -- based on my powers of balance. The ability to stand perfectly still is highly prized in Wiiland. And it's harder than I'd thought.
Now, the first thing the scale told me, when I loaded myself onto it on Christmas Day, was that I was overweight. Bastard, I thought, and set myself a target to shed two kilos within two months, which will put me well within the "ideal" weight range.
Since then, my weight's been up and down like a stock market. But it has shown a slightly downward trend. On Sunday it told me, for the first time since I started, that I was no longer overweight.
I presume that's because I was taking the test before breakfast. (The device says that I probably fluctuate by about 1 kilo over the course of a day, so for consistency it's best to weigh oneself at around the same time daily.) Since then, I've been avoiding it. I figure the next news is probably going to be bad, so why rush it?
Interesting things I've noted:
- I tend to gain weight on working days. No surprise there: going to work is bad for me. It's official.
- By far the most effective ways of losing those Christmas kilos don't involve exercising at all: they're bathroom-related. (A long soak in a hot bath removes an astonishing amount of fat. What did you think I was talking about?)
Susan, however, has been going nuts. She's probably clocked enough hours on the Wii Fit to earn some kind of pilot's license. While on holiday -- from Christmas until this Monday -- she would take the test three, four times a day. And because she was burning so much energy, of course, she's been mainlining candy canes and similar health-giving foods.
So although she's unquestionably toned and beautiful, she's not losing weight. That's not a problem in theory -- she's already comfortably in the "ideal" region -- but the damn' device itself encourages us to set goals for weight change. "Staying much the same" isn't an option, as far as I can see.
On the whole, it's quite fun. And it speaks with a good English accent, which is nice. But I do wish it could be localised to the southern hemisphere. Being told to watch what I eat during these winter months is mildy annoying; come June, I anticipate being told to get out and enjoy the sunshine, which will be irritating beyond measure.
Friday, January 2, 2009
He came down from tumpty tumpty
The BBC is ridiculously proud of its oldest Christmas tradition: Carols from King's, a highest-of-high-church festival of carols and bible readings. For 80 years, the Beeb has been broadcasting it every Christmas Eve.
I heard it circa 1993, at 3p.m. on Christmas Eve when driving home, and for the first time, I understood the appeal of church choirs. As soon as the first soloist struck up, the pure countertenor voice seemed to take me back to a time when Christmas wasn't a matter of preparation and panic, but anticipation and peace. Ever since, I've tried to time my Christmas Eve journeys to catch that broadcast. When the service begins, I can feel once again the peace that comes with knowing that -- Christmas is come, my deadline is passed, there is nothing more to do save drive through the gathering dusk, on fast-emptying roads, to journey's end, warmth and welcome. Home.
That's what "Carols from King's" means to me. Or rather, meant. (I have no interest in the TV broadcast. You listen while you're doing something else -- that's the whole point.)
Because the BBC, with its astonishingly lopsided sense of priorities, doesn't podcast it.
Of course there may be sound business reasons for that. It might worry that a podcast would damage its CD sales, but the poxy CD versions only offer the carols, not the readings, otherwise I'd buy it with pleasure. It's broadcast on the World Service, but -- well, if you can find a schedule and frequency details for that in New Zealand, let me know. ("Asia-Pacific" forsooth.)
This Christmas I tried to look at the New Zealand equivalent. The choir sang carols, for want of a better word, that I've never heard (and frankly never want to hear again), and the "lessons" were delivered by moralising clerics talking about doing good in the Real World. Very worthy, and tortuously dull.
This morning I was dreaming about setting up an equivalent carol service in New Zealand. We were rehearsing the traditional processional carol, "Once in Royal David's City", and I was following the words on a hymn sheet, when I found myself reading a verse I'd never heard. It was beautiful: it fit the meaning, rhythm and rhyme of the carol, it was haunting and lovely and perfect, and when I woke I couldn't remember a word of it.
All I know now is that my subconscious can write better poetry than I can.
Happy New Year, everyone.
I heard it circa 1993, at 3p.m. on Christmas Eve when driving home, and for the first time, I understood the appeal of church choirs. As soon as the first soloist struck up, the pure countertenor voice seemed to take me back to a time when Christmas wasn't a matter of preparation and panic, but anticipation and peace. Ever since, I've tried to time my Christmas Eve journeys to catch that broadcast. When the service begins, I can feel once again the peace that comes with knowing that -- Christmas is come, my deadline is passed, there is nothing more to do save drive through the gathering dusk, on fast-emptying roads, to journey's end, warmth and welcome. Home.
That's what "Carols from King's" means to me. Or rather, meant. (I have no interest in the TV broadcast. You listen while you're doing something else -- that's the whole point.)
Because the BBC, with its astonishingly lopsided sense of priorities, doesn't podcast it.
Of course there may be sound business reasons for that. It might worry that a podcast would damage its CD sales, but the poxy CD versions only offer the carols, not the readings, otherwise I'd buy it with pleasure. It's broadcast on the World Service, but -- well, if you can find a schedule and frequency details for that in New Zealand, let me know. ("Asia-Pacific" forsooth.)
This Christmas I tried to look at the New Zealand equivalent. The choir sang carols, for want of a better word, that I've never heard (and frankly never want to hear again), and the "lessons" were delivered by moralising clerics talking about doing good in the Real World. Very worthy, and tortuously dull.
This morning I was dreaming about setting up an equivalent carol service in New Zealand. We were rehearsing the traditional processional carol, "Once in Royal David's City", and I was following the words on a hymn sheet, when I found myself reading a verse I'd never heard. It was beautiful: it fit the meaning, rhythm and rhyme of the carol, it was haunting and lovely and perfect, and when I woke I couldn't remember a word of it.
All I know now is that my subconscious can write better poetry than I can.
Happy New Year, everyone.
Subscribe to:
Posts (Atom)