Friday, August 20, 2010

Prescription sugar

I read the other day that Britain's National Health Service sometimes pays for homeopathic prescriptions.

Blah blah, insert rant here about giving taxpayers' money given to snake-oil merchants, undermining the authority of the medical profession, and generally reducing western civilisation to a smoking ruin. But the fact is, I can't get terribly worked up about this.

The Department of Health says it doesn't know how much of the NHS drugs budget is spent on snake oil, but it's "a tiny fraction (approximately 0.001%)". How they can know this figure without knowing an actual figure - is a subject for reasonable suspicion, but even if they're out by a couple of orders of magnitude, it still suggests that there are more important things to worry about. Namely, the other 99.9(99)% of the budget.

I'd bet that a considerably higher fraction of that budget goes to AstraZeneca, a company with a documented record of deliberately hiding the truth about its own products. And the company is not unusual in that; what it's been caught doing is entirely standard industry procedure.

This isn't even news. We've known it anytime this past quarter-century. And yet we continue to pay these companies to poison us and lie to us about it.

I have a simple suggestion for the NHS: it should refuse to pay for any drug or treatment that is covered by a patent. Patients who want those treatments should have to find some other way to pay for them. Generic drugs are vastly cheaper, better tested (by 20 years of actual use), and more honestly marketed. Let the NHS stick to those.

(Remember, patents expire after 20 years, so that would still allow every drug and treatment that was available before 1990, and medicine wasn't exactly in the dark ages then.)

Of course, no sooner were this rule suggested, than a thousand and one "patients' groups" would form to lobby, loudly, for this or that exception. And most of them wouldn't even pause to think that they were acting as pawns of the pharmaceutical industry.

I wonder if the directors of those companies ever have nightmares of creating the perfect drug - one that actually cures a condition, with no bad side-effects? Because that would be the death of their business model.

Wednesday, August 18, 2010

Games and fun

Games play (heh) an important part in my life.

I blame my big brother. He introduced me to Dungeons & Dragons at the impressionable age of, ooh, about 13. It was new and exciting, a fun game in those days, before the news media and fundamentalist Christians had heard of it; a friendly, co-operative game, bringing socially inept geeks together in their common love of fantasy and fairy tales.

D&D has come a long way in the 30 years since then (mostly backwards, at least for the players, although obviously it's kept the publishers in business). But it's also given rise to an entirely new genre: the computer FRPG.

It's not widely appreciated, even by players of both, how different the computer FRPG is from the version with pencils and paper and polyhedral dice. Some people think that writing the perfect computer FRPG is just a matter of coding a big enough world, and faithfully and meticulously translating enough of the rules. But this thinking ignores the most important rule of all, which is the core of the difference.

Put simply: the pencil-and-paper game is a social event - a bunch of people get together to have fun. The computer version is a storytelling medium - the "author" draws up a plot, and the player's job is to find a way through it. There may be many different ways, some arriving at slightly different endpoints, but the basic framework and goal are not negotiable. (True, most human DMs start out with a similar "plot" in mind. But good ones will change it as they go along; only bad DMs will try to force their players to stick to a framework they planned from the start. A well-run game is not so much a story as a symposium.)

And that's why the computer FRPG is a form of art, no matter what Roger Ebert says on the subject; because it's not like the social game that is played between friends. Rather than spending so much time defining "art", I think Ebert should give more thought to his idea of a "game". Games like Neverwinter Nights, the Zelda family, Morrowind, Jade Empire - these are not about winning or losing, or even playing. They are about experiencing a story.

Some of them allow more latitude than others. Resident Evil imposes a fixed path, set out by constant short-term motivators. Neverwinter Nights - the most faithful translation of pen-and-paper rules I've seen - forces you to jump through the hoops laid out in the order required, because there's simply not much else to do. Oblivion, by contrast, gives you near-complete freedom - but there's still a single, central storyline to be completed.

The 2008 version of Prince of Persia is an extreme example. Not only do you have to follow the plot (with virtually no latitude about what order you do things in or what skills your character develops), but when you screw it up, you're immediately returned ("restored") to the point just before you did. This mechanism caused a lot of controversy at the time - some people thought it was taking the risk and the skill out of the game.

But, in truth, none of these games is about risk or skill, any more than watching a whodunnit is about your ability to outguess the fictional detective. They are about the experience.

And that, dear Mr Ebert, is art. Some of it is even, I would claim, good art.

I recently replayed Jade Empire, and I would say that, in script, acting, cinematography and (most importantly) storyline, it stands comparison with decent Hollywood action movies. Zelda: Twilight Princess is by turns engrossing, thrilling and touching, as well as beautifully visualised. Oblivion, while scriptwise a pale shadow of Morrowind, tries to make up for it with technical execution (I recall the first time, stealing through some tunnel, I saw an ogre ahead of me - I almost wet myself).

There's also, of course, plenty of bad art in the genre. Assassin's Creed has a confused and cliched storyline, with little latitude to explore and no attempt to reconcile the inconsistencies. Ditto Freelancer, and Neverwinter Nights 2. Bad writers keep you on the story railroad by putting in arbitrary, unexplained restrictions to what you can do. Whereas the better games either trap you in a storyline where there is always an obvious, immediate short-term goal (Zelda, Resident Evil), or continually nudge you towards the plot with internally consistent motivators (Morrowind).

But even bad art is still art. Good art shows what it can aspire to.

If the computer FRPG were really just a digital version of the tabletop game, then Ebert would have a point. As it is - well, he should try playing through some of these games. Then let's see if he can still maintain that he hasn't experienced some kind of art.

Friday, August 13, 2010

Two concepts of quality

Researchers from Rice University's Department of Luddite Apologetics have found experimental evidence for what many of us have long suspected: that video content is more important than quality. If you're enjoying the movie, you won't notice that it's grainy, scratchy, blocky or even black-and-white. Conversely, if the movie is in super-high-resolution, that won't make you enjoy it any more.

Not surprising, perhaps. We know the brain is very good at filling in detail and smoothing over cracks. That's the whole principle on which movies work in the first place - if you show a series of still images quickly enough, the brain stitches them together into a single "moving" picture.

But it's always nice to have one's prejudices confirmed.

This should be terrible news for Sony, which has staked pretty much its entire product line on the assumption that we'll mortgage our firstborn to get higher-resolution video. Conversely, great news for TV viewers: you don't have to buy that HD screen and Blu-Ray player, it won't improve your enjoyment: good movies are good without it, and crappy ones will still be crappy even with it.

Unfortunately, Sony wouldn't be Sony if its plans were based on anything as fickle as "what we want". The poor old consumer is routinely stitched up with products that they either didn't ask for, or actually begged not to get - cellphone cameras, rolling news, American Idol, movie sequels (and prequels, and remakes, and Jar-Jar), cover versions, deep-ocean oil rigs, wars, Windows upgrades...

And HDTV is one of these. We're already being forced to accept "digital TV", on the laughable pretext that it will simultaneously allow more channels and better quality (which is a bit like wiring up your aircon so that it will only work when your heating is on maximum). Within ten years, I confidently predict, non-HD TVs will be hard to buy, ruinously expensive to service or repair, and incapable of receiving anything other than rolling news and reality TV. Thus requiring more movies and programs to be remade, to meet our higher expectations.

Consumerism. Gotta love it. After all, what choice do we have?

Monday, August 9, 2010

Biblical taxation

For some reason, there's been a lot of coverage lately of an American politician named Michelle Bachmann, who favours a "biblical" model of taxation. "We render to God that which is God’s and the Bible calls for ... maybe 10%", she apparently says.

Since I was brought up to believe that everyone - no matter how stupid or insane they sound at first encounter - has something to teach me, it occurred to me to wonder what a Biblical tax model might really look like.

Sadly, Bachmann has it quite wrong. When Jesus was questioned about taxation, He pointed out that money bears the image of Caesar, not God, and therefore (He implied) it is Caesar's domain (Matthew 22:21); the church has no call on it at all. What one should "give to God" is "what is God's" - a definition that, given the context, is clearly meant to exclude money. Paul, ever the pragmatist, recommends that Christians should give a fraction of their income for the upkeep of their church (1 Cor 16:1-2) - but he never mentions the 10% figure. And money donated to the church is, in any case, entirely separate from the issue of paying taxes; the donation is, very explicitly, not a tax - it is something that must be given voluntarily, "not reluctantly or under compulsion" (2 Cor 9:7). The "tithe" is an Old Testament concept, where it's levied by the Temple to support its works - again, quite separately from what the state or the king demand for their works.

Clearly, in conflating taxes with tithes, this Bachmann is on very unsound ground theologically.

I can, however, think of one example in the Bible where a righteous figure is charged with managing a secular tax regime. In Genesis 41:33-49, Pharaoh appoints Joseph as first minister of Egypt, to establish a tax rate of 20% in years of plenty; the idea being that it will be doled out in the lean years to follow (making Joseph, arguably, the first Keynesian).

This tax is raised for one purpose: to alleviate the effects of famine (recession) by feeding the hungry. It does not include any allowance for defence, law and order, education, fire safety, maintenance of public roads or buildings, the Pharaoh's majesty, or any other kind of public service - those are all extra, presumably levied by a separate, parallel set of collectors. This 20% is taken purely for the purpose of redistribution.

What could we do with a regime like that?

The UK's GDP per head, today, is around £27,000 per annum. Imagine if the government took 20% of that money (only a fraction of its actual spending, of course) and simply paid it out evenly to everyone over the age of 18. Assuming one-fifth of the population is under that age, that'd be a shade over £120 per adult per week.

That's enough to live on. Not "live well", of course - you'd probably have to share lodgings, and you couldn't support much of a family on it. But enough to take the edge off poverty. No matter what happens - employed, unemployed, self-employed, retired, on holiday, in education, in prison - every UK citizen would have a guaranteed top-up to whatever other income they could get. For life.

I think that would be a Good Thing.

It would remove the poverty trap - no more losing benefits when you gain income, because all lesser benefits are simply abolished. It would massively cut down on the government-related paperwork that afflicts ordinary people (I've been unemployed, I remember it with horror to this day - and what I had to put up with, including the 90% marginal tax rate, was only a fraction of the ordeal that's inflicted on the most vulnerable people in society when they try to claim, for instance, disability living allowance). It would establish a base level of income for everyone, tied directly to national income, thus reducing inequality. It would allow us to forget about "fully funded pensions" - pension income would be, quite transparently, paid for out of current income (which is what must happen anyway, it's the only thing that makes economic sense, and anyone who tells you different is trying to sell you something). It would support rural areas and take the pressure off inner cities - honest people need not be quite so desperate for jobs. It would eliminate the state retirement age, and indeed the entire concept of "retirement" - you could stop working at whatever age you felt you could afford it, and change your mind at any time later, with no repercussions and no paperwork.

So how, specifically, would we go about paying for the Universal Benefit?

For starters, it's considerably more than the current jobseeker's allowance, or the basic state pension (even including the winter fuel allowance), or disability living allowance, carer's allowance, child benefit, maternity benefit... So we could scrap all of those - basically, reduce the Department for Work & Pensions to a rump department charged solely with keeping track of who's still alive and what bank account they want their money paid into. That would save about one-sixth of the entire government budget, or over 35% of the money needed, without levying a penny more in taxes.

Second, the Universal Benefit itself would be taxable. So while the poorest get the full £120 a week, a top-rate taxpayer would get only £72 (my tax rates may be a shade out of date, but never mind that for now). Let's call it, to a reasonable back-of-the-envelope level of precision, another 25% of the cost clawed back directly from taxpayers at present rates.

The remaining cost to be charged in a direct tax amounting to 8% on all UK incomes. Since the money we're still looking for is (40% of 20%) of national income, it follows that 8% of national income would fill the gap. Of course there's still the zero-tax band (below, say, £10k), so the actual rate for those paying would have to be a bit higher - say, 9-10%. We could call it "national insurance contributions", and no-one would even notice the difference.

Who'd've thunk? It turns out that Bachmann has a fantastic idea. All that's at fault is her reading of the Bible.