Showing posts with label politics. Show all posts
Showing posts with label politics. Show all posts

Friday, March 11, 2011

Governor Pat Quinn is Doing a Great Job

As a resident of Illinois, I can tell you that Governor Pat Quinn is doing a great job. I couldn't be happier that we elected him, albeit by a small margin, rather than the Republican buffoon who ran against him. Unlike the Republicans who harp on the budget deficit without ever doing anything that would actually solve the problem, Quinn started working on fixing the budget the moment he got elected. We have seen the results of his hard work already: the state of Illinois repaid the entire amount that it owed our university. Had a Republican been elected as Governor of Illinois, we wouldn't have seen a dime of that money as the Republicans warned us before the election.

Quinn supports our new healthcare law, which shows that he is an intelligent, progressive person. He is adamant in his defense of a woman's right to be in control of her own uterus. He supports gay marriage and is in favor of enacting gun control legislation. Recently, he has demonstrated that, unlike the majority of American politicians, he is not afraid of antagonizing a huge corporation if that's what it takes to protect the interests of our state. 

Those who have been brainwashed by the Tea Party into believing that the government is always bad and the corporations are always good are trying to crucify Governor Quinn for his brave and intelligent decision to show Amazon that it is not above the law. Middle-of-the-road folks who are so spineless that they can't just pick a side already and stick to it screech that Quinn's new piece of legislation (the one that will force Amazon to pay taxes in Illinois) will be impossible to enforce. "Things are the way they are, nothing will ever change," they keep screeching. This defeatist attitude to life is not shared by Quinn and his supporters. We might not matter individually, but together we can achieve a lot. 

Thursday, March 10, 2011

Who Caused the Collapse of the Soviet Union? Part III

To continue our conversation about the collapse of the Soviet Union that we started here and here, I want to answer the question that people often have when they are told that there was no transfer of power when the system changed. If money and power remained in the same hands after the fall of the Soviet Union, people ask, how is it possible that nobody noticed? Weren't the citizens supposed to start asking questions as to why such a profound transformation as going from socialism to capitalism did not bring about a major transfer of power?

Of course, people would have asked these questions. They were prevented from doing it, though, by a very inventive distractionary tactic. The tactic in question consisted of presenting the people whose pictures you can see below as the new post-Soviet billionaires:

This is Roman Abramovich whose fortune is estimated at $13,4 billion. 

He is the 53rd richest person in the world. 

This guy is Boris Berezovsky. His fortune has dwindled in the recent years (an expensive divorce, endless court cases, exile, etc.) and now stands at a puny $1 billion.

While he still served the purposes of the regime, it looked like his political and economic power was unrivaled.


This is another post-Soviet billionaire, Vladimir Gusinsky. He is now also in deep trouble with the regime. In the nineties, however, he owned pretty much everything in Russia. Except, of course, what the other guys whose pictures I posted owned.

So these are the people who were given to us in the nineties as the all-powerful billionaires who now had all the money and the power of the former Soviet Union. And they all have one thing in common. It might not be obvious to an American eye that is used to seeing a huge ethnic and racial variety on a daily basis. It is immediately obvious to any Soviet person, though, that these guys are Jews. (These are not the only billionaires of the 90ies, of course. There are a few more, and most of them are also Jewish.)

In the early nineties, the people who were effectuating the so-called transition from the Soviet Union to a free market democracy (a transition that never really took place, of course) used this nifty little trick to distract the fiercely anti-semitic Soviet people from what was really going on. They appointed some very obviously Jewish guys to act as figureheads for the seemingly new regime. When the Jewish billionaires had served their purpose, they were thrown over by the regime. Now many of them are either in hiding or in exile. In my opinion, they had been chosen as figureheads from the pool of minor KGB informers. Of course, I have no data to substantiate this opinion but no other possibility makes sense logically.

In the next post in this series I will tell you who I think was really in power in the Soviet Union and why the decision was made to disband the USSR temporarily.

Wisconsin and the Future of This Country

In the face of massive protests of the state's citizens, Wisconsin Republicans demonstrated yet again that they couldn't care less about what the people of the state want. They just spit on the people's wishes and proceeded to vote to curtail the collective bargaining rights of the citizens. The Republican-spurred descent of this country into barbarity continues.
"In 30 minutes, 18 state senators un-did 50 years of civil rights in Wisconsin. Their disrespect for the people of Wisconsin and their rights is an outrage that will never be forgotten," Senate Minority Leader Mark Miller said.
Yes, it will. People will forget and proceed to vote for a Republican candidate in 2012 in happy, sheepish droves. CNN just dedicated 10 seconds to the events in Wisconsin and then offered a half-hour-long segment which used the words "Muslim" and "terrorist" in every other sentence. A few more scary stories about bad, threatening Muslims or Latinos and the people will forget about how the Republicans deceived them in 2010 with their empty talk about jobs, economy, etc. and rush to vote for them yet again.

I'm starting to think that Americans are people who are happy to accept growing poverty, a constant threat of unemployment, lack of social benefits that the rest of the civilized world enjoys freely, sinking living standards, etc. as long as they are promised that they will be able to force women to give birth against their will and prevent gays from getting married. Well, if that's what people want, they should definitely get it. If they are fine with the Republicans treating them like cattle and robbing the blind, let them vote the way they want.

Wednesday, March 9, 2011

Who Caused the Collapse of the Soviet Union? Part II

The first post in this series got a huge number of visitors, which makes me think that the topic is of interest to people and has to be developed further. So I'll keep writing on this subject until I run out of things to say (which will not be very soon.)

Now, the most important thing you need to do if you want to understand what happened to the Soviet Union and what's going on in its former republics right now is forget about the United States. I know that there are many people who like to believe that every single thing in the world is caused by the United States. Pseudo-liberals unwittingly demonstrate just how much they despise those of us from other countries by their insistence that if life in our countries does not correspond to their standards, that must have been caused by the interference of the US. This attitude is condescending, reductive and wrong. Today's reality of the former Soviet countries was created and is maintained by people in those countries. And it's not a reality that makes them unhappy, so fake compassion for us, poor unintelligent victims of the bad, all-powerful US, is completely misplaced. If that's the direction of your thoughts about us, you need to reexamine what psychological issues make you want to exaggerate the importance of your country at the expense of others.

Even Naomi Klein, who in her imaginative and often funny book The Shock Doctrine: The Rise of Disaster Capitalism demonstrates a grievous misunderstanding of the post-Soviet Russia (she refers to Yeltsin as Russian Pinochet, for Pete's sake), recognizes in a grudging manner that the Russians beat the IMF at its own game. Those of you who have read the book know that it's informed by Klein's extremely Americentric agenda. Still, even she doesn't manage to create a convincing account of American protagonism in the collapse of the Soviet Union and the further fate of the former Soviet republics.

Now that we have established a productive framework within which these events should be discussed, we will be able to continue exploring this topic.

How Not To Improve University Teaching

The British government has recently changed the rules on university funding. At present, students pay no more than £3,000 per year for their tuition, with the rest of the roughly-£7500 it costs to teach one student being paid for by the state.

From next year, students will pay up to £9000 per year and the state will hardly pay anything. This was sold to the nation as a way to cut the budget deficit after the Recent Financial Unpleasantness, although it won't achieve this for several years, if at all, because the government will loan students the money upfront and they'll then gradually pay it back after they graduate.

However, another supposed benefit of the changes is that they'll give universities an incentive to improve their teaching. Students, we're told, will demand high quality teaching, now that they are the ones paying for it, and institutions which fail to provide this will lose out as students choose somewhere else.

Which is a cute little idea, and there may be a few people out there who actually believe it, but there's one problem: universities don't teach anyone, academics do. And academics have no incentive to teach well and, in most cases, no incentive to make sure that their university has a reputation for good teaching.

As an academic your career is research. The way you get a job, and a promotion, and grants and money and influence, is by publishing papers. You don't get ahead by teaching well. Academics teach because it's written into their university contract that they have to do a certain amount of teaching. Or in the case of junior academics who don't have a lectureship yet, they teach because their salaries are low relative to the cost of getting the qualifications required (BSc + MSc + PhD = £££) and they need the money.

This doesn't mean that all academics resent teaching, although sadly many do. Some are fine with it, and some enjoy it. Some, generally the latter ones, are extremely good at it. But even they have no incentive to be good at it or to improve their teaching. If it comes down to a choice between spending a week preparing a set of awesome lectures, or a week in the lab, the incentive is, by the nature of academic careers, always going to be towards research.

OK, but don't researchers benefit from working at a prestigious university? Doesn't that look good on your CV? Yes (although not as good as publications) but this doesn't mean they have an incentive to make their university more prestigious - because in most cases it's only "their" university for a few years at maximum.

Until you get to the level of tenured professor, if ever, you cannot assume that you'll be working in the same place for very long. Many academics will go to one university for their undergraduate degrees, another for their masters, another for their doctorate, and then another two or three as junior faculty member before they "settle down" - and the majority don't make it that far. And these are not uncommonly on different continents. Tenured professors are the only ones with a material interest in the future of their institution and they usually delegate their teaching anyway.

So what will the new fees changes achieve? They'll give university managers an incentive to try to improve teaching but managers don't teach. So they'll try to get their academics to teach better - but it is very unclear that this is possible.

I think we'll be seeing more "training courses", "teaching support officers" and other managerial initiatives, along with ever-glossier marketing brochures, but whether this will achieve anything is doubtful. The essence of good teaching isn't training, it's motivation - you have to want to teach well. You have to be passionate about your subject, you have to care about your students, and you have to put in the hours.

So long as teaching doesn't contribute to academic career prospects, many will see it as a burden and these training courses as yet another distraction from their research - and from their teaching too, in fact. Change the nature of academia so that publications aren't everything and teaching is valued - then you'd improve teaching, and quite possibly research as well.

How Not To Improve University Teaching

The British government has recently changed the rules on university funding. At present, students pay no more than £3,000 per year for their tuition, with the rest of the roughly-£7500 it costs to teach one student being paid for by the state.

From next year, students will pay up to £9000 per year and the state will hardly pay anything. This was sold to the nation as a way to cut the budget deficit after the Recent Financial Unpleasantness, although it won't achieve this for several years, if at all, because the government will loan students the money upfront and they'll then gradually pay it back after they graduate.

However, another supposed benefit of the changes is that they'll give universities an incentive to improve their teaching. Students, we're told, will demand high quality teaching, now that they are the ones paying for it, and institutions which fail to provide this will lose out as students choose somewhere else.

Which is a cute little idea, and there may be a few people out there who actually believe it, but there's one problem: universities don't teach anyone, academics do. And academics have no incentive to teach well and, in most cases, no incentive to make sure that their university has a reputation for good teaching.

As an academic your career is research. The way you get a job, and a promotion, and grants and money and influence, is by publishing papers. You don't get ahead by teaching well. Academics teach because it's written into their university contract that they have to do a certain amount of teaching. Or in the case of junior academics who don't have a lectureship yet, they teach because their salaries are low relative to the cost of getting the qualifications required (BSc + MSc + PhD = £££) and they need the money.

This doesn't mean that all academics resent teaching, although sadly many do. Some are fine with it, and some enjoy it. Some, generally the latter ones, are extremely good at it. But even they have no incentive to be good at it or to improve their teaching. If it comes down to a choice between spending a week preparing a set of awesome lectures, or a week in the lab, the incentive is, by the nature of academic careers, always going to be towards research.

OK, but don't researchers benefit from working at a prestigious university? Doesn't that look good on your CV? Yes (although not as good as publications) but this doesn't mean they have an incentive to make their university more prestigious - because in most cases it's only "their" university for a few years at maximum.

Until you get to the level of tenured professor, if ever, you cannot assume that you'll be working in the same place for very long. Many academics will go to one university for their undergraduate degrees, another for their masters, another for their doctorate, and then another two or three as junior faculty member before they "settle down" - and the majority don't make it that far. And these are not uncommonly on different continents. Tenured professors are the only ones with a material interest in the future of their institution and they usually delegate their teaching anyway.

So what will the new fees changes achieve? They'll give university managers an incentive to try to improve teaching but managers don't teach. So they'll try to get their academics to teach better - but it is very unclear that this is possible.

I think we'll be seeing more "training courses", "teaching support officers" and other managerial initiatives, along with ever-glossier marketing brochures, but whether this will achieve anything is doubtful. The essence of good teaching isn't training, it's motivation - you have to want to teach well. You have to be passionate about your subject, you have to care about your students, and you have to put in the hours.

So long as teaching doesn't contribute to academic career prospects, many will see it as a burden and these training courses as yet another distraction from their research - and from their teaching too, in fact. Change the nature of academia so that publications aren't everything and teaching is valued - then you'd improve teaching, and quite possibly research as well.

Tuesday, March 8, 2011

Who Caused the Collapse of the Soviet Union? Part I

Nothing annoys me more than hearing people discuss completely in earnest whether the collapse of the Soviet Union was brought about by Ronald Reagan or by somebody else. Such discussions make just as much sense as trying to figure out whether world peace was achieved by this or some other politician. "Well, there is no world peace," you'd say. Right you are. And there was no collapse of the Soviet Union. Not in any meaningful sense, that is. As to the end of the Cold War, if you seriously think it's over, you need to stop spending so much time listening to the American media and turn to some external sources of information every once in a while. The winner of the Cold War is yet to be decided but I somehow doubt that you can win any war by pretending it isn't taking place.

In case you want to know what really happened with the Soviet Union, North American media sources will not tell you anything intelligent. Every time I read an article or watch a news segment on the former USSR countries in the US or Canada, I am terrified at the amount of sheer factual errors and ridiculous mistakes that I encounter. I read an article in Montreal's Gazette a few years ago that stated in no uncertain terms that radio was very popular in Russia nowadays because people had no money to buy TV-sets. This made me realize that woeful ignorance and ideological dishonesty of print media journalists makes writing about the former USSR the perfect ground for them to demonstrate their complete lack of investigative integrity. They just write whatever old bunch of lies will make the readers feel more relaxed and happy at any given moment.

In order to answer the question as to what happened to the Soviet Union, I want to give you small snippets from the biographies of the richest and most powerful people in Russia today. Tell me if you find anything these people have in common. I marked the relevant parts with bold type in case you don't feel like reading a lot today.
___________________________________

Vladimir Putin, the President and now the Prime-Minister (and the real ruler) of Russia:
Putin joined the KGB in 1975 upon graduation from university, and underwent a year's training at the 401st KGB school in Okhta, Leningrad. He then went on to work briefly in the Second Department (counter-intelligence) before he was transferred to the First Department, where among his duties was the monitoring of foreigners and consular officials in Leningrad, while using the cover of being a police officer with the CID. He served at the Fifth Directorate of the KGB, which combated political dissent in the Soviet Union. He then received an offer to transfer to foreign intelligence First Chief Directorate of the KGB and was sent for additional year long training to the Dzerzhinsky KGB Higher School in Moscow and then in the early eighties—the Red Banner Yuri Andropov KGB Institute in Moscow (now the Academy of Foreign Intelligence).
_____________________________________
Mikhail Potanin,  one of Russia's billionaires, former First Deputy Prime Minister of the Russian Federation.
Potanin was born into a high-ranking communist family. In 1978, Potanin attended the faculty of the International economic relations at Moscow State Institute of International Relations (MGIMO), an elite school that groomed students for the Ministry of Foreign Affairs. . . In 1993, Potanin became President of United Export Import Bank. From August 14, 1996 until March 17, 1997 he worked as . Since August 1998, Potanin hold the positions of President and Chairman of the Board of Directors of the Interros Company. Potanin's Interros owns 25% and controls Russian Nickel giant Norilsk Nickel
_______________________________________
Mikhail Khodorkovsky,  is a Russian oligarch and businessman. In 2004, Khodorkovsky was the wealthiest man in Russia, and was 16th on Forbes list of billionaires. Now, this vile criminal is finally in jail.
He succeeded in building a career as a communist functionary. He became deputy head of Komsomol (the Communist Youth League) at his university. The Komsomol career was one of the ways to get into the ranks of communist apparatchiks and to achieve the highest possible living standards. After perestroika started, Khodorkovsky used his connections within the communist structures to gain a foothold in the developing free market. He used the help of some powerful people to start his business activities under the cover of Komsomol. Friendship with another Komsomol leader, Alexey Golubovich, helped him greatly in his further success, since Golubovich's parents held top positions in the State Bank of the USSR.
_______________________________________
Alexander Lebedev:  In May 2008, he was listed by Forbes magazine as one of the richest Russians and as the 358th richest person in the world with an estimated fortune of $3.1 billion. He owns a third of airline Aeroflot, and is part owner of the Russian newspaper Novaya Gazeta and owner of four UK newspapers with son Evgeny Lebedev: the London Evening StandardThe Independent, the Independent on Sunday and the new i newspaper. 

In 1977, Alexander Lebedev entered the Department of Economics at Moscow State Institute of International Relations. After he graduated in 1982, Lebedev started work at the Institute of Economics of the World Socialist System doing research for his Kandidat (equal to Ph.D.) dissertation The problems of debt and the challenges of globalization. However he soon transferred to the First Chief Directorate (Foreign Intelligence) of KGB. He worked there and at its successor Foreign Intelligence Service until 1992. In London he had the diplomatic cover of an economics attaché
____________________________________________
Chernomyrdinwas the founder and the first chairman of the Gazprom energy company, the longest serving Prime Minister of Russia (1992–1998) and Acting President of Russia for a day in 1996. He was a key figure in Russian politics in the 1990s, and a great contributor to the Russian transition from a planned to a market economy
Chernomyrdin began developing his career as a politician when he worked for the Communist Party in Orsk between 1967 and 1973. In 1973, he was appointed the director of the natural gas refining plant in Orenburg, a position which he held until 1978. Between 1978 and 1982, Chernomyrdin worked in the heavy industry arm of the Central Committee of the Communist party.
In 1982, he was appointed deputy Minister of the natural gas industries of the Soviet Union. Concurrently, beginning from 1983, he directed Glavtyumengazprom, an industry association for natural gas resource development in Tyumen Oblast. During 1985-1989 he was the Minister of gas industries.
____________________________________________________


I could continue this list practically ad infinitum but I'm sure that everybody knows what I'm trying to say here.  All of the major politicians and the billionaires in Russia and other former Soviet republics are former high-ranking members of the Communist Party, apparatchiks, and KGB employees. There was never any transfer of power, either politically or  economically. Absolutely the same people (or, rather, families) who ruled us before 1985 are still in power today. And if you want to know how and why that happened, wait for the second part of this post. 

Sunday, February 27, 2011

Liberal Academia

The Washington Post realizes that if its conservative subscribers get any less educated than they already are, they will not be able to read even the simplistic swill that this newspaper is feeding them. As a result, it decided to dial back its hate campaign against the commie hippie latte-swigging tree-hugging college professors. Now it is trying to convince its readership that getting a higher education might not deal such a serious blow to their children's Republican convictions. A clumsy article trying to argue in a very impotent way that college campuses are not all that liberal appeared in The Washington Post recently. It's titled "Five Myths About Liberal Academia" and can be found here in case you really enjoy bad writing.

Whatever bill of goods that The Washington Post is trying to sell to its conservative readers, the truth is different. Unless we are talking about a student who has been brainwashed to the point of not having a single thought of their own, college education will end up broadening their horizons and demonstrating to them that any conservatism is unnatural, meaningless, and unintelligent

To the contrary of what many conservatives fear, progressive professors don't use the classroom to voice their political convictions. We simply don't need to. When I come into the classroom, looking chic, fashionable and professional and begin to share my knowledge with the students, my way of being is the best argument there could be against female subjection. I don't have to proclaim feminist slogans in the classroom. I bring my point across just by existing. In the same way, I make my students reconsider their dislike of immigrants. And of intelligent, knowledgeable, educated people. The list can be continued ad infinitum. (The dislike of people who use expressions such as ad infinitum could be added to the list).

Every literary text we read in class, brings the students closer to progressive values. For some unfathomable reason, there don't seem to be that many great writers who advocate accepting things the way they are, resisting all change, and trying to revert to some imaginary paradisaical moment in the past where things used to be perfect. 

We teach our students to think for themselves, identify gaping holes in any argument (such as the above-mentioned article in The WaPo, for example), to analyze and operate with facts. We are not always successful, of course, but when we are we end up creating more open-minded, intelligent, progressive people.

Conservatives exist on campus, of course. They are treated by everybody with compassion. Not because of their political beliefs, but because they are those hapless academics who never manage to publish anything. The conservative academics' CVs are very light on publications not because, as The WaPo article suggests, there is some bias against their so-called ideas in liberal publishing houses and journals. Rather, the very nature of research calls for the creation of something new, for progress, for a rejection of old certainties. A piece of research is always judged, first and foremost, on the basis of whether it contributes anything new to the understanding of the subject. The definition of a conservative is "Favoring traditional views and values; tending to oppose change." It is self-evident, I believe, why this kind of person will not be able to transform their area of expertise in any significant way by their research.

Conservative forces in this country might manage to push another Republican president into office in 2012 by the sheer force of their mass hysteria. That, however, will not stop things from changing, progressing, transforming. Theirs is a losing battle, which is why their rage is so virulent.

Wednesday, February 23, 2011

DOMA One Step Closer to an Inevitable Demise

The Attorney General will no longer support the ridiculous and offensive to any normal human being Defense of Marriage Act. And not a moment too soon. It's mind-boggling that in the XXIst century a country like the US should cater to a small group of crazed religious fanatics by passing silly pieces of legislation such as DOMA. This is from the Attorney General's statement::
Much of the legal landscape has changed in the 15 years since Congress passed DOMA.   The Supreme Court has ruled that laws criminalizing homosexual conduct are unconstitutional.  Congress has repealed the military’s Don’t Ask, Don’t Tell policy.   Several lower courts have ruled DOMA itself to be unconstitutional.   Section 3 of DOMA will continue to remain in effect unless Congress repeals it or there is a final judicial finding that strikes it down, and the President has informed me that the Executive Branch will continue to enforce the law.   But while both the wisdom and the legality of Section 3 of DOMA will continue to be the subject of both extensive litigation and public debate, this Administration will no longer assert its constitutionality in court.
Finally, this Administration has stopped insulting all of us by supporting the Defense of Marriage Act whose only value lies in placating the crazy religious fanatics who can't stop policing other people's personal lives for lack of their own.

Thursday, February 17, 2011

WMDs vs MDD

Weapons of Mass Destruction. Nuclear, chemical and biological weapons. They're really nasty, right?

Well, some of them are. Nuclear weapons are Very Destructive Indeed. Even a tiny one, detonated in the middle of a major city, would probably kill hundreds of thousands. A medium-sized nuke could kill millions. The biggest would wipe a small country off the map in one go.

Chemical and biological weapons, on the other hand, while hardly nice, are just not on the same scale.

Sure, there are nightmare scenarios - a genetically engineered supervirus that kills a billion people - but they're hypothetical. If someone does design such a virus, then we can worry. As it is, biological weapons have never proven very useful. The 2001 US anthrax letters killed 5 people. Jared Loughner killed 6 with a gun he bought from a chain store.

Chemical weapons are little better. They were used heavily in WW1 and the Iran-Iraq War against military targets and killed many but never achieved a decisive victory, and the vast majority of deaths in these wars were caused by plain old bullets and bombs. Iraq's use of chemical weapons against Kurds in Halabja killed perhaps 5,000 - but this was a full-scale assault by an advanced air force, lasting several hours, on a defenceless population.

When a state-of-the-art nerve agent was used in the Tokyo subway attack, after much preparation by the cult responsible, who had professional chemists and advanced labs, 13 people died. In London on the 7th July 2005, terrorists killed 52 people with explosives made from haircare products.

Nuclear weapons aside, the best way to cause mass destruction is just to make an explosion, the bigger the better; yet conventional explosives, no matter how big, are not "WMDs", while chemical and biological weapons are.

So it seems to me that the term and the concept of "WMDs" is fundamentally unhelpful. It lumps together the apocalyptically powerful with the much less destructive. If you have to discuss everything except guns and explosives in one category, terms like "Unconventional weapons" are better as they avoid the misleading implication that all of these weapons are very, and equivalently, deadly; but grouping them together at all is risky.

That's WMDs. But there are plenty of other unhelpful concepts out there, some of which I've discussed previously. Take the concept of "major depressive disorder", for example. At least as the term is currently used, it lumps together extremely serious cases requiring hospitalization with mild "symptoms" which 40% of people experience by age 32.

WMDs vs MDD

Weapons of Mass Destruction. Nuclear, chemical and biological weapons. They're really nasty, right?

Well, some of them are. Nuclear weapons are Very Destructive Indeed. Even a tiny one, detonated in the middle of a major city, would probably kill hundreds of thousands. A medium-sized nuke could kill millions. The biggest would wipe a small country off the map in one go.

Chemical and biological weapons, on the other hand, while hardly nice, are just not on the same scale.

Sure, there are nightmare scenarios - a genetically engineered supervirus that kills a billion people - but they're hypothetical. If someone does design such a virus, then we can worry. As it is, biological weapons have never proven very useful. The 2001 US anthrax letters killed 5 people. Jared Loughner killed 6 with a gun he bought from a chain store.

Chemical weapons are little better. They were used heavily in WW1 and the Iran-Iraq War against military targets and killed many but never achieved a decisive victory, and the vast majority of deaths in these wars were caused by plain old bullets and bombs. Iraq's use of chemical weapons against Kurds in Halabja killed perhaps 5,000 - but this was a full-scale assault by an advanced air force, lasting several hours, on a defenceless population.

When a state-of-the-art nerve agent was used in the Tokyo subway attack, after much preparation by the cult responsible, who had professional chemists and advanced labs, 13 people died. In London on the 7th July 2005, terrorists killed 52 people with explosives made from haircare products.

Nuclear weapons aside, the best way to cause mass destruction is just to make an explosion, the bigger the better; yet conventional explosives, no matter how big, are not "WMDs", while chemical and biological weapons are.

So it seems to me that the term and the concept of "WMDs" is fundamentally unhelpful. It lumps together the apocalyptically powerful with the much less destructive. If you have to discuss everything except guns and explosives in one category, terms like "Unconventional weapons" are better as they avoid the misleading implication that all of these weapons are very, and equivalently, deadly; but grouping them together at all is risky.

That's WMDs. But there are plenty of other unhelpful concepts out there, some of which I've discussed previously. Take the concept of "major depressive disorder", for example. At least as the term is currently used, it lumps together extremely serious cases requiring hospitalization with mild "symptoms" which 40% of people experience by age 32.

Saturday, February 12, 2011

The Short 2000 Decade


Historian Eric Hobsbaum famously talked about "the Short 20th Century": 1914 to 1989. The idea being that 1900-1914 was pretty much like the 19th century, but everything really changed with the outbreak of World War 1. Only the fall of the Berlin Wall brought that era, broadly speaking an era defined by wars or the threat of wars in Europe, to an end.

I'm going to make a rash prediction now and say that the first decade of this century started on 9/11/2001, and it ended yesterday, 2/11/2011.

That it started on 9/11 is fairly obvious. Saying that the 9/11 era ended yesterday is why this is a possibly rash prediction, but I think it's fair to say that the game has just changed completely.

For the past decade the main story in world politics has been Islamic extremism. Of course Islamic extremism has not suddenly disappeared overnight; but the way in which the rest of the world deals with it will from now on have to be very different.

For the past 10 years, the people of Muslim countries have had very little say in the matter. Their governments - with just a couple of exceptions - were not democratic. More importantly, they were apparently safe and secure in being non-democratic.

Someone could overthrow them and install new governments from outside, but they had efficient internal security and there was no prospect of them being overthrown from within. So to all intents and purposes, our relationship with "the Islamic world" was our relationship with their governments. Get the governments on our side, or not, and the rest will follow, or not. We thought.

That worked, or seemed to, for 10 years. Never again. For better or worse, the people of Muslim countries are now an issue.

The Short 2000 Decade


Historian Eric Hobsbaum famously talked about "the Short 20th Century": 1914 to 1989. The idea being that 1900-1914 was pretty much like the 19th century, but everything really changed with the outbreak of World War 1. Only the fall of the Berlin Wall brought that era, broadly speaking an era defined by wars or the threat of wars in Europe, to an end.

I'm going to make a rash prediction now and say that the first decade of this century started on 9/11/2001, and it ended yesterday, 2/11/2011.

That it started on 9/11 is fairly obvious. Saying that the 9/11 era ended yesterday is why this is a possibly rash prediction, but I think it's fair to say that the game has just changed completely.

For the past decade the main story in world politics has been Islamic extremism. Of course Islamic extremism has not suddenly disappeared overnight; but the way in which the rest of the world deals with it will from now on have to be very different.

For the past 10 years, the people of Muslim countries have had very little say in the matter. Their governments - with just a couple of exceptions - were not democratic. More importantly, they were apparently safe and secure in being non-democratic.

Someone could overthrow them and install new governments from outside, but they had efficient internal security and there was no prospect of them being overthrown from within. So to all intents and purposes, our relationship with "the Islamic world" was our relationship with their governments. Get the governments on our side, or not, and the rest will follow, or not. We thought.

That worked, or seemed to, for 10 years. Never again. For better or worse, the people of Muslim countries are now an issue.

Tuesday, February 8, 2011

The Social Network and Anorexia

Could social networks be more important than the media in the spread of eating disorders?

There's a story about eating disorders roughly like this: eating disorders (ED) are about wanting to be thin. The idea that thinness is desireable is something that's spread by Western media, especially visual media i.e. TV and magazines. Therefore, Western media exposure causes eating disorders.

It's a nice simple theory. And it seems to fit with the fact that eating disorders, hitherto very rare, start to appear in a certain country in conjunction with the spread of Westernized media. A number of studies have shown this. However, a new paper suggests that there may be rather more to it: Social network media exposure and adolescent eating pathology in Fiji.

Fiji is a former British colony, a tropical island nation of less than a million. Just over half the population are ethnic native Fijian people. Until recently, these Fijians were relatively untouched by Western culture, but this is starting to change.

The authors of this study surveyed 523 Fijian high school girls. Interviews took place in 2007. They asked them various questions relating to, one the one hand, eating disorder symptoms, and on the other hand, their exposure to various forms of media.

They looked at both individual exposure - hours of TV watched, electronic entertainment in the home - and "indirect" or "social network" exposure, such as TV watched by the parents, and the amount of electronic entertainment their friends owned. On top of this they measured Westernization/"globalization", such as the amount of overseas travel by the girls or their parents.

So what happened? Basically, social network media exposure, urbanization, and Westernization correlated with ED symptoms, but when you controlled for those variables, personal media exposure didn't correlate. Here's the data; the column I've highlighted is the data where each variable is controlled for the others. The correlations are pretty small (0 is none, 1.0 would be perfect) but significant.


They conclude that:
Although consistent with the prevailing sociocultural model for the relation between media exposure and disordered eating... our finding, that indirect exposure to media content may be even more influential than direct exposure in this particular social context, is novel.
The idea that eating disorders are simply a product of a culture which values thinness as attractive has always seemed a bit shaky to me because people with anorexia frequently starve themselves far past the point of being attractive even by the unrealistic standards of magazines and movies.

In fact, if eating disorders were just an attempt to "look good", they wouldn't be nearly so dangerous as they are, because no matter how thin-obsessed our culture may be, no-one thinks this is attractive, or normal, or sane. But this, or worse, is what a lot of anorexics end up as.

On the other hand, eating disorders are associated with modern Western culture. There must be a link, but maybe it's more complicated than just "thin = good" causes anorexia. What if you also need the idea of "eating disorders"?

This was the argument put forward by Ethan Watters in Crazy Like Us (my review)... in his account of the rise of anorexia in Hong Kong. Essentially, he said, anorexia was vanishingly rare in Hong Kong until after the much-publicized death of a 14 year old girl, Charlene Chi-Ying, in the street. As he put it:
In trying to explain what happened to Charlene, local reporters often simply copied out of American diagnostic manuals. The mental-health experts quoted in the Hong Kong papers and magazines confidently reported that anorexia in Hong Kong was the same disorder that appeared in the United States and Europe...

As the general public and the region's mental-health professionals came to understand the American diagnosis of anorexia, the presentation of the illness in [Hong Kong psychiatrist] Lee's patient population appeared to transform into the more virulent American standard. Lee once saw two or three anorexic patients a year; by the end of the 1990s he was seeing that many new cases each month.
Now it's important not to see this as trivializing the condition or as a way of blaming the victim; "they're just following a trend!". You only have to look at someone with anorexia to see that there is nothing trivial about it. However, that doesn't mean it's not a social phenomenon.

It's a long way from the data in this study to Watters' conclusions, but maybe not an impossible leap. Part of Westernization, after all, is exposure to Western ideas about what is healthy eating and what's an eating disorder...

ResearchBlogging.orgBecker, A., Fay, K., Agnew-Blais, J., Khan, A., Striegel-Moore, R., & Gilman, S. (2011). Social network media exposure and adolescent eating pathology in Fiji The British Journal of Psychiatry, 198 (1), 43-50 DOI: 10.1192/bjp.bp.110.078675

The Social Network and Anorexia

Could social networks be more important than the media in the spread of eating disorders?

There's a story about eating disorders roughly like this: eating disorders (ED) are about wanting to be thin. The idea that thinness is desireable is something that's spread by Western media, especially visual media i.e. TV and magazines. Therefore, Western media exposure causes eating disorders.

It's a nice simple theory. And it seems to fit with the fact that eating disorders, hitherto very rare, start to appear in a certain country in conjunction with the spread of Westernized media. A number of studies have shown this. However, a new paper suggests that there may be rather more to it: Social network media exposure and adolescent eating pathology in Fiji.

Fiji is a former British colony, a tropical island nation of less than a million. Just over half the population are ethnic native Fijian people. Until recently, these Fijians were relatively untouched by Western culture, but this is starting to change.

The authors of this study surveyed 523 Fijian high school girls. Interviews took place in 2007. They asked them various questions relating to, one the one hand, eating disorder symptoms, and on the other hand, their exposure to various forms of media.

They looked at both individual exposure - hours of TV watched, electronic entertainment in the home - and "indirect" or "social network" exposure, such as TV watched by the parents, and the amount of electronic entertainment their friends owned. On top of this they measured Westernization/"globalization", such as the amount of overseas travel by the girls or their parents.

So what happened? Basically, social network media exposure, urbanization, and Westernization correlated with ED symptoms, but when you controlled for those variables, personal media exposure didn't correlate. Here's the data; the column I've highlighted is the data where each variable is controlled for the others. The correlations are pretty small (0 is none, 1.0 would be perfect) but significant.


They conclude that:
Although consistent with the prevailing sociocultural model for the relation between media exposure and disordered eating... our finding, that indirect exposure to media content may be even more influential than direct exposure in this particular social context, is novel.
The idea that eating disorders are simply a product of a culture which values thinness as attractive has always seemed a bit shaky to me because people with anorexia frequently starve themselves far past the point of being attractive even by the unrealistic standards of magazines and movies.

In fact, if eating disorders were just an attempt to "look good", they wouldn't be nearly so dangerous as they are, because no matter how thin-obsessed our culture may be, no-one thinks this is attractive, or normal, or sane. But this, or worse, is what a lot of anorexics end up as.

On the other hand, eating disorders are associated with modern Western culture. There must be a link, but maybe it's more complicated than just "thin = good" causes anorexia. What if you also need the idea of "eating disorders"?

This was the argument put forward by Ethan Watters in Crazy Like Us (my review)... in his account of the rise of anorexia in Hong Kong. Essentially, he said, anorexia was vanishingly rare in Hong Kong until after the much-publicized death of a 14 year old girl, Charlene Chi-Ying, in the street. As he put it:
In trying to explain what happened to Charlene, local reporters often simply copied out of American diagnostic manuals. The mental-health experts quoted in the Hong Kong papers and magazines confidently reported that anorexia in Hong Kong was the same disorder that appeared in the United States and Europe...

As the general public and the region's mental-health professionals came to understand the American diagnosis of anorexia, the presentation of the illness in [Hong Kong psychiatrist] Lee's patient population appeared to transform into the more virulent American standard. Lee once saw two or three anorexic patients a year; by the end of the 1990s he was seeing that many new cases each month.
Now it's important not to see this as trivializing the condition or as a way of blaming the victim; "they're just following a trend!". You only have to look at someone with anorexia to see that there is nothing trivial about it. However, that doesn't mean it's not a social phenomenon.

It's a long way from the data in this study to Watters' conclusions, but maybe not an impossible leap. Part of Westernization, after all, is exposure to Western ideas about what is healthy eating and what's an eating disorder...

ResearchBlogging.orgBecker, A., Fay, K., Agnew-Blais, J., Khan, A., Striegel-Moore, R., & Gilman, S. (2011). Social network media exposure and adolescent eating pathology in Fiji The British Journal of Psychiatry, 198 (1), 43-50 DOI: 10.1192/bjp.bp.110.078675

Wednesday, February 2, 2011

Pharma: Tamed But Still A Big Beast

Everyone knows that Big Pharma go around lying, concealing data and distorting science in an effort to sell their pills. Right?

Actually, not so much. They used to, but most of the really scandalous stuff happened many years ago. The late 80's through to about the turn of the century were the Golden Age of pharmaceutical company deception.

This is when we had drugs that don't work getting approved, with the trials showing that they don't work buried, and only now being uncovered. Data on drug-induced suicides seemingly fudged to make them seem less scary. Textbooks "written by" leading psychiatrists that were, allegedly, in fact ghost-written on behalf of drug companies. Ghost-writing programs with chuckle-some names like CASPPER. And so on.

But today, we have to give credit where credit's due: things have improved. Credit is due not to the companies but to the authorities who put a stop to this nonsense through rules. Mandatory clinical trial registration to ensure all the data is available and stop outcoming cherrypicking. Anti-ghostwriting rules (albeit they're not universal yet.) etc.

What's shocking is how long it took to get these simple rules in place. The next generation of scientists and doctors will look back on the 1990s with disbelief: they let them do what? But at least we woke up eventually.

Still, there's more left to do. At the moment, the main problem, as I see it, is that different jurisdictions have different rules, with the best ideas being confined to one particular place. For instance, the USA has by far the most sensible system of clinical trial registration and reporting. Europe needs to catch up (we are, but slowly.)

Yet the USA is also one of the only countries (with New Zealand) to permit direct-to-consumer (DTC) advertising for prescription drugs. To the rest of the world, this is really weird. We all have a right to free speech. But drug companies pushing drugs directly to patients just isn't a free speech issue, in Europe. Corporations don't speak, they advertise.

By encouraging self-diagnosis and self-treatment, DTC replaces medical judgement with marketing, undermining the doctor-patient relationship. The patient is meant to present his symptoms and the doctor is meant to make a diagnosis and prescribe a treatment. DTC encourages self-diagnosis and self-prescription: the fact that a doctor is still, technically, in charge and has to sign that prescription, means little in practice.

So there's a lot to be happy about, but there's also a lot still to do.

Pharma: Tamed But Still A Big Beast

Everyone knows that Big Pharma go around lying, concealing data and distorting science in an effort to sell their pills. Right?

Actually, not so much. They used to, but most of the really scandalous stuff happened many years ago. The late 80's through to about the turn of the century were the Golden Age of pharmaceutical company deception.

This is when we had drugs that don't work getting approved, with the trials showing that they don't work buried, and only now being uncovered. Data on drug-induced suicides seemingly fudged to make them seem less scary. Textbooks "written by" leading psychiatrists that were, allegedly, in fact ghost-written on behalf of drug companies. Ghost-writing programs with chuckle-some names like CASPPER. And so on.

But today, we have to give credit where credit's due: things have improved. Credit is due not to the companies but to the authorities who put a stop to this nonsense through rules. Mandatory clinical trial registration to ensure all the data is available and stop outcoming cherrypicking. Anti-ghostwriting rules (albeit they're not universal yet.) etc.

What's shocking is how long it took to get these simple rules in place. The next generation of scientists and doctors will look back on the 1990s with disbelief: they let them do what? But at least we woke up eventually.

Still, there's more left to do. At the moment, the main problem, as I see it, is that different jurisdictions have different rules, with the best ideas being confined to one particular place. For instance, the USA has by far the most sensible system of clinical trial registration and reporting. Europe needs to catch up (we are, but slowly.)

Yet the USA is also one of the only countries (with New Zealand) to permit direct-to-consumer (DTC) advertising for prescription drugs. To the rest of the world, this is really weird. We all have a right to free speech. But drug companies pushing drugs directly to patients just isn't a free speech issue, in Europe. Corporations don't speak, they advertise.

By encouraging self-diagnosis and self-treatment, DTC replaces medical judgement with marketing, undermining the doctor-patient relationship. The patient is meant to present his symptoms and the doctor is meant to make a diagnosis and prescribe a treatment. DTC encourages self-diagnosis and self-prescription: the fact that a doctor is still, technically, in charge and has to sign that prescription, means little in practice.

So there's a lot to be happy about, but there's also a lot still to do.

Friday, January 21, 2011

Democrats vs. Dictators

What makes a government democratic?

The obvious answer is: people voted it into power. But that's completely wrong.

People voted Hitler into power. The Nazi party won by far the biggest single share of the vote in the 1932 elections, which were as "free and fair" as any in the world at that time. The next election was less free, but only thanks to a, technically constitutional, emergency Decree. Hitler's assumption of all executive and legislative powers was aided by dirty tricks, but it was pretty much above board.

The current provisional government of Tunisia has not won any elections. The overthrown dictatorship won many, though they weren't free because most opposition was banned. The current government, however, is seen as more democratic, because its role is to facilitate free and fair elections. It will then dissolve and give power to whoever wins them.

Maybe the provisional government of Tunisia isn't entirely democratic. But it's clearly more democratic than Hitler, even though Hitler won more elections.

So being elected into power has nothing to do with being a democrat or a dictator. Don't forget that. What is a democratic regime, then? I think it's this: a regime is democratic if it would peacefully hand over power were it to lose an election. If and only if you respect the people's choice to kick you out, you're a democrat. It's not about winning elections, it's about losing them.

Dictators aren't dictators because their people don't like them. It's because they're going to rule whether or not people like them. They rule: that's the basic political fact. If the people agree, great - and many are genuinely popular. If not, too bad.

What we've seen in the Ivory Coast recently, and in Zimbabwe over the past few years, is what happens when elected dictators lose elections: they don't accept it, and blood flows. If you want a soundbite: a dictator is someone who's willing to get blood on their hands, if it meant they keep a grip on power.

Democrats vs. Dictators

What makes a government democratic?

The obvious answer is: people voted it into power. But that's completely wrong.

People voted Hitler into power. The Nazi party won by far the biggest single share of the vote in the 1932 elections, which were as "free and fair" as any in the world at that time. The next election was less free, but only thanks to a, technically constitutional, emergency Decree. Hitler's assumption of all executive and legislative powers was aided by dirty tricks, but it was pretty much above board.

The current provisional government of Tunisia has not won any elections. The overthrown dictatorship won many, though they weren't free because most opposition was banned. The current government, however, is seen as more democratic, because its role is to facilitate free and fair elections. It will then dissolve and give power to whoever wins them.

Maybe the provisional government of Tunisia isn't entirely democratic. But it's clearly more democratic than Hitler, even though Hitler won more elections.

So being elected into power has nothing to do with being a democrat or a dictator. Don't forget that. What is a democratic regime, then? I think it's this: a regime is democratic if it would peacefully hand over power were it to lose an election. If and only if you respect the people's choice to kick you out, you're a democrat. It's not about winning elections, it's about losing them.

Dictators aren't dictators because their people don't like them. It's because they're going to rule whether or not people like them. They rule: that's the basic political fact. If the people agree, great - and many are genuinely popular. If not, too bad.

What we've seen in the Ivory Coast recently, and in Zimbabwe over the past few years, is what happens when elected dictators lose elections: they don't accept it, and blood flows. If you want a soundbite: a dictator is someone who's willing to get blood on their hands, if it meant they keep a grip on power.

Wednesday, January 12, 2011

A Brief Guide to Being Shot in the Head

You know what this is about. I don't have anything especially useful to say about the recent tragedy, or the question of crazy vs. political: at this stage, it's all speculation. Let's wait for the trial.
But anyway, the incredible thing is that Rep. Gabrielle Giffords survived a bullet to the head. How?

One of the amazing things about the brain is that almost all of it is unnecessary. The bullet passed through Gifford's left cerebral cortex, various parts of which are responsible for moving the right side of the body, seeing and hearing things from the right, and, in most people, language. But the only part of the brain which you actually need in order to live is the brainstem, which forms the top of the spinal cord.

The main reason you need your brainstem is that it controls breathing. It also controls your heart rate and blood pressure, but your heart pumps itself, without any input from the brain: the brain just does the fine tuning. Breathing, however, is controlled directly by several brainstem nuclei, and if you stop breathing, your blood will run out of oxygen and you'll die (without artificial ventilation.)

Damage to any other part of the brain is survivable. Of course, you might just bleed to death from the head injury, or get an infection; there's also the risk of brain swelling which can be fatal by compressing the brainstem (amongst other problems). This is why doctors have removed a large part of Gifford's skull, to give the brain room.

But the brainstem can do a surprising amount on its own. In the early days of neuroscience, there was a bit of a fad for decerebrating animals, essentially removing everything except the brainstem. These animals were still "alive", at least in the sense that they weren't corpses; decerebrate cats can walk and run.

They don't walk to anywhere, but this shows that the spinal cord and brainstem can control movement and respond to sensory feedback. It's even on YouTube. The famous headless chicken that lived for over a year - that really happened, it's no myth - is another such case.